var/home/core/zuul-output/0000755000175000017500000000000015157264433014537 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015157300356015476 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000355563215157300221020265 0ustar corecoreikubelet.log_o[;r)Br'o -n(!9t%Cs7}g/غIs,r.k9Gf "mv?_eGbuu񯷑7+%f?7ݭ7֫]% oo/q3m^]/o?8.7oW}ʋghewx/mX,ojŻ ^Tb3b#׳:}=p7뼝ca㑔`e0I1Q!&ѱ[/o^{W-{t3_U|6 x)K#/5ΌR"ggóisR)N %emOQ/Ϋ_oa0vs68/Jʢ ܚʂ9ss3+aô٥J}{3CF*A(-aD~JwFPO7M$n6iXύO^%26lDt#3{f!f6;WR.!$5 J:1*S%V!F([EbD]娍ԹiE03`Cfw&:ɴ@=yN{f}\{+>2^G) u.`l(Sm&F4a0>eBmFR5]!PI6f٘"y/(":[#;`1}+7 s'ϨF&%8'# $9b"r>B)GF%\bi/ Ff/Bp 4YH~BŊ6EZ|^߸3%L[EC 7gg/碓@e=Vn)h\\lwCzDiQJxTsL] ,=M`nͷ~Vܯ5a|X&pNz7l9HGAr Mme)M,O!Xa~YB ɻ!@J$ty#&i 5ܘ=ЂK]IIɻ]rwbXh)g''H_`!GKF5/O]Zڢ>:O񨡺ePӋ&56zGnL!?lJJYq=Wo/"IyQ4\:y|6h6dQX0>HTG5QOuxMe 1׶/5άRIo>a~W;D=;y|AAY'"葋_d$Ə{(he NSfX1982TH#D֪v3l"<, { Tms'oI&'Adp]{1DL^5"Ϧޙ`F}W5XDV7V5EE9esYYfiMOV i/ f>3VQ 7,oTW⇊AqO:rƭĘ DuZ^ To3dEN/} fI+?|Uz5SUZa{P,97óI,Q{eNFV+(hʺb ״ʻʞX6ýcsT z`q 0C?41- _n^ylSO2|'P'BOTLl-9Ja [$3BV2DC4l!TO C*Mrii1f5 JA *#jv߿Imy%u LOL8c3ilLJ!Ip,2(( *%KGj   %*e5-wFp"a~fzqu6tY,d,`!qIv꜒"T[1!I!NwL}\|}.b3oXR\(L _nJBR_v'5n]FhNU˿oۂ6C9C7sn,kje*;iΓA7,Q)-,=1A sK|ۜLɽy]ʸEO<-YEqKzϢ \{>dDLF amKGm+`VLJsC>?5rk{-3Ss`y_C}Q v,{*)ߎ% qƦat:D=uNvdߋ{Ny[$ {ɴ6hOI']dC5`t9:GO: FmlN*:g^;T^B0$B%C6Θ%|5u=kkN2{'FEc* A>{avdt)8|mg定TN7,TEXt+`F P |ɧ<Ғ8_iqE b}$B#fethBE;1"l r  B+R6Qp%;R8P󦟶Ub-L::;Ⱦ7,VW.JE:PgXoΰUv:ΰdɆΰ (ΰ0eTUgXun[g, ׽-t!X򴱞_aM:E.Qg1DllЊE҉L ehJx{̗Uɾ?si&2"C]u$.`mjmƒVe9f6NŐsLu6fe wkىKR%f"6=rw^)'Hz }x>1yFX09'A%bDb0!i(`Z;TyUΗ|ִ0-6dAC5t[OM91c:VJR9&ksvJ;0ɝ$krogB= FYtЩOte=?>T&O{Ll)HClba1PIFĀ":tu^}.&R*!^pHPQuSVO$.KMb.:DK>WtWǭKv4@Va3"a`R@gbu%_J5Ґ 3Dcwfm#Y~!%2wwD_tՄ/F+ P'AYE; R j"b~PO"dwEр%}5zW]O?)-1 8/%\hC(:=4< ,RmDRWfRo>UJy -ܲ(4k/kJX ?A?gU3U;,דblG6s푻j% >8ȕ(eZ1j[h0SH,qf< ${/v5W_c.4ݍpHv݃볿$ 6z]j5 `7ruuŨqC@UrABC{غI=;V6*¦l:I9 NeST+DE 6+۴ymkqK+JsG!n&.Tu n$A  S'kmOl7^K.6_"RC ¶.vBTce"BKp r`jﺧWh]ut/uYuH0Yhu ;ZUM 6/"Iyt } w n_9W;`.4%XkĴ 6=l0Cз5O[{B]u\Pr򣏷y@ڠ -]Bt) @bm%mNjozCƨ|Y4zF u _5_WdCmWu[7-D[9)/:˸PQ!j-7Btk|VXnT죃&ͲH-nမVbeXEtOfD䓗-˸dy)0]?ݽv>|WAޭi`@bIEOJLi[? cl6V>(?T3{k>h+@]*pp桸]KK-*i ѩQٰW`DݎG2%ȳoۗp`FMeXÅ0+>86{R55y8 M`_Uw ȗkmwY>u@$P2K [Av*958]a:pmJhw K5C\-&W-qQ4Mv8pS俺k}ߤ`Zn >X1 smD) ̉TީXi߃ʟ~㍖›f!OI1R~-6͘!?/vo|4~6I@G*oW~;BLvjQ]m"qfӠ1˟bMeO?!hYf.WNW݃yh~%DTt^W7q.@ L⃳662G,:* $: e~7+юk7aXt?2 'so fnHXx1o@0OTmBLi0lhѦZ Sf`Yɇك]_hϳ{b ntU]Qexx*lJF#+L@-ՑQz֬]")JC])"K{vgY+Pf7s#bd+SDL ,FZ<1Kx&C!{P|Ռr,*3] ϻ;ZbֶXekX p]Mj_vZ :jJ2^DG"1lͧQѶGM]}yx^l 0JM"dλ=`Yƚ^"gJT_>t8.H#',c@V8 iRX &4ڻ8zݽ.7jhvQ:H0Np: qfճ40+^f.s{hkN.g^s7`zDc D]ͺjgw07'㤸z YJ\Hb9Ɖ„2Hi{(2HFE?Z.U6u .hD32DGE{W .d&Q2^B!&! FU NC@Zk)幱刟o#ث~?ӷB!hE=|]3-%MG=R;Qs5JQI?icCH2isޒ栒޹a% Ŋ X!cJ!A\ ?E\R1 q/rJjd A4y4c+jQ̘TT8!kw/nNb͵Fc\G0xyO sw5NTV12\7<OG5czSh'/5TbW > ~Wޠ9dNiee$rۭyQ(J:w?o{zỮ/~+w_eaxxq:yeqQntq"Occ°NRjg#qSn02DŔw:ؽ 5l)FamUL*t*!j=ÃywSXE*m#U ùRI'%w μʕ+88ztT:eEK[[;0(1Q@ET0S vE7մ{b\No*{:Mzw =mK ,\*wٗYS%g,0\ Rk k8P>x?myڈۥLFTk"5l9O'ϓl5x|_y]#LLb FDK|۟.)Ac r5Ut$|>_vJll9pL RN ]F>}8H0]+oES,n?UU{ x~ʓy_>/_~&q̑0dd4>v^"NۯDc558c&'K0L /C5YDqNmlEUN1O`E;\9n@VFB0℃= Ov~X®/p:MP\<=<^越a/br|In;3&'K;3q]lU Ry==]iSCQs~vn/SZ % 'I[DtUCE9s2".zɪ)u #T)D:fci[*`cc]%^W(̴{YES)WxI;N)%6-2H`cDFچ0<t#A/ 62$IL7M[Y[q2xH8 JE NB au)T܀S=;,)CfHCH#IY]tNWgA̕uF&Ix.Tp//|ʚ[Ưy.xF%ڄPw5fc9f짩Q{thbԉ]eH'm%=X |hM6rN+LxE>^D]EڬTk1+tnsǴ5RHİ[qL\}X` >+%ni3+(0m8HЭ*zAep!*)jxK:Up~gfu#xؾ .2ןGRLIۘT==!TlN3ӆv%oV}rN~ˊc,߹,=;2lVǻXEB6;5NE#es~ax޾8\0U'&2ihbvz=zl5|JE2z=wcMԧ ax& =`|#HQ*9ry\wW|~5Ԟ '!9MHK:9!s,jV剤C:LIeHJ"M0P,$N;a-zs=1/u&%].]y#z18m@n1YHR=53hHT( Q(e@-#!'^Ak$wTg1!H$|HBTf̋ Y@uwiFī h[W,Ê=j8&d ԋU.I{7O=%iGtxvBչ̳@1+^.r%V12, _'j"2@+ wm 4\xNtqwc&dXV0e[g#B4x╙✑3'-i{SEȢbK6}{Ⱥi!ma0o xI0&" 9cT)0?5ڦ=>"LgdJΆmΉO]T"DĊKٙ@qP,i Nl:~6'5R.j,&tK*iOFsk6[E_n`׎tp Nvw/oW4|<{H@#*h{Yp/E%dlh\bU:E%h@&SEK [ Ƣ x'Oz=CԒm4,tӛ"N88YܬUʽw c-csC{T-MzpvǸה-7֢ҙ40䆢^;4 9LڑKL|`PОnG#|}qOR{Vt2_tHߪ%pD?1%(@fxKrs45rMլf{sk7fjӞh2Hke_O'˿mȠ%>9cSH|cEyQp 'ˢh:,vw#[D6#6'mJTs>"tLvjkB|rN`)矻81 .&Ӎsēj\4iO,H̎<ߥ諵z/jYv4 0tU;[+8&b=zwɓJ``Fiwg s!8!ߐ>'4[7/KwNθW' >ݛ/_[z]Ȓtyڼ|=* lH#=M\`'%tYWm𓶝?Di륍sF,]VnSSJCҖԻq=ky^L6/R%eZ;i.p胉,y4F"37!ېATJKB2Z/"BfB(gdj۸=jB ]e>&ngl9%islԃ)Hc`ebwĪ3nZg0FRYeO:F)O>UD;;CY,2ڨi"R"*R2s@AK/u5,j#u>cY^*xk7%wCvpۊ ~;ɰ@ՙ.r{T?M0:;}d8Lj ݨW>Λ[Vhi/̥̒9$W!p?8=iU;߶o{+~e3]:({ܸf)*gCQE*pp^~xܽ`U'^vE90t~8-2S󥞙nc56s&"mgVKA: X>8[#yN SrA GYb8ZIaʼn8 #fg3i`F#5N 3qb$!qNCJ4blnv!+lR@;HspSI]ۡCZUck5pDcI9,oN-_XI,3\j ]?}G]c ې!rGHw@56NDq LA!&mYJ*0<>[Vݯ,%M{,#SY(H_USFC"6mݡ"3`Rd1e6d逖`7rCMxR2=ϫCy~ӡ` ^ y!k!9yL%VLU2gr26A!4vbS ]Wp+&ee *w -`J\ ppǣ}^~#_|o{ К8EW>*)D{ޛ$fnY𱹜M>4&$/"lX-O_vwrpP40ޢE[A͝ Z5 om2p)lbp/bjWd{R\' 礅_}\Ƕ:wNgOb{}IStg_pvIxpI"Oc S GMV+Z:H2vd,P4J8 D72?of1>a$]Nbs=Kx̊ygbE&>XYs䀚EƂ@J?n>lhTm'nܡv+0fqf٠r,$Zt 1-d}2Ozr@3?]^M ܓmBɽkQ| }^~n6Fg86}I h5&XӘ8_->b _ z:9 Z>eUŎmxTL̵F8ՅX/>'qwߒZȆF 3U>gC}Y }S1JG9TV\-B{MӨ&Ę4.s"x| 76ү4n;[4E8#yrH9=v֍/8. ZsߴIJ>&I?L6i}Y^XpCًݽk-$pxbڲ&6*9mg>{rtD)OQ`߸hy:1V(!L7,RPEd;)QϢ +RlWDžuF7LFֆoM~ar*EtIbW>jqour?qzJJaQ#-n`/$fhnqgTĔO5 ꐌSYXzv9[ezksA`<dkON৯s|&*pNaJه5B5H:W2% `6MRR'xZtfC$1aH_dx$1'/v^ZZ4`9);q`F"d1v>ժbLGd~MP%m x52LMF9 E"A,S Vo}\"X.2< 5FB΢u.`aJ#Tk’"D#cuCXȉ4 ՖK(KP|dZ1&8{9rLnMRф%V Ng2K|`ot.GSGd oE'!B'Nb1{8LW^9KbN;sö!`0ݘ/l+1L#B8U֕&*?V6N{դ}Y(INBKhx2 *MOenT.a~.E jG)j{=u^K+Ȫcv/w#MivX :)ǪCZUnAS`SK6OSxa3 W; K>窜̀'n 3u0?K@BS %fee}i]>̤+*l:\歶!IZ5>H;0)N.w7ߍ|+qUߤ^oå~4en\.cY[s'wSSۘf ?.D s}Y~/J[}jX^ޗ_-/̍ݥ*n./cus}]\>\\^'W_nAqC_oO-S_sOq?B}mmK2/@DJt}=xL@5MG0ZY,\S Eb uw:YɊ|ZԘ8'ˠ*>q/E b\ R%.aS qY>W Rlz!>Z.|<VD h5^6eM>y̆@ x>Lh!*<-lo_V684A飑i2#@+j3l૎S1@:G|gRcƈ?H(m>LC,HI~'.Op% ' c*Dp*cj|>z G` |]e*:nq!`{ qBAgPSO}E`́JPu#]' 3N+;fwt[wL X1!;W$*죓Ha-s>Vzk[~S_vD.yΕ`h9U|A܌ЃECTC Tnpצho!=V qy)U cigs^>sgv"4N9W_iI NRCǔd X1Lb.u@`X]nl}!:ViI[/SE un޷(ȊD0M^`MDN74Т C>F-}$A:XBgJWq&4ۓflq6TX)ى?Nwg>]dt*?Ű~{N_w7p682~ =WBX"XA:#u-9`x 92$4_>9WvTIj`+C2"s%DƖ|2H\2+AaTaBˮ}L@dr_Wfc>IdA Od[jlec=XJ|&+-T1m8NP$%s,ig\Z:h Ћ߉n!r}_\ \5 6 d#=&X^-kOwĝJO\Vj; )!eoB4F\jtctUb.L[3M8V|&jZz/@7aV),A[5TpUZL_?CU0E [%W%vl x٘3܎y,< )i7 Ո: tC`\?c%v7\Ct!$9iç$><+c~݊lz1H[E'2/clQ.I`AWOlw&5fH n`gMytdx)lwAK~GgbJI-tq5/i ?WǠr^C/1NEU<=co(k0Q~wˌ\g,\ rf\PUH,L#L7E"`0dq@zn~+CX|,l_B'9Dcuu|~z+G q|-bb^HcUha9ce1P[;qsA.Ǎ-]W‹y?ڕ^Pm:>I+Ȧ6' ,}U=̀*Eg.6_~OJ/8V ?ç&+|t><,BLqL򱷬dS{X6"X#-^䀕#{К4i̎'QIc(<ǩJi lc*n;Yػ޶eWܗPdױ3dd3M)[V~")YUX1Z_UWWݕ N.`$ Dxu%!HcQw=Q"CUT @`6E**IHg~js 0.<"*D!TQET]iCS1TE)E<Ζߑ`We9o$RGKmPr&B3?X5'k)Pb֩@P/p7?PUaYϸǼ dWA< d{^tDS.'|x I 鉊?60ΓjB. `0 vzf2~]R퀻vI wnr__jy//ێke셯%ŕ`,l]3C7<9x/+҄WC]shc~ۚ,l 5lfa`Nk -]go 5YQMvC[y3xN,˅9JaV-S!~P)!F lf:A3O?\dxST{ ^\>%3puLwCZX/I?oj/EIPÃ[|(S 4RW1=_W ٷm<"|wO*CT&ݦ>9|2n]n^^P5{mDh&뷎* ` 0M#m`63/!<"kGۆ3Y//+A9Kl/lxIm~ /E&*ѠL/a'x]5Xru1^>Ƥ<+c^܃EiJD2˨WDe_-3xPl yf<Zɋ(U?36/mC]UQ82yeIBXWYZYA6Fr笮I+JI)e췤7i+6`~ XojbOHXFȂ"'kdCνi,2 a9p$gy}pU0BwBeʓoGKD6a_N*K0azWuNBƢ*"դ"*2LQYF5h4DnM7Q&dS1C'.ݥB9,&ut  bXΛ4GqN |hiHL::}]t(ۤCNDS3Y[MñΧ)yF/(;n#ollmp9X5M&.A&YZfh ѧ]k蔜3T׊{Fe!]Di*rPѳӏ<KRvko˲&_ 퇙ck9цJ(c~񓇕]὚eXUo8Fo*H!ouz) KJEGCJ@ABHD7"8E_UC#$m ! kgm^]𸚟΅&'C#`̞&il*YPFͰa,ߌh[CЎloO޽) LS#SUZJ<ԯb 46MtduQ#yh7(Ri)I +>\Q(#0>4ldSbw lr-y]<."pA n?>$!U[fn8:>:ikd6`J4ȢجX=McPl Wً}|4ɋ̟U\xR4Ȓ 2 hP6wS=kv8@|r}4ͭAT{"mK2qV 珏ڇ!Ql}'N |5+{Xt&Y8SN-ajN)+&ҲwweJ*5٢P5讻# {Y3T|4 )1)OuLȕ'̈wz͆*Fp K⇥S*=$ qML 6䄓v_{C"6{*U%k+TK˞osd!pqbjyWJ Wh;AWf09{NqO7Rw"?OW:aVBI!n<ؐCO/標2I,й!b.ԾIi;1[a+= D:wB}/Y`4e)51tm-lWu`c%T]1\B Q\;93e0*ϖbwHY7Qa0W_˂iɦ l7ҬkFmitBczHXe<BU@湂%&L`Aa2 eڢk-RuyV]`_ڋyM=G}EH'j؃0Yx,P*ꐈWe[;P[]JF0}ΛҔw=-YFo}Z[D,.{<܏YMh;m9Uu[qefʪ%p܋9H6r XDW@1DX2ap[ [v\yCжVgܑr_GcA 01@ƦL'ti^߲KTT5vгL'ɱvKۛ>$<= GfwuS<7. y{sZ ځtOE[n]dR!O݃T x{~.ou*'"44 _5{Imt4,{} !W>a|9xꜦ-ފ+?Sz&gx5'Nh8lzuiV*2v0} -Ly[ABkh:To2}S9v~غ|]e *R4(stڻ챙$338m+Ù׹gI  |NRONڛi2l6GJv| êd1s4Hrڷ8[sއ=;]NT䟖y?6qyG+pV~oa  I|75//m*U\l5]&ETw EGg$))FCuzupc+o_jHG=N3)beuƀE{Ƿ$ Ui?IzIQvq ۫ /g\{]-z}4i+aQf; ~mk?"̇*[$^|^2ZҮw+m/MZ5\V㬰țwo-<;wǗW?50&/mƚ6ؓqЅPd;VZtzpDBʳ_"(@AhϠ#"bb`k XKNhv(|kȗ110='y/gGe;H&BZF$~ q^e1𖘍kk`-1mN=en'&kHnZ8gkpބ4"^ܷPP>(#[5 4UXdKu_>0DVK3Mk=èA0ZİG<~8/Mv8)/0 a! |0DEkhLgnB j/^mo6/M F} {aoH Aj4jBiE0"  |_pfXmX_Y4pA-74re1_mq~6-crf턄m7}s&q_zC 絃=U@Ν9-pE~Iy{.x[N{nQ74@-}O; @{Ca@~60 S7' MRsᴆ. X A~0u5jx`-@掺ndqaۘ0p}sDx>Ӷg H *0yJ+B~w[_B( SS K.z&ˣT.MtZ0Y/v JwKݒh-m`^+|מEMg{%;A(g !7 7_KˎQZcF/ zuo>0\^y[$CkS &!PQX Ѣ miWcUP$ Na lInc kE40v׵TƀЅ5ð 7AB诤& d)T |uf{8 S>'d9وXkM3V;5;0rNez/a3mBՐ Fzq*2t|.1)iNR rcOcurmcx/Ʋ/9x4hȱ^Oc"}>ёtqrG`kXmI`,CyaA/F>v|еmY{#5q)qx K-<w"g߫! $d$ ls?`veΠIe0;}nIzsTYԝΘ7^St?K%-]//>ZaL qeNW]kI4Hsv/9]qoV1;];*Q+= SrKI/Ԡ@.aS v}UY2WVng&$V*|:t(Iq>WY V6Z}1MFRIyO^mIeO>|8=;n-xaݐwʸDЋmrovCvױmm"~{8$Nl拾݉Ex!!o;"AO;vqYΖDǝRJ5Q0EHg6ctT dP>(%9ឫ2#'2eaf-Ve ª0l0[ޣo]jqk%SFe ז +v{4jXV?;dz|zgcNk,ޟ Б(/SˆiMA ÿ]:y:>[2$8FttvBFPm ~?/`qҥR.`;\5.WG}‰<*B2].jS&7&gw 3HM*~Ih?c<! $2d8Dqʚ RQ<=s=/p,[< KV,<*6>j'ox]hP `E~•ϗf<؁Ƨo{zzCp AƁ ${qthbEÿΣmrWokv( Ć~&hN.6Z%, 1,c;iP;x9,5M6A3y>ꦤY0ÊLm@ӋlzЅNI!*.(] (]P4A鞂]]iRK=>UJwjՆݩeiڧލK::;.4Q=%uv]APwwAݧ )lYPlwAe{ v[APowA )|YP|wA{ wT, *vT.xbOAeAOT)APYPA&Ar)q=Y[H!qv7& /BoQg49]aڟ FP۴rFM6:C_GHy]$Hrlx^ kX`׏!̿d4 ^azv&CuH~a:NwEpJ^<-k:}Q̻K¨ Udzj^&' 9t~85v~uŤU0)lQ!85="IE=LTVZxS~ Eyfݱ 'BWִ48ۨjr|[xHIھ.EesK Z؆Y h[/)$DSB=ahEI?|V`N*ݹHEyO0^-4 '[4gm)wg87ZAEt2ReE~,yJReIORJt40[N?Ilm\\$^RzxnuxQPp{$UZx6DK&7(ai?+g)(_A{ʆKfq ?,e n*_ kNnp W 3%?f Uav0%(i2Y:HaC_j0gzS^t=,cT'X//4t`xxR9i6F]4uy>1ח*5aWjY`?kα{Xeubue}g(O&Z#0 d[]vbeI!X3!`6--ڂCΗ%pzs(;土E T~:Eҝ#n;ZLCjMx5Be1UD(. s:Nի;ziϽMo0o΃R{Xqvx PnͲZz8FcRDR܏#ԣ9@Eٸ׌U; 9 -5yLN,_$]#2ѐQF:73? qryeTՍm@~2ey]SljڻWyU\_!Efk7?7em=YӣU=9W6Nq8 x]mt?~˪o`>lo_HoaYTe$9|:U[%$F̒LpJTv̮*U~U~ˮ8hG/0r;F2 cH$3C$v-14f`lLD.IAiizn|1♡ vAdŷYӣO#Hnr34g%r` 5C;iP,Jڕc$1?nN- "DMiѴvDn1lǗʾ9a{쪋^i{ڸӃ~| Zyը w>&7^jew79H)0Ԃ䠹҈$wUR2oEJ U8Dw:"4Aa-RZy&bSu(\nuOy֥Qޱs8+Yܙ?ԉ6nI4h5+ d~UUWpx?-a)K2eS&zs?K٪scPqUVSRu٭׽>[ЧHB JgU?"⣥XI=CU_7P`SbMd0eaĔ+mW c蘓-Z+xۄůZ99;Ŭڱfyj4ڬV 2t(a/ y'Œa'd|`Սv?e*8Z[(G:̎*g3\ 6e b˄]XZ3@p{a裐\5auܰ5jOT<<.EGF(~lqp|Zu|z᳾ ʩEUYm֭+bP'>+aʲ.lb6 o-rj'}oÀƺ@\$vQK:M5l25r? ^y8 ݰzp| C܃KK*E8Fl 2hH! {{h VRsyk@&H"s.w| χ^йAf&|~awևrkrcپ&h>6хvp=C{kGMxP'f?vNSilr"NԆP{Db_Okތ7 OWK1 bǸ{\}{БZSzNf 31y9,ˍZX1-c^Fϣ3_0ϹZP͑x?OL%L]BdRPCqFXU2xa}hj:I78ey]P[6pY.0meItʻyR2EMLE <2wi60es 3@!:1]h"Db,Ic7F yLؘδj\hUAek>YtNabXmƚ gSpg,T]1aMaH =IU>UB>O7E1ѸҴÈa ݡ c/L9X&W5sYqWP12; Z¼)^r.5r!0Xo]rt]I\O&TJ,+p(8B>wG̦I, 8<8yXT8MriS\G曮,sl-pdZ#&[5IpTLHo18&IrX'^o̴̧69.yk1Y3lGSҨNUarK8y"d5U$q5FǢ|!|q8IeWc:i=ȱ1xٯ2UudU\m!b1у[aʀ@:IϘ%3M,N" I7`vGO= wBrY>afg쥯YnsxSC` :9ZpTL/F {eʞqS/KI_JdX,ݜ0spV3|xC$M$Vl: P%JJC91n= gzWQN-a+ NQ']h)սO"h!V! n'I~Yw$8 FiшIxwrEMHG .fL#^ . 3U_^w6(='6S!+ǠE::I`HGoZ_3zu XM8B?VՁΎh́Hb/_VWLY#xԎ(98M㢩Z7V 8XW#Z>6gR^1*,Im8ԃsd'0c71 D`ppL`uC๝" ;N^7 ok-7OVc*7(@1I{7ӟzc$ f6xziX!`1Z#q JttHsm_Umަ?VIbzB,ŢRVI(3C=2a`\C[F!׿>#fnLv~w>}gaс9n2GQwj.,՗vxY~>d:.Y%#'8z-SE\!tzwG//qeJY.E4휍{L"Rx%ΡբTJ[>xa[Verަ~nާfAfvZr'-N&>;pw]8)ga8~Lᅭ8882mra(R7M:`*5325*D iSnjW;vG@zt jc:>އ_^{2sI t0]dsQT&kXSmBvF;:J,FBmiT@v(¢ &8^V&}б e*J*i vŸKmK/$8-k$J |"U&jPJAn#_ODσ8(. 5Jk\-q Afgfi fDEŹ]ޓFLGb˗I}hDJ@_a蝿QJ SKb&g^3E.j$]G{٫ȅ  ؤӽ}M,PB ǧuݓ^9P0web ^>?wdc$#+~Ժ\/(pGR>KTcİך2^)npJ`]a DJ:"_I3:lv$8FB?!H3P58ϓBo: G*~?4`muPufZ8,YԖ=,JhO7]!1gyj__i`/ {ڮW04KǠ2^W#JzHOk߽['{l(Bg ʀKzk0bj<]2j|51yg\~Q j&H5?b4= W|XFer6wͷcs'*8+5/h{"rs8"wqilG6zٽ(")y)1Đ-MG?RɻOvCH'5w| Qs\?C?0KS{H4AYU§2:$gFۯ6'CBT `"͹+?nCg+qtHs='ƓhHp|Qq&, uɊPYI@I^yt >][C}Pit$|s˙W h~ %_HγaQLb9;Z,jd]5fGeܼ $OhߘiTH69.ywb$]3>g^.:/ =upB8_槶ڬzi/W,-?.??k.j(bF~xPYҎ$E{83FIɚv ys.<$Oi|74Z3{]bHD!;W!+CL%MPޏPCIR:pIdbW@U\SFpQn$0 {Opcs+?r{͌OCŚ0k"0CZ]wleߘ澑v.|{DZ26EQY|Bu6"Q3{w YL:79w\<ѸYr*~;&g1x@,6S$},~iqՏehIl441N)<ȇzo/oY޹DD.t%Xֵtf8WN>eZm"v! Sz 8pEzJh{.:·?Y޿1yRd7~f\[?ɝ|\|StONG8~$žB[OP'On@I0.JP%ǿo'w=Ƨ''1$=?.`7'oI& ؐ47?CV.z4ϣV!gSxC> ZhA2f>V> &G031kү~9{Kⱊ% x2eDa? 26 /̬KCD0G /ER*g`\4,ܽ,?sŻja8E\'zɷO}xr NN@;gtDa[8j-Z|Tt<`XbO7,7L(̓QJapʬVJyakkððoĘ>!C|bOp &[X{:A`p/^Ov<`;T#B$Q6P̽@q8\XEJ֦W¾DŃjuݫr菗ogAX0**ha/O,OUN]%ԉTaBǂ-6;SٯUtv]_}H]^H^yFriR[Q*YX۪3Ț<Y O8*,=l3 Z~V |Ksmmdk!ESkq~HS-ĘunWu<'Tu0+#gr]p_rIKdmT*M} a=W8##j0_Gq^޽"B^v0V_2iSqDk}ybRЃ[ghtx _Z@yV-zP_H=r$ľ|yԈ$FU=`1"Xۓ}A+F:"YP'~+ pum$_y Mkׁf^ޡ)ߵRh9o٭/'>òU+/u G:1&F\Iҡ[t\uQ\ʻrzKvCrzŲ( ˹BC(sF+M~6/hMPVr0mo#ZOMDd偾ƋKߋw#+W|\_7bΛ\,k<+ Ա#P*8^)ϼT }>-/iNB6pt6a8Od ^zIgy}\֌uqe]3RLjm_(İ! i)>"DcG8vH<ƚѴFѴk#SAC@#Qt X& !yaSMj!@,D4X3vH3v@38WufH (!V06XO_2\1)*"W:lM;jM;\3.4CEVUnRPn#&X_`>wi5bDR<SEv]Gbu];=ygpEuzmԜd4MĴI$ z<栙5 6e8C {Ñb^2rn%isqx Y]a:M=$->mٽ*2w}) w\=v/M?&go $\ \D𨪉s-9_ESBߌÏf͡*zvx޴ Tof<ފ"5ZvllmM8`Oًn}`^2wJ+NRc؟Q?}wZyLUVwMnWtm5D^wof9wyz@û6;@K odTj# b@nF&6&4!Ά~uh 5IMkkR;\W% ad0G_J3(e^<R䗂cUü՜I:) _6_iɧk{Zmbq1RHB>/f-L<#6Z>Ml:5@U{lj~.ϙ?s)]ĜQja1cFs9D, NP<VT]TkiL-R*%;N_¡HM*RRb$Vqnʞ]XqB@>usa=#{n9D*ESn&hRm( Pl' 3X`nU`ĩ)'Q+\ja'D0N4T+I;5DMQ\Ż5qtm=»QŸ'׼25d}9ޞ_Y9~ Ep8HC=,PLtFZìB ?ɛjLEVγ:4Fs\;|>ipCڵEppޝ{To<6K d>eʃJz+Ž- fWY7aiןЂg<4#u; Rh f-\ٌ4UF6)~g1\Tp+Rڦf Btk^-]c>7IX:i5IBhwMJ_n[[im+ \Xf$B+,@w^Ќ:^D8MyHRʿ(.eG1MO"u@`| Pe>a$;!)_wb~XmXQ5~݉lùȎaRހ~a%^qtK){֛OqO$L&9ξdxS»q ei\xF<&"W SwV ,i1%!): RD@qRc"d6JN§}  {s/6H*gO%1o$4@dɆG@AJ`2 2>)crO`Qj0=K>%m'.Z9FV90͹!X ?3'=RR/kGS6tՂDvC6\^|EPT kb BJCPl IFSh!C^+_ˮ,4Es0o f >vzwi)JK#]S\ 2pך[ S督րtHFbNMZ0lF5r$L> S㬷8%/v*(xۤkK,S hc ٻ6$WJTdAc8gIԑ%yñ(ruUwUuwWOE6e!9򒕹԰?(P9$9RTSHR&fxG5FB6 XM!|IKEqyuϋ29>_ڟ<=_xj`#rM =Wb5n' 8گc9$/G3"߷cW_;(Iq5΁p;`@gfpzy R~Lo)wh ޣ+5asDFRd"$unC$5AH"ŸnLit°fe\F92tq}%KjyƲRIWh0kL#h<6MGyVy;D46hM5%m ke,􇃫#V uiH#󚴝R2<EP0u>u,'J΄)&p .2j[a0E^H9ˋ!'*ۢ~Rg@'UW ʅL3A`APfe:#Y"-v7Q)֢Y9Ò 82a՜[lLT]L$(*`'!v!X; ɶޮ`il:39Qۍ5Ё&6U0#<61mY+Z12̚ڴx'WQdru7IoQr뫛U jY*ӧL)P¥>8Qt2k? 9`M?xX^LϽ\8b-IBh&?WY37/ZokXвKƁygX_ӛsPLmz{ GRQ8KY2- g'ujhXV\3m<Z-`gse^PVIAY'@Q!W2` "3")<1RD1Y`{Zupm<LTkGߢEplˢSQÛ{ OI. iY[ [ JXRsǑ/ 28~/"|kq:#X~cP ]\櫣`2g}J-Hp#hn10pLsK]r^tvu2=Ylkc@rBi-H"KX жX ŸV1anѤULFZ3i\3i"jOt3&N*V?iL%dk$58D1k]kvi,O$q>P$yY2GM߀GjUgYد~ 0,ݛ^NR'iVxկ[/8wʼn: 7e+CJAgcǜ(yL v1}>{cԫճi:AsUN/ɳ$_(x6ӓrޭ$ϸ>>=!}]{aV4̬3YIhWoe}sd'cs`Z?XɞH,. "xfvg,Km6k 4D>5`JgBU2=)fV&o/\^f~ 'CfO$ӓzYt=\=x| zT~$K8ߧS2W |&h3*>A1=Mc$& ooG[v[JON#U خ,r]Zuk+m?O10Kuzn\i1b!;9(ؑ4?+;?l9clwfUkKon=_{%M^e b+䶖Lt`Uq[mrg);ջ3P 2}}383x_;/G \A/Gq'qp |~b^/?]o`i]ҽ;7>Yoo/ӫXZm>Vdp8 v+QK2u[w/[_>$?Qgק7^hyR߿y9H~@V'47Ǔџj~uqT^LfG yUCjW `xEj-XܘԏjK)ґtQp5*Mw-T]чT׶m Vjh8șeF Pbn6y)Pӂ SʨfҤД MtNv߽//#ǁrf00kJ1%a M8ì ׇFJ!Zx]5F1hIDTswsQܶ8?wNv5٨(&&k90Y7PphM$'5 YRB>]Ck`*>|c~d fvv~wL?\6Yud4YଷKI:}ÿr?eP2M),#dsI ϶g(5{Zaκ`[\% I6etJ[>%O׵SjkL vțo5䁛0H-`Ac$`m8eCss"qU-Xr1usiHVgM#ۛP5mFLEVSW * %EPhLWևm0C""6>n6~,`ClMx*Ӗj&5Qm K-k[ eC]67إSU9xr!0 {L1ccj(~^;: iQ赟ִJ[zخ1\otع0E1z ׇQ3XQUԁv~8.(7ppy\QZiCc,g{Vvm؏6b/("RcrLw' Tϳppm[Zŷ_+uqq*\BRѯrDU@w%-Xtթ j=*̃nnKv5X.Z8ִ&F*Ze3x>jBj<{<8s!#9LQV|O+.a;;PRI8U(1mh䞮rgp]ުL!05k5y4t!iY'敷ړDK#yxצFґ?ujY~wWx3Gu-UTɥu{jP'TwwӑyݛS!rӇ.#$ |!42#.@˙՗#9%kdɸg-]V#y!|K,MFQGk\Ha.TY!$LÍJ!dXߝ)fjf/{1sX&;Y=Y^ϰ ,]T ԭƐȿL&|$5.$/ N '~vyY f#cCsKyәSpj:]?6w;$`$ilSw76@ =76%˭q[ci3Ub-C BB'Hh^뜦!h`! /  _e11P&iJ FܖD8TXgI+Y ~gI8ʽ|YR ?.NX%_1#,ޱ00-*A$eqQx w$YYIiX *`#ߢ FDk᪼y i#RWzƜRzӓ+CTFv3{̓$Av@ E"n| ƶ(ZD+Fn(m' !a2rOl+j&rvYmX4r&$ɫf5mq/Ú<ڑ3ĺk@=0ӱQ@..|k~:nD*1 G߭طH/haB;758J H¥oº4>!9˨c` Ak WɸFVځdLAȿ=;pF rm'OAzUKMw'~y岵)XMN6%Ők^ftIff, vPl:$CoʠG =gnf\Χ&m;.SuPk"uF.e:nf(<2jl!Q5l*i5SI*S1S[89W%#j9,"X& 3"0]΀r"ڀ g;Dv ?ܔ2RZё>: !NGo aHQ&QqD 0l>+F}ЌZg6ߝsǜb9p=%*z HтS c+sbBDqo,+*ԲU7Xa _% 颒p9oOT(u>c)TJa\&jVEJG4..e[D|L2=u\fs]f` d?108VE(|D236A9]073ʝߑt1?#<[34f_ph³׿z7yylۆ 49tK]YcŖM wXwUpgwG`s])=alFBauՄzsS< tU $ʦu$@,E "EBUJXJM+ȵNbфkY2mkUrS+5i۲ %lM\HdZ]MjԔˀ%𐧳$دnIfޙFw|W\ǰ!˶A_V\);Z;yuwU9bvȮGdV.'I[S-yL%u U۾x$%|Ub6O+TKYY1ds(SwLِY1$to(R}D*D-wl/>1 Ʊ_R-Yl/Q`Dހɔav[l %x^SeSloj{ɢgp#R U}j@ۻ8N5gu3) @%ga,76.|DV~ ` d U=Yb{L #u$}\[vJ-w\jX;&R|D*H[\޺^>XIP9A Gc-w]Տ ɔb{?&EG#"[Yb{GH(qi:e Ub{G ɔc>2Mh!)H4g(OR-w/]o0yD^|`Guϵn󮽿 oM[, ޴u|hwOW풐7σ_;}۴#b-V-z }մ~ _k`Ӌ/ W08[\{oӽ/}Ԟ7;)﫯GAtyNˇK85']ܞW`x`D:i} Uw5K+.>ú%}+ [e\)ѵފ6_Rޜν?ܩ]qv>=uY\|o TIArif Ɯ}O(:F[ (vvzgs,7HB4C#~bnPjjB]b;;;q\DB sئ ¬bW `2$KAꤹ_x&| Usr}sWx7/^ߞEvOuMm[mr5ܲ^W.ZMp]-mW^͎{]\W,ÕZ]=SRZ!ibSp5$\Q4pV^΅~=;ўgF+Ԣa. /b?Da$MɇCB2!>뽟FkdCc{iw-jh6h& npMcnKh5/aC@|Y CNfd0\Aثc D7~hӡD?(0rRAJG? 4- qHūAb ^Ӻ `VbUb![CAdAqbbIA2ZE}^,Օ-ŏY3JKxo9p4,(}̧qd*H> Ld㙃EC^2í03J'&b }2EeȊv1.^d3$cyt"^ 2'dڔ9!'|}y7Z< HRDUD\pEui柈ј^*X=jlDēAlZӷ]l6ʃZ?kuv1 ")/$bwlo^o\I/%yG3] 戏/"Ӡ)  I #,#-Ld@Yn0G\赵2'=Ý4፨2i2:1 4!OFіRxT!BĔRż(sDP֮K ,IKڂ’Y(RX2'K=nYB  %>%Aa 9’ @h!yt7w'_4;[S<gw>eX 68r_5RW^d5umCkv>"fNC#N$䘦B9g zS(:嚪#PnW{ޟ/>6 ` >{v.L%)H>|uXUeW|y1E=Uj2h@z./A yK/|*4똁/7]sطN};l-JXCPyk &vA5euUW n״{{6wxuTS~c(7 AM0Xx3jM7ov֗ RTxh17A qbBw~A;/,}``자RA'(!H60/׏j]պRyh[f}mY d:j<)XBG}SZIA!WFʻ WtǗ/mY mbΆ4ӌ7M2K&u554 =a߆lAVY;c0ej@Bl坘5/<[xZ9ء6{~0Pԡ2m1!H>@'eA/Ďı]04սRdÎ[}i4W7,՗M՗i0` "!H>2X|2VֱČe`&7jX $M J!t˥D8|pMv[ أ9d %HIs%|}P}cJd#se0thEZvX5"rk Bt \V]][ٶ+px#Ov&fPUqiMkX*l bͽۖ[jq&Z1@3IbTlPEq)j. 08il)jYf-jhX=B&d V@`lX|zC=# ĎBCyB3,u*_sO vVE "$2#HZYT HەRN1xxc$LW^dv]V^򾕗miw ^):w M֒ /x_hl;3;4t~XtjLt|!@c4ʪ@x.YD $hMZ v~*4]Gdۑ]M`oqmϋ$@Lh|ZUQz nqFa~6B9vȑ$nMɇ[n-s­Y q@ % ()U PUMyoCH;gȡ7?6Md| $@d"1PUƟgCJ4 $eh)!H>`$3 dfdme]&3^p6;0i Bm嵞Uضmֳ%B.>vMl?n')0m:$TMUιjUXQq;kE+hB6@$P8=P,e$_@^:(D s00dЪOr Ar[׾.ز~r5;>WzJ3I/vJ2P5$Tŷ[dXE>]ۓkYo  _cLy8eNoBlso w*cZfjPq6R&$B (jcLP(j(r0wހpb2F>`sP3|e:gI\ju9$Ξ)ʟ`Rj Ayp"'bZק5HRFT#r<\WCy*Tp1;uQ觋IAa rLxS!0%-m2{`ciN!'Gȱc 렎"䡐\B&? Y=!OGȄ{c..A⏰@C]ǃI`:H6@AoJu0Llj':!%$ #íy&)H8"aM!NO<8 Sd#a2ȇ,alG~ z{?EB;dB`Nd|?zm}_ ,z7C>@$$I +VlL%8gNI8QM=$IOF?dIT伷6gW<9PUũеC2NwDb<6!0ΛJݿYb.NqI\ W,Wa W6p%yFrF !k2$9FD@B}(PJ2a*T4F H9gX-tU3.d?U'-R<$)YuyY%ހKy-4=kV! ykwgcK}~ƏPq齅McT?ΖW͆>\nnnVyuU1nΎ;:>=;ȧ᩵gx]uyw_ZM&.uW_k_^D}D_߽C<( vi=;O/O0Ke"'XqZY=]1_j?Rix6V^TaqvZ&c7ݑf g Eb5zzO]<"m]S8{swr՟ĒoF/]c )?-UP~ʏǏ8^մ."bTxFxW%^Njz_oxtc636=qk)N/c<'o?Ju狻G7WmwvѯSx|!wxX/V muW?|A?6*[sЇˋ/Cc,N ٻ6+Wb?dTc78`$zZ%Q!%{=UMR%"eJrcnvus/#N R1[B?6 V / 蜗4ls)_'ڐɞmڱÐ(;H~'6Ml3ŹeW)AC'A+)O?gi+ N d<8|ou~|q(}, /(WC0+]Kx5k1)_r NX'? ^XVθGYX˞i.+Ws?[ROw=mH,@S wU&6O*eP+ 9)R,0!-wіPmSOG0?HR;uiK{]t)$`Oab. 2ӏ RI%a3:ױ_mRZZbQomP|HV@{>{X>~U핯gdbP=W-5ٰ:Fs&" AVRf-lvcF Ֆ0Th{jkQ8~a!$kU.*!bHI)C@ $^!#\M>,I2[g]+ {>fT,sJ;K#ֳòkcARyU$ W֨UɥIY^Bg\oWF煁?(qS6 'Fo0iP»r<';ܘ+3JA3OLt->v'KZgI׳tty_/KƤryBZrn 3Ҋxb?܏*Gy3P"$+- rZ,O_ר攨~TUGQ%THs'Y*f[g0 Ӱz^:v:vjc<$Qgp[&K-&IʧqO*= ލV3xQvOu@E4.ǃ\??ß`QթN_wޏܩ> cW;޲^d؎ v4Y<T&XCm蟋7fҫmtRVĴ 1%DޏؖbЯbLI0J>g[Gq^*'%=f;nZ}~w ?X}'uS0Z [-1}TR^X]3fqϋݚfXEI k>~fmMO x0y tyyUzK|W8pG|_~)ݐ݃ ~񿭢Rvz+Ȧ_yf7BSӥ0ei]GY&m4S >ukY"Kzoo1-H9O1c E7(0;VQIwh?MIo@94~Jtź>TvU{Fo}Tye.0z!`甽hTqL2dVF>d>)ZXݬ d8SS=/ LQ|׈FNh}I^";kgDw̟ i!aolTtl|trTV轹`^pizYMж@%UKg_N@UhdR28/?kbtpu~{#A%c\Nl0g^-d2WuzHFA)ҾRgU3yX4rNoєn\%]ϖny4K9 ׍3WUqM\&MQ0}@"8KX6H fQT!9THQdm:wfZZ19LJKjH2/͂Ж#k}id칦 jN09alJ#`i;tJZj)NP]ʄo̕y ~JgU=)l]\To%}hBWG7-ǐA~Zح݅[lS Noa*yJV>fQdh0n|ʥ)Z>o>$߳s>;Kp/ æP _Gb[NIW>G[^y}#wp ,kKfQtJu?4__yʏ5`^Q&3.)W?+ʣwW3Q]fѕ(]k}zGGt' %)Œ6 Uzј6O㐍5Zȶ;gsVs#o޼9Et\tE;Z8Ts?]ų埆RIg=hn]/ϝ!H.DZRCYTO'(ĝ:7jŭRBgh[8|?jw FּxrcLػ_\Xy]sI/qWc]vnY[< ZtA܃485/~>#;KZT-@E(fYִ8 y ~x?{6*s!Fg(2G֡○ܧ]]OR88\|==qLcocdL2<:I#Ty+'>Q JfeU:.p]r9M҉,O`t/s) XU"5p Mu W+/# *-#FiTl~;/ID5  bLj}d!r9nTdMr?FQT2h0!)s E*(t$!E3Km΢쭤c.ORPf'`f%ZK lxCmJ} KHLye,wVz=Sk´[EWG㳳sHu!zƕΒPJ,Tr%%V&*\"$[6 pwS _!*ƌǰ) o2( pXC *$w(*maGS8uzӅ"ӨzFj]6di M,;szQUe*Z~"o1>$T4Z.lt6k'B ψI1ڲ{-+6Xr+Ign>]^nm\,H栨Y[/`IT{aJ׆OBmϛk!Ҫ$,.w["o`DHBC_ x ˤbb,S>jhIk=QENb!p#H F2Qhg7f6PʄO ^DkH`f %<]qۂa8xqFH+ׁ5 %X=b #sdPQ6Am>Zh-SUWY9i0 N@ '|**W/3 ӐctpmLUX%N٠(.ld9#+uԂ{I%2O&0"06 Ai> c#mY1#zG"p:E;.;:x.HKvo! OpWO咤duXF* MHpq-(epv`AV#c(2(AO:d'o,)uT2xZ+؜: hF8jaenc΋qYʕF:' F˦tk!G05ba;gqzÂ:Ez|D[܅L[iCL٣]8gXżSU  l3dQ($b'!ҏ[>P?v? x+Kk=yDGCwY: 0h/F9(TNwK }l97A4AK32DL7oVw> ~;wok3UTw֒WW+1ϼԃ_هoCrlh߮G\=0cDO˟)=MCuri 3ЦV@O&|-e1v NN1 h*ĜQEk9+[:~_*x Ȁ؃څiyd\S1!D`;͜O +'ߤȥ4Turiô2y .G:!Os%'IDdDq%dY\ˆX9Tp(eZ`V'ߧ@OTq'0uzO4w,% XZeY]_+%oۘZ/$ï8Sy^wXV&X ߜVMQ5].zz,jEDKDDKDDKDDKDDKDDKDDKDDKDDKDDKDDKDDKDDKDDKDDKDDKDDKDDKDDKDDKDDKDDKDDK{8qHxu*>Gx^\3Yʓr=k$ pRX]=Q32.pO9.wpK!ۼ%\4(Ek'1Hֺ0]^-&qVASVqg'ˌe%]}:/2wˮAGؠjt' h d3ZgSUزY-?Api5|*-yޮSg𱬺}4zŵ{}|7~u"nN;v%3vF)eSV<ف: ˎ˨oY'];;~sfzsv=k9pq[l766cGac94JƗ{W4輕:(opɴ; Ak_\ߖ=k7\T[W[/-{'Ehñr+?|Wovi(NWy=wX<fӵ7V?_b9FPf;L~Hc: Qy,ɏW7X1| Q~$h/s0|~4;y^g7\}]swZu9gsm4_=WnïWI8(2wp=m6,'ݯ \;!NBRVFS@c=c^]᳥~qAcYn8}̠5?)U9۶yNn]]?6 Y箪OJtp|>]vfٻo@T][߫X,~Nmr?f:)i_҇@/Ã0AXkkxXiÃVVw67)q7)q7)q7)q7)q7)q7)q7)q7)q7)q7)q7)q7)q7)q7)q7)q7)q7)q7)q7)q7)q7)q7)qmJ x|ѡے`ʆvEAqHղa[ˀz-IwЭM_^V2) 2 "E`+@*$wjT/Spa/S k;Z9ZZ=|/rfeN/EJ}ӊm%.JNi{{g欽ʀ4uƆ(3TT0oZ NK]IZ\M@.V:gΤsFIǮE1P:,]'h_.A~ߊA`Aqᒧ*{]@ut1|yo)%yԍ-d!!!!!!!!!!!!!!!!!!!!!!!Vs/?]& &5UaV_=VE?#Z/@My>&T1TУxN3i +U=z4.!>K vg|Vuwyy9N/-X+#$21Leb,cX&21Leb,cX&21Leb,cX&21Leb,cX&21Leb,cX&21Leb,ca,] w>{Ӭ;{7?YsNkt*̵X$"1HEb,cF!|$_l{^h7|YyF:Pea)?~M_F]C%Tϭe?ab@8 {o61Si%\/sfkw}_|W1_4QRI. I95>AܲUnsVǭV#OۉO_o9j1S2s'(:8z%e }qyÇܝ.?HVu5`gm]g. Y:&CjKZĨ{V1s;|ڞl ^~wWt V+ ׳4͏>P{m6L*.Yًt}ЍyvwÇZ刑՛\*+膕3ʖvaGsykt].;ilLuhL0`2jłіy'uesfǷ[|}66Z3/KF1']\rY^vƺ5uÇ\)sJ $Ę@)0! 'XYsRǩTS=[[-|kZxԤSɕyX҅ yfGuykt\.w|FBKL5󣽰e"ARardGuykt\.7uu.:Ǡr+D+L)pE˚<ꂞ;Es'wV<)K MTlTf8RRu{}\Ç:W_Mse ń.y*VLT)\w^~VqjÇ:%o#Z۩ZVIp\jjd?'re.Gi'kW6EwHI]$NيR_WMAs*4SU3TJ;r]F7{>:݊j2&r~wdq8*KV Ϭk֮l\w>u\tNtLȇ񜁵Y6MQ9lŎd>8C.{m$Jø/HQ"i`$&dpnDδmOf-R$Ycw]%pn~ӟ<$Od6śi-9 F̒Jd[pc3Oh&X1-Z\GeT8 %ΜKhI" &!HG)Q)r"GdY?5 8eφ O&L`өV^+Ab9G(,3K6-O_&bGE`0p߁ YPFΰWztSODz9pgpүapC!Tpu3icqPcY.u+c%8 a7'1ѩJ @΂8:b16(qe3`aؒ@R6FIMtltSFlp]i4'eZ^W%Mm"~F+m@F(|1A#LHe!PJ |"lW3x,6yTX|I*uӮx .01 3ʤ˴xW]MlZBtqĤy}h uyƄ!#idỉ@Ux/З`h>]2?Q[ qgT1<#?I{ڄʉ T J"j޴G2Q{.|\2u7Č~dAVӯ8R,eks JlGPpXKжղvw1c%s?|L,'HdalRA{)|(𑬶;xrϰ/7q26o_6ygYfا& p7$oSDFhƲQc{+rTcL\1DoC 9blD;0L=_N CFH^ <ɘz4ѡhN3*T!suR%18,ǁeYřoa& r~2XS$HM T*,C+? n_!P4Pc' jY _r $8_kޣI[߰`l^vI%TM ҰNشi<٧'a 6ܛU=AdS9 =WMq%. 5XC؃ u;?i0*zg6y~kYѻﶟ7sl{om7#~x>]N'Biv,XX2׫)c׫织MIۗf#FI5&֠lNYb`ĄSA/B >ޢjSfЅ1*DؾKKt[4! IbTDwI| K%]u;wT KpGԝv6r'о! iaVBwI&}IT7~si : CRЫ{28/;/@K4.?c%/cF̈4ݾ_;x!S J(N7}s5R Nk:E{BꕀQ]9Ĭ<;2_ߤ}Jέ*97}A؁#a7ol!R J(.-½KjxڀʡKA61X*_G65û^KDzr˨G#2\X pj\Io`V9tAЫ?`̗1Tw'ehЕY:YJ4_ e=k{1C/2t--*[0.*^T1-/c-4_ mԜ{AދNd39K[>No˰vw-\ljc]`9|>v}w]*G~[|R蔂@)S1 E{^duخ8C L+㝭"C>V2n X ,}]'}K?_Y:obW-J.>_2 7963p |WZ'YK%o|wKM=ѹ.G_oddg-No˘^ߜrt\=qv܅Ga΅uV5~^̏͏ptT+n U욆ҊTu?]x 37wDH;WrFsK};ܮ %u>~ʾua=}w庢 ·kb)Qܮ>𴿽L\{U7/;>Qoo%pf:XbgRcc.ozLpt0t5s0;UwCsh|˘#;(ì*):Q/dΓ_ks?,.a,,ɥB/H Ȁ ~J JN(P2Ιpg#),Pݢ=JE\EgLIV9tꭆ*|s(I.<(d9-}8ۡ /yPK"!3]S$`$  v5 +G\)@u0?.`ްмޝ21{ot9N;[27Rb.IJQ[iK"_Rj6X /ιZ4|(8Dv|D'^J|wVz^Bwh][\I D?|-7+-*Kauw%1\s(]"ɻY#mwV2K0qՙd#\ 7Ѕ0*񡀱1|sxR4d߹"}\T$簂y@N99?Vu:BSPpҔDqN==1evM7t ⶻ 瑏QWĊ+•#f]iGB_䥱.3*656oq;utu;/a)Jʨ<Ү;. Z1*L%A,ђERx '6q t`  "/}!'n}NCAC5 g_?稡<%18,ǁey9,g\mvxGo*^0/%&DrI-[r9ݘt)vT59>R Κ۵uc3glYxG7@s=\5 2XPYe %k15XZ_N2ô1  (msQ(|mlOxףּF59>Z@4mӉQcY+ݽͿGXK/aAL@S28{5Rߋ:K-iTgGCer\U  '" %2Pz"c0S>1ˋS틳=AU/~Ne_a L"x pA<4\Pc{s>|*,1,1Q黻D PtD556bF4V ⹦Z5k>aT)&&Be!ؤƄ8058Kbkp>Og?lyŅ6UCev8]ܮ.3et7<`.|,s] o;= wQ{ E0IzKCi4#.-yѣGn{΂(/\)>JQrXP6gopaj`s%) 6r7sXF+wW n3%7 1`%WAb9G(q,vc5vch?G?wˢ1͘JrA,OCakpЖ<-|,ot8*hR$1E( tT{9C]t8<__BugBnwQׇ4|HS#(p.9Nto^?>;^s߰؀\s ƕ/hIS^6|rh筞ۗ1\JtxsDsR#պh9$=,lI'Ipm\rwg;t#Fwe̲#t8{XfgA=9 `N.VluP|i{K*)hCWzU!0e̡0;2T~jT-4W`mr/\}qSO؟ПGxU@gpbmlv>uΏPPn̽p7g&C~^_rhcڗ1*L즶Dவ%ٳ(IA!:?-W %%T%Ętu۷mQQvu$QvO8l@3*TϯfT l%xb/]pFNzk zsP\$ceh[,JM;7QB aW2Cw옼M|Vz|UR%1Ḇƺz6FA_x5h 0gmGp.; 0 fz(Jql$ݽeCNɊhag *>{@.nlA`^]`}V@6 {rW 8vTO .ӸBIDӾcx/Fxy^]j6T\jMPJ yB-:@\_ $Lg&9;S<ܝ)\>i1FPƇʓi)zXXFH>'өk]'R`HUx-wAIIIudpO'9NѯɁ0gtÚKޓ /M`FAt+h_+tJMDJD!Cb`:MqD) ɀ6V XmPS3[>8פIŇȑͧ(*<=-tnbhIBBZK$:D#5ldQ]CWBLjaM)fsĝPد5˓[ufx?f$un) ?v57] Pz2c_~m&kS%MjD̐AM79'S!slO*6 /%pQ* RBcgҼcϤd9r9ء6ƈ-X>+g9(\u{,gn@7&uBtIFS.e'/L 7KC[2[xO D;#Ҝ_g蹾ƈX"gccR+F},p4exOqZB64'w_Fj2g-@!6q: y+4=KIȚ7)Bz1JLݝv0!-4F~.*fBP.5QNV+<6q#n-zp4W9'Ga"aM'?/w\R k ;^5\>gHp/!"A n!~`pWҫzz>y u4etgV8DDSsvBb9G0,nlTAEU( irޓ 1Vէ% ]E=H=KG(ĤƈM?`ea&A--X+B68 YN|ewhN avȽXxO&b8/1t](CHΒo+t(4'b65cFUz039NH=EkŃqF!1q"AZhT,hޓ2۲rPu f x_z2j,l$0޴0U*һo6|}ߛ/=EDmP[~:/ ,MPyv@uWvA4 oQk5MޓƋV-:o#t:ns#*ߠvVEC>KuRi?8g*sS&&׬WFU: T{2s ]1]Pŀ_݀Phkx-![]HG Dx1NeUBcַq;}\.VE'9>Ä`1=w0G@zF +$.%P;_hNȀq8_$aGZ,6k$dش"G 6ىt-:b r5p "HcK90kА{&8'ɀ1} ~}&8eV4>3:`I +ȳO/͊5y\A@&16IGs)Z22 ԑLE.KNJv*"ɤ&L> mZ ~xK\AFۛ5 d7F!p~5]x ʥ==4SREL1y:JFWZh+F~֡$޿ƘFAa_Gw Swͻ5z+JRlB̞íie$OCo WK&8'њ+o,Q/;?m[A02lKNkқ4ooZ VAmBl9֌y[rUx;&< B"M#9an 2-r6iՔɩo-$Ay$!UVAnE K-Zph!/ux%Avn+PZ V`׸f$ҮU6U@uwQB:蛒*" -q_0_&&_ҌŒ;3dAgts/fxF?o9A,'5ƻ؍WBG&xSIy`mN0T4">9M~{ԯ!6EwKϴJ>4v'Mx(uIZBxx^gVpiwΞAnu^]o&}[6t:7/\$ {ʁ8yB͏`ͳ4O j4qdBe_:'rvOq8^wC[hD/W>JZkx^CkCϫrW94 ʜGE_C}P\lNg6EaQP XkNhmtFV{vR,&OKk ȳ]MP=i^ ?֡ubZ|9)z^}YVo5)//{tMvn~3)Z9CQJ2Dߜ~~wmgOꇕuo[mU|Y 3ˇAn1/ x]R9GhfӯζlqG,ٽO-_7uae_>LwhOr!?̥GWj˴]-mx4{79{|QLR7-@hrÅkŬ'sd?3ys?=ݫ/e_M13nn˶Z`M" Â~.77M }fW/Q~- 3-?ͅ~4}͟o*g1gY=o*ںN?~Q)]C馓*^<ʖ#_~Jz%`ZOO?C4J_[!v6.fMWQFGů˻J ٵdz8Yd~*+go'[ 3_HJ='$t<T(n'?.օGt_MB{Ondy"I-'a]VԿxqryL1-f_-bno*>ⱑyѳfh;CvZGDxO%N\0~UjC- . Eue6K(܈\{9OPsRSHk9!am[OMüY@sv+4 $BlF{hTe,e)SŠ j .P JjQfe~" DwS9&O@5#:)?ZMpˣy_HB]ᐭ J %*6Vt:e2:CMA@!WR3T̴pPJD:B)'2VN$Z(!)|)dJ0DZp Q! LTxo_}!4Xy'<^NC#uI3L3CCZ囑mBNt_4L-2qZ/*}yo+o1oGXB0^ETb)ז"9\` zzבX[PjfV0Z DQqDKD!j-q$_48 "!m sVK7Cg,F(xgl(uzb« #m/y~L8Ək/Cmۆ7w qknUͩoF+&I6yq;7&!mcڗd7)G3&|RB4iIw]^)]4qD "PV=3 LrZ^ᘯ77ݓwXq/l2DWu &E".JIzIgRDlQ$B6^KR(4o)q#-iq~=n#G"4ؗIwއ; hQ*I=*HV3)V%E/B7HFdƕqUQ['{#A v~Y{j&Usw_Wqօsup 9*} UԼvWmtZo p tTnVM‡^|9Qof&6>3m+^/sdq#UMtkd`|u6(a7Q:!!gENzAX>b 4yiQ x/Qp7vB +.$v'/JhBz5 0Is 6MR7G:\M|YLVKЗ[(-"l%{rjKEIka`yaf˵Rˣ{l^ǰOc-/[BuQ\H.MnֻtqD#|'2FbTۧx rPFpJ-M^} ^_5؏4.ir~ۇgX=;1yo=o.ђдmhh:qX=O ̚.?B要-b蒙U`QeA\uJU]aRg |gy6iUG;ټl-Ų|-ؿЂцlQ/;cſH,hiy[VC%1.ݧiIc!l { x`F^2HIo[<>67_T݃BlqI>?tϏ,o6D@oHA3"Ű8wNJSY:?7$.*Ymmtf[]m#$)şi빿MJY bsR?KldNXV/#;bf[]3mmt!) ݏR okeCXԏXd_]6W}3Ev:?~(gOPV47<*.7߂',,+ k%!`)X(-M+W8E Nj{Yf=Ob$Pd7w4;.C ˙rjr[ڸA^}6DMԔ&Y(}WEMRd(3vv7JD8rY^kM|i -X2aF[ǁ b(_t~Gn6c#Og}#OR[kL5kl),YW4f#C򽛬ƀ=8g@EIA|u""] #ͅ@m}r-'7o/c^6hi-qWS{b-ȵE+8؀1NDt^ϧq)dsau@|JB< :(+0݃]`^`fRAeq*pRQ h`exg9WXg[=.f` g56VݶjL3fvIX.%`+˓Է|&t`/p}c7v!P㸂|ڃZt> ;//n/YI ~&~t6|'kt)A=&[/Z4!/Jm +d989 Vh vjЎ{U);n6Y'/vi<.]uD#Fx +auYہauĈ 07`-℥" מRfm_&;[ G[ft8G& i+9]SjmOZV|0Oh]!xq)bNH+BHR.y]u/+j=lXC*>\ī_b?LߛvHtWxCjYC;m=T9(h}_ aa+) aa2kn؞y`k|H^Xf`jge|&V)I6:~M)lVr5 YC-.)Kַ $67)Y]zp\^0\ͫPŋ1 yD3'Ҹf㨚+u ּ*}P:h:b6_v;΀u@Få5VK ep֫aPP!k *׭WجWvF]>$`?NXa6k)9̌Q&ŀ5'z8yzh1.Eet /rZ;Yqg0rLVr{讱z酧i>LPx@#D8fb:ʭ@=8 #K`YO7c{;Bx8Va-eDp,M8fTV{k1vlldѡB\P.w@h7,n³ft8G".bVwmWh2k ygь!|I>NH:)pW^}L%_WCO'~[ˊIm?ǫ!T/,C"6> ?eF& 1Wi # ~X. t Tm\xmu2rZ~6aa+h+`JE\E-D@x VFڔMOfn0Źqlo$rJE" R X{M Qɛ1%Gs886Qn)Ҽ s^ڗ'c$h,H9bPAp Řj!ma0Ř~S~>(L:1$0=8F&"'.7Z Y8 gR0ޡ`G>t]Rw7aqZYguTԱ ޘg2cqL_Jܨr97o0̈eS9\~L}ҽ]WO+H#1h0,s^+8LV.th),e/Y8<oSeP(Xwޟҭ;J~0^"]qP,Al12qMXy-/fr-12q4@lrR%g~vd*sbT+'&]I I2`d)/-s1'XiB)*$ p@?*8ĸ14+$>(6Iĝ!J!ڂ`u(` 7q[+12q*9<0(n g@gFUpFse_TEl)PU` frql6L +LNC*1 ύEz*tuD`*krZ>#^8b¤"t͐񈴰1 C=8F&e'硙L'݉wԱ죠`2QC@I-\q^;_wd"ӉpgZPB4qP0"R("UT@}HF&`FApT رqL/y;LjX-І8d|˂ڀ?+w\&_ eX4Ŋũ,k VUzpL=~9RPb;*?ح^24ߊYs/hsc#h!cU.E1*p5,يQMLM>,yŸŽEkō#BTYN+٢d3:UTEm%3pȦyo)ߜV/o-mÜᆫr %8hޗ\EUGv{́~Y=*ҿ"Ph5uImA 焍 /\ɌB9> 61e=N|R)5Fwd:阚̦,f9\{d R6maM|5U:6=GeAF$T16`b4LD%1!B%ONiFQ.(!H 8tlk񺒓8Ӄcd02FNNoTsbeft0Ί?6U9 䫒:tp)x܃cdX"eXpEQmM#V,Գ^gXqI" bu)g,0ʨ@K2Ƈ=r 6_ǼތxX\PCϡEQRE@jX(u@XVUErhI/##MC ǔE 2DH3K=guVlRk@΀*3Tq D c8QC]022$..L ]ALZ :$+3ެRHH=@pJaM=8F&:oϛtLj.Zqgq>]s~#Ũ3tm8W oɣJʻ|{bBuÂF[z½"ʓ%C $}3Ev:?M'ImLtk&W(bVRo"Ȥ*-?ڄKۅ?2QzH87́Iyjw-?l)ƅlwD \d^+wwL'y~y,YCSd>ӱ9.)`wQ+Yu Ȕ0zp+r{sDnDNQq#F0/-$&1H#+IVPqgfcd1bbt~=)D5*$6d?yqgg:v*MS7_=raKyd>T+L7ψ~ϟn%mFm?y?GOX?5c_x_!. vvo}\!am4)2ue<v:Cߒ`/ c:%xںa1w]a0W?-nhc=mʦ:~y4O;Wnد(q4[0[$:-÷W{+. |z>sgެґ>_[`qV0~}YKE^# cʌx߀Y4o,[ʩcAJδTqQX1]bALErf耥)~6 vnz-閟ݭ53Ӑmd(ͮ6)x`mOҩ+ c?6W.fKm110)÷ۛl1q?,cӏ'u+G"*"Fƭ߭r<;gmnHTWu J%-+qm\<$FHd;~= "A $leE&ݘøگSb HϘg٬a<{ X7 bb0 ԠG#Yaՠ+.?iR]lC8[ 7*9 ^%,̜vg (0"X)Jf0]*T9i^hZ6 @wP .7oY΋s$?"$>+e Hj`R9X/|EhA'Lx;Apt?=Tѫ܁!\ꜦRR㐸"R64<(^yYՐTyr ^6 ӆS r( F,[@8iF>c{1Г]ő'’=tڃ>0AѢ^HOq߱mqEj89)1Ծ"UJg1; R 6|q#O@ ՔXR"ؘd!⾕pKWpA ӃUL/Zuhp6p6F =HA""ME$=zUa: MQWD9^yc2IkU>&.@X0F )Ř<'܄r3W=S#WQq,҃o-GOF$Y+S^|nQ*tձ C;m^IM}@>]vGO];ci!CT 4ItQ6p MT}1dcdϊhV>6l 7s9쥔`u[/uǰaQt˂ PvM\ I q$ x=zyo iŮZd 1$]}d6ycx10;I a9[" Aaj*JqGXtq*o4s 5\.̥,;urW4F)m*SH&K CNL>;3PMcܗW Xq_kd.յE"LK Wqw`$M 6Cym?ΦtǼKݍ}11@iS+)l-8&NBOY٫fUA]ܹWMqNC24/Ȫ|e*ط9$CSw@"bLud\`(L \%q"W8&'-F=)BG $ ]e}*G8]'xZp LvoW1CJtUqޭP L-zsl&>1yG1 L+A{k&йON# OU=}CXLlj-a48)Κۇ+GDk{w{. ނc` P5&_,t%Ǚ݂c` vsӂchLo>M$̫И %̓C/q/wn C.\Nr1^1G LE?zq9ZCcۚx1ȨmMkt٧U>[p Lz 0węَcX$:]`/͸= I suP~tAbo1"Y-dLjDW-' ~3egj Ӄ_ӛlѭD H>([:īH?Q]_.6ԩZd'.yuE l]>݆Ū!k+*!Hrńyg j]5=01M=>Di1(qxԃ)8u<Ԏ&UoD{V#)$dD38J4!V1V8&8F 6ŌD Xvt%VvdGD_QK󨇈:\i:yԗMA2@~#vv4̸+;lCb'W} Ok63EǣUN"%rhO׷..CӶǰA5.X&\̖YtJ/zՠhXk}Q6_3sb1zUMjM~1Xª}wYJbPDN 1LKo&:܅UϷŧuV^_Ϗ }bZ0j5r.? OKX U]O7k̿̍zy|pWo_&</0 di^?m٫_-adqoȖ?9*#Gwe/KS;̦2T[EyR78;jA?Q}a+Ut^4m]CG_p$іnrE4k 3GZvjx85|-8 UzvmoµAߤk qmYxk sA'X} .sg`FaykLmC%ZaBxxx 3+ ~gTSg<ۜވ$_8OjI[G_ ^}5F"WTp:^RQ\m$Ca](5-uddQZ!W =mB+Z)Vh#RUMj19}>}Y`V: N Յ)fKc0N; ,og[UN:~Q}JC U#rq$1nWg y S?-Y8IO+ ] W587 73/pƗRNaE^PfOF =`8: VxU3>8L,5o75|e֏.d^T_jv/>OY2+L3e^6:lfj(M+PNJpgk`bʆm:.}vIph5~Klo*궁~)?{s.LjBmngjޒIVpgq%1e1%*UHP$ feP#R!i/jZ̧z?4h[zkR7\DUܔ ޢtG+bn#jE=_b^ άCܟo S\[ci6Nɼ~ && B zΏ|gàA]!Wu||YmWՀ5/sU``\ J%yCViA5I1E r^\#'Ao%Ly,CQeC/* X/'ۡ T1Uq( S03eS2L+tef.d9ںQAw|$A4zI |ݥ!WHg 03JdS`rcHhlLC]>bajHHnEfWFq+sGcz n3w1CAnVc_2BkKfUy5"$Uك4BTEa,PĆ*6M^lA3kN7XZrQ:sayMA $9p+ m 68{q#\˾Go?5MNgsc1Z," Z4M!aipbLbД>l2gN~+//@cC"(a)b6QZA )CR`ic AKN]Nꃏ[سf_is7wԟV kU^mgCjwx]qmtUx^ c24ʅc''|05^'a WxT(x$3F !$iC\70nW%!(kTE//_C"A5UӠd%M0 <s5`We#j#-IL0' KABGLEBJ%l+H9_Oˬ8ixٷ+~繘|}yf/LiMFڥ_AҜǘǘQSS: [ITִ,Ϝl)lvv;8_0FϚ]2 '#&`"CB7$0%(r,'&x& rs8z%uϾu^*.jW;BASB"4 /55%QQ4MBs%b^0OoCwZ1vӃ6(aSz4pb vEt 2qh ikA=P$BI"xlug%+)˵6[}f>@e]B1kM_宣2AC)m骜Qjl"%sbN6.FO`N&2W:ѥ:Naz,Vf#_Tv ]&`;c=\g{Ys='E*3emy2j >{1Ș$&pu{g:+FQ+[B{gmG}YP|dbz۸}ӎ *"/fOw뙻Or2PT/oQoOy/ U+.@ 5Y@llr js6yG€BuXGh4rrM4̊kpdLjVb>^޹u~uzB|FCzg9Cn̜^)n]~ߜ9Ug A7˷9aɵGuʁk>w[ _Ӽ"WQ#Zm/e;s7f1鼇Ⱥߡ o~SxL {bC^mgJ- ڶ<4_dKֶ t&7jz5ݑO%Zw#h_࢝o\Q=kcm( Άn3/nyZ4ȫ1ʣn I۬ꊕ̙Ǒ~lZ8XvuV[DI}= ^T'6oKr|wNxR-5'/]GXIvo/QT%&A_mTzSe,CJK+521Lr+hDJW5t)OTzԛtFFk*e%T*,tQc#(X4MX8U<m{PuO}S^|b 3Xu(NƻQ5uzWUonq> Nڧqߓtwd>K3zoޜh BPd:gԘxy1u£nZ L"S/hbNkz0w6 e o08PsG",e3;Qg vg31W:9 rwoR&CSĴ(bec4 Ơ= xUZbXP1ڈvT [b P(NDJq"L>yN%Kv3A;-HXgM^ 9IF'wŠ})4}%Ŵ7~rZ?9ON'^ ڻfݠݠݠݠݠݠݠc 9'&ߒd\G?=/7Cua⼳d\U w(/s뮢v}ԍ^K-%㪜Ü ` ?A>i\ү\l"WcQtFڃE=^t#}П>ۮ!GW$_}9&Ʒl=Юsr?^>T>h zD{QbFK30Gؗ^yWKTb9rC&5WIb@z ũEH`Z xcJ|23l|[al>< m6Ÿala[L"ކ7] v>D rtϬvtR=|ќ/^n*鋒Z9`H4;Vho+ˡ`ѯy1{xHrP\`fH}A!sMJs,CTcy BΉ8Y}j^IP UWx\[JjqZ6}/%yj͡kXl#l2)P+u!P& MXB#D9X.3APJF?rCׁNwTr&ɽvtV_9ՠĶ.rGg.$lbd2,HF*cHb6e0;ԳнnpG`|qaѩwZE!͝oq_̴K7XmQ)0RgB$mCTZLkv=G8v EgZmݗ^TkA܊fKQ@Kw~U_d8hyIb  4ɌacgGH.I"CL;ː*;)ޭfWG{}9\vS_M0` T.nS޶z1y"W 3!Ds`G8:ChT#yDHۺOO> O1V!.qz? %Qʦ9DgwbtH*i]ZXb8QD%Qj7@kA@ ߲OhA!Q)82RLb/Z'Iӣ .nLc2 3 &IID*%.-5( Tw~"IQ2zDM&J+!ÙAqdȊPZW<̜ Qd2a s׺JRU,$⑍z 7,gFDYD"Ȃ HXEj8x!9ƢVBj(,EDam tx@S\ m6ɤLOY HUGXPھȒ/wVY.T.(2{*4ݜ[L.&qW `$Y 6:IyD&>4D8Ĺ$Ԁ40, "y[8% UÂ02$'[3Gz6F%F$Ce?$idtEǁ[P&meM|-5Lv|8xgdK4[T35 Kڤ4D pӡdztlW@GЄJiUl),?NǾ7 ބ ~E)$>~r&qQd\c_+#ӀUn#.ZYZ'טy52*+s]MuaKb$v6{{-!]I~9_@!\d>Lp6?߸GHǑ⟳\ꡟ˧]"8IeuL[,lM)6}acO.j&4 ћW?8/qBV}NYz0ַ_VtdI"xToϯK(L>ob)0h]о߶ߖ{S ̮nrHƗ~pQW^"[v?kKa6X)0hEԤ6FE"4,tfZq Cyn[nJX0m̫Tp9՟C\~dvZQwe|A/OY-_%勖@.@a7]([-8[> ,͡>O[Üǵz'ӢHZȝKA9'A;e~8wsV`n8e/jngAhhvSI3bچ7]݀~W-en#ŷ0{_t.y-~z@9U:ϣ4ϻaFqKm/ |e`VkCJ@U.X@-IeZ no;:|&R[AY| E1//.""1/T ^w2/F8wRpE2<}rq f6-=*@nf?Suр^(NЙxkvHN Nh*I0I "8HINqAxnw8۷R褪@o%Ieˆ`H.jYjHs'#nͨtě8Ub p-&,UnNF)bO"TqmfTgƦ$ 2mCkKPVi]maDٻqmW4N~pO]4- di﬙./[NM1$-CyNBZ ز%Ƿ-K 02Lxz"3L) ܈"isTrG1MI¢-*J).5T>aײ.D!XŞKs%PqMwgNGW}I=F ITJR !L}<)-giOɀ ?#Aً%1KuㄈLJincv裱9~hR߯AP 톝}t6-.ܾٵ ۯȎƈ\yĕ;K5ǴPmTu1h#h @"d)ꪪbU"E Ծ+x2U`*`ӔYh%b1[K'<.|xkה+MS])CuRft5%Yx #uI jڑ eǐU=H5+@>d*QJ wX>|Jm.M n_@Ɉ#W~=k6{AM,u>*Ԅ*Xst`o AqCĵ9SwxDZ@aJ5 p#L{ J0[vhuQ'OX73י@2Z%!7A(9?$EѾPxf<aktj(Bz$!t 3܈nAOIkP#gzERS';eÎB'nmj=3(Ѣt AO:T =aڔa G,(~c歴rPɥ82'g~k$.p;vaw Je`_<`g^0DQex~h| nK\_ J^Ep\-דڝj 2oL_ǃ\&U/I:xgGIRnJK StLjPO$ERMْ4YQRd]sdTTJ"{*9X.$l{) ϼ^:}nym\$)^oBh\R.F.9.%sdY g]J zĂU # lbxy0`mWfMDdR}w3IR,LDžeZ+>i)ReFXᒱEIw]>G"LGz3k /|7/]pFniQ:63.$9#D+:h̒[d*VevEˆM|<-W&U;BfDNJ}=Pzᙥ,'P~+'M07N{ʡ`Cܿ)'?rвnUtltoe.c$Y0Olyl49+Q%-,=ɕ8KTGՈESwTG4YDWbOx?|aRmFӷ׻hnj3=M=6ЗnI|> Xl,EC幽&4bRfs kfu`نߦ⎧_6٨֞&z\AڟT׮^"2{"N UNهÕzFn<^?P=E"퉸~E[t5YaK8+Ͷ2q!-,/7s ^m6@`tp\ 6 ̗ gxu{K^"|Ꝗ+y~u}JWUZ׻IݻƱlmW¾!hX◯vw~OT.𾂄1}^Aalg!/)4 _ATU[=-jd8:~߰;ݺ{o][Mѩ}_V߫e%k2/:W ru*`0éD`tu\ws>k~~?Ut8E7ϹS}|ƮqΨ.qAwtK7j{53vh,%:?I?;}p:NjNܣ$^c9t+֊Eː4:qVXO ?Ę M15R_ҟy0ı|d⡶W sSԠ!Ʒ!lgq?y-f]&Cz}aHIߣo8"4g^xv< gokM"eTX~Gd<1zğߎ٧.g$%Eo~ ` "l׎옲~q\jSasI)N_$=}g)'4aS-ۥWWpW5' l5\W1SҹRR<rƈM QQeX TkwA8b肎g3Gk1SNLGu7蘆\.yVLob0"[^C?Vrvo |UY=1YݟfUA߷É z0V*C;@"`s ,.BtUw9]̜8<(gxr5u qi\^? s;=a6mh' 7p.t|CH;;BX`GmåϘ0,ӣ.8Y)JmohƏ}>Ar)>`z]Ӵ6FP4| ?Z/e./}>[;qAS))-)íLl*ZI-Y-6Mr =P\+Wʕ籒ia%Om.z'g4^З!{cOvHk"R)0tUpA G$Gwn#qxT;Iel6$W2&xY3MEf(*RU'֚Hj}-ZGM͖󾨆:H#(4XO5UpD|l*L1bE:n)H I8P]54T1Xn)XU\FO<AS,4XKxr^=b,&re;gSD D?`?dx,G+ghqvqfQLW$aA.^j]vd P`k[^Wfj|?M (_REcw\@FTͰ B*b .ynK0 ,ٮ'DŲ7Y8[g o.hl/~p㧇(M~, Z-7oQC.?TfhVd\bht+msE.Md B"g>rdJz#ElsSZ16PUZ \@@Ah4p}CW6_/*\if*r`;cdVn̽MHWS~SN)؋y:%oRQJixo?*T\{,B\By'?c͑oa/ pW[޷QOSd)Sp\EvmՂOH+!:,es^(%_xt$UṿvÍyIB9LM cơW $CZ$' y+l#nItrHyoV"-Ħpߗk | m.z00lTz[0bU;Y;C Tv"Sk[)yڸ}, \UckHO tWm[ ʰWL"&jWnmlQZ 5SK+$홆qYSKϨ,0-}qwx1H_.Ԥ I*?tzT=KLn],FkIn i]$𘓰9l'ȄꃳuTF6 ?dp0N3P=7{_zSkDNO#GiխkΓzLQUǎ_MU5YSck0^ zD>Q v{%f!98!ޜ'V5$!6x'Cn6GZ mQ}Bu@.uVķ1H<ˉxӲΈ2"T6DtCU>30q3Tu ^e ;_ uU^i,&l|PB3*b_ fyN @&[1/='o/_yu\LQa2[qN}q|{S-dgh˹QGoq.k"KJ;eGR_CgCYP>eCy>CY%i3<Է%N1!"6צC|t`j!3G t]ݼx4ތNj\*_  y&0(Mæg1)v~ @"c^oSt.7,`':?@X ^~]@k7$ tqIvu>l$h5Ŗ,J8]J̔Hko}@ܬqޕ,( „oSJ\(1RۄҕJN\ݑMjw&Tz [Ӛ)+Bzݘ"`(~[ mũ>X'4Ӫ6}X]2/C:6ҡXtz0Vi%mGǦUdH=0T Pi} Ӈ614ab.Ղy #R"{|VP\Wr6`+Ig.0.]tˮIabMދoWe:y-#}3N7 [$Lxm*6AQ Upa$n ]# j p͔Ep0  ts.9 eZ$um᪋a}q,ϏTQBHE/ |h&[R (| b_9f;G#u+tzwr|ptj{Wh9^Vx55pz.Yo̜׼ G[GȐu~·%!>exmQE]ª̔>,ZoyԱӷS}o:c=NnMu-CZҵ kY]k5N=ɔb^MXnšwCO\i9]|Zdbhfh7zfdzy] "4 YA|CqvȰ v՛jR8hShU[[!eghi$AAxy 1e|zz? Wzӹ=!#ooQ!x@,;! j",,MΩ'-[rɀQQ'p :O!su-{_+ެww}41=M *i(Nc`8BA iێrg"x6tmUDFE@-xKeL2mFM*آ2/Ҧ]ߓK|\% 0gR Z2$pBx[5'w{K(&ܨj(LO~\ lԣZľ&ߖzGƱi . \=L'A[4}*1PWB/Bk9e!0 E@ m3qa.]k  tf+<pG,k[7cԷL(!EY6>~:g3"!Bph59m9l Lj+uű ZDʹ^đ1l:dl`/5 g zG֨!>qjԶ?,SڽIj[)ȶظ\&R㌩>E%Ô[rgd^-'ʖ+g4黌0:ܵm _ۅ7m@BF]blUgo_]dAO/@7dW[y@@Y+ll6"սn;g>zjJ Kڸ5ͤs`"o򻷿/onU޼[G\x\p#wtZn뾂IF@lK%T'?mdFvF|8ʚ(`)ҿ+p wyc֬)1C[3vCi+sj7$.x }a~C5̅Dc5h#D? u6ϳi@SI[&2%}NbqRrgM:L+]jV֢@Gwj EML=#3b(ψgj.#1r[X!p8c C؊vdmSR U?mm+S|ˮNa}$zugY~}N>яww_lNc٦}_Ug;_';Hmc421YeI9 %N0zm`+*Caߣ^|;HvmVK$."ns?ꈺk'ɠU#߲*L1L^.Gq.4VrkP$>%Q2 y$F߳w}l1\*zmËZK7fB=/7D$b-P2AiuttcÙ JNXlègYOzXG\^=RSs `Xo]gM\s28nk9 Ubv(}bJ9>Qԇ},&qIaHex>E؁NQUz `\~{[(r҅JE)- 4vPRpe&] ݂諸la I~ :w%KZ5K2t.šfyB8$t.CL,q`x62 J1%|6ápm ; ǔ*TÛ:/PWm۵g,^*t \Fj"D|xSt|T;>9.g2(X.uTuIү 6t&sj~Я>ݭNǧIbǍs)TlU/}ltv{;0iyf|wz^=UO=9>Ma ݏ}},}}̥SkKHE! ͆wO5i7V?#ϧGOqH{DJ4%#};+4*UE.V\ARJ `(ssOUԑ\H2H+/˔h;)YJf`jԋR$u+:)fxjh9C#) !T45Ou ASHޡ Q[G{`Zf~.̓rDe3r,+1ȯvu{b>k4͐D7mPK-öLrjc!&ū2V&S͹:0^b-=78$E:RRյ|Q-fe IQ5l:6=I!1j•`zۖM۔WAژ 3nQǢҢ/)ƞKrQ+P)36Oo Q q,l m7(r%`:!6C+A0iIHEFJ?ǿ1LAPa?iĕcztiiV&#SDH@uVppXMG-QyɃzviul{&P2@Pvv}>+YW\[ֹcXSHO~&5۩=8; .&a j2Z֫뾔`INЇgqQ J3]SyI_uP#֦DvuO,b׹@Պ}cɠۄWhIJ" } 4 Rk00:u'>HQNe7^3h]\lxͣ7~.AbR1JpVOAn]N) w`?98/s nP~*)<:~T6kv|>nTRw)2js{sٻ涎Wn-u~Nj6qjm'\jb} P"M(3g=3t>Wc.ŋm Ay߱OI'R\;8,Q?,Fi՗ӲFt?~8{IQ칻HT>Oto !}Ǜߔ!+m6C|roV(.%MXhҝ3̳'~L Ty)f`a\^@ R-1̲QÅ_Uj|tE |IW|^ -wvvvau)1Fs!N\Hاxn:r~vglqvzo-w#[qۇSr*ȸN V.AZLA/ѥ+ǹDg꒭:g5j^IBsSjh?`|_rjˆ+9e;/[*0pۨ"1ӂKK9S7QY$9K"W ssrR禪O¸bL{yM7,tkE?(r]AuS'xNϗ.Y-sz} @)P )ȣ-Cú~kR}%Ƙȑ\B2IDž?z(,Hԕ"wRUZޡH|.Rs ׏]F<wEZXiCwWe]=EwOT̨HLBfݻajFp?#s'f?|wRo%02r#) +E33xwdv%MyCtg|cf\5PpVWXi 2;3/;į ^)bD[8~fJ^]!a:sOG׷1?emvs)͎TQ2^{e?ZubN`_钵~OKJbvfd;:/Uf=G@o;ل*R5ntj|O/vz'hI\6pyo]}ȳ[:z.ZӁC᪂JCCB'@8nPdTMJ{x4oV՘֮}%k<֢l5VX |9t Ԛ| "t|%YoxX.-GoFTRp2;:+7[/O›Cttݭo5vne~.?8;2RbP `xb9 ,cJb8ie4ိUE0ӧyKC{ nM'INmg} zu*ztWgtz*]"^fS+[b΀o8_Y Ѭp]kv||^m54>c׆oj锓ayk2i-R%}RONl`i 擊 { }QlX~AyV~$ 򖾤4q}lCCB^5mFZ'2DiDQI~ds>Aƈh4A8I!֢4Czcg=z c}O kOvP)09nr8A#B56Kug][fJkBZ#: x<Dm%ir9$w`sW_y7+RWH["-ݳz[=ƿ,>-^9y&G_71)S<&`M׋Ѭliwolfw nf ˋUd~yFEN$7zwS9cڂRHjj83LyjeQLHs/7P5ק,25Tsh25RWǒgjLdjZmy!7d/J6Gtv].nW3YCe2.D)h N~ӗp8ĽE{TBhoE;[̩}W,(-D"^B4 ZwHК/5AAM(1k>:bp9 7~XB |VYo1^@)t\UřCfCf?BB޵oѲգ2XB}hrӌ5ϥ3A@I+bR,pβo?FB c%J Z * ׼?~&ox6Wو4c=<_ 7/#xEkEkF_-y"c,茹 -Kg Y'( -x*ti M;v:Qa!RұgB )fG\X1Ҫdfu -|ڶ hR@@yt E*"؁e&Kba%EBtY"vc$o$o!yv4aj fWB)H'ǁb -[#Zy"Z0ZBD*4dU(GHhqzR@aM"蓃wJ$'#x=xɭUwʢb > )?BB^ȑA`pYS92D9,&&;@Hh/E!VxA\8a>FB Bz| R"2 DE/[:7FB R4?^f'%HLN- rE-8F:v';zKz{.LQX 6"=TնC^/Ʒ39,=yTӣ Us}8.4eK/oq}nUYRYF[d+'{}K4k vRUz^O [z" -z{UnֶyLxY#O?FလIٯ/@ x#м|<2,ɧmz`hM{]>@{\.lS-9[K˙4^UoJ7ӈ َwz"EDIul+v[(; `.qgĿ|;8ډ#ED|I~MT[QYVU?r*5nO{~Qkm /Qw|@׽q+e@}F^:7QUiƇ*{\o!s՛w8S GKI⨾_i+0SϼH ]~E]GߛjtFj_h7Ŭjv&}ϯTS 7 7jkMEqI~AC=mVo{rHf9"uٶvt%:o .P<9""+p, L ,~`BD(,VhK LD*8\?,Xz[N8PۇDNs>QD8iXAV8V9a*L9Nxл7XL1h@'x @IAJ7TD-,KzH*RJ:A6XyTYGYKq/v]LDfB}r]zd]@C4[cILD&n:>Fb)|uw\/l1q\]~Y]R /+/'"=v ?Ư'0 `Ė?r}^`а'"_"s$0(G`O|g\|&ūU%\^_^' ϿŤVsdC'Uh<V/0߮zA3TpGA%Cqa/V\yQ "<+$%74$t ĒyxLJX[dqx0[A$@&y>/]q޺\»Lń$E^k'}62DĆ^'Hxh1#$V9]l՝Tcj @U l< [@*\1>ٞυ##z B~.l ykrmRaٛ;/| ab~rT8\Q+uEW-GWr*۾c17LwoB,;q ٻnkcuv O:hh  Z"6|baϤTG! (.Xװbxi⎡Te |Sb`SFGuTlb@~-*:r3:ihx,ajQpq)UNdo<e_١ȎbeiaX.L-foEujb{lpTZ wsh p'Ќ^7b П "aIDCdCDJ1 WaҘ3Aq6K9""G>*iP*Ƿ){c z}^&FB&/ [iZnQ z9G1#]%e4"a!$2!9o$\\{`? :`2I /?Y盂>^BMd`P-|ڧkK28]#V4,~.8R(`Ehmx\8^X| >>wuɕ bbOּ?,`wEDwj &`-ߦFDcG[|v!&FŚ2>W2pQtЂ}a:d09zUokts>8yGSN$#]S˷v#&LDDiiŻ^ )CTFbc!8+ ? zSܺޫ4AMϳ @\x3| _0Ul%x%gAF^)8z/@J0?{DXRG)w/`&)\кNIj&SJS͒Xyd<*\̚HIޗ4)#Og;5g <Щ٤h(cAF}rvV> (*0W0,P0@Uf @ؘpg49J IOšl )Xj[`OCg0cpDA1(1) $vD3FMyV 0 <a,@P|}$,w.eb^sRTF2$-P0SC<U+;Cu$J>3W7댛B$cg6WН V@:Պ pyʫy2[u3wĎyU߯0芀nW潒0Y`#E NP~,qof{#9+io0"q!`$_nTgCf)I61 X*p*vCx'팢\<Ϧ";ēOl(,IduP"~<]'AeMt`M-S4Y>Fx[fͤY_,uUWUמ,ӵ,D<$Lv|lAw-tFMўyJcSnālqq?+ 2E穚EU ;bU JtsW*15L5Mcwl[s\3fs@>,8s풵2E6̅\bRvU|㎩'&*f&\ua:7>8_~|HE"0zK'q. v-T-yp97<{R,;y6 Pt)wUyJ-j>?--nEB=)iU9 i-eq)rZܬoVv>.jK֮fX бej.]-=i6uMtN11W۞2P WArHo{W7u`m価aqբ۾NLlp 3զXuW)/>kWj\a)W?j58 Mó.TvL@XpmG3uH٣䶸?R@r8I&4W@+:F_Ⱦe;zL۳`585u:0j\E1o1wYdx\U6" 3aAl ׀`TǾOl vMتm[s&w@kIe#L2-04+KmTͶƆXz۞f]- X @h@9a+pGKb /+Fuo\w}=0m9>H[U7m4@4@r2s)l%Ugv7u=ano+ Jܪ7uxjfX5 mXRصTi0(XVu3ۅ]%oB^}bF[P^LO>+t=l[X\媁eX ֹshi.XjXpGrZesF$X.fcXZM7<ƁN|a2\;i,uz|Hb.v⪘\/ h +%uco-b=io_=qW.c۲SƧj@@ BzYRhI<Z˙ɂ~s43%a[4zo=zTַfjm.)+f* 6 UGg.? z=)<\15+QN}nxT]%?Q |Z\:rxbH]QQ@CiqnF O/.6?sKt5 HjYRD)Hd2 -%]A#-to 4`t!l$0d1\FOW)tWGcyeN@\S^WS{FdwD b+s=Z32|͂kYKu20V,[2L%e#,Xk:" MqlULf"dS]04<% K2 SRkkVhgYkOZk=}.{KJ%; lSkx lsx_b+/8na-Σ`aff|,{Xvӝޠ7)q8Nyl|i 3hQ <%UfNs<\ҍz_]њW7YWq߬}\k=2M+KkL~z\#;4+m&xgc֣z8ebSl[Q!&+|:8h Vd!POeuA$2^%/J>%2nOդ蟻[HG5Vv $agrZY`SFq8*IPd7=ܟ@=yi䲫jvߺ` >Ut:#w"\קj|u;f :p>%y.Y ܽ=,h/,I7`K5Q[|)Hߣ6knQer]^ ! -vdCrt_s Ug'?zݓwί@ {vc mig76l`X<ËIӪdOےa,"Ma=z/p/? zihם'S\ P֒n<9:[* F't2mci r_و*OO>u"?W=*Hocg^lrʇ1-1hh5yCʗ͇azvSp6 -j"ݜ=ҽatY8w@՘ |q lh貦\D/e's.XfGXp:NC$?A`U,2fP<;| ƒD,O1͘xxDg݉oW;fB]x[X`UoޠOfl6_&,*-{Wޗ~oA`x?H~`]L$;ђ fBS@XA24مш6lkJLtС%\zIs(̂Jq+΢P"&.]xœ\^U(>?l0 | -E3LhsC:03&Ry%] {x~.ì(ӞJDP"di ~]dHU|Nž-MܑnzeyI3 v-[ꊼv,\w.V ϱMUĘ . :j6觪ei: I3NF>Ո&[+5 ɄUyrTy}2<]9+u]/s ye*IC76'{,|f6s|"GrBPrg7pTv>Б5pӨޫXIx߲n1T}ĝQC(j־ Q$@bE'H'2|.6ߞÿҽ,ODG٫ӥjԖEmCG,QRTV#H b6B%=ܭg+%V8k`_6 !/hLrEDsL`UXl ,%kQŷ/A[NcW]{q=ώɭ96會e7^]AP1jW0 tz;6w9FIeܴrrSP#@Aq_%"/I 3XZ#>dPI}r=r)V2s$V07'˺:ذ`Bf9Z鍩`J/{5tQ66dCmk< &ѵ6rxUK4̶X]]KRy ]u3~ c|7+B}$d[7lF Dp.3Æ+R0h%cbІ.'!E$Lc<,xiVUJlּMh'UUʩaT̲[ʳѢauې oW kB*Zn}C6ʪ)碸Bɛ [/h[ đQsyT|rBBUʒbٰZ]0'MĔ1Bm)0Ȏ dU)V|yZ#i|eq4]FE`k5!9tRhnO03n]s )+TZU-_Utd _bmBM}ž"Ȉ`6%u5VM{^jZj"ſ}q*\-O!_! $7V*P0֚YE ??ֹD{ 5h0<+/b#F!c] 0- dĮcSG4LUvm؎ !U Ͱ驎`" 긊mDLlh 5%XpQ$ƒH.`<\9ţͤy_Ͷ@#i$*J,+.0<]ɀ2M\y7 h SQ K/j2Z0c:,jϦt Wdԟɡв~fL j< &4B EFmxi]zar!/+ 3-W[,2C^CgW_2 GӖ5"YaVi$Hv^X&Ն-h=Mo48zs*^9[Qq:k|nEnԊvf{ȎqG8(Mg bqlyJOT)ogU!DPhx4Gǔ3}.Hb&AG|uٝQry=~cwCa:^OǬ^w^#/뭥!Q6ZTt .,Ub3;S;Uz(La`uB..p56m2WX'lߊ%`r̖HI{jNHb4סt\>۩? \޵6r#/379@ಹw %M"[OqKl-tSlv"U-&mխn|(cZ4&+EBo n|S͇<PUљ\z_96 ]) xǩeRyĈ9BF#cLs4Qk!g@vNδmGg&~uV3;tȖi[rh5Z`(3*}7hQ9b0XJBԀDW>X|9x_;[|[xij%>#AQݞm1ކ GzpIR"I4E"?\RbC~Q^_RV@'&;sL(Qm/#6%ёaYqGJ-J)~{~OB@;@(-[NӫiVF|1\opm3l5bj ֖Q*:"B+3$Za`{`ד4j၄*Xbxhς8ҍzV?eafHg~ox'&Oe&#Q eZ30ZFLMVHKDo;2+9X)5)XFg' qf%*JY*o5cg;٪x9;CkN .jzj?3yUGJF#xx#d+H-&x#32xEr83Y{)!KFMoRDf'51vf ǟnz:*0GII%$@tY$hT:"K'U:`!z [$a iU3`L1(h L>N@`kAႱ>r_:@A^U-d2dL+\B23uS/IN Y$53Gc}ؖE,iYҲSZŽ { ׫:%u,A7xw-KEOtvQ䴇v, ePD,Q\J5.~WD5vJ׽Cs 0vLrM#./u7Ѹ^bxӟtS∜OYc vw6UtnimY/Z`tY,7RnCdೀ |gwfs2 т2mE-ewΒA2`f|:˞u-r+ ʼnb0e/-,}$SoP2}?@hAdmqIFNNY=QY.hqJj#*|WԅG7@!;S^yX^1Μ,;4{\ms71 i@]#ʛl!Pj?{'wmaVMT3xwlx0 uD$E_aFQ[QTw'=-6 e:a%Zta=ʪ㨺2m%JSQSz0"sZG.|[83JmJPFP/2E20p/0\3~bXKfsZZ k'jY2+hg)͈g є'@< 6a>{M\\V0c(1GvL0&3y0g##ͿuaWA:2&ZDpX&Z6mBKO@^iut'z=$9H#Cq4>ɰ'zU:;y}yNOt4D{R ,<1waʽ3øT0#4'e~7$a^dR|{wOjZ:5p <o/EK,* ZA J7RFVhvUt[}h{@!rFA#U X0YJ $>}Fg6Bj|{Ϋn}9ଓuU|1cLzz<3 |9퇥O=k χYe-ʟkG&Ht&;Bu7deOc#NJk@<;]fZMЄ#UF穷q1bPH*ؙf2u^<K%Д~g§HރJ3YaL$3|P???ịUh4BkM̨8( F刊`) NP]Kݍ,KaUӸDr+-_e r ]$OP7|"P*hf)}zV8eai/nLNS x$TVa S띱Vc&^ˈiDk45[!-mYWIͲKs`Ԥb <* 9N|0ar8{+e\|ՌeiuD,M_&bDf +!˽奚D޿L1!G=7\m飶ěUAD^ء,]L8)7I+D7Yw6mp843T` OJj-$* %^Fґ0хX:kgʾ \H"cfGat[ Bxo5'X$cZ@ QzHLdu4(&9bCƶfI˒%혒vvS-@ި^Y>f彍r߿7o+FPgri5̘Q@+cMrn?(RlQ80Xb֕y?ҵS\Ɔ7udn?5a|P'Y)]5s.^g'rMEe:RY9gjg*D<^|Շijnv55T-;s.aeӥn!)nq.qكc`)MsF["iIXT7_Tu%yspLLryAmRx3jg>L8%tZ+IYv^@vtW-y#xn 3G+KlˉMmDb#0bhRͶ"",8+/鿔%V|^Y|h$Sj=<)Mie3{,]Ρթh4MLF{(^? ~]Lpokfn?Fm9GNmv/!v75%ûE{=/Kw@K-wCJfݾЬgLkE.!.uSZrk^tn}ԡ64^}ɛ&G kov1-H7 T TqKfԂs;Lh;#8Gɞ5Q%F]rݓe%ŎbܮqZYK0X`BID&pI+KFMsF}fMr9J.@=]k&Y52C̡Bl8}@D) ^Fi52_ ԲQVK{|ZMrb27djr5V.]o5X 9ܗhOWzLf1fLǕ$*l**v.`ϽBY%VB+PFJ)BIզ Q7SQeM@ %Gi Cm3E XmMR>r7Lzm&o9C;!qstb+sf-j>xQRi(e0c]j&_4?"'B58w޴_oɪxM_%9,xFr&.Y7`ROW]Ao?rEk6KL}YccN3?7Ż ՜R>]ϗ~ Xa3+p$kwTSb'.;<#uvZ?JI|,ŷe:qj=OzqÄjq ]k? #V{ɱ(bHKΐS4a-XA䮂jg. 1д ̝nDtC;(˦9hD7i[?I4+vvx]\.:>EaWƮpXή &u,l(%;1;˜D[]a0s-!J]~#@ŻsxuzW- b?Zb7鯓Mj{nCK^׍X7z'FOQ$=\+M(IOLQRj[iIOLzJHJ+,t0tp-4ҕ V+sl¸x 5(ک~IQKy=A}k7l >]]{7sn W£uj-΀`'b\htCq:kO߮Z]_h~O4R}JIE2NpTC{S QJڛj4$Blw{sb$NI<kطUv#-M'80mi`Ϙ v+0+>n^ pu־nhx6E464^5.&<yv.tJnZ`+C.=Q8Ew¢s*#Y]Nv9$BlCɣS.9e%ly#d`RrW܍>l\mņȔ1$U4 JE$q<4lHL&$Ce-8맏۪۟'} DUZ8ognZ#:=A*ܰBᄥ}Xa)b!EѕfT` 2L~ZNWR螮HWZ B'? Z ]Z =DI.ҕBQZPRť>(F+CY@;'%rΉhq Dit.V?KT9A+K5 ,VI/J:^^élߺضFzhYKaz(d =]*zl.l@t㘪)BRBWVk QUm*BLBWV Q*U[͐A, 4*J{ JDOW+a< +K]!Zn}+DY:m;t%&$ 33p3ERѕVpA+E0v@ΠZvҕ!\r]`Id„&|+@Do]u,†DW `Stp ƺzCWQPjio,ۮY[nˡzh/jf]Ci=~.xy3bdFJa eCZˁ ,:Q1>]*?<1^-\H䳂^ɝHZɘVTT@Hg ӯ>HpERSi2E$}leb~XFb:BG`["D|4ؠs#-ovXFQ!S~MpB&@߰(w{nӕΡeE\V٣.udTÍ 6Иk-EAhegcrlFk䗄̃;P-ee~{;wKGgTˮܹ$0(Cv؊_7fr\Wo @]kSԗ9 ]!JCz ]q<P ;]!Jz ] io±5m]C{몋t%DDW8P;}7]Ϣ+@k 骓t꬜KWښ++H0wBxkOWݡ+p+l± &h5dQVAg+!YWX]!J{ ]16CWWc]!Zi}+DiOW+n2BJBWKXo]uVـJXCWWP j;]!JǮHWX^yXL8Vu5: Ʌ4Z5.)y0qj?-elyPK{ ^7T؀zՒrٲydd%! `Ńqf؈xDV.v]le !Esm.o,O= TF _E} |s;d1Χa<_7r/o\,U6^O?_RfgEr` LsshyWQaWJd Y6AZ(L.̗G_c5^ppgGF1꘏E#/QqpAFսJj5|?}]tOF_ c}Ãz}ͫOhul0;4:shԊ/+>fiW}6nVc)Q8uҔv|9gU1/ p\nьGew*={Ӭa ݸVZ%*N PUy7UH:hF.vI.r<[fuܢG+N{(_h>{amȩ{KކL|xYGCԼM1885U0b ZLPETCLs杚S6\ި˗#jSh;s=ͯ"m,_k(UKhS>1ZͯKL "{f̙ & `24R؝RDɐoUQnC+@K%}y.ҕVy+KW+ ]!\Ly@+=]uRZ +KU(th&UwbvH!\L 3yg{:5ֻ0lYU=Jж'&J]tE{:UT+JY@t 0EJZ;]Q,@{ ]1m XWP ^t(骃tű  ǺB&BNWUJ. pyWj ]!Z+|+ATOW+))uX ׻x-Ar?Kep/o:lېery|c]RM89À=uTmvk9BW_] |G\9˫ cYZ/;]D_˛~˵:~T[_ $E[1[ywzO%.**4gͲ>r8^d4{x=MfqM-nb&՛CiB05،И0ӂOT4Ȩ!i} g (~}(wGxeR?dwo),&Mj}ROqo$,dQѯdiٻ6dU@Z`,N%$kyX`S"-?ws$Zcô4죦Wգןׇ>y#n/Ί%z6-Ӣ-Vy >Zrj僦f52Baż;9ʼ0=]P䲵3K͗5-SAI亦L7 غjD ǷU<._:}YoC @@lh%6< wY я]Fb<'u! f'AsۋUУ\ 4*?'DB !Z+r6; rKKr K5UT"e(ԁ͠T H렴, yҿPdg)҄ #r;ĢJhy32=nc ?g;n#"/Uqƣ>,4玙(sjl I@S)qRId8ʩQ"|Ns쬗N[%<63W-3P 80p9m|P/c"0 _!)x#}ԫ|Z/z^w XOs~7]ۢ#{K+PL6 ,bpxseMg|<65,&Wn9 sF7.91.( V&$c k`56*Dƙ0n$Ηx ?ih *& ੒:`s^% ؔIV(hAd"Kg1cPapOFVº&D9)AkJ0)W9CEt, (UO9b폵4#HȔ 0y(ÖY4u{]-iA) E'H R^Õ y*Ħ b$L}6߇=.N0%<H]tPE߃5z#o},"bT̕0Lǿe̴{]K]R84~\e1֌=͎(~B4@(Ąaz2' PaN nO3PW'q`PUV%g%xze;Q 7AC\UDS科 1Ty{9-IE%ȳ{z`D0< ܻkAu071\ZqKHqK }oeNU^Ywck,(zvQ=x iy̕A-3 .9Ն?_ϊ7v8' _sz*o쉋=]5uCQefypT(`b^GmMUmr%{x{$zm x1iRG"CWe)9!QpGAT-J-/O>|ϟ÷߿@oo>[Xu7͋,YX}?Wz{ӷ]Gjzv:[<ҳ W`@2?0L_'pc0;]ϛZYufZ+ѯ0mN߿+Iy _B)_]o1^1Hh"1[ShRF27k0zmд1:rp؈^v;i4 Fϔ^QG*Qpi$4`*jXUct,hc$ km\إΨ{lH Q!꟟"*ҜM3B橰YeFKV2$ ph5&ޚmB 5ٷqݺc5Xq"x) r}/݊b6X_;$tPHIH  mCnh 8h<È7AI9#>_"0hS%Ŝ]~5;'OUH,t` 'm#BcDԂh5D,=&l_{!~+9]z3kꂺ;ijll7Q<1C*&HxQB\/O8<hXO ։~bt B[Tiox(@y eYa r?;357Z(f( Xz4icI%tQ24^/1yK㵢O7>E7b1 VCKr4rZ$ ~F+L^K'1p$#ML'FdkmϫbzZnki-f|Qm),:GTCZ.f tmy^Nz3pҹ6ۏFS~+/@<+8=6^h,I-X*cHD!AJkιά b v_^T3 KST| I]EPm^J{W> {=(쵟>f|y~v7D$52B1ȸp$KS%JQ*d-ylېʬ=)FYj:'5/;ZNQ Ean1:#p(}#0 e|.qO8H[0 ="2TS(VĦ;C3C3Og1!eLXό'!JʘĉL-C6pQ7E\\Hi&co#]BV(u3oir܅l:AR& I\h'7-qrXWTwsAGrg;@=Ө[u =4y:b1Jibb 悷yNWC0*YìN*Gek%[x{K󹅓1;ǃfwj"]9#+(f -Gei )o|7u6-5$_+e6Wf9VcUَi;ּ~vXtRۂZZXP.:$F(qDq4tJܪ6"_:Iמ5Q:+7Ė.|۩6`08?T0fj9h L*/<(jJL6rDŽm5۳*IXRY 1,j,*4R%1օ$t2W$cFP$aAq QƜ4jRU{b?|HES?gcs()G_4{ +6ÇmtL鲼(Xm$GFZ8Γ p)[y"ERQ$k|A{χa`?K4Ks9Mƃ6U8P"p@F XCu *y)C$6=^Ctvlgc+8ٝYZFWKigmD<0v#8R )ZVA?cTs#DH L0K!;dk2tI/b,ʹd4 ~+> 04.!zQ+E'//~#r :Yd>k8%TU(G/t9;lz4QW%w/9 ̒.䦓g)7DĤp bcDQ),)qDgz8_!|2$#A ӅD& ҕrI($X,rsԩ>rPOSHQ%An'O1ϡ'MS2R xe9h6禎?N 璗k-yGfJbJb^b%C'WN,Io7$'/ DHZ(;>Q%un|x gsp͇ bՋ>ܭGtO'^n7͢6їz=r[B^M]37|aHMvZa/gnɻƱ`2L6x>}W<:vJ[zCzڌU Sg!y]syr;Ӄ|]'/兗'|ZNv GXX{aWݳnLo˗]\; qW06JVKb?A<<=6^ev~ >a$I~$?^g&jlsĤ[ʅ)5g,?aw:j:w9Yb_]>H9i/[f05ٱ]1a*WVz-1Y%1Ӄ.+/?5_~Dn|7gLFǓq;miQ/^ة<\R m|s魕h<M +j9=i`)t~P{tMt 0TZ ] !._7Py*j֋RW[ ]E_L3;Ph]ziGA6뇪3Gqxt4L3x{W2u6Lۯ}9eh}z&釛htPfOv7wV'gyg ⿭ƭ]+4͉%rxyÕ5. gͭ OEJOoC['==MG_ūk aQUpdX*Drz?aX&p?-{C}V&sC1o#ǖj?}C["=d Лvdu]Mgtq+:ƯBʛ}Ͷ?~a7w Y{wxYؓq~mux\ۧ+RQwW#гn*d UIYFR:ښ']?#YfTk NBm#K֧+jhmتUuX;iQ1'Bk@ Ԝ$=hZ쨦Zv,igPJJAu@r=H}FoI0'mZtjShіrePƚo&d,BZZ V\j() AԤ^!pFc\=D@$XvZ)Vէ ƐIu) NԍhMɀ1!Kut:ZN(!i׆R XJ2P4@ZE36XvckC3'XEA)1ٽTTlPtA[w Z y9QsM+`2~0P8jeWPb${1lt XBՕ)Hĵd 2Y6:vWgnM x2NU֣(հ֞ C* "ջV8VqW&)^b ~N`K(H JjV\)0 Nsd Aq)t7J%Ce:|=Dzgb2 V̼޴X =ಮ ++eq2FN[@h ` ("2m Y1B:BѷL : 9 q& Bl nKB`@Y\O 4XrPVdbBѬ A{s m*3[(Q@HseԸIj gYV(h ;:=63ROƮ+QWa.5}Ϫ J H[. PϨ`[Qzj yQtМ`d!"1۽BHy V*КeM236>l<\۠ŌTUĬsDq2|cL±yUC:iF&},9n?Xluu8UA7d=2[˺t4M;d&xs:p4(6B[0uy _t2&@ݚ2@vTm~tXtu~V% 9ՠQI5%PUUɃ d-їةi!́qzwGT>COe$XXktdD !;KSv9hP)_C߅^c A=n;H Hq3M=`Dz*a3m>%t MIJ X}t kR([|PS`y&˽eg:V٢a:ZA" td!3vBgV*j<(ͮɀ!JP CGRE*{0 !뮒BiYfJu;Qb5\,ԚϽ'hV', MGȆ4YICxnڀJVڎWoU5^"`!-#6HªL͠>W3Q%/lmՎ6< cs;?y?t9?\O%֯iO5љ$; Հ ԭwH7-4@'eJac`ǿNd2Y]ۆaMƻ^QzhЛAi;Fbz6l}YeFbaj4#paw:H9-0 ᧀLFe*1Q`2z hNm#hc1O+ҢǚAĦUJMm3i^'&X:lFb\Ρ]Yg4ϠBB5oZ [sE OU>Κ`*G0hMEkaZ3^F\hRčrzGzޓ0j318 8XkW$ / p \t*N6fTS.\;8 "ˡꘕɱIБdpIԅ]0p$d Jk|A;oBxb!h92$R~r7\l:QUJVKY*@+7ٛNgֿc$g[Ԫ A;J}Bw^qEW pηPuLgy~8og9&4?^;O!.[k+෼ҫBK?ǿ?wk$?ܸCJ>Pn*~7n%~(CJP?%~(CJP?%~(CJP?%~(CJP?%~(CJP?%~(CJPz~(PKCpCnx~{?P?[C>oQP?%~(CJP?%~(CJP?%~(CJP?%~(CJP?%~(CJP?%~(C}~(G.P #\C-Cz~(0?%~(CJP?%~(CJP?%~(CJP?%~(CJP?%~(CJP?%~(CJPo!PrPlZr{J?ԛCQP?%~(CJP?%~(CJP?%~(CJP?%~(CJP?%~(CJP?%~(C-? 1ϰD!ɾ[g} p=-ƾ!}+jX[ߍ}8G5:]=ptLWkWǡuq(M#]=uuᛦsjEWcٹVKք}R~HWw +cEW.Rj5qj0$&[ɮKїI%@Ucdk1WX~#% V:b3=tόz䚮֐J fJu fU jt(i]#]uL]!`è ]!\Ӫ ]է+D5]!]鲡[VJMB UV*:]Jf:ҕ(b+tpYef:]!Jݪj kUZDZ­re+D셁t%^_5]!]Y*bȬ:tp2sWVW~eyTCW쁪g2Uj!=Tbp{WBhmep1Zt+VUQcgK]` IV%kttE sW 6}bpU*thut( 5+M]`(+U+Dk)NWxTttTjhJa2G2tp5*thZ5]#]i*UfQ PP.%v9ѥa9B0*K#pH6 n6ٯ)I4L}|d2#0LH;{ t竆it3 3sVTY\.v;(Ӕ`8alGs,SLƨcpQ|Tv/4pD4}rx#i<לwk-amb8` 4q O+erl4w3 6c?,c*=j3-`Q =\u|YGM8 iYL<[ŦPAҳ[6{ms:btefjIJ=﹔1j +`]ʄW1XUBl@*؀u u2Y0R:.c뭀Ҫw%]L5*0+< *U+@֪ӕgh5]!]x2^!\ *͆hՕ_oEttei23TZB2sWH*t(ڻqJ{ ڟ߻ZN;\ sW՞C]i ЕVUQ+>z kZe *thMy QZJMWkHWT3 L/تwp5E ]!Z:]!Jժj JV)DQlVJW%u+}/5Z<QD=aBUoY:O~.@J+| (K+'< >JdEKyJ z+7<"i:LڭV/H#gÍZyH' rZTUմ)}EB]O%GZՋ 4@qkx*kena)Gݬ BO$ eR*ۏW'|~x~DƄFPA x'\94h_M" Qyon'g^cђ<:B"Q56`w|. w' O*r#T%əu_rWjXP4KOE+rj&( o- L7 uTsB#.Gv͆fo7ޏtF Sׇizҝn{!h` !wyA"j"1vU Lwo#|'k؃ٺ7,'kLl,Hd&`vG.F2e{YY{Z6DO8A6l;xٻ)_[q/i#K!9qM]ю6hFMY; MI?s9wv:3%ɱpQ|qˁ'q} FGW@s);~;rw|.83 m uc>sF7}:9@^pEOowt}QoCy#)4D`4ʨiʲ$O k@Win 6~q_}3{&< nb|s" jbB_qyy4Ru>w8a)Ńp0$om1G߄OfBNI14fwH0k/{$> 3@p4Fj, *P2]_nݸ޶ W2{uyH;/Ж88ǽL ] Šmn%UeGGQ_Nr[o|:p|< ʼnh(] qGzE2  lCp+vwqeյXO370kAϹ$o fCqtYnȇQ=]g2h3X﫞aB-ڌrdݕMeE\.rG4 Ms*No 2tƹDX76NU;G'ߩ-󳴕i31$R4:twBy=1TIp0cB3'b<]UYho\g e/xf ˤ[G]Q+_%]~Zz]w[ݽo;;7U,  {]4bӡe;]m*iHJ\ zdI,IvK^ % ']M  >:'ŷ,dh&Ci.nȒ?PObzWd}tEEU,`P)M&clf$Z-'ꊭ,|U$p2`$cٶpp*a n0CěhwVݓwݭqWf-pޚRrEaXsL[zrbk[e;s4>Aes҃K g}W?𞰌\oiF({ܬsQ2έ%0wON?ĉ.I=qM8`g@`b]@^AD@-Lq)/JeZ,%h.+< şB}nse(q{EHܸ+ ffR4CY9zGt#zIP]cmTyxdxEn i@ V=?ApWq{=n c渴AڧDi1YSzؾy6Ԝi.c'a+7Nh_):|0{κFAMFF]͢m46oEށAwz`%ٷRi+p+(+^Y!–f9# Vnlnt [V'YUDha aTB 0׀v&rin(H \YRvaHB$82TRJqcf28@0 Jn/'Av "iC0B4JI$C"ىtH7=SeMG$k^=; ,yBE;}8$-ۨ:7W'0!JAOzqqX+5E70=Zr;ژmkњl܄wX_neMzbE|bjܚf;ƭ]mk.8\8;ZZ_\;)UpwD/焦/76`v`ή~MGw%wy ]fq,1 ˆFUf[ 㪼3C=t4z[n-w+LQ,IV|o+w,$>BDbClxBm 'VMp52I(UNgV躩WCljJeJ-YU+@K`(ZteUD0\! Z9*d&j 4)Bt ZBOp#䕧+D9C5]]Y&cR!R25Y֫BWvٮky^]8t?Pla>;]-ؔ3bpgRyj1ꊝ/@WzMWEUX" z,T1BWVd2XՓ& 6µXU {WttZ`|J JuAK+CWV5W%u+M6tè]ize RK ]ZU+@fMWHWfQ PP.%v9ѥA|ŵC<ˏF^4ȳ ) %hǩǷ|(N^Ttv/4pI:SxJ.=SG4 b1ƘiT2_N|㦌_l8CB.Ђ0Y6JX4ژ(eJD?([=M^0UehIm[Z)%+Z+ g8Wxfs8v-㻲]Jku.΍|ͪ ˗h҉e{>﹔1fՙfKh6ѲsfjB|\"X5긌:7hYLV&-o$04 T.i UUvUkceL}`TvGLt}8!lv۸QNsxe6=7^j3s<(#ME3ޡkIqHd7-#F w ,s :'ѤKFB2)M3$@V_69<2$٘0FDZ"9BZ%$L,IUESqgg<4'{'G~R:<~6~Ol 5zd %z fez ѐF@ݏ3> ➬Nn}\*5jEdږ 4,W[^Nt8%-#ٕţQ.{X*,TL7KGF`i]0y3lg+ob=fyۘ-u 냡z6dr,Ӆg&TP9d&x(k ZrRo=3دUcKQCRK“3[3r6%`ݣ#=9l_t8Qu,c\.\㡕d]2 ,{4]E+;*ZBsu'+DtEWObKDWX qY  %#rEWO0 X"/up%[3?9]!Jr"]10\"!YBsM}6]!hAD90oLWCCB>$! nZ(#֡V@hL5$RXcYGu,;7tfj}-*dv-U+ksW@K+l 3gW#4Ϋ6Uj#㿙TsH?Z ]xM]^Z^R$`9Nl럺?EA&Eӵ[N4jӧw_b W:*V~-vK@d P9,Ͻ#V(IXBQL@1uFjf]qr@$I;'Ho2VyTÀy\[F*dT$KA5NiauQBaLr2#=B`̻rioXt?˜ǁ"4c$f[., C+ "a+!Eu t0 *{'?Lk80^҂$P<$a:rVjB8Ӊ<m`rw=+ÄP6A4؄PDFX0QeD!*L9Db\H>rqm(E=9\%IA1(F2' Q1xWG pZݝ '\DĒ&bN'΄. 3:*A/tlEZ'x[aiow{*& LZ@m`H L|$Z$:e\ g}si.X c-@-A@Xr )0,1h&fD:Jyq&cd4 Yb&|_ T>~sLW[{gh | ~==+W { C/\.UR*OI "-b"`Jq~͂`ڥd0E4J80vu_<.nÒM*!D m@E0#ИrCE@NJ8 XqFX-HBOOxsxY0RynNv*.y L"Xޜf#LT< fV  " $5 "-_L5! {&S" g*V$Kq A#>` L( 0T+NψN+ V G<1 ceYs ' I|R2OAS 0lBrzi34ڰThҀߵ:sӲ_ΓUc夰T䘛+>138%NP@0…JK3p?vRs&BB%L42lhs>n.'twN.^t(0{M/nZ/iEW;^<pp~/`ԙSVUgKDm |TyW|N8st\?UNs\Gg;z'y.~F>>Dϼ}yfRF+9Q^I`i8'33M5q}-b_o ]zPahF}֙n9ڗRmGuah{r?ppC~z|S#oK 򺛗G(u*x9%퀉t ya y^_˼oz0)xǠW`x̴_? uZ[ZeE[ FZK@`˙F萄/<vL/xv;9FX4XJvV!ٙ3z/6^up:t_a39Rj=Z9h҉|6袎۹x1ʔKMUu[tp(}؇ %Q}>#t_Jj [bۜ_5lS PbpӜm}ԡ!y\-ݬuξW)\j ksPI;$&VQGTJӀG4N{7?5E7'cW2#%(CS"!1 *L3ncHY&7س7Vg:sgsK6elC}zLjɻNFmvΛyiO'}"]AR4a.xvIuF//>oon^XXH^m>Ͻ ZV[=hXdч|N`M72'`烻g(FzǷn ʜ{6:go+PT&9 p] À+εΡ/N!|7J̺cЕ\Uޡ'j,~*>amm\k65pKR(g c.!'SCO:<|h,[, ЪMJX!W+IJ+ OR>Ar(a.BW6NWrj.t xDt' `i ,th[WR]=Il`2X7U1F=u6!Wg.л=㽏DL*Wך< /„,W2Y'8㚷qRqB;?]M^kˀ~Iȵ;ze/d"1<q_1(ΫMspP"aѕ4GnԜjcd6yhaLO(T4||6mD! DPq W s 7Ӻ߮e.>*U2@gPi$]8H&}:lާG!g|CCέQAI5f}8$pi'C$9Nam3v~/ |1 .w-Rv¡1F'oL0]ansZɪs(fz}X$ڋc`)~޻'wOpQSuMgS]8%o[*ԤyGKȅ3JYiLόI3i±2.J[gEg>rppg<16&q]<[1Z~iٺ8cR#O$XITzʦm'N.Zq6z9^&zXe} J"[" SJ)k\Jv|Χ/#̳k!2e"(RaiZDǝK K9ʹ{IR.HVY=ejaq43A"41n|`0wpDf~>I2Ȳ c\GeO>J.2/Ϳd"ePd4Q@ .Y]Co#T3My8c{rE>C[MȀad\ 4 K ,AkΤ8/sIN1x2'V9=8r[:Rಟ {>\)OIotfu_@[wWPi|2OONp&L>dS%߬~b Lfݵsc1ٍ4 MkPjS_i:x߾Fp?p0|s 7պ}lro9f^/=|yL40^lsz7C )jl_FZ-.G_mG0yPJp? & $傩2,:#5(O^srN  ~ w='9N2ƙ4IU'WLzwD]P~zū)3|_V/܈/_F>!g^+yrV*uԶTZN2*gTBܞxDk(3'GeH. GKzYUt?)w7\Mg_/Z/(pJ*ډorC00&|Q}Y5Э'^a}9Qgekzd]^'h*ʻTrxdܑ0nHv`[ibĮNo=V<;s ɭoַhC8U^8n!TVeu -@EE埝<[Q. ڠ/֍ ͢rW)j>h}(^:0u#`m fۺ>Е1Znxߕ6->؊\֞(i{!~IY~lN в+DHKWHWZ+KtVp!X3jnOWZ1Y-] ]mD+,xc PDj!JHW8^o xW9Ɯ"h9J7CWvEc?>\|ꥠ'ZOCW4zʮAWMEOT4~ɶ 2B;]!JIZ:@q %5m]\K[z QJTr B0T]!\kBWVQt(Y "] +% +8o ]!\śBWV?Tz(j ~i]`Ku{BC+Lg +llc 74]!ZF%g-] ]"mFysM+DlKWCWU ՜`Zޘ XB޻B $zN eB=9] >qnJDD+fjM]q:]e[DJ%M [tp{NWA=]Jg+-]=]1j m]1 m ]\fCW ccNWRT{KWCWDWXRJ٦z+@iD]"]  +7'Di۞%-] ]IfnR0(%p]tp9m ]!Zl FʢtGœ\읱t{x(w}PsscHΠ$xdS0tu":T)N߃e~ =~u1mCf=sw\ՊoTuUOJhs&YX3,x&~ʷLUS43yA1旒B>&giʹi V3:iktuW̢0$R%2I$K`f)sZjJʫa)Ε{lA;vJԛJ(!d4eۨ)T[5ݭLlڠ%i Y"8 R$dL8*plP o•͆hPZf;[sŌm]`.>9k5{ (-] ]n mz+V͡+kh-{ QVtu8teqE>mB4)thڄ( iꛡ+11~Kb/<\Q /5.r7}İH^Მoz={~[Awͫ掑Qxvܰ˛k|gJwt'i?˙:v~^HSH ]Ёs]vnK{ЮHJ)+U^T]w'40"v㘄xy2R/1N#MPHFBx`O >3*XR"sI`|_)o|6x.}߻(EΟͺ$ܟz }i/CJ;u~gc/@;OC',Ͼ<(B6|WŏC]EW 8͇ٵqa0x o{*V\Xֶpc^rrЛ^Ŀkx伬ҍN`?G2Ƣ~ނ݃O$nM=z7{mWtd4zҒg& ~Zh-y^tb8̆`z=tD|0D/q:űy3@O& e?jOEl0 2"[fܸ;/gTS2]Jfjc < ޥBD2Y.KT锩:fǛ%] ҒgE6Ěw] -#VTfGK蘧>TZ3.1TTkb|ztkVGWAwj.B]%n><ٽja08J)G2: !O'UƄSf=΋ٔ2t * Pp|-fCuh]%"ldMNC*n⦐, Վ~/@z^'詺OSW+*4Agc3&lVe>fkS܊=[hGo2TG )5{5S+ Z0BҐ@<8Q;WݙimoƬ(-ARpj3)Z ) 7"x,;M#백%󣭥Z.-(W^lOENQFU]=H{hvG%~I%|]KsD=&&WYdY½҉1&ii£Zf{s{Jy{S|[FDtUЊv~Ƚ 7?oFAϬc[K,]+U)?& qɴfAGLbjS&RY鼌(e *5NKOԶtJ=I2F:NX/RgI9QȤ0A[]yo#DZ*H} ,lv ;A ЧĈ"iҮ6xUK"G],ivWwWU zxdr<{vBy5:Y,^ܦ_o%J.FH?U _ ;k0rHs.keFs ʊ&sk$H:muims;#)8RjGLb)"QH/: Sb)*yb$#¼`%б ,SŎ_jiGI@*M!sQI -2]qd4*N qx'H R]Õ!M!B6 Fv~=g \m"o`ͥm#rfѧ:Ja@-*)S$ ӱqO̴ϻ[.՛ܝ.j:yqýQqJ@AD- 돯~7}>u/~kXu#0 .U^fni\=ޏoS5?Vy̴6ꬸ h{ ן@5C%}<$lgMsjګnZ`dߣ]rMYoW`Mi@n_x՞6W6ViM\lu̖T6PRyIV1;cd~V;G`s'5S 3a4([oۨQ' HUɠ`U R K˕^ ;wG*MwHN6)ЯpՑ( $08DKoi4y8&!Q;Cw!4ndu#{[-]f~p\ f$ȿ|uGQUXm=_/j?jܡ6CiB9jjt7̖PHf2ɴB0/W H2" 1Hiƴ1KZ']WN4 qx2|ߛXK~1B=0  ~Vr lY+냉 eDDL Fh#2&"Xtaߞ_Y9s5Rk 咊gǟN ,Rg2DO Y"x,.)Ŝ(aHT:O.!PV m^di荇]arr?;357 |7|nwۀ"glPY?,%& |N%[*!$ \M F΋.&r`à Gk.V(d-j=]yiP:%_xgu:d=BX }C;<5Fr3aߴNmgv3d%7ۏXbQqW %=מ{nFl X2:Oght&biL; ;|vD)O`XCI04P'6Ѭ Yz4w|fngWEhnj/?g:y-Xhm GetۨQ1,%! j@ڢCݨ:|r4źw7~Xc_amlR~8-Rۧ>2,t~jÈ8$`q2M M 3:Cz Dي=xJqTQ( Ȣ`;X+\$JGF v87Қ窟f)K`vÝNEUXaؒq^am#sHH:%=řfY)Dfq!ʐA]DfOPB˛̈V^+F)3Lsn3.OHKI,RDDٔBD=8q^N?y~S_]YܷymPˠPB'xRL)ŴTQ`q̠H#0"Q8ɂe*8b `#xR!Gx8 Ķ-:Dm"bvign dzdO33tǑ ^rÜQ L!lf%iAY\Y^i'Nqf~:PSml b<۝_~=G_.}\WPfa͘Nӕ"XQ%ħ w0 RI_x~xW["O3_oc><1܌&"F4&TvG./8? <"zauN@(KJ@A K3Cw F@( YXQw*zZ |XJx hѯG? r}<l@5Q^)3 n| >}0oVt܃둍穁zdNjOyJ?S/3A B1"]Q<*f[Ƣc\gZ\6`X:f1crHxlM JcֻVmb2V{mF#$*8ʝ7\ 3f4j13#,RFXp4Έ}[q/gu+7;˒O \~MziB[ϳ jLﳕn-,tP++{I!e7x}^Ą\X"jiqg)ig_K(B|hЧ x$ )[X1(7L"^ˈmbhnD4v[~ߘ_}'&ӰYty| di35uVQ|#gs},d؝ɯ:өB墨\z4u%ܿ>廽JtC,%I8I^\<[>|x3(?Ί۫NwT:/phV>B%_}P3}Wj^ku}}Ң<-YC~pR$f辉.ʼKw[[l+?!^ ]NT)d=ݶ\_.;Nn8ʁ{Z1xҿ+ElY<0Ie*tI+fCʠOxʴ`2NZ~x4Ufbhtj "OjPԴ;q86q't}-:a[ sWC΅| J Қ|Z\ڒO+Qѩ*9"]>7OK E)jJkuRtNuUN]Au%MJ`ۃE]%j>uu+RWdϩ'g LB룫G,:zW]=ZEBy"zcɱ-RW@05*p[PK>uuĬSWoP]/(kJ+uղ- XJTݩ7BE =*E[UVSWWJ;uSEU"X֨DmQW@-A'w\rQW- z;ҼC4z; F5n3{5sJ`ԦDjy+KpkLDnb~&P6`$[\t[U0;o}JjDo`-qkԕ8&j:uu䝺zJi*mf5m{H"6Q+OLT-+ҪE nmJ8&j59uuBtը+30U0>18}`zI]=Jzb }zS&ilr5jL:\ dSWoA]}6܎IK/3? RA* c3sWe޿,ޫJ]NeSwovYOl3t 7] E0M F`RP-w)vڼ% 1._yy[fl\v_:٣eEJ +ZÛ&HG0a]ȻoZ-w2+^ћVfjS@RalXU0-Oh@y P$ *adPO㚟ӏC:WYI\_!0:ٻ8$+-)X`gз}"jBK K3ל(tʦCJUfar껟.5(Ԋ6$ 5S[}- l?n6}?= ȣ>xu~jww_ߘ_>, =HFGw=&ʶd9;CsN'MzRɦM5Z -{SAos8Zk;y^f4=sg`[v``? -dKD(cB]2&%_{ddK4a2kɾd!Vbhӱoȣ/Ҝ̭WCl5<ٲPR<"e0f5(Wҗ X̘\fh-d٫~ԨlbA~_hC>|C=}[8}-Ɠ&Qgk*0CRm`ؗ>g ŀc ߙ4GJ5 T!8>F]gIMz=o/J|&Tu=t`#:WTyKZ8Z)0<'16e7!XYUF3zieȰ@+iv̶>'3Clkqƶ4XlNka˖'F0bTJ*XhhM`"Q%֒]̼&XjDQ;=EKBFSGqօ#D@Ϧ"X'_/6ߚOdz,6!93M&| vcR<ȄH #k-ġ;T{|>]z};aYgJD+MlV/ML!rM+z3l1ٳ;<쀶 iaW.8;U AR::CV9e(a $cm JkkCK`o$l Kn03`{$7;Ƴ0~pѣ I"qሦntgF2XC*5T@K-qݻѡqTY\kK` Ŝ`' (WJ*-j@F.'0 84؄ WkYRpPT{B;}tU`Ro&x,4E'U~%ٕ=[Pm'fH574ZL e C-'P ]p,ʨLhD'0i"n4[SjUZ]A.ŚU kǤ< !.A D-i4(I0Θ \YxnL>O_ [2:l*1Phq ,s4eEp5(H,ΨRX6[g>@PS0yqWh"uɭ?q JHoGHyVt0WGȧF7KJޢ5ض( ݚ` dw!*Db"Ѡ ҈=>4Vcf#*, "/A7ޏbE]n1ќ<48jG4M\}?]@|/va\>St>S Uf&5,:f2X0_*XX芺`FR|&QL&zZ't^װ`TNRrg&D _u5}MANtm~9puqU% 95-QXwl@F(e.CIvy߯M,~c%6Lji& 'Xk tdDrhc.m,51X!/V90hT&%@_141TnV?y@R@"t2πEy8gW\!n[v#ڱ,|Pt ˨)+Xі20fQ(Zg%/t&X!9)k@92XtY$ fz!k5\ u"N77 Ҫ2,j)차Jf;rj-'|MπrtՑp4DF΂Zm'ڀJw[7W3\Ds,e&U$g  |v|OJ`F. q4PF̃<{n3hw7is8r`1ҍ';_yhb|9l`;{$w12,fGZk%ef+0Ţe0ڽAY^ˋ?,gз]UfY\69x7,JxKtÄK~A؍:z\3Lt.JtVcNvp l+ )aɵ\ѥ'69R|, qh VǷ''84`[r|pp#,9NIcZ 0pR sb+|K f2- &sp pEC!?}-.l}~sPiRvrȅ7 ynz*^Y36(# @Pļ`A;̀~1 8.dnVhJ7a#EvU0jzm( qL:ہ NeSj͖^j:ti ZmDЬ#6C|Y.X,vSVZ ]\<qHxT"搂13 R Ż^˹ۭ7{diW vuļK&ȩ0=雟~wwT #7[< E-~yyvI Q_7ړv}\&DFoyχ/^X/3D ymwwxho77p=0ߘ: uX}3A֜~P(ՠS DgkPG:Ѡu4A hPG:Ѡu4A hPG:Ѡu4A hPG:Ѡu4A hPG:Ѡu4A hPG:Ѡu4sA4 ?Ӡf:ФN>tubP'V:Ѡu4A hPG:Ѡu4A hPG:Ѡu4A hPG:Ѡu4A hPG:Ѡu4A hPG:Ѡu4A sNPpLP'^d!A'qPQ)jPG:Ѡu4A hPG:Ѡu4A hPG:Ѡu4A hPG:Ѡu4A hPG:Ѡu4A hPG:Ѡu4sA͆: Კum4:J:Ѡu4A hPG:Ѡu4A hPG:Ѡu4A hPG:Ѡu4A hPG:Ѡu4A hPG:Ѡu4A z;/|g~r7ǧ{g6F.h\:7l'Wɜ|(\S{J'9o:] J=E"rxEf7CWi+t%hɧef'HWroA7mȧNW2;'HWbJqCt䊄 ] \u%hٞ:] ]Ek+Fw ] \Е'JP&5O$zt)f p̕lɫ+ATS1nJ]eN;Jm72JWzzn$ qp\:m~$uuhOAWtSo6DW8nw݆ 9:]eVztBH'(ٹЕ}eւ+A+R6qCt% ] lBW>dXڏCIV E]]=vJͨ+Aөӕd5O;v;^*~v].{rSoZD!w96~(Tgo!˫]hwxWgמ=`>I>qz_۫y#dWcM5ʭN德;\G'Io_;g]lBxͿ]>??OtQ?p?L4;}QaK;yVM9lq_WوU^=dZ)ٯ:5:t]vL!Wyƃ32R㸟#V+fuyw92ZJxM't0s]bF&׳?4kwr~S7O㟯juݫK ,o\0 5Ĕ 0fZk̩3zi)2kLֻ-my3t1`>}hCJWO= 3+w6oqW.t\UePzxe. ?7KH>[8mHKٚ;{z!vҒos*! "9jOzH=lIs~$ë4\>\5*I{IJCcUXK1n \%i'&)yk+$FO4zxz??ܟQ_?o*&J%(@HMa?ށ}x"g4`[($T#"[ec?_ϳ;0SIúq6FGbX"3h"Z18^إOiђ9(АGehTĨeRi8U @Es^֥s\V? ge]5<)v}E;TPq:X3$Z}H_2fp3"/@&Ekt"0aB6"bI_b!;3+QCh*g=GLCUp>g䈋u?~W1Lr!?K5 8?翎WolVoe8%2,_EOfOyPHYᶛ',rVAO뢚'.K7/}n}q^7 =|MA+uVm![e׋ٳ6SS }YTVզwڶ1mŵ1@‡=* 1Lr\(wp)8'O=cF6y'j$~ўRgq#$|X9pڻ m*@-m z0~{V&aKiC?te_aЙBTO+Kt?*}wOOfEPo`Vwƭ>uJ\]sy>rdYf51/Y0߅?Ǯj]LJn wO8oR:Z?ǓCiv׈u>[L/ʪAɊnv՛j85 :CЩ3~}?uJײ yRKjWq8BEt4ps|<@B{&".3uz2bt%n>?Dҕ[bbVɧAKD-- E8-^B,Sqہ?vn,#n6jPRd3?m}}4ܕM7>i'׸~&K&bQ+~<av&cˇ\_ ΖA@է߳[ Vfw^nbѧʹ?YMZ;mM7%]^`Yl7i7XL<KI SDpCJ,s$#$+oi y<-?LVl^Nҹƽ}fU"U{7ZAlwNʠ?xSd&~=ほͫ8ZwJ+: ;깹ۛݎge`@y3,jgNg 406zvy%oֲfއo;РϿxCb7hnJ#SS%cB~Y慂+\Z> oM䔧cF1 )`i )&95+7U佨%m C!fH |sVk/;^s5>c+9o(:ǯEf[J0LmMN׍;n^Zi_=SK~2@F. .L0+Q 3i?H)EВ)Ej m<ӥ0yZ2a&{ᆞc0O5dWJ3g<v|fsWIS+R[0EE90;˜4mМZEZa͵+LW̉68[ƉVm&B[L4NJPMǛ)d B# +Д_B ȼ_ԛ8%rf:Wx k4|OGCJo=N;[B7zf!% 3Bs46uލi6X;ԧEeV "\H.: 1U*tUTHJHRzѹ>Z<} 09BB,ZXJL!0"}fB;AT4tsiuTHq6FGbRFze!a wyiØsgQ v43&ӂK9oP5djK~^AMȧo%gbN?3c6mpnw"*@7:X3$Zf99܎˭d@Ilo`i(nY䎅^QNY|>ngOyl\CK׸3]OGe&#QHYu2LwZ D佖豉hj4BZ"/޵y6>OoIKymy5CR5jxTs`.0 *- MqPZH&xIcGMT>}[wJ4-ըrqeMϓ.nI?{z-57 )MA %b bf\ZH'}Ԗx"(߈Y?눓gLSrm~j'W%.w@a~W^(PE# S5 j-$ fh/#St$LEt!"'U{f^ YXBg 18S̀5;$ r|-!\0Gi$hNlcoƴ ɀ(QHLd$&C:łKs@g H5ac[GeK˖vJKk(:;QU+FW/;yJSSLi%S1ۿ>J:~ n8B-_Nww"߿ \3>ƞ?wm$G % #}0wkw`CSbDZ -ECڀ%j9}LUtWW wiFx ~ǃ[sxZK~[;nܭl![ZW-BR- euPM^K͙9=o}d.T۞p!MoX+Le#I}YlXnyY*(&hޯ}}j-ꇟL|ം鋚-n a4m< 7#x`*s\ (zh>/ u<55 9(&௯˫'{3JӴĒ:!KC(AFmI˒+X#t>Asϑ%괱 ]LeN,Oy<}02wѭW{E"B(%2G V[ˆ\hnD C8a!5;HXG"\JlB Oc*5(K)µ`8zfl zٽj̥n[cPy"W6 F0bt:;Ѧ)X+Akvƾr}=0ՑT1{ңC_HkzSzda^}ps?zx:}7~Ο`FqMѬF͌^3J??~?t.];<7"-Y-Q~ehӣ C%y`~~Xn6 Q?u՜USnZ`f'E\zZu2! 0@Ӻw[r 9FXW1[Se?Ƃd&c8cכ;PmӦ?Gf gexPQC{;_\Oߠ&u8 #EIU*̪!X1]bi dc&u:2eqnNݢ(nP:fNDF#WHjcrd@B&T+P0_(]FZZpcwF.H}[QG#v˿8;n+Nk0`\9EC?4N2Y)JɴB0o-H1"_ 1H iɴ1KZr] ^tl':BC=<9Ľ'.:U.7[keB]rγW_LA,XV-A,$ӫO#oNM.63hL5SMU*\<uFIBPXjd}ɘbbRCdF3`yyA:xPGC* 4‭iU@-m=3{ DPf/hlN8C ̓΅A[هYy?E{tSLd Z]A4S(AzP* S't#qڅY_&m 6̌ڒ-Jm|,jdk`j Xoä(ٷbnEhZS@gƌJa^iGjd>9HysQ bwFgM~K.uN3lr=sA4 dPt.UrKS3,|#bɢ'+]Chzw58\s,vԷ ZN G;W|-55m}' !Eqqr\'qqr\'qqї'89.N89.N89.N89.NI]89.N<w(M0qq)|1qqr\'qqr\'qqr\Ny{;4fk-9dK*ai)1H4fr yKދ/n[ ^ưyqIGocN`y&̰WP;7Z1ju|o8ꑛs;z8yޝzs8QW32|hh5$xdة@QLk!P1KT) f=.+o;;WAwGxқj-y8[O/إskmkNǥq`';s89^M+y--9dREKc PL@>3ZH:a{\y^04o] :'9ՈPl;_uayΗf<:l>f~P9 .1VYE 6: #2:ڀmT 5m!ӆ.5fYGo}vJ ߅j 12 l }oX'dHDTÈ8,%@8 &pI&^&zπKJH89&r6ȹQjEw9lVJHH!xU 8p i 3, sCΫkƝNЎyreXzl-/]}\*閮u1}~>j#İstdpKBCp3ېM7 27l, -]] \X.t#n*hSeAq#:Am =cxp[`B<UX-zg՘>ye4zl5ͭK6 Ftɇ1:#Oځq0 D3`HOA+xn!I Ç(E.(|8}]`FZHZk*8C滋:\WQbPb51|ɽ[u8ѾV-Y{;B}ʣ|&RJ%bdF/H3A\svm]5x7ͽÎ n/~v_ρz6*0d J]yI0R:"@N6l v ϲH"8(3 r6XGplHYDz=# #$$8XG!1z "wA9p0IX:ְ,5-kZִ}jړְ=(Ezޑ_, BpY2aV ;ٳ`gax GWTn鷔IҒ9kK%(-#eȠ0f-NyԽhb24Gֻbie;"E:JMED=8%n#7Χ+6*t;zZ74)(TBrmSL)ŴTQ`q̠H#"Q8ɂe*8b<O[pwE™ 6kBݙDPIMrF:WDv ŰnKnӢ$J2ȖXR?-8˲Ős骜Ǖ ;ɶѓ|&qLi|?W_a>:@fCTQr#(/fXPGEd1NM~7.z/Ȑھ.7⪠=^6ħuq6n9Dq5}T_9>r[I-ڇ ַÇ0VH %}w46Ĺ.]B[Ƣc\gBx ,-X2tie$<+0oa l [S@&EMZGK!zƌ0g>NLj9+j$>Qa=#3wzZ%a ?igbUʟ~5Ï;~2ޭUR \vi/Z\Ժj@Zj(VȾ"K͙Xo^Ԅw*y,Y4ALlM?A wS0$;QLѼ_'R#*ꇟL|&L_n1 FAoXx]QどM [@ٕ)uY u<55e'9(&ëW-_Ѡf\ b@/ 2v$#aqoy!qþ*j0.9ڣUˮo'@'Lkݥz=( ^+k)L@yM>jb w MX5u 3ʷ!-/  Ch^(̥lFg* RJ/mPARL2x2GVE{}ԓwP0vY4a#A2 wmI >&u ,0Hl /qHJlKfwuOUSuPJx->cΞ8{b&.ʬOEᵆ<{?b_zG3.[iF<VxstkZ_-@6YPP)˧ơb$ .29O"ˢk|&QN^vq7Lr9X/U8@ 3Ja6sxq I'Я٧׵ QBUIzo7:wNh}V6=7osc8FeYn`"{&CD{fhp%J7bi1g;Ja8ҊY@s0wE|2uא$eLQ bX3?uŹQg\!Y0`N0,, ru~`BB1;27vjcI[& v3uVlimC%=T^;&-t5,_6I(_./`Evcz}C-Lkv?W"zϬ*iUiaJmEigo'Uae4a*!X{~YuvfhdTU\g3,57ޖNZ8}ry/E',xl׵u%, 7 u7wG7vSDE+Gb#Q'|l:0h*Aa3;eUwLIOyTAX^|N8sގ?;e"NeTS&j=씉JsvWr\U',OF]%r:uUR謮^b\)DNH]%>uȥTUv߉wWW%*W@ǯRJ䪓)UǮ$Q] U}ZѪ9'dUvߛ*Qɲ1Օ }B &p7ЖJbz**QK^]%*SeujԕS2`NLr**Q+رDe[*ՕƔ^XKĔ?O>|sy1\e_[0w) u,QgHͬƕ,xC$: jQ&H'V.Di#"Vq*Mh%rZxb}`p;]|nWx5~"WzOxcT0}u5߀nxD^_~xތ |KQZOX;%0<_hÝ+ rGc NȂ wǒ+Xsƽ5sZçҷԂ"Cֻezάxќkt+jHmT=V~-D$^ &P_0>c_|v]|8>`1%:cDIOQ7S֏Q&X鸘LvAw_/h6f;^79G%h$Fm+h6Q 9C&zpg0 }ǠSN?'u׽v}ӻӿ~Oo _}[u+0.jfu2Uz(۾r]]%Ūٙ:sv$G}W`7 SziX* !`@~CG}CxTC ۜu[uSno t\wKC< 4 OsռF:Te̖T1PReI5qk b, *0wHsNj}s 3J6pTAFl"pF DJjz=`L@XZ$}٘>5aQcx>EpAul"4BRK]Q;Cj)YcطqzHkAs.O @i\EUi6_*"|$D$DKNIV` ItJ袞PnuJTL?i/` ߮ZmEcؑZ2$7,jss#2۵RmY) jW2Bs qU07ԓ./99,vb2*D?o:lȖ $l˖]RW~Z|~Ӷ`dž2[($T<( K)FDOAsvS'DUfSl$Yˤ1r'p68+Qliپ9}#)t2'r9?@ZE=-P)9i^cZ |cne6 ހϱ҂ -T4J1 X V RgJdT$<9-5_hͶXq0wI.T۷{߮ϣịUh4BkM̨8( F刊`) NP2mJe%jK`"N_}Q1؁RUܺ^ R:`IbDRh$9 &bqI&-lIKXg5>BcB)7j<[֑aYqGkZ ؊Έv{1AP[>*ҁsk[CXkUm ^B`c~f< :uuz׎ucB҃Η;GeC_|;4aG6V3҈P x$ )[X1c)q^ˈiDk45[!-]KoTTvO.(f/] ZU'$%"jxT~8!,ȍ$49NkR"ZvRn".s]QnôaԻ?DCS -hhאn__9xr\m)2 F5k7x^?|~*jnFbZ!8>jK ^Ȍ ^`,9L_VkKR6K!( 䝣|TV/Aaz^w4:`r6hd@aD%$@l %^FHHL;$rR.$,r! lL1h L> 'H kC*t{iZzp e, 10BB23uS/IN BJ"sg H!c GYҲeIۧA(Q|ChnJUԣ?>nb!=;[sPs)lKq2v$zW $ZJ -T/(CfM9$*MCSeEKwL[TT3!V^+F)3LsxǸ@(,RDDلDSS =ނ1LΣC@KzLzhŌr]t]{^eY4*Q1(g @ԔbZ*(08f0jK(d21\D؊Uux =^Oк&ԝ]Dyjmuo,KmT(R`kd k,)P-P˲{21?wOZ杜ۘI,\{JT SwuqfU{᧻/u p1K>Y6,rvӫ/Lh_U#yW*m#,͡8.Æ[ϗv64425hL a u 0}T][Ƣc\Z\#y ,X0taE$<]+8rٜ;c ldEN`IMVQR1q)1gz4‚Sgع#lßM*.Eg(!0ɿg逭Xx޲ǔN'U6_`>CA=.{΀!QK#.p4`,BN ,8.x)'Aϒ!'j;=;34V[s˞]q5 t:zFg@ul/.u9Ժ^=,gi;GkWKun-gwZyޗb-?LOc9o[P/g4Kg:1X~tsmfmշrOQ-YBPLe aMu~+I. S U MPḧ́u]lN6\,R.v +^.x~WOŝF~_lZʄ2a^&>jb w MX5u 3ʷǫ5ɉsvMu602a$( ` ,s_(hLၱXhT?#IB/;1ieF)RMR-IKZU_d VV$$[4gװ2giSOPlJa^iGjdrVk/;^sXeY3f3GW/~Ţ6!4]L_NvL]^ W~0Oy7;?j< nC/?~R;ALQ䢤L @rV. LC:7P"7Z KYpo.B |3Y RPꆱCR%hU0KwA FAa1Q\׻/<_g]Hb>ݎ7HMڭҹ8u3nV ;lz6ifO1fNersr%T8 y,h/%%D )XZYh64 k{e٥Y~s[hq5hhA͑@ tXӟc18ry:rY,+$3 f e%Fn,XY(,,/xyqaamW&n"UKjӰc78l= M$F/7v;%-5R-_F*a^0V>cx}C sO' p? w3d--]e #^;B::kGQǘQy?v{M\o`RGTPJm6>IcaLfwUm:$9+GHSfnwpfmFZBZ ZmM97=$Y<Ȧhq^hJ}?B 9Gz|)+ѹhre1NEĖPƽuH ÌМ,CCm 6ų|z#Ƕud\AB *;,XadAr1W%zD2}Y// "QsRW`y>I[Ю DVJDuEd̑<uȥgsL'*%`[3RW@0lU"W1PI0)=Qio MHgze#z+h ZO\Y릺TTcV(nq]Ѿڥccީhj^_`~k9s|3*>w'CCv6ߴ_:i0XSmՠzHǏ}10 .f`rH8+i5ymݥ'3n34۬V@m,3Z@PfBe:b"O~dJДQqF&v"X1\Nع؉Z&NNT M 4ͩD&r٨DZduuJjmZt> ȥj:D!4R3RW٨+ QW@-!'TRDu[i+ϿhbʿVZ=I~VZC5LXcAu:CW_\H'kb8Ai-a1*] .zEpE8G% a<߿B%uWa|%[&xcX0}{;@|niwo3#KIsrkګj\{5^5$s\{5M\{5^%\ګj\{5^͵Ws\{5^͵Ws\{5^͵Ws\{5^͵Ws\{5^'jk jk#^AK9[U20fBeNF]y{Ԅ:bRc6(SX8S{͝b aBo=h$ koPs[|DmRsZfH$[B*}k^E0UN*rBͤ*j;RP)̀SȤ_IILJ;LbOo;ad~e0[Gfl#.وk;83q~͆:P0e8*(vg8<ߔsLcO5Hs /SRɸ*ܚM7Kw`hX)Q; ZDl ha[0 ^L6rgvmtw?fs1{}hOqȰL 4;Kmq>'*<2= ŅIYyx6CQ($T#"ᯠPX˯yol οT4W`&}p|8z#w#1)J*̰WАa̹3EsYG&g38HA+ھM.l_շ _v` $hgawhCl:#pS2R{ͣ24*R ˘M4T)cK `{SZ79\TE+. (vaS͑S7w1Yuo]WM<G\0GhхT1`llwٿR \Q K37`{ MWV{9q\5 ќat;زMAi/ߧịUh4Bk FFm6*GT KIwiBV*,QIԠŚ ٽ ?./J+s#YZڼ{a ^ T{O Xg$,INC"0X\$!dK2[!*,D9&r6ȹMiLtdXDcZ)"V"D46j`gDܚA"wGg`9-š ^^q:fg3&"ȕBKD|b$N=" ǘ"݇.ܾ~Waej|ӷoo-ܡ 淲0\ǂuDM^w)JyfY9m OA;X>rKJSt J#`6}c6d&#QHYu2LwZ D佖豉hj4BZ";z^;g]y}Ų1q0 D3` HOA+xɁ! bB <* s]`F{+%Ra,=(s9%wU3H~;* <)ᓅd,hU*6/V ( Leժd o}MQi*|Ɗʲ; kx['7lx@Bsm&9'#7,mD^#F0ШfJF#XXP=w飶ěUDDPRMk٬ף8㭺P5 Uօ{/<-;zHAB5:CgR;=AaSs<{؍*PL3 JFIK`!#a:0 "rR5a-m{DMHX89030  D9(.#ȁiIe,cHƴ Z(QHLd$&A:łKs H:քdI˒%mv:{]6ZUgqp1(QSr+"{{$u0O7ʡkg/h $ Y0Ba .Kb^0{:eҷ?_Dկ`Ymo~<[{V؛ˉ(JIL+aՖ0%QB(Ő5NXe$wH3F kP (WKM)0pZFe)A@Gg  5FΖ-S>?t^3cDI`]f\[2r i~fZ'>;}6k$K:muim{ ;#R XekAx.0LK uN=@NwD @TDfEwZI%HD0 AP$͙NcP)es:3w%º]H m8 M$JQD1ʃia^ K%б ,~iAD&90KCX -2]qd4*N qxf-7 |;#nhɎKkD?ܲZ$5ҎH;Z2t+0IZ=0)FIƅ?H{0ֿW/\&wʽK~\&/90""[kɘL_Ůc mNA\ߣT6aub|o[  ΊqkBr VK1Sr3 H}.7h{ωBoX62^x*wnֽ)x`d?N>WjfLƗ )$RkU+xFVܦ*9Z }nU6#|,^·%10DL3wziVcKll6z{5vC#CMUMob|^C_#ik~`(>Oz8{g3Z׹ʷNֽY_!Ld$Oңb+kW/<(x:vuoן>}~>>Oޕ$"egW;ˀntocze5MIʶ<)Iţz,YQYq| ZDh=]fi\O\r/Cos$,<ܖϧ u60>0 >iko|5˛Y 1@[/{ZLk5ovka%M>u{W&~_I  ш!%YcH>*8]3u̖T1L HVYNg`(/?%p "<[/2so_Lu;u4uɋ|JZEV5Fǃ1&K I\ꜺS4aQK!(5{&,Y5IYYDxMQ4$@GjB5;jj'wO<'')k0'i1߳{UVVS9s6js ['ފKzRσb fͺ)UD]b%%0PnBm#oԳG9 ;OJD]%Pxg\*]+-stmeqWA;7GkuwڒfݗK9Z['Sj2K??noOk&;^ޫ>Lt|YgװVWQ*u$d7᣼i)Bf-<1ɜsh78 x[>dǥtN5Hmlsh4khIrf (?*7F+“bݶʎ9jӞ7_3fJ-U@ՕW !]]~}mW?}758~g7rԓm |^_?yyÉs^BSVAyk}zgz5AAN&MɆ B3 BVt;x)Sy'i퀞zk~^R9g[L+Ze\Pqg3>ud?2CYGϟGۧwL:#M綯f9!,p䬦n[G3/j'K#Cnv_?:g5NÔ#KO%{Κǟ 7fg1ެIZAwuY΅9 TUAY[jeKv~6T(U ZiY8`>FɁXҕnIkxE*Z^Eig`F#2lo4FFHp9.I(xYGg&.DS_\; l: V\4`4ZC"(/(]O\%D:zfV `#xQe-xE ^؀;LNLjջqa.qT@ (e,kY g+HŸuӝ?gOڛē Aznl $ TR$2A3⩣FksN?"<.TuF;_9vE{֮-YPW5FmݑKF+q3|@y@#ʰE}de r{Qo ~V:| ӝR/ppJ_C;?B&0CFQɗ墤L廠K9+ 9s$%~7o*J2d)Muh26Ih5t֏D?<_73[-R7Tn\~^*WUVfrn0{QOh0*wxjOۇ]%T{W8v"N+ƫhNh[`T_ٯ3JlB)֨L.@[UVKWWJi:u 0.hBUU&=*S+.]e*+TWI E*, frekLR2Zu+ _J|u AWܓ:+(tudSWP])-RW`u5- Ք]T]Fu9([`MUkU&L*S٩W -D_=)^on)kWVF“"7p,@TXaR{˒3WKﯝ-7^(xH!sU(8Jރ&^ƨ=")r@9Bޓ<(y ? |X<_|Wֆ`~Y4ld&Fn|s|ͣ~KxsX|Kln!pyMaMJbY=R@I[bg<+n-ޥ1$l4" Q@Qv22` I5 0ecYQܯ-r>z%ZH-6Z2nPej_:PT.U5AOSZPt_q3I d,qeEHZD.j[8 9a2*NYԚui} KjD  4qo$с>ip#N.cC'e,WvW2h s݀j0mM'/< _>3/]&y2h2Y~-ёdto^%NBakxVteœRiɸгg{.Ic4\KftNm5:2 hR՜۔5 A$Q6庭@ N2/Sׁr |U_nO7EYo?|r.F꼵'i}on2^/Ej35hrw5?u.t4^"w9Zڝkb{-f5`Zoi6n{.}6mMouDYetrsrC$.}b"F 5A)gRXRrT@ dZ鞃uM10GMnlhm)][ly|o+ oe\ f'5Ϻz{fBd/z5av)EԓLjӰv8k? Y&t\j)b<rֶ;Mngel$e4~OPjLq+z*2[(I}v6sutwJ6jIiѮ7(BoeQ{c2j3Sdط[n43 );B%chtZ M tPPԸ;q86q+}):a%Ztʡ 5]Ʈ8fQo0F~1ꀏJf iW7qR| &JI8 $Rp]ؐm .`emd_u2LkPlf.1p ?Qm - CZ1.GZCE{COmn-EAŨ0ą@CAp"Sr*y} r'yL,3jQ o&g F8Uۺ΢iO?&7v,^W I=ȤZz$OXLGF 2dT^&}L%œyhl#e1QB"i  2O}r\k҉" ˙-fT&w=ٶn%|`n< MA]o!%rn{YThmOG>OqʝLex5?oKFΒH#% ,炡h R2} Y陳IʥF5*¢!qPDtTHƱ`c)U1H*K(&TRMk٭8㹺P7 u ^e^dcrSjr- f=?}qO~Nub%t*`O-n\ O?~)?}>D~SORv³%2(u=tv5`Ίe!M׷xX"¸=lM+Jme?1{C_7ƛ M{xhyo].r]+Wqd:9 Q!ӄ9o61ZiM.2fKp bl,1 ' }r3?292K?+à ̻LsXϝ/թ^7?c|B*xr WJjz=`L@XZ$}"vpB!jH<6Ep( $08DKoi4yI-qL~I E2#TUA"Զ'h0IO5EUda_?ǫj饓 [JR)/T-}s,a6};0C(]ڌ>CY%E5z@ؽ(#޽AQ+ϼ@z9e$*3 lUt3jkcC pE* %Jd)1HH-h99u' 4oc9A$YeR1r'p<`)A̰WА;OZ1U5< 'zq򬴓a=e3XW fw݆.V pS2[1iQ)eRiRW F$sϥk-Kݑ;+&@TYMv'bjfRYbzSo$h& z w gDISބ^כX#[81G(Rh$bXpK ԭ$֒`Y+]c{M}SWYvG^vh~PڹRịUh4Bk FFm6*GT KIwZdf EmhFʵzhw+UE&j2#eKU.tm&*'AR'T@mD^ YD7Rjm oʧOI+a]r/eէSV lrxt~u: @f a^!X04UTscS>":d>_>^I|;ھTo.ua$`ʖ;^M93C)bZ2Be&krINqV'*ri3`F)3Lsn3.JKu^,ҌRG\{jqAkut-BGn詾&7Z_`Ii VQR O=cF@/S1b"yJLYv?iQLAdXߋIV1t-ܯVtKi+m cz{ziFPT•fya)pp>oRo}{@W ǃl3[eb˙xF@DFafi8%0̜sja:S99y[ 7%6 9* &|p|Y8tƒ{(lgGey3povmcq)]~ߖw>,w2CM όƃ`oa? ϸ(ۊZv`붓4wN[[`:Ք˕ '{)˶`:^D#ƂCp^B9d /%yA>4.^~Ҹ x$*SVa S띱VI2﵌ol5ͭWJ&{}m^tZlk)wAaG~=SkNQFiVLvH߼^i KZxk6칫WB}d]^]{Q0~ +\k:֩4<06J!2lPYR )&є.X2GLYo5A}搝z^/>y"Wۡg)k7 o}\aE++IɌ!p0B߽&uZČ.}DI`w1`U`q#M:~Y{5`ۮI8Ngpd 'i ia,i\E$۳8d5*G3ru+iJ}_B UM$חxa'/?!cFY c*h%{gqo]aFhNK؅7Cm-ڢ)>yŭ w#B4Yjk1^/ C 8Un2;dK!se pl"1)1@hpSm >i.O{sX ڨ3`F0!>F+&Ҁ kq,p" +ApT;D`B/i5О\G+pzO.L{Z\XTۧO2M?.)auىudɇmh< hs`cFlmR̲D`7n{t\|y*_OO9g}wXf:ӑбΈZJ$ƈq&nBj_/@D)vShQ1G >۔/JGEdQ0X+\$JK؍J+IkR Q:w-;^1\5'Y[(v=d|L'/RgZ8B&@FCp$ í!S",y!)q(,4]]5Θ4gs2j0Yecf!! VxųS7akF Iܛr*`JYû盡x$:\ܝ+} x$ )[X1c2b=6MVHKDcG+G4|ٟ+dY̸r"xPPtDGT`ZM0 ҉h' QڈUQGia9N`.0 r# M`ZH#gQu:-Q3n{ٻV0|>8- b̓@.&gw$&[JJX4?}ON 2*k+֮~X7_ `g 0MsNظZx5bp+b#("0FՌB}kOqBcH1\&}Ԗx"Ji95j]g̾gmL9(]@Hpl=@5lA9!(ǐQ(kTcSE#aHQk %ImҤ<K`!#a:0 j#DjXc7Fzm}?g\n;uZ[ZEMp800@f`:$@H t>>rػAF\l6T+c-VV QzHLdufaX5!c57J.%%lB1OV7\EoŅ>+Ҟ&PigYqWdk%h $f 1BfM1L8Qڙ=xJPFڜI9^Ә >mFwD @TDfEwZI%HD0 AP$ X+Ac1g *Cq;,ae{"`h0!@m W"TL jX ,-^`l/?뻖۟kiDI@* , `R,H 5*b)psY5BE: R#ޑ;|Cw#^ j%)"X< |B$\K;"Wa|ɣBF4IXM)J0?O{0ο[.՛|)%weQ>vɴ5N!#)V` ) !:0uf<7ww:9 =@0P&$7i.pa9c=%teAۿL\D>tt~|tKSw)u&W$(^TrMt~>KZ>ʠSHQNFɪe]LŸV\ Jν>;Eߊ7n'!10DL1_^]cKll6?&E޿z}mC%k[b|yK7u͐f$mfY> : F0bZ]5YݛdUl&K[%huU'Z+hq:e9?##~OqgPCW! 0Ӏ@ݾ4J>xy6m0GS CI &eXV_4 w/ۨQpOtA^q%+ p_)m@6FsT0EC!9A(J5u@Eih Y&"XzK+$1%10BCFar;<S"$! EO=:ZǫN'%ڏ 3'iq 9u^W`Ԍd3k,PV8dYjoIL+'Oj,VI'40rJx-NafU_=h:A>Ew&P.-p_@Nase]\릾?/Dh_h۵8xrt1d (8_>_{)<Ç44RĤ;I! z\H;▱ٕ$<d1tfY$<ݖ"Vmpd}f' m6ڥc&&VQR1q bĜE NVF-ag-=V Y D{ KW}hk4O!3o甥 VYa B=A= qx1" `88 G3Ƃ4{axõQ⿁Gs_t6Sos+L>wwxamVR& 1A}0A*)6xΨQ >^Ivo 5k 'rv`w Wn4,,wu ŒE))+Ҍ9k3%,#.A*kd ,{EVZ^D UF(ei aYb)_E(u(ȵ޴;{&&DSHw8Yvr)g @ԔbZ*(08fjѤfQ8ɂe*8b `#xP?xA?+5{vw| n=]L+DFWɉi0/ 0,/IX^74s"y }$}qRQ/0]x3ڣr2x5廿oVR[oGy 9eQ,ұ,_35/%'vCiE]h#[&eVS}bA,y:,?Tg'nv=nJm7yQ|o~͝v }v}ENׁf./v4%n Z.uOzƉFU"QW\AE]%j:tuԤUW/P]qJW+b_SL?˃h@3 <ڋ׾kgnSD,+r0c̱l26b $5qۇ38T\6JaUTI  2?tƟ!|* *)}ߨH4ǃ?޿8^rJ" $cLK21V=98'[c@3c^i Y@SJx-@UL٥G/74W@@1L!>}PyLzwq4qŠ~ni!(Gs,{GM@', (QYqF1k-tjsխ7ݾ+#QϏU ,L#QO Ѹ^\]%eL*K'h 3N*0kE 8_{ ~jw]]+umM؍IUnu,aiѣ`VnZ9HE*L<)SDcLQDpC@J,s$#ـEGNy 8]o(zp[^[.ZPb}zR~DzO3|LT\e[\"`?\[ g8Bv~? =x QmL IV_Mq(j5NLsYU}]G6((&Ԡһ^%<'ԃҰpVzaw! Ͷ%|\ih>MI^DZtTׯ8L )h WqfmCͯe`Fa:},/oh:T&?Gt*]y5L:\.'VÑ5Swq~&!2 )drKšjdv1]Ԑu=XʓAm"aVSdD4Jl>I.2K06y!4Ⱦ!A[-f[qxٹL6j& AT$m"c! 1M(˳Ȇf)K(*2О)MXBA-"XF 6JyY^Ίv7`f-7X=-b2 ,%{;f>i Gƨ":`5Z\k%MSG>BFJ}"pǯ9'bxZ')SJr(!wr-*¸wJ/[t\v mCHqiKKByDKG4 iW&¸&$Cͥ[Qq4%?m4ma{XI#65;+̓CcCڶ= ߏwZH*8)UPV(cBI`b2D G:rHxEDP!Ty+Qk|`q>BB:B\L6Jf =Hp 2gѨ6Gb$Qy39Tku5􀆻&Ud~<ҍcޏ{X.m=D8%$&GJ6gy[9 ?J@2,U<(]Py𲖆6|M@YNxp2v=e9'H7yV"qR$i͢;hI"o=jtHOQHW.EX!+H\&2F".s"wϹIqBQ[%QFٱJ\O)_yײDw5UA'H2}t*8Ġ2BbJGK9$*nF̼nbW10uhS0]W7T82XECCe-OBcP90ؼˣщ.O Ar nSNrAC߫"WQQMn(?V۟jjŐi)*N HyqfǝfA Wd$c 8ɹ3!5}bžÕ'r觴9ԟ;#;_o8 0> [rD#;/t!$zΟ4Snatu:zu72SǞ_q]7lI=]5uÚѼeey?}`<@}vm/79bcluM6d񮣼Af6;> C~W9=d?eFoyBúN^_:?~?w?\ۋYO8cğѬF7̝^f|~; Tϫ w6 9Ng7iLJ47 rWƏN睻Ҕu͍kEeɷJ[zZ67\څL]rY[9*x1[S p8o@ vw/g`@Hb VF^m_Op˟뿡&u4Љ)`@YBj4U*SK::2'Mh!9|ePspG: 9:zm5M @ Yh,BTr1FYzUk/6`68VJ*pՖ{6j@&rFIbwVak^ \N>89h_Y;r+sB( SC#X$eOQS@0J{*Pi衸 I@B|83.JYRvf_n纾 }tcg2L; ^ۮ/΋uP;\Td?9ai fB:X9|}0$)\e]y\(a>aǔ[/9 $Ȼ,@iH8eP-޹ )[̝6E㫧&#t1$j Ҙ`Úd*%k'1(De:[xd[]}ӷ &O/zý1WPJq )c"nr爩ES׮GW~u-JTJ!y%eW&9[ic M$fn4 +yr|(l9z<}ϻc[hv-kXyU_NZd,C-aKR**kuHk8K޶xųAnvh= atpx}Tk]ZSb,_?kdbohyo0"ՉKY*W.vu} BU=ꆷ'UQsxZzTxx[p:-}͔=;X oΊmR$ F3nHW{/wBuVDYiA2貝|*:SHR4̩3LQcBgI)C#uLֳ UJ:Ҟ8{((f Ma_hZ_xP_8(&g7$OnK:i4\7߸c^0\ᑅ dDF=: kj O!?3βyDFcEm {bD.zm[kkJg\@FWmT 4h 1 d$.hW$'NYZkmlƬ8hD&)Z Gew&&4w dAZKk-퐖-F1?g󳐶vQ;>şN'笴JGJVdUHǦ,Vs^}eQ,({˩KYTf?̈́};\̢LWRRt!m5~fMFp.R"q^aF% 'ΌH%i#418ƽTڊ(p8ԧޏz]>=-tQF'då0R zfD :H*ēA餂4J1UL'53.msnIבHJу 4a1 .X<)i&1w "k(r '<`c-= ,cN f8.뻯(#`iZg7v׫o_G$}[OMB;_JJܷ[M<t:WS軫; ??] 5ظ>LQGY )FKAMe"!mu8R&aXKod>2e^qkpdw( m}k.wې@2D'RȣҜE锨G#,Jrk)󕰳N?ɵnڭNШ5C~ol|rE\t,zp:L?2oHV GϟGWv{&}?hn.bZj=|]aZy5a'=-6|ƞ{&nKoU;YٰG_ ߒطSK<>rc nRp?׭VvqRUNa|QL+2P|Vkyvb}1߽o-WPXl (>5OT;;kw /jqwxK3d4mG?2h| ry+q4O;i\ ZOWo r%.Tɯ0܇g&iT 5#X*VJZ_4]Gp-o,y/}γL29;OWkbtw3vF.0U[ZYTLYT{aE+σ+q6-pew#b$gh'|$|%"j6y6jIsĀUXY-Ea<-NmS5XGMzxys/ܠ71U$MY:̥5vR RQcƏPOK@ r)*O#8%FKN}K:^gj 鶪 ]3+Ap'°oo0ެTYC3%dvG?a%?^ě>c}]u }GW;);_ ©uDzHKdYM77TUQhH$Ȉ0KmԚ3ę!x=b,9A]`E"P <]GZ8blra;jܰ{૛33V'fcohy]1uګɪtߢK~uO_d!m[Ben};hR܆ t/+X9xzr^WɯXqGw ZzE x?,]gآVT|bvϓx3战iC&mɶg)vBeQݹM7ϲ[@y3(jڟ8y 9N F<"֢N'?=οxF蒝07Ǩ)F3mnaV- %JIx%MJ8*bv)&TuBzW(\}&` xp d'np8Ue#3!-!.j-WW- *`~fFTFHYY#/:'.Y`$5c4Q2I\+as[$\yW.לp?׭Vv-^J?fḲ-jջgӿjc޽o^1^Jg@Qeqh;tum)k~2Z-Zi|Oͥ?xaLy8-`wml>M`O ?| ؍eDx>y^dry2hq͟Pʟ co*p:ٽٟi%ݬ'TRqG6>UNT1mo~K|}xe(3O?v˂Kp=9@8%3xC-@rR!Db:Q"u;˝^iT坻5[-{;4 `JQEAR ,f@Q :tdBM}"Y4іjQVZGs%#X)BaNar~dTFvF:T&&N97Cd=II8 QQ @)\B!UZ2[~ɸEBSX^psAɳ^:GMLX$),-gQ\1./EjJm[Xj^jvK.8ae` %5HS $rHm@ Ä@ !hEeM7e^p>&I@sa엇QZHbBZD|iv/ߒD,RPZ RpjҔG-X"Gj(!˄NS^%mW9紋紫#0Ş>MZDX\QegoǸMx>LqVHUrc,&E? DuJ2Z[k=VhT~$I'/* W.KWїTG-IA AE>)G."F%Y :J .iEmQ ^eBH%T `%2KRx ]2Ŗs@K4)qҨ_tf9އ u~1Gsb|>@{5_Q׺iPrDY)>Ot3E¬44ȄȔ&\I.2ًhl!41n sn1έLr7=h `fRD/Md  W bn"P1){hHr\3Δ% ]\MX…0Z[U,I#I@OGDQkNH ޵։c )%Evr=*lxI/{;ғYiQ:8/- g 8F$Mq#MHczQDptuX=0שPRѭqǫߓtt~|tsSw~:4$(Ⱦ+-۷,Gts)ۨ4Y׿ka֊Pbx1RaSgOUΤxf?'O>_UgwJTۏdlFL/wɐ/_tp{!-1fHs3663?Y>"6 F0bZC=УYݛx]nl֝ljY_GJku$5a}0tدI; w;TZkTǤn㺷W0/~|ÇϏ_?>\| L{Xu#0~hLF:}\PO4`f΀tj; fj;J}ay,pxMsjګ=ݴ^WA\vܮLu74! 0v@i_x9kӣ\r-GZD @[HT1P2e'5}1h:l *0wHss'5ks33y0H_1~ۤGd xBI*!X1]&%+L;R[#&TrRMQH`qDKoi4y8 8 Q;C>j[_yZtZa}}Ygʫ]gL{)czՂX[,EÎPLܰu(' #_|탆9VReN"LSHb88IH{%x\p$`I6xX(fyd!H)3ʈhA P-cJ2?$$"Kkݱ;zS4P"#`ߴ1qVؒߤMRWWĐ\[`0"j08,BN ,} B,3|v'o CGqeDD a#B$dQawARObe&F$n/R;R!e! zg &Լ1;&Z܊Ae|ؚ8U#O6g6oV΄ eUxK7rhCZqkZ_uZ@ ńVVd(tŤa8'*y:`,+3n^}Tӕ'+ j+~> 5O{ɜFTʟ؛p30vsV5S~]t x2㫯t 2K方9G]qNGE#W,`}n*守DD27s5ai>/ѳVΒe| &Q{cLӷ՞r*Ep>@;:9amke-e` 0GM,bh,eۘǫ-ɉO#AJRu6SCQmwWmL/-wv-E^G))CiJ #Huv$'sMDgD+Rff W1XRGA!"מZ|MoJ⎯ndQɱg).yRyFI1V#0I,SÕ NĬ(QE[imN~+ Xi@ܳH++Pw&]"]IĝY8 Ӄq Â+^rÜQ AƒEiZѡn=`ylUhѓS8cB)7j<[֑aYqGkZ )$<{˷ ,L*0%˯0b |imPUwP.wPBNg0ȥ1.M~nѿ>0I}Uv(6ֻ0]lqKt̸Ӹ3Tnԍ;`y1kyf%F}6,k[n࿶j3䎜kWÇ 䳝\,Y 9J9`If>}Mݛ{*ͯ3xa(/f0|xZ\_0˯ƿH\~xVLJ:p?E)@$KRzV. L-כp %Bd`Md% VihUt6,?'<̀6Tn֯_'~WIo'.EO;cnM[hן:7j׉`:Pv9̋0NEM)3KLmVjw7A/Ne6Gd -c1K-.[<7Y,C"Ncʠ2} ݙ_䶮q'Vt ,٤X5\(wp),zƌƁZSvbĜE NVFY Yr]ޙcXs5iqCz߇L~%X fz^,30S&u[D)0$.֧LT;3MR27L9dNF$,~3lKn?a9SH쨛unv͎Q7;fGݬQ7;fGݼ-.<;쨛unv͎Q7;fи4*Bh|82{eϷ=߲[|˞o1NN X[̝,d%)ʞoߢɹr.K.˹r.K.˹r.K.˹r.!7ʹ2WϹr.K.˹r.K.˹r.Ӡ0m ,sV+8 bRc6(SXҽ )&ќũ)s/)ꀭ1T)b+t@HmojRz\Tknl`mǃuvY6c/cb쑺\JxkLd﷨x[TOŠ++3\C S~ -b4/_0so{mK%)Y$"_hY"<5agx:SMRmetгd䙑siS키{Gwsin]Sk $V?;ts|Gm@', (>VPXqA1k:{OnUaf'V/2Pffe?CRTh8?e2ˑꒊ?0}L7_vFGmf|{>n:ر5ݩuӧ׵ #OuȷsSOʢ'U0{|Ǩ0s,M֛,YydcOLDX,-,xGI4 GZ񐂵=+ȲwΛﴤ|?65L5I%X,QS5S@~Wt^6ef̬Y1K:kh^XRY)fCY'ŅaJoz줥) i@ jۆK:]z\<[d+wLVni <[",]I|_./`Dw}2TaAKg='{_5*c8=up(ѡAV5&dT13F xi*-I6IF_ۢj54v//<0IyՠQ>q؝q7Zl‚_lq?zE;xK07?Y>QW\atk0.ø5qmIQA)3ZX$mw1_Uʼn4Gh_&` q ._ \o^oxbیҘr."ı1KRr*i̒\{$97LhAo pew4>K3E}HBPXhdS>3 8/4>/!#=[̎5#(p!h4wMgRP2A\^.s,8oۈ,9Œֆ9 Pd2j jq?Z)w=m%eg9/H*NFwfh2We#/cl0o `0:RH9>]]~Vp8T첏0!.& o=&uhI%MDU(s0ֆxG1 Y;Da/ql{ӓ`?oi[żړҥ''ZlD&Z̪r]PJ'`AVDC-ki({mh];&J/)E,U_$+3J 3 A *"@ܕ]f$. ZS@# 9 ;\d |ԖhIy)qV6M5?\y;~Ѫ7TPO>xa1scNRsM )ChRL\pi>dAg#\ hf~V:)m N$c hLF##2{[!}3\^lL's'OkhZői)+ rhOws$^Y˓1$TG*H}{W>+~Jys@2j`y]$߿؊9PɅ!*s.&gϿk`zy&NG8,G͕>Mn&iFcrmwh0N sp4-4Q6lz @%!$-/k-oF\6w.FH>g];o뜡K[eVg\ҾJJ.:ga#aiȏ>aJMߣN^cPf-Uh;yXo0܄Ꮿ߼1|?Տ)3W8 50[/z.yz׳MonZ>\nXn8 <9Ѥ7jf\K5X1`nך+7-)47kqwӊʒ*|v-%c]E-cB byū.Js9Ǹ5G&E̖T;L$9Vg`t`~la$KL%ʬbS~=ߖtNך[8 {#W)`uw!Q|=R֣f3\HDpĦ89O3bM>921h>EX{PO'_pDZuݞrW<ޕwӿ>y+$)e\1z\(a#dy&${ӵЕ(U([N o/$)"AQ\44 aT`r7ʀF{(2 t1uO ȭ{ddA5-7?6Kʨ1 ."b'wJ(/In#JpE)#*Qy8} ;{t"KR˷<^;I-+7ۚ)D=oL$g[m"<5é+s۱2,,zAq&kt ~@T0+! Wk`9m^5ͫ'~^mo[py!l|4gGj _ wVTYô@ጳf85c2(R /̪G|7ӲRgk鏰ʶhK.˼%ǨYeZcgq0p9\l|)IR‚ZY$%2PYR3K57IWeq.]A$IJ6 RD20^_&<+]=GPn|&؞G_+-l;.Xɯš ^擗}|{Z'x0K}ah\h yd+O6@xp,y[lųwA`v zr=Qg>tr,|Z0{"Eݘy<:y:b 7rR!Y$F)RYU&,Rs/] ~sO޷v0W|`n h@X ̀- tPȔ#ID7h-g"RNl K@~ϵzĄu|Yi6SB1 q՝o!S1;Y썅بx|zp ,'&{Ixqƍtp43%ʼnׯ?mE T˘ 9+`粟|*:SHR4̩3LQ/cHS F꘬gA4) !*r(KJGboG)O Yƺsrg~Iyb}郣Qݬ _>|J"&ؐ XH:!J[ը!A 2& xY"nxq&Cx%f6161aAX HЦp.&Rr5cݨm Gm[vEJq D$x-)ACFfw`('"J2T`DY%mzzZgaIhzgGc8BJ$[T!ej!YŔ =;&wh^s3ٰv6!nr%w. ޽9E͇v2z5Bg?pŵ+V*KIY W/0\Yݫ;"B֕O/vR "Vn'ƈw RNqe6w烐)~ǟߟzM Wݻᙿg]GsDK+oO@ʐScu4U;P"FueY^WljIZӤ VQrqQHs,̂ ^eB@A*yez0b T{)pJsQS2Yc-s#.m`鶁,h/8 |髹R47i"V *0Pd-2"aV3&| \2e,xq #xLTxb;#{<ٞL6j& A?a/򩟀5si2bS i.Z8׌3e ECFw*WPf@kK %i1hN8+YWޕu$B"dL `BWle}!#)Q$g?^up*<~ ^1)<K^pjcKʀH)X0u YcXM}-_ -?$%@KaQ1P|(Oܻ`{J feYlZZ~:8$_vYs*@:X' e [) N 0eZh4&:t1ӛùM<%%,bˬgAx51QR !4\jM$p {u(5V9-,&nJ yJ|mS|bm) Kv!&0Gr$ XQTQ:Ѡ%I&Xv{܃S<;xߢfph&r8gǥ(q(u@Bl`1ԑV9*+Om)%R 1/!FwE|K @io~; @xZ9꺈l(b0a1ǛġR *MZ$*:(F ij zc93b< P 6.DsgՎ1eT eH 6:ok:[&Cr57{?:?mx-eno@>\3(o\rqƞKl:)tHR8m9$0n&9Β@^_CHLOԡK+M*)MI. Qhd"srsӃc ( 8e`: Ŭ{7<&D9)AkJjy0 A/?_J5m0,V:c;MKF *' ;lf7-S$6t.#= Z P e>+Q9lNSCU 1,iA) վO*OaA=ǥ @qE_Cz?*|,{D7Z)#HޟS270>샙UyR?)|O /qyz9ժ1̿1h0s  +1LܝL NY8Z1ƜAQ(..#eb6ϊۀLEq8'D~ߝlg{ԝ^ƻ޵-IKߪ It# qe.'yjCZ`^8~"'͠TA9\}-j+g7#Z~-|*|w3>8N)4_{loE@jßn qH\4ꑮ(8<|F q8*d`ŬG]ѓ٘t4i֋lqZ'Jky$,|:We)9!QyNVk:@(;=߿ro|?~?>>dzϟ'؁iNEfYX}/1ͷL*wTϧ3qJ?kG W`WeW7oس71ϘO[-nkhiyhID6|qis uy\I _BhOM~R4劅kmE4 |wRYmJ$r;/rv iN6 PASX"nĖ,qAb:C=t97cԌ橰8NʲH#w^A %|XERN,oMvZ99h?x㕖#V.i?1LR2[a_JO웛 ;}wCqF${#՚!nC:pNzxWvmUSj=z#L'Od"XIW 0 %܇M-.boVSޏn.r1 {ShW![6lr!ˊ՗乯 ,<mܷ&nS86F苣Kd:$ZF  >tqOQ)v@ i{U%T%DZ8Γ p)["Ek*HxBcp%*r[T|\HXNj6[]1N%W %25$WF Z ) LrKτriYP|NU۬Wf`|x5?Orihy)IzέX\Wm[֡^Bʤ.r(GA$R]Wf6ɵSe,ѵ@h}׹K=9i_2kBk֨l{dCm6'\EA˺"ݮ=o ӹ>_F߽/T>( +|uߧrЁ&ɲЪ"non[*,xr/5g"PCll]{?Ph^/EDG=@ꧽ23'}>cޔ9z ^.fOz@~lo9K? ;X\yg9Xy#C;9N}qlqH@é@ǜ g[n[#%<"<1̘gT(Iɩ$ڭc3ÞhG!S_]t-:="uLp-$@gjZst p5ĥi^踎ZU2u ThT<я8'E%X;ў|DmзC⾟yժb"H{Aj2A mD#EဌyADPt߯a %dḵl$Z5!s&bbIb.*N$ց}2FZ:ϼXĖMh:8k}sjtm* ՞ L!v8| [ר6G5!nHDr! R:uw6CXϧsT(7K/(pC3+$`t~񲍜YKUH 7?j0?M)B@Kbך|}pɱjPDۭLt}\Nc1{/ʦG6adY7$>bbR `%F1xFzJ3As|pخ/tא[)|IyFn5E o3EҸ^mǗ|Y3Wpfǘc1,Ctό1 3ڪo' Z^TL $jCdUiRRm<:lImېqFUN[rwiټ[.U.Tˇ 讯@9J}oHغv;^iMmbufpSrՉ6h|[dz ڄ혷1GLJ2f8w[fx4 uDC_OE_:4O+Y΁jPԤ;q8i7rw_LX{9pE[hv 1]bԵ2Hh}`6֙"rZ#α@.jrT.hhT/ї |JK}ݥmn? /77u)x]^0γ\1Bx׫ >bFXJr$@z) m#Lˆ1F eOMSU)_UEBՁn FW]pb F%W2֑V!@О";-ލhȿ*1 DB\DZs"'LtE8יyBБy-EJ*#NĔ1L:aǤ y5R9ճv_ Ot42Gָ-yVAq 6m.r^]׸in=KIm 9B|\Vi]2neWFL ^ALWW UsDň:bD0]N#ʳЋY #zň(ZYOcCqF h%,F"5Cbu?qYyGZB2I,A":0BF GXENGT` Rq'ŪUs A(czfl ` ϔM͚xy%Fs R%flN6:$<ͮl\څW79)c,=Yjrw}rh"QD5Ncp0-u/&Q$69k5Ilt, PW?||W[Qv| ng"0ZCB[t!-UjUgvUjvCו޾[1$NLgmB3\ɿٻv$+<%%:bަqbBEp:U_?)cq F۠sKWz̿\YmY^=<\ݰ\~6ovm ow~E—WoՇ-[ByfHy.n9зj\G+O=mۡ_^2`ɯԫk.9s P*sD(gX|97GSwWϑ}eM%q,~K֛>7^ƥ\mstSzF΃.utM7gz=SggΚkSLF'ꆥC@*-CqZtJT% Qf9h)^$klN3i2éԈҤ弬qV3 "{W1Njڢn/ rZފx/ img4^c4_Mq)?n 1l&-S`v92_n0Ogk̺[rbFF1m7]~0mP}WN]Jw %5l\Fu/dն6[+ˍ˭mwOwX+W塀S^V) o{.ǝ' $IbL` $%h\,QbJS<9_`B*ǔ/L7\p|s5MUsj=DϨMtڳi]u\DWiяMtco Dk4㷉^a0!ni0r"UH` >  0 B-@NKSN1K9ozJSmzTUۇ ۇ냸qSFG*{fUf7׉^S1eK &LGhS|Qg]hVMk~rZR#Yn?X0zǷv}_x{'4D4X+x">.i6jQ7U3xtAk2e\ǔsEzvh{їծYQ.ii06GZsǎ/Ia*E٘2xk/lkBV{avR9#UF; x$QRF VMf| gX c4'mQi@ Xolui5L}ҟ~}]⦕Y+%0\sn ~2U_ d b:^=^(}Zvipೳ ~}mdg"iWJ!P*ٽw ہ8JJ*Kk2W.0*$ X%3[f B[;A614"@Ӄ_W@bu֕2um^V6*J28o |IJԢAKJK,)8zGG52r'Jljȱtz FU`gQ\;XZ%qU'x֐//:|glF88=D`Pb@6۩r;^cmAGK,E&)KfijJdbB$Ȍ|IL\IJc3-4 exWӘx=p!E\rGQ7ا7?~CLc̪X57&Ҷ $CmTiqo]͏` O3ݰcz}jg;}1'ڎqFfC/k@n[뮤sԃ1E?8D`Ȝ3H]hH]?_k Ko&=H7?{9lmpgzrixg_m 9nmJ'zmҔ1xK^J}im@b0IƂIa S$[[I"Y= nnfeCV!(`Ŝeneei$XӇJ{7rB$+T>2eqmnw{3̫>4<&{i@`,\O'_L^]L{DK^#ުE;_328Ej-!!`+- UNr˙ݧjfVg-389z"ːěټ~afk`(7N<.4"kB%ASK+R,CM.*,4quIR}`0I e M!`@JRH+)X!%[!$&*."yY6JFQ|"SPFֿ2d"l7NL;Gmty7oK( :ո۷2k̺z5E3sjgw?oo6TצoIx5wٜA6rގ;mO!ϥ,\S;7;cY jF #9@M-P*UIŀJm4]ɹ0,"梥bC wr@pTkLݞq;J9/lBl £Gkφ|7C~iZ$ż< a21zcEƢO5T\"![ ddED@\(m#< QiUݦϮ0OmPaV:˱xfq^#mo! WLRdDVt&Š,lD:*K 2M?,Bb&(2#C ̢&Ǿ&ۛ"2pd/"1&BcLpNVcGlznT-<݃G41 NdMLUQ)Y;MsA2(!ehx %̍ɀX'uvJ31DŽ@tHb&$zݺX+q(ט:iɡ~Q6Gwg}tyx tF^wYZ8(Aw :+H<1WC՛ L@C0G{Hk%{9hmC9gFU%Rci },-[͙=* \`fk`Q/yL oDEwmfgξ8ȬJSwb?1pusI* C]5S%h^HVd6w1N‚($6%U DE𺱝5gO;կ`fsEܦ{^g Q[Y&zBL,rQ)iLUk_%thi?/~KH5%KBGFT]>Osa/nH1eOY=)t jdLo:O:-~{ꑊSCƋTt]#D@b0>GkdJTP)q4qSߎ'ͨNr^}jxt7Axetz2"œddQ8m1in.lgw . #{-]{FdLN ][oG+qH}C&bžl`Z"e?DQ=bKtOMUWu)b)MB$AsNTw3*)]T:Eu9*c&˕W:Yù!@OZB.hGcPĸ -eoAw\zp?6-#K)EjШdd3cSzMXOLaRښ*"ť If?&*8`=fBEܬ7l鉂Q j4'`HrP2{ BvNf Bk.?[ր[m3w?<.Θ [J 7^Z5Wr`GǝWHa~*:#v}HT"ʾ6_^>_]Tp2}>Vh=dd^޿!~T+~R5([3MGw&˩Z9c}ớbNP֋w׽V{9E޿x{wablL VΤ`LW1r41aP(`bУ9W񺇛tZ:`g=YW>+pv3)ú͈Ws~;I#=F7֬&fr'v/?{~w~x׷:" 1p=5kK^dR(2]^ޫ32We7,rO4 =j:+#ܛ0ܞ,@ x4ozj4lji5&zJ-#ojݼ**  y zQGD0xhJEPt2!h ZکL6Bl%*ֺ4 5^0.G)8:yj+Qg}6^@"!IAfob8s;UBc<,rs ]-6 !4q'rIKQ\LeYPrYV+ݱ\U*׆\~!VhiN)B=uU+Zi]]Uk7p +Z`dUY.?uUV+^]U]}J3]Mɡ`p?~~_^OsU&4y?ڻbp13^W$h_IʕVJ1Z_Q4 zNlsYHAWcOW [/!eύcSâ;F0FfLG8{T++h:=gcBd&Q)g?}U q*Jِ+.WBd"3{*33(}EBţcPm fr%h_y}I8"Jg7l,jR.| y^ڛ{Q5H~5g1W{>יEq*(dh+zIXѺX8&')O@Al2cЎUAGBePi*'rF0q1pA CS:"g&GBp$dLgǝDhMȁ";]`+!Y:Gd IeD'/KiƉAH*F%%DliS hm3ޓ6;nXs:`lPݒr>e:t'ҡ{pP+hBn ȢwVԱyD8,^nWWnnhш)°zEtNW;,_,6ٗFe*7VGlWͱEGБ|>t$CG/Dbb;4)@>- VJEU׽&@@Z4TYdrTPZHZ]֤EmC''5g1WƾLڳz,sB._/qmT+ޠ(rΘƕ oym"r軋ՙWɻWcT־^dsw.w=h$D{.C9&/jF[i j/7>i.>, 8lƧVo|*x|BFo+aHKq{r"6ZY% NWΧ\RSJ:̾tӵ(lFzkm1~'f#oax tkWMow}W_jW DrXHRVUVTNʄf-6qqlk(Pki># fXp")Y:ir, sI\ kɂ p тKܢә!!y Rjv &-s-g(^>( [zg$ ӾP({p ?_ ?F[>0:bZz g >x)K?>AIReV Ҝ C?ԂSg'|fŃXBs 9CdK"YW7nrL2Yd^(dewtZ8+h˳T{|{t,m9R\EVn :ǝBqgЏ e a8$6Z*%3j~{Wp8X[nec,rkxSoƗӦ+jh{|PzqܓY-ֻl=7FZzoB*V`0+ңﯝ,a1/j+sZ2N5nehEzo#"U,ID ΘG$HY8$[KH|$THR!09KQ4<%H\rYTODeF<&BAqUJA,6fцIDc aICW41~Qv@5wFUIi{iy_Fk$z>ZyəU?ȸ?O{uIRrL!0UsB6ʃ˕!2E~D)!v>DO;ZNMÝD! l! )W ,%g29A&عZtׅ4F/6rBZ}R,GFP'8#Du& 6b6!3,4[KV1s:pMB]DVO < $7oi*v 0%r6(^ ݾCc `>JumdڟkoF<9Lz-Ǝ_k||eJVo hbgʹx9FɫscƘ^GK5}Qͨ)e,xlʒ R=KjI<&z|&%մfll95zr]g múж[{5e_`xyrmG8YNɀz/˧gT6VZȘ]l+DPR@QY5 (U2VfEd(OEmʢij95vӜqy4]kXkVkvm ȼ@ 0%FbB0Lr adFް>̄|$K+ rEVthH$$y(lIG> E2x߰>ll9amҿb<5돵Ոk9>1\8ýuYG2UP4"@ޥl<BRM0lc%p虐B 5jn$AHPFzQ5]lC1:SůH/]y3]]nGcFIw*I}>Ն({8rJ!_%QO!^ &/zPqvN9?_vv9n'm vJ-QIE dy SaTD(Q;j=$ypε&Gowv9vVA$v%t[f|;*8FD%N NڃI*%̌Dƹpg\d&1RrH)GsBqnAzmToZolC2tWv}W>eGtkS+jz^j?Pu]V~Mb lf BUvMq&pT.gDLi+))%->_r*녓8}2T`Ϲqn=-qdZDte\d!46 A,)W,.'S욶#&ϲ#@h82,1ԆL8:Ffhy=Yolg _YCw*љB)IX*d*Һ:ΫG!f&*`|Meg>YBSA~bOr%q2 iT@"9gRs^ƐdOi,O߫ZZl_vz# Ic1C#3oB lu%2e/8z؝kF=6GEf=H^oMKƍ`tJ騝rM6Ц=CmN03=qmyD1%-O|}ȶ=3LoHM)̑$aL AA$=:6Q1gc (d[=sWFnCMI=Yǥ("QPf `' 6΃MWu 2sqbPh 4J //_%ғ~**<4uf09A A*0{ h5 ,oàK2̀ +T#IцϚ8裗Dl8jY^4 MzPH/o[šӦ7J۶{&{9Z~3wsQdT-uT>j/ߚڠ?x8i P]f2z-qyxP5V״A-H._/K#( RS\R2\CԦ & 5`Ǧ\ߣ-Gm}lN0񝚫 0s7(fL)8ʊA .)Q) uqzȜ`:DPV(cGB E6( r쏐99f8潛.4xر,7K_{0t4L7хbΑuAfl{C$%I kxT|AϮ&%_-:25A(&(Q+&˥:sC51F)wcCPDq[\GggŬO'۬:2#GƤ$d-0#ZW/%I;֊' cfOOBaSښ6J"L䤝+)R?&(#dpaYv:#_[cHdmjH6JdmJ4y>X*FłvUBV5W6>Q>;jϿy\0A8WjpZW0Ysl|jUa1$#O]dSF&>̤wkMmJU[dqO#KpK@&.GC67o.O^/+g_Ck*se0g'-n-p("RrU?# /%c]#uW$λfX#&E,X,;hxv=Cd1|py,UGMkι*s82"hKW_*xWWr:ѩ&g`ؾ/ϯ߼~_x{\}}+uBHA׼1s C՗Wf0%M89E, qFʠ솅J ~0f52:7Gm_JپfC\9)7W>1nqyjD%b;E/JThiW RߴY;u3eo>}2:ih*HTN XQd\\Tpu + ;fl{5$mANa7gۡ$D-r#}2 I/տS]'mΤȑfcxs@N冿!`V玁 V띤Fm|ڕm3 AU6!}ҷkpy/:露COzM/.-R4n{Oi14֗AQ;,{.l2'%%-ިy{UT=gwU-ݩ{:߶F^gN5]Owͻ8;A)69/]^O1oeZԺi]9tf};tH-kj9+Myww> AK-驪htGwH9oBd!mHn~uMy{y|zfҬlmw\foyPɟ'Xn{ils%l/M\sz B 4muΕxd?Qwkq8s[^/;z ^ M~nzM|)p ͗~ lxn2zdB&'BE. NPR0)YM}jIy)M~Mڽ T&L~6 V>Zw="ҩG|sKaQdmDJK61IM==Dȴro&PUNgL˻7>m-\x,LKBuqBzBBn48dTr)4LmdRi/w.6قp`m]= n Lw/9uC7ThTlVSbW"B,wG$ w#Z:p7 vT[k7Tmyb|;%bӱ,#22=f=P6#ٜIX@@UsImjMH; DjJ}m6:0mB e|Ix0q Km?Tt!+A:TFe [ t3a>Pk9%8:?JEK9+SMV, d &U:\zd,;G13-cBsf1n Ds I. 1xϴIFga E7IqNLSc<$nhĆl@$Q^2-IFɔ׀`!"-4qr&GFӜ_0;JM4 z1cNn)//|ěSfX L)!aQy )Wچ ]_ b4vNx ÐG(I1$zhYgY֚\vxZU:2-0N lu VK5^X.[ɒG¡,3x'j}\8d޵tݿʸo=VFeғd<` HuӺ-Z([]%ljt =Y)dW(!ZIPkCu4- ӌT"yˌWh|z`rg'UED'Nk!N0v?vNe2Qh? IeW( Ƒg-K75[]J^28C֡ـ 5Zp7WgC=- V5h>҆đ:TZ{b E(kGQ׈IU|+aMId,>I1?؎sBނn]y_V 5Ȥu vTl~`u~(j@ԅ&y Ej)FKD>"CO,[ 0"(Qw@ҧw^ 9 (T—%|̡zv@ @2/ PJ6ܖS5Y$a @AK^ Bk]2s@mBbtJ߳%eV@t޲-Cmb:☧ͷߊ5!y]LWՈk֓(=ɇ` ~iIj-*^_v9?jUí^f܍#S{طrE:dzŜnᗓo 9 FoK‡}cy.k.Gpʭk.b䣣޲or.]8yG'?'‡vrks?I:Ei?\{pV뛲>i>M"*vw>,k%wWgn_v> o{zg(2|yKtۘ)OQ)K#:_K[G"rW}ME< ?K~OmEzwn?~GoOY `q>1xw<LpM 7&xϛy)!*9z?yEU6Ez%{k^|!c[Խ6zAf>4QRG$5z ֳ| `Xlbs؜,6g9Ylbs؜,6g9Ylbs؜,6g9Ylbs؜,6g9Ylbs؜,6g9Ylbs؜,6;b['v{~V }߻VWYM0XEs4xCN6c]+xõo?S6糚L<=]ל(b2!)'$tI `oS.oN.PY|<[\)6~9ɨmqggl|e.}n{4sɷjxwu<:3al(4l jD/BB3MHm95_\O*G,Ci.TvMJ` 3gw߉uMח7>]OSq??Y7"[;xu,\!_L0qy_?b}l)PR}t'|P'3o-~k;>ko-(tY_upg'\1 Dqn"FW=^z+\Q9Qvf30svgƧQwɅ|vD6sy7(M<]q07ӷҀʹ:B] )Wjg:̈́U5$ĦX8<ƅ;(07:0o&͊ kI6x4euskrzy{ωs'NHPn6>&0UzsDahFQ˨@X>Q?pvFZ jm}ݣMc-7']LmmI{r8^_a7~xi+gyAޫ?I!/3y)C6nLz0/ȒR$[6)d,!RN#!DTSt>:^-@cV;ARTYGY}_VJ~Q%SDgd*rB1p5٘sfu2EjCQfiDJZ6 @hrrkf.QQx/c,r50!6>f֋=ԟ;#w q_0LwuvsG])D$?I}/QMum}v3qeNJErQeq: dgQ@ ;&gBrfxzφ#4Wgg{$ZOHt6bByrq9f&ny(O4_6б~οk[,uۤ}{>>&|Z='^7񏯸W\R7Tr@}_p?\rypXN seu׽-7s;(¥k6?^Y5#X>ժaa_9bbP(;g3_dǰS՚N%;ow|~Cϟ}2w g`=n gfitOo_뇟C}קޜihc>V ӏ_aOX11\gO?7o  m<q\CK*|Ʉo00M>q8z\jCVŅ?m2Oăk}hb;R؟X/QIGz1`wAVQNA [ !&TCG֥7Q_YRsng’>Ep^ '|.deMe5ޚXv9iՕ ɧW.lCYOC ŵV$ Er!悫p`2Qf$Y("=YiE6&Άэho߾z :_|RuvW/lVK?s*|;\kOF=?<O\qk.%Lj &dEg4TAK1@k@KA)>Τ uQ娀`Lw>ሓ7KLQ\M0 4*"~1F%Rڑ(=@Z(T"LIPN:+rPE uI`%Bl((Ebh1.x;J@mrpkaw1q6LPyǥppoϜ}XۺN'›wpkEXV6mν~3]ҥg*l]asjz}8qV6yp3o@ϼɚG{=nw1@Ԣ^si4?mݪ9dvt$SݻJ(HD%þ|w3͑X4v}ՃCZ_#WQW _%fLMW_ dy?\: اW%nAV_DhJz[1Wߘz|!8\vk6uvTh#a dcSF rR\ \DMėsW]f]u0qaDʆ@\zz۳!TJ6DgCd)dClE%{Ȇy-P:_@畿!k8 sVZVQr ኁ$*˫Hf4J9X0BXkadN)=gELh:K}AmcJYLO-eh!A:CUҁFkH5K$@GϬ*z6eؚ8Zv+ [-A%FtJUy[Qo+y7uA Ȋi%rֱ$І:0=p{Mp40kpcJ0qL C"1BƐ9쓄Z.e 빱1©I@erψO9Uy-Z6%`X~#}j+z5C˓;?;ܵg&Xϑv04կeTA"| !"W) <1T t*ꂆݷkㅵKNWcfl.% PRj"ȝ "tJT@A!T nVz,U'"h1"ECqz,Sw !؈/;kM'6DB-ME 1A F8Bdip|1q6i2(>;TQ_ }U:Rͱ8sGrsE>FGk/8Ÿgi=aL ``?Ǩ&<=")_ 64VZ&⥐{UwJooa9F}^\\b&PVG/MhnF赸:cky}U^$q mY]uz&ic۱I_W 3gˊisvۋ,*~|uy- | .rw+ja0VK& !E3hfQ̢SϺh :?b8RNbdBYbnja:P}7aI jL#' nsO:]7ۉ-3' ,҄YtJb6݀v )ǜ1/ kk{.^;B: :)kW57{b@y1(jXN Nlw%vIx"3DAPS<fCH&G'4&_F2Y|BoM[Td_p|vEgMm6V|Sȗ^AG$sָ. k; $Rq]ؐ<P%EF}f: >W]_oZ"uŲ+ױ׻9mAZGBZJ'x6C,N)by;q-km7zKsϳM;SѐS u9.ͳt{!O <0cu/I/6īh-^)l;4V JZIT)l;)jW0آRJa;\q-L0i&¤Q4 Fa8-g)4& Fa(\I0i&(cZ dzFL|5dYZRr(W I&¤Q4 Fa(0i&7Ua(L\a(LE1b&¤Q0i&¤Q4+LI0i&¤Q4 Fa((0i&¤Q4 Fa(LI0iAT{'kɲ/^KRTz-^KRTzm'^@V*VV bRqB5OfPWS5ʔAfX5iyzk5^dnR%Sm2׈s2#"Q*+1#r!X`ʎ EzTK+g!sr,Acj1\4fa_ox)iiI76ORî #=o>Y]E{jh{3O5rDIșOƫHӱ~r]!p$jΣ9-aDͦ,/"%I(A*A|I&.GGSF0J ^`9xց izRR 0{DU[*GA )vo ?:X.92@fSdd%IM1by4FMX+47ߋ>QjiE&xf"'t&%nIVf~p-LPF@paYu:#HuimBFY6O 4y}"HNUV !P2%Z-cHA9qiwI\F~x'\v{i"xMl8 >{UqdI:?=qw奢 鳲3k-ٓv+y^o`jdsLQȂn t 㒉 v֐qOhd4&#x~=LLiRH+n͛>Ӏt]oa\Ϲ~=|>~lN΅o thdo:gnSӏW,%I-rVIFr1C|Ik-!8|Ryݹҧ۝TncМ =o̹^dl[FZq{Oغ@ U#Wcj)bQ4eJ>>z{okklsNum_gʱ%tTH|2RWS~;ɹWv l(TL1)۟<՛/~e7_< ͋?x@#0XN/Fe87zmϤ|~;%,=E, s6\rf_O6Nf`7>,_7 UⷝV֪U--lZsUߡ^jvyM^N5!0~@`ݾr~sYi]9M9Q٢<4* L R!JqMօYGt;3Ҵw!1z*~ g.G!H累! K.ѩ,08f@Qc*k2Eu@UUaayˑw W&t͇oŬZh,Ӷ)^M73a7Z!ӑ$79݋̣t%NOLΰl89n`DI!"goLs"A{uUpa9'+f8/@z`xHqk Db'HĻRlux%LԜ 3WIxT:;EoAwtڛvWd [~wm ^oͅr^Ɇ}y@PJz0PJsy(Pz|Pc:0Xt(>0Oϓ'o.†1dT D!kq[!st$8tҽ˃p$}1xIK`*[&ty&1` Qq\$.H F[@6nFJy15r|6ygzqڋ4O@DmD0<4L>d$RgU58:2 4:εs{-ֻl=7FT92_Y5gY@8'?{u$y F5e3Eo|?~X/ >~ &H<`p(5Zms Z/{tOD8|555qYS2C4`Cnr"D&9#I>I2XE р:yMi\9F麮8]׍mZ5 ɴ^uEI&-9o-rlKIq F3^󇒩Uc$ΚigĚ|sU;GJ`TcNV[^>&ښR&ZdE, DE2) dso0i wR3ږZs[zX-&B[ڣ-U[8oQ 5qţϣ8#\= >#!XYOZȘV)Td"X5|ʆeuVfa(d'dBb6e14$#\2. pfFTllaOKPv5jVnQ*dl&Pb2S]ZeR,sF獼=̄|$&ȁ2!CĢDC&!@a&C*H%]9aKg10E&{ZD`u-ⴼE}LԠM26Cg'N-.(LզI|JN%KE{ڿof-·KbpKƭM@o,Z$ID'O4l&7d jwttJqWw;x=9^a%FlBgn%83 >o4rh2do+$x7k^U}vK}.RsይDɳ}Ĺ%7鶌VVi 02XP곳xNv))+Ud 8nNO^Utle7sC:\qڥs=VTdܹ^Jz!|b+nnىw.3=إV,ʥ n={dgAՒq$oANWǑR3orͥ}"*I:Zڭ3i{+LW+x3qޥGb;lkpSI$׍a6Blb$ΗAЯI$+AV H"%>90,[3==TuWW9o^[Ě_'$IZr0G0$RhLhFt\,hrMDuӣQ*c ^l;;DͳC"? V"YQ$[hz`ҕ- EZ,]!֟kyFr#Bt0:ryy& h2/XHIqAfQ;Zq9?VXE0k5sᄕ1BiM"L`*sOuN}L0ϣR 2)Ԁza"ʬbN)(*FlJjmpLSPFPp6]or7uyUP(BYP,rSa49K?%E¤4:w ب%dKO[T+4ZuwFׁ.PiPs!E })9aϧY̚޿Ӭ 0Ft_>i#+srBᓻa@2oz67)C6Ofݮqܰ}v`;/'b3gT'uHV Zgw^/TWu}xFq!kRELe~tSl3>.} 3v5·;ɦ6]ds*2zy d4 :>"r8ӯh:[aL~h{yҊ?ߞ[꘠~A}Ǩ6v{MrSi_np}j_yI4G^iL@ÆN-b)|pfpd\aýeVÕeg1[ޜ{}|C/})fdn8a;`'(q`s3+X5qx9WL ?h+%njx{D7;:EUpU7%:VZk:.ukg?O\naz7eRte&*/un+ԺMu!T=ŕ~ٶ<a :ʭ41`H!3LꈇRJRt: }{|_x.j:?+?}u}Hm^GBZx*;_,#h.(Iݗy}< : 3v@˽juls*M0jբ-sP9DiM0wsJ Kힲ/g YqgU852Jru^49^畒lWvrЬز_'g'K-4 \T_м-Y i]!Wʬm+DkOWӮ5tpEkjR;Ԭc++/" c0s4~?= La%2Z/ΟV  "d2s KSiDhIPR~T5|5*WEM0=7\+UytR)*`Uz?BfSb7WP7,ء=m7lOoVf(U|A:vTfWP79m ]!ZiNWҐ\m+DH QVT;:KF[DWX1B5 =nF(5]!] R-+h{AY[ kW҈$t= ' })C7A_^$͢8zFTG&ZPqO c}&S_6'\pA4- ;51JQi`ťl&™^<iOR}کaQHJ(!Dt|57h׷O%* _7O yqza@v2oz6ϬPs"Jj4̏y^]SUPuùp@/(< #y&(K`^'([MZFy/b9+mN2 xVe9k]B6sedW}EAQKX&8Rd]HdL8.M֘fho(9L#45a=5D-k{#+CEtj ]\EZ͆hY70%]!]Yb4-+|@l51-oYD);c+SB?̺`shM2Bi3@lD:n] JwtS*x2mX•-th5m:]!Jc;:BbTĴ.k ]!Ch ertt*tũնMtoWҶUt(J0&o]!` jB6eu+ɄVEtMAL[ JtB3LINbEWWfg6D[j?FZ\4=kWW\; ^>2ښvDv(UGWHWM *bլt(MЕYs)R'4< 9IHQG'seBk9)wV8CIn]X ZB)؃.'߷(0AzNzYd8V ~vC17Z;"Ut4/R6/Sm˗\iX?~ Oxm>p)_K_ qVr]< L׃p*z u§`Eۋ4MHzoDç gA|qgcr8Vo*BU_=B_̦5(R-^}“Cp`>͚6ms7/|mU?nzQVrvgMR]lRѯPR'i|Ʌh)>A),6<YL׋~ I&}_?rx|&qxRZbP-{7 0t^;N77rht+.,P>AyޝXC8ֲAE2}J=Z3?Q]OR2I4w>e K/̫A!Ҵw^haRP^ a~g\weFL=4-A(?Cff~#tfhl3ay[~蹪0,4 ?'Y) .[-j0YJ&Jg aDf<&Oʓ%f{\(mR +-mo օmopՄH--3Oe;W1¸@- h|>Fx0J9$JXiM")σ\y'W 3Ҟ}p{(&QXӸ 0t~7TF)6cnj)U2uJ٬ԃ5gY`U"ǽ9̋+-lJ3JLg^ȼT>M-܎ymiD&m{bGj)F!p\EI2Aˠ49B8|:5SÛs5|+, Fy SCcL;0=sA$ɴPF涠IFaZpy@ke.R|L.Ä0PU?{WF/{L| ,23n`ɢ,y$9[ߢĶ[M6ŢN"w"4Zu6?./x!g>W[Efg;⻏ғS-f-qn1 TsI^gi%rsȀ^g3\TubnU&& fx* ω}X)49F  (E!N\:ug{%{\N?v=&t-qSFu{}YZn)2*+,K9\{8PAb֊Zh3QH5HLXzD^; Эuv)PWr18 @ A]#B\H #L>OD*rUg} W^]|n&>OqDR%22NHw9LP*H Z:(i8dw:λdowp4H~ o oڽbo᷽y"}Uu}s G4tb x:7g±9&lμQQeu,/W8ܞ= {13 ™d f_0a4h>z`B: 65#+%g>3ӝxd;?.+ceCZNcj|~fo?8-Gc O\)lѓ #dVVTfI@}|wX׬{UԬ)˚~''Ղ䣧OD19 NGtp񫁴`=0jx,ONXg qY#t!7}n"߳ou~b,|*Z%I`PǵgD k^VGPI+c:-},sq2!xrYJf.s"QK::Ug9;! zpL#_Zʪy6YmȜ 3^{`Ġ'^JncNN0\0q6I9 %(Xrٔ} 0XLL UUg32*հe슅2  /kTm~^qGd;cۤ.fq40M>;DyRXf1ۈmB@c lYFNu85C{!) l*fv,G 603µ:jUg3b8]լcWQ3ػ,^K DJlfQܑKcaP&]v HF^3q&"L̐+RQ%IH&Ph8F)"jP;Zu6a+W 0 "VzkDLL o98ot2"HR sVjB0|vȇUQ+}6dgB )4hDE")i ʨjU- @8kεOcլdW\qKR%/3͜:N DI>YǮx(* .7wS1iGn *6mba >eIX=|5Mb˭{'e`.*TQ@CI}Rzz}ヅIs_ޫ?ؗ=}vKw,jwwם"Q+kSP RR:eWqƹ5WV)*29GO ]y!@9AX x=Z:[:Ka{ຶ/Ì+aJE>e3uk_-+ΧVz/OߧvZf!ahʎ))^hq*RٲT֮DN/y<@K-+@=&W ' $du}_F&Bp`%t=L%$\)OhKJbveGlU] f՜6߽(fM7%Ƃ56xe4dQKCU28tO)Ej&ηZ:!K.UC3ɑ㜞[aal QG AH&K6Q3ӻG=$c{woq|V:`i/iV19r5] HM(D 殂MWuJc-Kĉo'T~,=ٌ} /ڟ^,K%^#o`ZRZ7_u|W[o'6F~TW$H~0I5+ FBڞFu.^o"w:S@a o/ Gyқ%.L:_Fl^,[] KzC_b_ۨU:;Xwl/?ϧPhP2a]e qpA=eUˆbÛ+%eFmTBlFC#1a[rrףtE.[bi8?[?]?o[ $SLeUFiQc&1,[+-{&1x Qp8Xhʠ+%nT3!J턶"%,5\QpMB%<'R1ժe&K;ߧ1M;^]ϩ0MLIn9Ġ'&`$p)2)alBa%¯:h|VrՆ*4) kuX-sT q*;trn#H]<*9^TK5(2cFd4/g}q6fdqɑ165ZLrf]WP KLN0Ďp˘Iv1?V״UB0.mf#'v& &A[B&`f!$(LFqޑБv!Vm"d"ДM60PVX1j2;-@J֓5w&kՈK&L|8Ja{Rsel9`ŒS$$[^ٓP.*0;Z{L;ԕR[oW\W=-1Yge KpGb2.88cǖ|xlN xz#,ϭ~;YV&!2Z6R+;[M x.m+Tsu6i1~j'? Z YէZ$zp~zvxPCT{A**ruu&˫71ϩ8)GWTen<2+Bg޴'?,N> Sa.+s  y4<>YWm{mLmy_6ǯc]%)YK:*Ftceg1׍9b(jҊ'G]9<z8;KRWUjjήuVGJ4[L>,ctS75dp8=ó2ǿo~|Ûw翿9wG\ݛ #u ]ůZUx zI~ߏY,).V,b3+?AZ}t1h?  #w}|^gE_.ZkWh*oUtjsцOl-վ#wr xZq!v#]?/<7rGߩn5:*3vg!xvhKvܨK#DقH訧B 29c+Qav·dXȽPzJըÞ/Y@2 M-HBԤ3) 64W\{)_౟ة2WK3YW^yK^Շ\>MȥeJ׏aZU8x3#2AO9TKIznw-Y 牨3>&#<\~^nCV-36J娴os\ 錪l! WQ&,tɝ-E][S[I+ Ád:DLC'eD]@k߬+B $ร)df} = w]zZzr5PA]{UrR! ,FbhmN2xK!T3d,8K gi-{- fes^?<:J,JwIA΄׻mVw^kmp}ƜSRe-=>EnoCe}Ӗ8cx5(Z&`r3~5,ϛ[d͟f$`=nm-n`VYu ' P|Tp:;x %=`lP&p}dڻsчKEwso&3a4ZQXg1g:hdB ٥uR2,w%# )'Ef)9K^sNz.H+MgI=MQ͇|{Lm79fzr ouSIn<~jm͎h㭷RZ/qvFt:ks͍vH-j9R}Nw>I+E -7|CӧyҐs}dzqaZ x3Ҝ.t[RvN-,$3r,Q PY$/u>tB2elp_dKIViDgjւ36q!8ШYf".+0(Va[F?nC y )];RMbʄ|5btXŸb1eX̻)#mx@'B?&T)Li`,r(^LDYh@8BG.FaB(b./s-d!~G'WB6PYMt6"O.YRdR);ݜ<`%wS85Tfvc.KEʵM_B]gˍiΎ;SњCoZOȎ('j:dFR'E"Z^(C'^N0$'K4KPq)6D&΀v -&72PRǔ`f'+Hr%cPĜRh`RdOMg sOu'"sCx])_nMV"d.I>:+BZX"ܗف3 r7q}\1*f -p(`I= N㷟d%TC4={bG2K-$%2 (H2J:ŹNK{e2JfrS}ߛҁjL&{Phr@.zݓv =yS'L>&)Gy2lVIHt;ڤwcqdεSei`pI_|,(fgQzR JR'z6{A&c&, &ak\I~-*e\zO}i6L½7V{I,;H+DB/ɉ%A 2k4a<02zV[7PuuLCfLv]%8=Cmv8~Vꑠ=.7wp@tƱ 'eBf DCWpR}%K;Ƹ(X7fcU5t|g2~lLl =/|Ľf qXh澮yr(:&@$&<$&bJ]ITB[ZPYҎE BV}˧X;>;4i%#q1B66 3@qF&Y.'SgK#Ўq"dbw0AhDƸUEVXJ杬gզ%|`iB-^aUB"Wyqt|>""_sw,P Orz)O$O9gb91$/4SZIG(˓;U j}"Io J`ʱe9#3oB i2""8*$Mr3|?*FX0:zt\'YThS䞡@IƘt3;ng] )1G{|H'+306)$JK 3#̧$aL 6>"YyYb{ckDעrTxFЫQ9X?=.5i&f?.]{q2uHMvl ]dU6J9ԛ߳:,W<8|k^P x8Bkqڪ]apApFi<'jRWu"uQ*+1#r!X`hN"o R* r996:5-%|L^y-zWt>^39K@7хbGʞ'Mt:cRflE"$ (A|&^r-Y.y4) u%sci,XuQ*`@ߟn/&P~l ̈ sA%GHl*4LrflW;hBZat bG/u4mU܍K<3:xeJ='-LPF@paYv:#c'j$'`HrWh`E(UJ.%@ɊjۧPjkrS읇SK&{ƦaWJ[e{K1ČC$IIAfob8ó& |;.v\p5x#>yJK.K+pUô&EKӤRQ̣@AFއ0Syx]&;1޵/|CcO>їL[]nIZyhLvEB&$-^̿o$Y.V4HaYf%"NȌG'/ ,ȏsc%5`T$rl`ɜncGJ V9sYl5譑Ue,A&B(i#mBF)Q7)AhbQrRU(%##sj츘3 y:\VotлaZ_.Tի9M=&|pr{K"xd3Q@4{c@ 0N1 {z-f]Zqށ1EWBFX)߻of J={)W~O*[ e6%h RFs& F3dK(+K>ᑝ\N92瘈y|49H Gp HƄLU$b{^W&N'E[s&58?nb㩙L~Usʸ bY'HnQ;gԖzAjUR&[x[lwڧYt`O`{Yg,YWm߼#$Gohh-b2xɸSc =LA6J!o X1Eyzߙ;JHʳؙC–:?1xd`*F tu`I2D}r~gFʮtx c/fFxm1,?Pp[4^y:䬍'ͤ)G GE.aU +$NgǹvsOȀaG2p!91T*45 dD8Ml?.ȝ o6%YQ,l(QI/ DX]N"RHkξ Dʾ XOi ޤY*!DkMR6Ch bZ @-Tu:};^@chWZRt#MlrN)r;}\ %& (Q Q$/=0G s*)kTH80\QC'>vZ*J%=#SgX+rc9QxeM{bgU'N\~7 . )K׈hHch n^V@gѸiQl/!:k,4004@<"TKX |)nyf|HΪ ,r@&d8BJ1BA LyJ摗(t>ofw;sLELW q> 獨1}F}h:MFyU=R>jwG\snu60CYj2l 2\B L=: ugXl*OڭGn4xoƋIUTzW]6ꓪُՠbTR(UL^-I&x9L eiDeY\+5S'"E]8ϖX4-w^ɃLGi!F2 MԮL4'Xe0xmw_.ztv_;n麝wG!76yXlS㯚n/ɵKcSKoys}BcwC.t׈&8 NmBPAhVUz_Ѷbޭ^cs) ^DwH ?.ouXh7lݰsۺ)\եr*s|)8珣~z뒤H h2>SҞRĩ3{~GHQӒ!q0&;q1~$%t=/aad?dA?  831!ImK.'Tf_ƱXy0ѿZvJ*]+vN!N]:y^]6FS;hdN[C{XgX).)|@VK\y99X=Yc@!kHͤN7'$F9@MAKU?~$j~şrr_X9;o$g%ǢV_1Od쌃Fq~mL7jj>]= s}9ѿbbjub7dw2k`LGkgH}&LԨkK=".Crr|,"0'oP" q3`0Ǡ]ʖCAY] 1%0Y8`!W2F E)V^ \ANQ }8;<4B}vy3f]Gb{ n j|<{vNO<MReyF#A9#Q%3Yaa1K&Sy&MG'ސ'@KUa; 'j#,:x@-sIT-ed4lR葧 7Fz,2XMOQYBޑ48qe7c}N-:lvWQ82M"E=a.*BVK5@p.Bj82,Dѳr!2a#e\q.ⶇZ0b9Nlq|t{S|m *NӘ-Y(tOR5YA"iaWL;h;$#^wFmnXBfK5iS0_vLPu.|BG5W/ȭR^_mx|dj3- |Ɲt[ۀs8~kqܯv}{q7htc9.}}A0A0`Z)a3h~n 4cͪ J unwt_dpu3'WILW_сs q:|=L\_V22_\~rYݽ>pDӼHhE l$S y޼Q dru,Dfdu^wGn![v < V`3ٯftmV҃LH!hyHTJM4Fo}JrqqoAyz{ ]1&͎H6),S#+ R+e"<L( ![Z吵Jd]`U-<ץl=J.@[+#Uo\قނYO'Y^QWuώ ǯ7^h#+ܦt~ZK5$5 #g“%6^8AK2b=cGG!w{:iRxNB$Ƞ# :gDB7QK|YYT]ۂdBX:l=MV٨R; -_11k&zI\C.G*Ӧl>7<͎~3ORS䣧/C19=Ë]ޟhB0jkLwGG 3ȸu}n$Yd?*&C u\{FtK"TTtI\5/`TJ"1)*.cci\f.s"6/(LeYM ,WY?fmnVW(Fg $20rC2ziZ09$L`;lcf":]PQ1&rC= ᆤk^0*Xhs8)Zn{VgakNq8-gc'])\MO,,M2΋=NO&UPs߂trF6-y˵h;%K^0iqT^h6Ά}`5bZ0r=32BNN18a4ʇٜR&@KY"ۃp`SQ嬂d)V!HFjGz\VӌmUBcᓷh6;+xEv٘lmSX=/4a4| ?;Ls> Qy ̌\GYq2NW\4"ۢھGާZjdALf1Z ]: 1Cƀ2ҳ8 #xe<ę'y "/$ZšdoCđ0>1eS6;&z1NFd:tNҊ >e:hTR|<-`87F$X.U")Shp+yBY,WP]8ҫe.uVӒ]oYŜr`(QE8ZsVZ  2!*S2 SbWvl2 "}E 5rnNrMF1OYG?P,}GT 65B#.(TR '={?wp|?߳.tH"ѕBʲ:;Ԋ׭"5IK(d1SJJg2"ƹ1dkJqV)@hLƑcB=c^+Pym*`jjoVg@'Ю4ۧ\LD0L"ZSӸw%pUchK'Xtf:v\q]ŋm✊Ą׎_x-0;#P be?a$L[rb5, ddFd@-q{$j#sB.0 etu AZ!"n$phgb.;D댉(Y𲲝UgC;kW`fNSTP)S z^ :fi!(7&rۖ!%Kk>jע _XSPr1d$F% 5gLTz!heBOQ-^u=L!DrpXpHF}F\Uj1KQUcU҉m f<6ߣ(z2%Ƣ56`2qC>d%q*If:I툐lgH?6;TME5>YYj)1>1鹽Frh"c)3g%96ͣy\Ne3WӐmCU7AϦea0ZC_iگ4˘G\DiDCY$&/lWz2lV @IdF 6*!6!6>%?@'ߓ\ xpqj㾧O(Mvj>D 0ekEFYoΚFju D8Cb1)WJnjD m-\4q'bpE)5B{K zL4 ݩ',~Xkq޿ߟ_Z*LiG}\G7qZqTJ\RVs2Lo_/یvOfG>^%gw*̅08Grnl>+@>>x;ت^9wtjz+WsUY>!&OZ,XVxt쾇s|娂?!WY5g:- 2&ʭϦ_yPvLpn*禆t70<9"u?|?P~秃~|;>V? ha[\r6[Oc>_?8}?Bat- ??ZZCxˡS6\|ńo05yŸ9^\ ǩGb;"j_r\}>u6qty^~QR(*%Rnt"(=:V*(Ӊx1f,Qh|L;JϽD,w:I%X)m(PEgnS9X(ucQ|8NNWӉW+xtc^&ɔc&_El_b916VdQUk"M. зe|%IF`}_u7:^ ҝ~@NѣqByO 1 (L'g4"q'@h:P64B(@Ѱ:+$RLXZ/0 oD 2.1@d2&Άi ̧+m$GE.Yf>L/f;h>.de[#>U71hBR$e!Z4IZMRp%=A{b<;j7&M`]3ˌս\v*=5xu~7o0oS,2خdjyt|5e|c2MBlQ uzqz*t|ᑍTR6, Gbp2(H+&%p8 g4P7 e b^W&ތE\s8ߓ~fvrFC $0*?A$8PeGB1Bp2(u6o;cooaކٯi"L!/gk06Zۚ:[`߼}c4൵oG.e1RbhdO X@+[%(8it2-P|:M5R\ХN_7ߟ%vt`^z~|BWy\nni_)&H-#aa8 S?Z{<:afOх6./N8\{vlCji"/y [p-6w}<~SV3Nqv痶k j~w4m*Dzs5Gџ~i4=ҾNDG`oDcWC=N?HG_oC$9xGo,{L-Κ.EeK,.Ctåh@R+$qp @֝@d:3' A@ȜvqVI$C" f\lcLu\JJfnlJ18-'2ZA.{gҫRD9[M=fcvվlvbΎSN/^ȭMsjgkWSY?k/244/r^MvLfl' mh'+{̼4r>܌[ݞtyb3GwLW-9O ՜}s6nCpKzn)^]nk Wr%Zkyǰf#9C4~}k*]224[[b~s`nՏdoT|DO?~+VW-aNN'"TM Ro 9{ӣݘ kəsi)1Zr 򒻝jG.RdY[TLrY2^\+DWMhx`I?io&{DamzM#{@~y%ꃂ,Q:&4iL/5Ӥ19Y6$sj`n00xm䞌OsexXIT#U5@>}RqEZ!:2LE$w7AbѠehߗo^e_w>Cqmoܯp=[l*;xlٖ 8JTOr^hms)nsW $Ҫ:cD&koOZL//bفv!`"Yp^b5&"]M>lb 4rWF 1!c_UMc!n9^Ʒ7/Znj=~o_+yv;z%vMy4v/-1 M{zIP!bIQK(q'ޠ`ZbD,G'6`+}2C:|^XV̘ .;qZ:FЩG@^R2kM*{7=gְW,RZ9T|=հTMB#cb4.\; %`:EvtJtʄNy9N`ҞZ#H.Pi$ʤ:XD YB%(0dJ%Ujۑ[aLGks Bywb|2݆PЂ4bw5bdz/[T\NRi e͙V^r8ŬYZ%63_a! <8@F!ǀƕ4%Y;2;ĞlH%e d,%-Ƥ ep+#Gw&{s>FƧt t٫,>(nT `YL |ޮw~f}H-6}G P.Dj[v~bX Y/gOA7f]8C9 %|d(#z9pD61yg@ Cj0ϫb۫wEO;ۺt\$K p}sq7jYlEu۰@{!ܣb2ۼKDd&\{:OdݩKHrr4_n?x<ND֨\:!0 IW{aVU]Ws"`=O'{of%^x]!cAĤ'JF,U@;z#ئ4^+YөJtJ+Ϯbya籪W 8gH?}SaG$7$ؗS Sa43STGpYoW$:\)•(\RW$杇+,WU LUW,.5(\|wWZ+""M"ŮIAp2OI`WE\}+"7W%T^&W]=M\_$"_&%nlG9{b}9>+hgqMkw3)jI7[OJO89㨕_){8}{sŽ,"\(; @`VO{U=-}:ynmY'x<~>seUx|X)=q U8!ߺp),BAGwW9E~bzh)¿B prOzIO5.NwfLQpBB=-E`p 'mn8xvCNPաFQkR. FTt2Acx6\6ɥ  щMJfnlJ#?JKa r+_M=n.,og}$Ҡ40q%QEd907 #KJӑ+qodֲHĐ3<(نD!ʿbr<@Hih.0Aylr[gFMg7mGWKN$TӓjT3I5d~o[Y^turN@;S3R/**Q/ZKE꬘:FV+\r&k0gr)h8 r>P9LfRYL0Rl=.E3:Ul'.HFjFz\VӌCTB3` ,J~-xCVf[b^&f0k{~wy9vy1}dY 6H# YĘRIFj TJ& s1k-HP=9s lFmԙl;flhZ•8ۏ⒉ٙZq(jʨm6B&L^`Hх;h3$eyJ,eC6ox3@ Y Ɋ"išhMPD 1I;W[~^cW~Z(U.DqG&M^z:cgT5eBs/q6f@6[pU l$QJy.8J *Ȓ樅ӀdPٍ=jO8쒯Ѐs̎%#^eLKdh  HD$BT:BmN]jq(xx`#@؎Ӻ؋=6 hk7kRMk{JգiƲac3nc qr '=ֺ$vj^p$Yqέs޵q$/H~0>'{vqí!Z",SH NOGDQgNB1|kG)%E;jZa;/[_vȘiQ;8/- )ELp@6z%i"iB2e*"0[96:ᢾ'Axje"!^TAY 9'A:-7*z!3Zi+g] ʉCbK r6@GCґo#m5SaGR9F9THl$r1cu yt!G#(oqHUT?tOqi'J $!u6ٓFɋ; J'`AJDCu:NP+)t>sկZy 9X}>7oSzg1&N0/v> GIZS[ed[Z*+fA *[d V3IȅƉ˜qs.AE`j}ԖhI,k^;[C25@pzYpshUDp/,=L9J+51LLbJGK7Bl|i*F9=n<ίW$b`Z!В aZ‰dd% (.hF <}< nusO  զ-1`&!WQQMn^\[Vv((3΅fod%R3$$}tY܌J@]l =ZK!ШU>+=q9oϮd2Nj'rk맴9Կ;" W\mqh_u{qbja}F6e);:$jΟ<0tm:e=%Xw1VusLf(gG ) KrtQ(Gn4 ߻=Q ^AVwE \Y ucJpyw-NH7w'yA< y>g!v\7n;V$Y™}7۴Z}{<<^~T,O C7: [&g϶ŗWWP.Ó$CП;w4-ݙfFkmmcl28{L;st==-]2R|yÙ'w& V$.ahb?u`żDmv/u3UFַ:dSc,:^a&#a篃aU_PN#Ln>cX)3zM ?&uםн<?|1??PfN>ׇO'1E9M2Wz8~; 9|6Ӏ:Cͻa~Op8,8Q'Lh'a??oݴd?,47kiEeQO'|v-9vz]E-.bB b&4[r`k9]4'-%P;'J49d9j,t@h4ܩ*SK0.xffu8p`Ps`I; y:z9n49WN{`A[1;/_ޠ \0LJXi9T=ׇJ RB Y&Զx+؜UX:gt.6qH9H.srL}g|T_gȕŔarjh˴ @i4J v (F U`yj(nfeK>Bo-L%ʬb)^eHM$*(XwdGCP3_n=%n><81)Cjo^J<(\<_<@A@O\s >%ψ6[H} 1"qr(H$҈Zar45GN&$gJ:2T ub,s55LQ+,K2u*-4ؿ^]I7K0zѢ{ z7r(:kmoo 3roOz0>ɓy2h&a0 uԱ^o?߽E޽|tOW ~IK/\翢̱]q~'V.?\,89Q.)±gzӝ9{5yvBeʧA}#|JBERW|$ϧ/ +n:Ժazeڭ3ͯu֠Y5{ۙ$ߌD:~3Ү&D; ^wq:>jf"͗8??=ew%p y]=W\zRPV9棢 <,iDHs ^;B݉FGN+9D2ڨ đ6 .=/EHs΅#.bFDP BK9HL{԰Z rP_].uduUXBRO?ɥ0R Ea% tT`'\\%4-0'R+EB[/?̷ 6δ߄[ҵ'%,zgl{9ζQz8/HE/KZ:1ȵwU$uk{lipl"G@u>:x[#=Ӓg*!xLD!FZ*!/ۘ5fokW`nځW#\.el!F8g/.0|Mn?7 (`4,jAEa\)P : D6{U@ED *1 :"7o Ϸݸ8l|1+pͱg}lϣl&jf~_y_'ba=/%[dlui[B=A=- /.zxJ ySjOhrBe1*'@õJsƽG VCȩ'ʃ yGJil4P21C7D18-EBY"1Z("Jw;+ҞƾbaGkں?Ӊv/.17}YFcTG{ԾV!Ww`*e6B]oΎb^-Y/q8sO:{T-Js{'*i|݆Jq9*댩 `\yZF"sӳg󧯓fȘpv!I#垑༔ ,LrLk *]D]4e؀z1"|4X =h6*b"AdTS9o|#b̽蜍om^LӨѹk 6>݁s}H{r߸9h?m:]w)%?Hoj4BQ F-iJp+1 e$7Z~/F/׹@^ArghuΥ:{ś9/mwڭT{T*Ғ[bҒaw5/%-Y-c,6-w1!L,wH)Y-߹|dv` nn,|u#K${?dedԒd7d &dSZSrIȱӳP,tHT |aAA n>q"I g_,Ϸ7#>݀&  ;`P.htB3VX%;`=A1JV1 u\\j~fw%ЪaYέ(}omz5P䍀=xP!繕c,h:~"h&A sh҂r"-ou÷,`/ 1\DG]To WK /Ey]4na8"`''i]j 0%8h &%̦N L ߜa=y{"s2HfvSf5Jv}~6o|r\LC\IRE>n6#H 241|ƉDYO(OF!JX`qÂѳ2(N|hJqpS3dAk;Jv?G p_9~ m#y?l*˱΂@*doݤoEÜmp3O;ٯŧ &p襈EE-aVXb m"eC=s0T)1 E׊j@v0 7Hϖ|hűlۮXZHTs\3a2yx2OQb2{LQ1@S_hngRM[=ve ([]yU(d4UWվG˸Z%뫫 6PتڗW]z7]t6|ѪUgW ԊƵĴϗfR1}H?]8vAvIXcV;0w/<[mbԤ.kCMy|IoLJ~P)EURX\}sRWkx \\-$EWIaʮL] \%mtJZJ{+}6p=6Mޛ3?Ve܏M厸ԣ_{X}Z񽖕 1;DOF~*eGҚ F- b ȫZRe(E-~OϨVedo8 cz/2_[M꘨7VFIj 2N̺A>b`p:'&NqWPea qt\`>i8C7睳qmyJލl=%xe..eщ/K0SuôUud¶pBBt!fh@ zO~bN({mOЫ2~]v@B+=1͸%kiSk,_?WI|[+|X84 ᯱz*5=|rlߚ&wm~8f^.[MM/qrO27?(44N8l`rr ҆s^ɥ.US]6'q>IJ.$y}3^U?jH1`5^(*aB3VX%.% %R|bCbAY>N(B1AkI|%Q@m<}U{ߡ0ݶR &5/tpԠp512"$mgm֬HwTEepe~WMnE`77vr7b:%ˠogź* =/|[жUgH%=O#Jxtɲ ] +›˛-ϓNq޹߆6x._UiUNCDXI-=6\os7]t6|ǦU3d DžvtPjy?z۸v֢LNHGqǃV'KNymR"uw|s][e. bDI8 $"3DqKӔY̹nUU =}? !2q-Uvw(|\7|ҷlF0Jp<\xdbY!2w0(~q?=fXƲI"nZdtdze /z^KH`yd58Va4F"RxI^ ,zj(۱_ Ⱝ^XG+Vlu,cΝc4]X}xUU 𥱊Qtϲe1( 9M9eLk*G*EQրY^ 4Vii ~UW<92g,\S;NBl`/q.QYMW~S}֯_=q)( q Gx8< \7\t^NL篾eΓ_ů ޓ5zE"hJ T(0`N{}罒 ^ExjŖ; B Nךa!Z!b!]ny-XOM$"n{_%fcPfBas잲3n\~Y晬2ٻ6rdW{i@ ^ >nd#YI-~D̤jVŪyMՃ?k`y!C`J+b6J"*TPˢFOٰzlieB|L (nSAC2%1m`fD+{jh.Q̝zmWkkwƚR!`3`TҪd (*Y҂0[⼑W09C&dȁXth$${(lGRVE"hs?lIUqǶgRX3)GQSޓ gNE֑rFw)[T^ >츯Tc9čQ.ke2 r3!h5jn$AHLZp BvPM-qn>u 8YdإjZu~'9eP81Rqf^1k4CPDTI./އ_XM;?2=[%R@s䭫Ob ;(8 G~lP/y`J;Ѹ 98?~gI^5OM3 EOF; lwߌ?-Q1M\p|߾n*y.MJ{!.Z4$Y6`$_Y }6op_<.B!/a@ߺDBMo;_g_gnYYZ0vu؛m1ٟ[G`w{3?L9=z/޼2MǾQ/Qv.~r9-H{x)x7,z6N*5D3Bl}C8stVWc[tl|jI"+ge}|oƷө5[ejɳA&Q'VT<VgfD)8l4wU93DmIA-><10qn^ɫX{連8eݕ颙z[7_$_:RdžY'_) ͚t}.ӇkX8ل3Բ|AeR\>h4p(mtJ*mn.1^8`e1a!rYnCf\2kpف1L.JzIS PrK"W,anr)Z_fM[7Rh8)2;:LƸ8,0-Z;YΪv0E*|[h;dz,2:ΫE)Y k(59}g5MӏsG~dOrt2q2 LBT\>}L ]<ĐħP';U jų.x]?*BCXe:%$ag"r)h3*e֕qTq;NlP|V( ߎ7*ø61^)m-$ m3ԖhLx쬭ފ6cSJЇޙ#Td[㓕obckBLW 3ёr%VҎFg&c:,E$bӝy3%۩ 4d[?"tR&cI6Y[ҵs-(as.6b㴴V6YS\k(r#6 $&` HM9ԓ^P.W|wզyxp'5?E КecD2P*+BW\ Q)ODXMtXQU6#!JU,aBĉ\୷AJ:PЄq<'R10RwiYk>˕')M߷%=7хbGd'Ns ZJ,M!R)MB Rp$/:]h`:)9Tj4gEySF0J dc9xց"dPKU,N*TCVƣ`щ&-#KTh Y+#&wЄIsØQr>?ԴQB0.MDNLJܺ "0A)cf!$_ܧΐ6kl@!^ KT1*s2[%@JOƫʭgsŊK}&Hړ]ifK-ˆ ٲ"z(8F$50Kb S̤?{.,kO/⽺.^IYJ<},_rj %2PL쟲l˸'< 9}f~o_z 𸏌aXR-+iJRtNҏGᇣE8<+s/Go#0|oەx;#k=Eݜ^+ 郳d?nԊ\<իIFK)%J};[f2#]}ve{x~ep0M^ syp<[.B-;3`l]OjuO'1rm7bQ4eL>=?,_۫`7xu}WŻN|\HXe4 ++|F1z憇j҉=?1wGowϏ/ {^{f?a,U׽͖io&kKʵT,_v.٬tuz?L hҋGi͇[Ao7Z)[koѵp{ךo= U+/܉e$.[}V;FtM|_}b#''J<[4%;nT%ɑQJ-TA&6Bl%ֺ4 #J.;xf1NF<+ Bx&%͕3<^@[ƣwmT٩j:o㕮t\Zlja25Ta}6`tph @ qMm ,5ȩ;Q4Z6/M2D:9 +tzc:JN9q%G)gm8-O1(H'"aQfh5[C,CIU0H!aIj3N O^61Y\eYM 8xї]Cw)BclӘMOk< ~~ב{K =?3@fiǐs%CT3 OU#&% fxnؒ$#B+/F7+Wv#^\u߮6eipiCEɓ6_Az rn[:H:'{]QV^m>*V 2m)|9furݘGJna9XBs"iy$#89*1!DΕ]l:ѧr[9_C#U=).%$RŲNwY/.@+ig\0Jx{oӮ`<g%m 7o/p1sX\<ͩSQ"$SR7ll1E<vvcgm%K}- -3rL[Nz`h2H/ix 6$CT' \v~9+w: k/P{WЕFAG-Gci/B$mb| >:HSۼм3+hM Ab5V7DDg]'+vXV9A8)#(Gc{նޘR`G· (qA(dLI9#aE~t7[Ez̈́aǯ9<)+\m$1GbjJ7Cd4EւR3 AC?T@C*#U9"sMP5Y("`vh.hh} e%7ﮓV)Y T]}UՀbsPb*&ػ6r$W{CI29zz'b2s kRmOHJ,>,Ai#lK"L|4 !#/ ϵ20R١hY(y-zEwV%؇քo63Nhtˬy2>ͽ4f|_(?򜐩_.>~}k^s~rqfPFuaEa<=Q[?'@n1>=+"kĬ׻;כp.ߨ\UŰa+A=㽂PW|x_g.I gQrHsJC'm&h"8`!~Oڋ8mP0*(A"HnRd+.L$B0b *ϙL>K. D9X1_*"godD)CC:iLcyct.:Sk $M]dTYOR_\o_qt֤s}o6noz=YWZ'o.--M/fݛ>mHvrh"E@hsӵnzbTVB\;zw7{saKvPz$g9. *cDVe[v&PQVCҤ1Ρ!Ixf h2IDFr"O7MRHgoR˒6Xb=#H2R`J ǪrQ"TRfcH:Хӫi=~ϡ.O/U09)kuX %ub:wVJA jg {b}G99^x\۫{]>^>Y5ڌ6R;֬>=L)2QHh_1nT9Q:yc(/NX}ӛSoǟ~}+{߼yRͼhLcn^d~$e,_ |e箵v]][>bwW߿HnMgw!1u$nclHw&,jq! Uu Yrɨ(Ygԓ̕RSA:+PT!!iUh"N"#șc(H(BOp~\7WկɆaroz;r?㧋mOo2$CGk/I Bb)Q }%EQ¢B>MXZ,`Rfi,3w)/}X`NADiH9*eHk4gߠ}\cSvrӖɇ5&qbzЦ;dSr[Jգ4y)~}Gl[hd;'jbAH,3dx!6µ W;|frVKx/2IY2#8-TLC$#%ߨcR cxyaylvk9Y8{Ɍ74ˑ20*u̼ S31  )0Em.۷`tx~Ud [w +bd{u3Բ 爅En"CiplIrN";L :Ws`9p=u{T ֕#%r&PJzҋ![|A(p2Em 'W73O8_w..B|v|n)wjv2g{/-*S֛b62KDU lXPPlK(3yL ZpRvR M#_"X\RP*FЂ݂Y߀rWe]͵alS9żz^10Wbl,dzܿzF_&G)T*Qu^9ŃVB4ǁ= {{^G%)Z!˒ JQ] T$Y%t䶦:2HG(ZG[)X {ǤDQz\**F5V@iLؓ=twC|© 6v5YKϋɧ~dob| >JSys\s yzi_ z~Np|2W{)'e},jw&SkB_djqU4Z5Nb̨CBLVdRy5ҷcMY() ڠ)8}FL+eLF Ƃd&Sf>tj<&x~Z{fQ) trwBydg\ft<AV{K2iGQ3E&`0Xn3]&,xckQ|J[Č駂X2Zy^ _2IE9PgMv0*P Ɋ5 UXZ/v(\qD DΚ_1Dl/KPRЉ߫1(烎zn>*%޵q$ЧCFÀq9Ylp>csE*$%9~SjH5EReNWUI?1VtJ8Wu4 T"n`,>"ymDQ ɇA#@1azCSS (ˀdZ jIğ~q۩##*#`,ŴA𐘽 1/\??^LKҐᓿO-G^İ ~|I%-]^q~˸~NqɖÅ Eə@8KwzV+̳>_qar]%AB  `l0Ճk4PR?vfN|Zf@gKG [<ׇG?3%`fW;Ӥ?~4OA;#*av`8^v@;`e]u* {Fk ` 3!RZ0gmeą (C* m\>>/"}VRff\Y(ʼXRG<\{jضM҄MDs5('lcB_IKN~r` TM)8cw4T鬢d21\Dl0K68@ۚԡ6FtD%;{)sow7IƅüQ~9- ZK jY[}7OW \pqnAoq@L<@HNcG$#x5 ٶf .71Hcs1ٕ?vj/%ټHG(~柱ŕWW0sH[]DSbeP-z oRn*t<&םAHUZ띑mUtGjz{Y$*8ʝ7\ 3f4N <t:FY^1p:ӠB:PTY `*7YCg qߗVކNnuà3qi|52r~&}Gݫ7&xDX+q"0QGZ \+7-ÛAQ`M~La6bz2?\beSQyU6.xoޑA 'G\ E8-)!6Yڞ']PwAtHVdu"N΂G I=EیH!o]SLkԾnȒf U3U9|2Lnemelg;ޕ6'mzP^ ~a& ]0Л] -7Ǵg{nj#¥rf.ǟ mzg{bkK<jm\ @9c9AcW!Rx*[1F: lp6HF7g)Oc.*5XPhμŖe|Blpg}W%`JF]3d'xg Q8?:=s5)$XMQՀ㩆 c̱b2IKSYH)EВ)E>i oAHR2b|CcmJ}59{yAҘ%/-+pjRB#A*ΈgTMe.R3q+WL\jV|=evp0XJ˥\R~_-sxzYPPơb$ .ZO}٫`'  0z<=wFBCYZM<2!PĭN+^LErXѸ-xJ*p#3ĺjl^r^S-qd@!͛QcM`qs:l2A!QK'",ڷ !+o û2 ^SSs)a$ҁcJHL~18+ 3ɗԡPKp{cOL^JJ8('%zb΂wDðqfcA-|[Z/҇1 yw*4lxl=GDbu-5xy|;F=_0A`%,JQ$@Zȵe'(R^A.Pk O3M}kzjMOkyޚ1eR="9x;<Ȇ 6 Rf 9 `>Ho[˓$n\?Z:Bk-k=h؈S;iP|#\ik Jx6b$Qx#/(*٦#pH`9vE=}YLfK9xRjSp*-͸FhCXqd Af [VJPJZ0j6^f|w3O/dt5# GL&10`=60ǘ:_Y߯du0xwA Ga4ߧ9ЍMo)7ԫrN ӗ\p%n\OYFv~ 2,goUU5Q>~}42Vɓ&)ޗlB^fҌ[nr{F@Aۛi d3pILMW)[/QxL)7"x{ut)}]^K01?~Y#L k:}-q:z ,d2+ܹr !LP}!uE(eRO?_PbCJjlܡ1q!F(r0Of6\zQz.iޮ8m݈v:֐ň7"e(ZQ!Ѥ/RI) ܉m'Utt).e!P RXl`X3hfMI*wF?N4<ڝO1M}@k/M.0~w-jdw7W]7{ d4 XvMo3J!ًCL*._ݞKdEH.Dlk$GnTy} Om5}q? `"陲̫3W!?n{@i5,UugŽk&Q5`oŴbRՠOҾְbβ7ZMLP}ͼػn_xe,WQD!$q4-I]W,UT cћwY+QN[@z%~)!C;0%_Xp%֋ZW̴2i֚# wѼ &uDhQhc&zG-}u7 [[;sU۪V.vm5~GgV{ymq.!ӯ 3LϦrwbsܝ]jsan'Ub*2W]\%n;usԼ5Wo\\"^]t6>^\kŜkI * a`<`wMû0c-Z cѩnf7n8Nf7.> !C孂6M|o6M|o6M|?)Z}.wOѓ闵x~\0}}=ɀ^ :*`p BSgHEEWzB蕞zՎ2'q?( s0 I2~A8[bNԮ^v,ݱ}3Lund@KFkd2c 3P 8SU%$T%3mO$6;ܾdrr J@/$ 2;w+T(GZ9 WuDXRY5{_ }oEz;KH,R2DFT&_sa/ SASFg;j.x]?Fbt*!DE*u#(&dR+9-PK k`8z1;U7f۰ڬ> /Nv2BZњ6x4IDrAq"Dgr@4h[ٌ_ΜQMDU>YV79a*?rT~*F&u<|Tv&b 0IZA=Ss0S{m{g󺰦pMIHHH8U@l8&R*`UV٢?{ @:A!UH6H)6@/z>n@o|r9km\6M>Mr=ZnQƈbʶ*2N芢 bza"dA'!(LL@$UNL&=fe^c{Ά|\nOw+x}M餥M>"z&{>s:SR4V85KC,RL}Epd/4l^Y]3hN\`rR0JBduQ',Q d@W hԭZ}l&ץ - !\CsN@G'Al65Y΁]N"׀Hu;FFl(_ePKFv$MF|+^3F٨赠sL!H*6OFV#[ 4uF TtJZD=Z'W}7?;4M򸴟%FrGyjQVa6WU[:M?eЌؒ,}`0nmTa~ꃙKN{*ׇWs|3F&k^ktLsPHjD콵c^8}ڛ<ޣ#,~;#( KVa'CbT+.ȅn2NYۃ*E49]`6ogӐ)hBZO;4: !Y_k-(~0GQ+r ˋ71?x8\fPoFWizh^݌g}qU{ㇳ兗ٗwq8|Obn_%G䗳Ӷ09#@!zZ^=wueay͊i2mm`źO'GV=s|<>U{]=dWϪxy=J+kMgm]9ۯ2}'Gʼn/r/5,X|x=JCǿ?O~T/gTs}ZOnpa@$#<Njzn(-cb'ͧ07^qZV]ۻ6R ߠ_|~v_#:7ք8.Ou7krd7rh9Qő&YCZӁAx"Ό0 gB;c6"(=ae]v1dt5wHƤg(islyc&A&_ acM&mw|ҵKkp81Lف=i~\Ygʗ(L5χ=5.ܐP&pkhЇPC &1T^F9rVcrd(bBL^@ IevF1vJAΠ5T`y[(p [] )f TH Lj-C0I"{Άh[]Cw{ݸn/Ә#wɵjgJ>ٌO><\>$ck/I Bb)Q }%EQ¢B>*Xʄ2s]fdAH6JPzVކseeq SqWAnq NޯzoUaUwhOYZo#qW@t;w]R^m>{T42Z55 BcPW<5urݘG8639zdËLRN'#)2v GIdr ub|ccLRj&Φw m]O?̵ZύջHtqArY 3rԒ1 yH)RgmӠ(} {OzE:(&GkcE-չ ZL8C^#qzt<4'%el0)h\ !"E<vvcgawvK8R"(W gbbh|id. 1iO o|gprʝۇmff܄E}G߹ym9PrY"HdJVf#h)j4S^58y y)A/d,.)W(Hk@`*Iܮkq7dzt{5Tq=XK|$fZY|}!P)zkd^lb| >HӀyq3)W c3MD bȕDr%j,E`Tez Cr:HfW I r >C5°p/[ M2wD6|2nuBnpR6XX՜Co] IK^ .I)2MK*hg)2(`/ Sz oɵ߅XK8ă: 0WUwJYo.S2P?قD&/W^S1Ktn6ol.k whR;U.S֋vdF2Զz.WsN_oKK=_Pg]2Ж] u}El#0ulz5u],H||>-]8{nk⹞~]\j?]G/dENӳd0Ϲ_-l٢DGzR4n{yo84==״%?({oΝCJ78<>d&o4DlbF|-eM& 3$ !w:*GA\aJ$LQpkam36w }[ot7s̓[UG^3>|ryXfsw W[L푘eܽA^#Ze-h(,.5 ذG, owmhHoHbi0 D}SV1܀#n(+,}uNڦDfo=#ݭx(j>%)mqdȈKs @vuY(}m+a vO|3{F̚ZHZ;}8Յ Zwht<ɭ^vŻdac_ѿS.?lƧWac[;έ׽&=@np_?Gc~s~/!\s_-n9mmoP-H#՘z7ÙmE6 RjH 4nߑ@{Cb7i"5̵ JREY!I'A8 ʪs&ӫ"rE'=Qc|ŖMݳå&<^+LhsWl|z:sͱhG/8}\a$طU5ݎ]L0A%I{qBI WZrV9O0Ë}7s) }|19] 9Ƀicť~;y8_G>.SkҢRJ͟s~{FWe1 218TLb$ .:wqK0~}`NGb( ҂ǜP&$TG:r{T(*qIxLV>n} v=1n+_ӹ; ޟ,z2.u)XA08s:-,n<Z:`־ A!e  d@J0 1VHPHsM2B0 :WUE*L<iRRaqR"!@%0l\,_Vz[iwW!&l=5LխHݐJlA[-Q i`Mw+3x=;)W8','6ҒF-K)`a! +ʰtxh< <X2)m͆ 6 RS 9h>K|ɂ _kwCE*Z%oV2fe|@Z~5:x={7kU5Ѥ~SO/ ]=>nm\46/5'H1qVK.[ҷeG@Ӛп!)b9 n܌w-XnySz g *w8Kɧ].GVs)%,ëvgߏOZ(~PisIE-@$/`Ye/3 w{ >{ӆu,|_ FBx m.F .lM'ϺMwzUcCvzk//jETB /{kx8i|A{Iؤ"^.e~JyۃػݎgތGw^쯝J$:!"'WI\N;\%)5/p I)>J u*pk+N v>Z 93hE/pfcT|[ЅP#̓i&߮٘f+)i;`0>4mѴ^HӒqҴjOzld*J#+ҔLѫ^g L- C8,S8$J{KA8TX,6"Kg!cFY iU"bK(F[ :aFhNṖ&ܱcDAMo:i=Eߞ;lu}Gݥ8>\T`u^) JJI׷x|o'3k"I/ ګ+l ױ":CE{} O{Gң^IFFʣ\#L(3a5'$X'IL`TR7xV7LRVa S띱6Lo豉hj4BZ"2k2}Z>&Ŗϑ?u0fp3"XV`HOA+x^!N XQAor/ qfaH0R 6h9~\l^Ơq6ʍn%<+m}2c{JZG[{srZOVtS{秒0+*oPrY 6&9'A z Qbp+jF0^ȊB}HbUHA!Q[M*Ffdd;qf"Tnd&fḑdl3cW,TP,0G:Hc2a6q6amwW ͏]cRPT 79DTU${e*p Q"QcשFaP>$8pFBb%#1)DXRPEX҄Ij$f`P:۩&8Cm']fc:6KN=V+Aj@/ 04)$pRa. %*#ꉎA[_px,xmv슇$3!]R#75 ;6nNp5ϼۼ~Sg~n $ou&eX~zy8"SW3t+RR~/ep;+?y?i8]^Rr<a.j Y<X]ZXgzڳ]_xu b !CL ]t<\]٘ߛNu˛.=NUcCvzko_JL]Z ww>5n&zPދOζ=x&xt<Zzӷ#"EvgmhOmU3[NL,0Wm|ń0y*kmZ4iy+Sr2HD%nWYV3ۜZ᩿_8ʀL ϩL+Cm{ƍc1JHCt0Q ZHZ鄒1 k FH"I`ci9IAW徻&~"f=K?+\G}W\yU5]X%}>\a4gSjz|)*,xD4WGO(ӉpʉPQha>pef(7`ul+بΥ4qA{Iα^ٻ6r$46ߏC. vg1b?XgYHp[oE-rۖ'VSd]UUX󵢐ȹ%1M(˻i6f)K(22wu4a a9q`IAL2(gyr9{Y%W^Rna\9 di!@djwX'}QEtt1VYDR"hmr4)7/z= d' 8|| 9>!ç:|tL=$NZA0Nz^v^iQ;8/- )[GLp@6z%i"iB2*FQeho'r:(E^`DBVIBCp &(Cȅ֣Ǖ~FC+myD!$裹R*|aSq>BBs$_9P)0ڣQ sr4Ѱ V<9; qȡ!ף׭&ed^lǥsas66n[lgl=MV6l2 !0<_(ORɃҥK?:o}.OIw-")E,\G$ 3J0]ֳ ]kDaRk(c$rq-`89"0i5nd>jKS$6}%1r7צb 浬qӹy^*T^X4rZh˨\fbfS:\ZCp+x?(Czsέmh":W10pՒ aZ‰dFJ8P\f B.'yW+9ϰ>SC!BڔE4N61^wBT$jBt^haQ#:PDg OQPJT1f[I 9H貸 RtQ5C:WG.@{s0C5$Zd'9gϏIO](~~P}0Ft4T\ []C@MR[~Ц̍0.ɾQɟw5̞T>+~Y'/jD'S1X>e(gԠ128AB9agrEQhW{w6:-fE 䬄@x9Űw8L.ɍw 9sƙ|Jէ"= F++W1^nУŘx &Y7j\%%tHX"W*'Ѣ?N[:UtM3@v>??_~)3?]Ochݼb5+KJ矊_ş:Se*wt/٤dd"ξ3܌쌊0fa^,nzCKiǛ ͍=YO|qmr5> WR!1a@}>\əFZEܠNɀIa̹}` b@Fy42"lC FmQZ;3r7NfN#}e8<0HNɟ@ `TGϓ!gwJ Yhu99x⅄rsa:k;`U<;B|M\Ykʧru=(Q+%0]4΃6Ob 4[RXrD=\m%G1%e2 g.,#}QBTlCZNC M8L#,rT*GDӔ0l=wwb=: N>;8fL 7jwN%s_O9PgB9Mq 'sgF|r`eb-т 4}bE* H$Qk6)ʈS) Βְ7Fn!e^0gX”SWl`fs'N۬E=H'yfv}&UJlƗrf߀ I Dk4D2!3W;dl v!AvjȦ;dG6 ذ$)"AQ\44 aTq e@|J5P'7zms}p꩑݈]~OgQc\Dd'hwʅk'1(De:یNKq8ΦԘ`M{SV|7d(krDcp>r IMTzH!y)e&9[jci-d8N?hiKqz)tB0*Iph}E\ ޞ̽9Tn0L~dKl_xZ*]/&m:hu]yc9PV9棢 <,iT,"$ED1MF%8X* gXJ86u 8l@}2j͘ ;H#goY/@8ϟ|ggjn)&EMֲ|&p>ܨe{8٫l6`=R‚ZY%2pfDSZ뱵zGd=ңL$EK9KAzdJ\< TyɅaB0`vfC(yDnD:iQr%5Ƅļi OM>4Fn\DŽ;pk4Y=*sAާk ?~"[O1-|&i faK}2i0ry헟FyŝV/"\T)‘arD#"x_mT H2xVCPZ^EFs΅[&\0QŌ&% 'Ό -a9IL{2('N}qykơ_)6*Q.r)T #6 5eXRA @`N AVJ=O~ gvY[(RG7:qvJua)=D#γ$QzacSؑ!>ңR :E}aZ?`,kAUDaQ9ʗHAH[~KhH&UKo$h|z ECACNաPi NUԔ]XYA==&5 I!*(1&%*EKZyPV\,MN{q&nXiyEx+.o'D/gEm\Jwםu~nѻo{5L.caCu+nZAsIO[/{{8ϳ&!3?/W7µYzW6anl[LP/i]||>TCvkéd4X 4L E@# 'GhO㚶NiVCȕ'ʃ yGJiluidbo18-EBY"1Z("Jw;+Ҟ}g0D7kC|ΞN-vo/ Ҝ̭WCl5<ٲPR<"e0f5(jkz`,fLt.XT342,5jf<[-ńc=W*ڢ);}o{~m(h3n#=6xF\xr:ølS1X@!&}s8[P `IcJydl9 A 2b8L$_LJwoK~&Tu=t`#:WTyKZ8Z)0<ɁObl!?7BfIʘق@+iv̶>'3Clkqƶ 6XlN[a˖'F0bTJ*XhhM`)r9֒]̼%XjDQ;=EKBFSGqօ#D@ϦQXO6Xi_lrS"cIa əi0dSL{n`5PE&ܼ` `ӑN$j/‚K3b`A֙/+ EU& L%X@TTPtv@[b Z 4As0 m88; }K J I*ʡ.C_@2!޶I9z ,dݔ Dqah a0yg\Hnvg%W z59(X#wz5 {ׯԡ]C>ѽ{R6*qt\/Ae1 Ŝ`' (WJ*-j@&oCMUc< VNq[YRpPT{B;}tWP;$LX,8+h&GZ V;dWlvBЛ!ՠh01l a?o9a `xgb͂G¦ ÄcR 4k, N uƔX q `i p bVF SMpV6-tew,\ J!Ҧ#K&3 2}YW ()U<_Pъc4(uj\ 1v496PcՑj6[ J%Dol;* ݖ` dw!*Db"Ѡ ҈=>4Vcf#*2)D_)^_^E{7uirlRFs| (Q4vXN2_0v0`wOۏWrOwO^NLzêʏg]`)0Q GA*}vrW$FHVBxZ*3d`ayR@ 8X0_)XX芺`FR|&QL&zZ't^װ`TN9ږ~πyILPc $ku <oĝ 1|K`Jrj@K̀?oɍ,oz2\ldoF#AtTq.CMv}_}oO ӺKm1f,MXuOnBȈ ǡA]Xj`}Qr}\PGY _I aʠvݴKpm9GKշbhǎ@HA-D/;PjbPyS 1cuD8+1<>3M^nFx,ClJ35ȍZ. py:Z'xw؛GD]/piSRXQvXY%3 Zصh{j M/rtӑp4DF΂Zm'ڀJw[7W3\D(Xy].mjw$a7>[` p_S+㓷%CGgim%`%ACx;%@75ǴWo./q&=P`0u Ɠ/<4z1Q6EQ/= ~v12,fWZg%ef+0Ţe0ڽAYo F ʌ<&E y8r{ɵ5BG_s~E.`,wЩz`m!%L +4作69R|,m qh V77'84`[r|pp#5r-ӛ)6TyǴaxD 5x 30xד_Alz l5(? ޢo{*>]NnߜXwfvP_w[{wF?/:|ͻwvɧ!)s[ im7m~#VaM?th)#ˡ4?`.ߎOWوu5~Jɂ%y+_|ð4L>T$8J[簌\n-hV[lAII-Zhb]ptqסUJоKNNCΑL!+/CW-C @JWgHWٲsi!`o2t%pu|DWwMPF|?pGcS/8p6һN_x7Ih!3xʽm芕;TWe ]ɹ6BWJPM])]= ]9 ѕ ] V+'xt%(s+`++V+'ѕw(]]M '.BWJPFtutŲ,DW٘ ] \W+AK7U]}[ѕ ] NW2U9vt%.p2fP?lV9U"GKmd`a ] NW2f3Ll2l:t%p2D] JwUxԇ EiKzx$־ЮP= 'UPzԣ+Δ+*t%h_uhۉ JWBW]+Gc+k*t%hNWro'2(]= ]yf q!`6vwJWgHW)]puJ2t%hJPBJWBWrN|/O󷏄CX'Jz{lDw=fE+NdNKYf&]IOv! Xl*[w lZ3b)y!Ĵ ] ܸ (U3xtEgz74W`/#딁"r`t϶H{zʒee2mZDMQKVSabpE]=BZpEj5qJ9Sĕv}_+W$X-BZ>$ :A\uUqi[ HWֱ Ur&\ gmEmվJ^ H=H%W'aWܢ`B5 VT p ,z8gLw[ sɅ#/J훯 WTW0ТUEB\A5" kWRW'+5[[ HsJpdԃ:E\I*VW+W$WZpEjAWkַ4W+r^wLփ+WVHw\JiՀĕ)SHJ X5Jw\J=<:I\Ia|Ζiw]YՅ^ŀ*4t҈aWiUk0\*4J){V bXEw FhdE!6 6FVb\fC]ClR zO0Ķ^H;,H&ԺާJ`Cz) $ V+|+]Z Ra I{0q`#DrWV~C0YIkfJGU+G8J\WnաE7*-/(U\Zm+Ri` J(dM wxGsW"\Z }z J*T+|vr-Ԃ+1w\J>SĕqVH0jp`Lւ+R+\qE*6iջJkřW$XjpErvj;H%W'+5 WjpEjUJ\ opS\`ɵ\Z{T!qF[cW(Wjpj3}4C"IYf{zc+`DGV4óoW};gVcV6G}~kYmq wZeIվZ;ꐢ־=w\\k UW+YYVUe5\ڷ^Bˣ:\I ` 傮&D( vTgNW YpEW$\7z\J1SĕoecɄW bɑXWxb~b3$emJ9gm˔5Z^ذJK9-Yg%[:>bZ!>&R! )k \Ԃ+R+uqE*rB֔ ]\Yx+5"[9wNW1joEG~E[l6WR2;߳9cZ㻫vSZZqv* WC3`* RW+\Z} :A\ .$P\\j5Tn=pu:ܨp'$k5WRW'+|-P\\U" +%cz)LJEYQή' OTQF[r-ڨ5+OI Xjko_שNn$~}=}lO7su# '6̱/6MlߚmN,5/T?bu'k总?ó_y~o o?}Ѱfetע7`cqy:j꬗g{"m~FɊ#O5Lzzp]SkSsV,oW 9.10GE|{1ף[uFXhÿ/2<282Ud$Lr:pz1?F *@<1VдڋK6X67ɓUޚg7tG,҉>_ed7Gs>c?9=5䆂ۿL~U63c:<.~겶sſߪWSeKm[5Wrv^<,Vv]z;ͧt6W\Eh[>Eߎc1CB?'w/,7om܇ V~~XMrf^ۇ;Ztxij~Kgr9]uO&d9>p5q9_=zt^>Ϩt\^/./. *y75r-'|9ycuMgWخ߯x5gxew-x͕oLS~2W1 +ient!h?lMg:U"gHZc!5I2KzW<1 vLȝW$Ꭻ+}}OWG;@S~XZaOG3l2%!CJe-QN;ȬhKI\ ɂ4?Wzx=/q]䐝)<w!)t sNF^r9̧L֊O-LfG^j~oW:gkGtM[b|>=I ؅/ӇBwy~4ݺ<;G܉sn sjA3Y;H=[D{5gN8?oS%v:؇fe^MWW:wYo=(Nw#宮C2ڕs( 4ݫ{=rEU" ~H]kǸʳEtF@xYsBYI+cԽڌh0]) #C>Ǖ\˒/;a:vyٽ6YdzP(1xÁ Ui\eat}RD&JU ΰeM6 s0@wPb L[@q1#7$Q.Ǭ6APe;ny揔1-QO(ή}|@ʻԓ5쩱' oS>]72;O;^kؠwvܱN.LC}‘(.-U=}T6X 3N;0jε` :50%K6GpitJFT0,T$(l{T\(os2P줕/ؙ؝W錅ՌCY) 7ږ7/F(vea'ƾ3C䅇"D5Hؘ3P uB(*` Q1e%K)%Ȣ6P`2!aSb$F (:l5)c{Y@4hQTtblyMP?Ǧq 3- #Oiﱑ ggt2*KVT k RIa|rqmR p4ValcLVB )DPdíD+1Joj2?_e,\\WuvVK"|j|Ių,8ţcXřRxxE&Jv! \| .ՎCy(:8o _#_|X0x4&q$!M6‰_@iq}߭7&WTGZu? &%  UyHG`R:VO'}I.Y'IU$*}sݩ]?N>(OH^lf2"L2VPIiSJJ0YQs˾X]F*sD!e ]BXJTgɝ#O ڗMfQ=ݰTc(]#ooM ꎓ~)P;Luf?ܑ!@@"14TPQBt2sZja?}NYN1H9`*Y]࡝]X뱑KVhbT))kiC,Ja2W=+)sSO&t;Y )c+26wEs,:&MhP l%Nv:g;k?fT**~mi5y:3\^:밻A(7&R U"&@M[wރGHS硰d@G%Xk|A+:U ҡ;jufrҧꯇ8D୰Hc8Z–Dg.KRYq AJMB;YHK]w&g7mhڧGdg/v˃#tڍ%g"B 1(AcSb0RZX  J cSJj^hsɷ:n͂ށƝ~9߿Z5ۢ(QYVE[,pUtt`1=6l`/^k EӼ(n&~5֣\T +t.yfL)ĦV%$gR9`z2 h ١/LP߫k<\ݙ*ppUFA[, !yjn`J> ۆt3?tV+$G4\ecIb_NH}Cl$ Np6їjGԐ!)2pVWq(JJ<6,ۜfe"A Wd$c 8~?m~Lʽ~6h@o_sKCpU1 L L2T˖37 H0$RmANq 0_m__u0R\X5qۚljxߛ傂2Vw'3ɗ 5HQ;H('dDN41ÉLnO&ST'0ElQQ <:ۛ)[~P]1,>Yeu+W1y"ĞpSYN/7r"G Gjy&e;@%`r*Uo'H/ob _^q#?,T5s~[S]7+kLɏՃﮦoV;es>4蟝ƖخHp\Mko] Bj&!tT kFjV';\ap(|Xn=YٿOwuZ]뼓M6UR$ߓK>? !Jl >'7]P(3zPsϙ~9{ 9No|w?~oû]8S9aSr4+ W_>~WCe*st/,d+c/ο=]Oz00\ @n Se瘟4mUͫF_^GҶzجWQXq@m>z\ẓ#$q I49He9,g@ h4ܨ*SǓ0W.Gxdnt8rWPsI; i:z9ڌ40;3g;i\\_[1;/Mgag \wuG#! bfW^^PhaVJ`B:om4(V'h`慰e vٲai5s;,=-ȕŔarjh˴ @i4J v (F u`yj(njeI>B-L%ʬb)^΀HM$*6ؖQdkٓC<~=:9.എQQf;/% mG7O6PgGDpĦ89O3bM>921hRV1"qrH$҈Zar5GN&$gJ:kSA %' EZs_ U_jT s^+204X^o>L^=Dž],Oݏt1Rֹ֗rzoo$5_L ϙ+%Lpo[CNE@Q%N L ذ$)"AQ\44 aT 9Â2 J ]LZuro }pY݄]|OGʨ1 ."Nl'Ȼ\D^ϵQ+JQ(uo;FoaNiݛ*@V<FVTgM]p{D">8GòɿSRQ:z!> {*ʛ#%,xuE(9%5#бǎ= {GQG.*Iƥtf>rR8.0B#U^r!s nbrle<ȭH^'m>Rd@=Ƙ04Ӈ֚s?D ]icqk4Y=*r>ާc?~ⴞb|&ɤU/6>;ÖBs^}aQ,(.#]"x#s~T[Y ',F&\0QŌ&% 'Ό u;j8QCS_]i|^Kb +>H% 3\jM,kM<2,xi9i4:x[#=Ӓ!xLD!FZ*!/7|gmjW`nΎ{3h`>%tnkg&Ag>f}p; 0\E%}TWg@/D B,^АL ٫*R!I|z CC]~ȼPi3NLz7y窱6z:|ƉE:.]P:@ۥ_^P \VyY ۘ3L "h%TInѣ͟E9) :"d7θ-pi Vןæn󒏊yOz`Uٚ^b]g?Oyl1Wlerm";| ifE^ BESjO-hrBa1g): y*9"Uru"L+ZIPN!B#8)Pd#'F8TLdcժO8%3xC-@,R/%Չ".yg+jZstFcƽo- G~Տ 'bq˛NpYޛx6{z_p4d9ia*P1r^ 6tKs ,]?.fw,']߫ehV'l_#yY3=q$bi}<2-P-ͻH8+ޑ\i15ݟ7ne<\L;TMW[͟=7y0Z8գG=j_;02Y /w\\`G1q8z}=9ŏ`׺G;nȱdZϥpXdp.A!ژP2xLy&erít^]A!6vM§ϠSv).@Q0(k7?(Uc )ϦmD]ATXgL! Dϲ-uSӳ'󧯢˘pv! F=#y)X5D%K-%΃>ՎHgX$YTQsG N#D#乼7)N1:j%";ӊp݅Ag{ 8NYsU)&yC8q>Q…u 9'JX5Y%,Oa[ uZ4}G! P68#[Km(BwE:V+WZY:hg;@9.8l^'n$c{E]ԝL"ӺK~tPK|LN!V)Dw8y0b~K\ܱיoA?:;{9_%W;m 7Om5 b){m7$\1鰒)}{7Fjn%z_׽4^r'aJiL%'fTf {x3~E46BM CDI޷'A ơ;| kbag? ˥:: i+gbkkzdsq氇;$V4VJmRKPJiu6 {ԡoV6I\.ŏWM.W?njIkkAq&-s4gKm4cw/ ͍ǻpQ SQ251o6{}nkM!{ſ,qo萱{WX\885\U[/r t>Fu sTGVlbYu! o-1o4#f:=p{˜xi״Bs#ڰft֚Zkꑛv{Y vQIqAx5c^{nV4 Rt9R^U ,uVV#KsH2h D:< &J"EL&OM Ric;|Y " %oK R:"@dH!yguOdq=dgGUg> ?\O)jOuz`i;:|X?N]/ی+ .0ʋ7u 좾]^ԃՂ>r - =2Q2 _sV'42X?.-$Z5l+̮_//g6ǪJʴud@VY!4߷՞ }521w\g37e=frfXS !7Gy ؏尕f(|'nFVgg|lbc*md=B @hvdc*=ty0ar2ۢӯN=vjԲ)iO˚\ G8*V1`"W'x-NhtxQl]6 o@͢g`~YǵvXo#׫z8s6%^>99]9bUJ$Pkڇ\wkws?/dڟ4C٤nAѯTjp6--` 8{v'JO;Nsx凇^x~}6 6; >4oj&Sn U rm+0/u+Bŧ9S$M %@pXrΒ %z@l$rGu r̆QdU _5/6 \ {V61{wv~S)wT[&u]G}co_",Uyzk풲 BLo_qt>\lKy3:9fysuц2X'yv[DgGxr*kw?88=x5^>d;PՖPVn-Q! [+0nqo{O-F3|sBGs?noH3]~kR&@IV $+hʬlj.$H& o@ɫͳx:y{.gWH/NrՏۻ4.JmmIN$I(+вƋժ.[Fڨ`)%f/х2"ܦ>\]-4/KtO1v2u6E#;ӝ ia Mi5 UAd`t.* U*>t+u+2Nfr֢kQ:2]yRcԚJ;߬f+LVg )[J-FI;%H5Xݔr i"FgR`!C TQNI8.;„GUqsɳ57.Z(-X7"P-A4Z9)!)x{ :U2)!EiJF);IM 5ZJ!a##+$V 3_' ]FPK|S+\&D5#(xKd2 Yk[]F/(5I:'͋t0$Mʛ J'[jڗHR`榤1KeXgM!(Xޞ9TdeCގ0pP|*'@nj ָh)4".(Xd@< `-̡]ßuqLOH$w1WˮpWaA&Im֒p, Մ򘱘d40B̆j9uF%S0Ue]`xxXa=Ҵ> Ki9 *@ъ(X_wIAkAt&ƌ"(4`klBWJFeFŀvPh\;S;U;:@=-5 !pY *a q2M!,@C9@B!d/HLsWɌ<5[s沨tXY7C7a+?%1+!S'@gSO` bXr@f̤c@Z*)@Sѝ Dpi7j,X \I{Hl NJE Bꕶr{FC^vOJA/+Y`3/אzU $dA(DpP(46 j"| $ü aUG+B>84iȀgm| h뗃u+k7ᗢ+fD$' nj1!D bh$%bB6!>dUa,2_+)Xu_\p v01Ս5\$M Q 3  `::pĥ8 (}t,*&{@:JuhTUF@bfS@yCs ~XA̤ƂBgs$2d"̫ k2a8qcMxH}辬 kIt;q2pzX:?)@JTs F*(8`ȀU2p)qcU0~ v?>nQW}0$Udhxe1x6F4= RobRnD%%0Z5%!;@r0]Zx dɗ ;fa3ڄ`1btJ߲].7؊D4ciҀ'WW(3"O< U 2P(Ŵ1"&WcD2;L ;&Iנ-й'`҅Ȋ8FQ۪wZfnfǚ4*EiH;^&&#U0> W%.duN^y~5JS!*^_^Ǘxk L F}pk 7VP8+E8@.6 %=ʤx_H 0#"Q ljrR9=6&TR&XKCd1ˁ2!;ff5$ o#Ɉ!:.P,=uR@J-0 Y輭\];S9`j?ʯzA/Y|yyUSSjn$%&4(OE>U'4x8Y>R?x*qo5&uur{ |9 Ns?xWӮfS\S{ғ/aHOr!!O߮}˦?YWVܽMB].nw.os "ĵw)Jjq~"X|r+/& J`qܵ#Ɠm/ɐVT3*./o, ~.ڠ{q2^\P\Z]\wu\wu\wu\wu\wu\wu\wu\wu\wu\wu\wu\wu\wu\wu\wu\wu\wu\wu\wu\wu\wu\wu\w_Ws-t~rppƒCG)/;o&PX+mͣeW79&H}SWX8-ߡ9oQ=IVG/n^Oe:5EF`.0AEʷgKZRȿ &2P5Sp07*x5dg2<0@+>R=ܩӚ[/]|Z| ޼zq+?C CdmkV{4z~/ e׽\uzo~Mf߃(+҅R9`ͣSnYkɁ{"ڽJ=%FwRƾ brUJ ?:C+\<{:*^Е!A\\\\\\\\\\\\\\\\\\\\\\\r݀ɕ ,X:c+!7,RE}-U3yO18H[||kjBvH:!;pHhȡupMxKp_#PuT[>Ů҄y\[13a{_,ј3#q7o}]sy:s's_}|Vj,|\+z(EV5TW_0wlJU tL:Ϧ80<̜=0'f^?b.3,"K[%k9G^&tٻ6$Wy]`K00gv7a0򴸦HH}#$ISId_RVV2"Ȉ/tt1~5Z]U*pP`Ycv+">ODלA3@1jGS kTw; A0Iek8#w_]yP#[W1epGFB)LaX4Ѭ2&HzSks_v1z3ō5ѹ4-+ =LHϕlB$/v&lˡTSbhga0KI+H;+ҴVJ;[ig+lvVJ;[ig+lvVJ;[ig+lvVJ;[ig+lvVJ;[ig+lvVJ;[ig+l~igx8 a*(ω@! 6]Rіhy%YOhwAE˔"Yk4 O%o=Ck ~ Y f#wA!lX!+KW"d}zOkMDĿ[JΈ6cVM@`} k}rIQЕ&DD>Ӽ,dO ?5$4b;M !y^xVFE5!8TiEkSc9[ *Lx<.e}tnFD]TVaTe=pof:$Xۑd%Nx;`i1Gi0+?(C L1ֱb$} &ۢΔuK2OojcoW.__zjX^ic./12D?=NxQCj2q<`8$vvEcĞptGϦ30gidŵF $I`Kۧ)ϰ JQpn0NypZCX%Xٿ,D^.7?j8isa:gᠻ'^ wk(Jowiq`~;m ?.g')S\Oq _זnH@ƿZ nZQE9' I]aݏѼ11QOƣ˅9vOedS/wuuZHK:7S_yszF ʌ1}>p<<ë ?7~|ן?w?e|?gXY 1u]fi o/7~ǿa^2h ?a|ڍ#/r^>b?za~=|YN|GL/?ZJS壹,֋?Җ|tZzn 1!]t -אJc$A0h6IgsXF'UJG##6lX`4 hc,\8,K{,='Opny`,zĒAgC4D[M3VB!48Ns^oӝO{陋w $ϔC/&l!UbH|afT%H|X{ƊgNn7+&F RB(ه$6 y#s} هGP x@ S1;M~ h+q~o'l%3R&xs1geϘ,.I r,B ,)jobWtzM`εwK ?}Kf-GB> |vPnqYm\HY>54CL7pL>w Iŧef}H6Pg(EMOD`Gsh.YYhe7 -Ŧ_Wk+^o@Q;60%N=޸iIv ،]tO_>װu,mX0ڞ(vkڀ6 hjڀ6 hjڀ6 hjڀ6 hjڀ6 hjڀ6 hjڀ6 hjڀ6 hjڀ6 hjڀ6 hjڀ6 hjڀi@0^ٽ,)b#vQ\Hx;A[tKUo6_*SЮؗ\((-+Ykcԏ*E?Z>'.Nj4G1&/3l󙫔hMQAjR>M,R 6zrۯNud}ve8?KD7_>u}"<8&rG+[\lsL4@RFo*o]oAwg|6-?BG1-OW]l[J!4E7.\ЈϣKqs%<CFLoLv!,S=.u\;$F3**LKg_PQ%dב x]8GxDQ纺.&ۗml[;]ظʃї $UhB(,iTX„ ..KA̓(+Ŧ_o@'Y,bh`)Zr_Q=g,U,x}FXl,=R‚Z7y䩱0X,唪XǞxG#ev\a.sɌ 4bibN$RĞŦ{nX>GC=K q=5.>Uz~|TL+.UË\>ո8G=nԄ񰙓Ma/.fPgPH0>7N0mE'̢g6p`AEVlP&21$/)1mKKCQY@ ,6,vo Y??fz+ T$d2*(HCfH[d.頲ӑ),6!e3aȢZc~EMչX Ʈ{Xn199dGw}Ts۰:{__29k:\\3R;2^)]ZQK⬈:^C=3hL!aswȕƕSm ⼂t[rX-,&BSj Oj W3w 2ٞ_lOlW-;~Ebş`t[lh>sapB!BG*Pz0@A b$'.u,:{.2½gh<hfc36 f#e$+}Qj:-ㆋ]L:ڶնjWK%8a2L^Ԁs#wL2$%ɔ{Iـha{3 !2x`kb}c8Y"1IiŦN c_,b18" R".ƫZ.G:JάX%bYpU 2G?8II&<)m% Zb9"W.Nr)YLJ].V,fGO. -9s6 TGDX?#9Y]<]=,&CV Lت%o7"_rMF1/ٟW)Q`'nWڀɜ`9D#7"xmT )).$i{2}E!xN!3=WwkElЅLNbJ|m fR{뒮1ˋfZ:j1k9B`:eEn8~8nh y?АI`ثITFb/m2.Md*z .a %Xͽ}'fnR qx5Nz.dZjsz{%0ҚĦe.ZF!XEL9S)2~*ɭ5 ź@QWNr-Ȳ:vUZy q"37ʏ'_? >A~>0|wtpvJh=(GZ ^5\XyTe{WusgN0>g}yai2 J7besQك˶BgzOLهq ՞fb#D a7TǨL)IQ1\ G |,(9 AJ'#* Y1&UJ3()כJڬvL -=C/vÇ.ΩM'C+@h83=i6riFcnnws<3Y-;Wpy⽐ÇݵF^L;#ol=ni·Aws1}<[X/o =Oy.q{;RܪoLՌx݀}= +Ʈ1XZK^&3k~'Ov:%g^FiOjN8x|'X'˵.q bՅ,u݅/%R/!P\"LD1[  yMw٧( @p&䅋s%RYe.0NDF/ʆF=v2 i ɸ+~ȹűgGOĎg>;#:!&"Hnf>aנY5HF%$,6(hZA 'I/~)Ki[uБ!`lwͥG$i.9-7֥TODSdfHn;GL  dKlكto7c< <ˀr=]ߏxk_.w}6.iu] a:;d"=GkBɶ#Ȏ$G*]ɬ%&r{DxD(|o1ﰓ f1\$\R\ʠ \3`r ]LCA.U#YQ0 BhS #VQ{Zsv8 1p7o)dznOMOKuB7Aܢrŧ{NM.7gr٬R_8 ȴ*S6I032+#b+ n:3-b=G</aQd=1a󊇬e*1O9HR2FCL&y OEOop 9K.2|(م&ŧMD:謬 :";'F=X~ޤtvWQ82t_նFbJ| }Y=bld*WYd1XB B*1NsƿIcS2V[I{$Y UB hNw`YƖqvp'qB>G@q_e:A,3]AiѥvY۩wVvCeo(7( \8Kyp1m딜>u1=?/~PW];6~g#zis:&y:fawK\ [3lgv [C,6Ti<Wh8Vt/N'v:Z]W8pY[Y57<:Y^PqYm`7CZ`kK.hŝ?MU51k՛tsJyu6+ǛawޡM*7cSZ^O\xSE溢s]Eq6ռ+n_c[݈Px'jtlhu;>VNnxjȋ{[t\LFXt:pf+8.;U~ >ph8;ݔ,t5:wEeŚ\Qc w'RG u8jADZc'-ˍO:H+Q#KF9fl|FDw4lLϏ^Nc GwY1)/EHLhrt$p(!fsO[/ԃ^+9dc췮t4yc3}P*<~`o_,Fv˛>8y:I ?Ȥ VH~0II# Up͟ <3ʈ.ꁳ4{d=Hά~r9Bh~_֪~Q(Ky{![ 2? GBwk Zz-O-4ʭΛoZ1X.̯W\r7d?4XnT\|V<Ȅz*{+ZW&L|`y6ڟ~޾)BA1ӹt0{5˻>d5~kVKԵ&sON v52h0.FJn)ƺ=jG.^Q%^gOO6BNg=W4&5u>d.n_+zϥ i" aK%Pb/pw`]zp2Xjysr$!F0뵩s(h1 ,YU*x'?=y۸pJ_mqp/<}}HAgb# E -=Z}K6ӜSA;[^GZUj[7)w죗t'_5}p7v} NA [_}>~]~g,b۟m;kS_yLKpG@{:/fܠo&@^%RQ8b`T㤌>hˈ RjK(Xpi1 1$wPkeSԡx-YZՍb.rk*o'B&3B]Aŀj^;%XZd՚K0ɍ{pMk0xl,7i](o]xl'./Oq8}b7 ̒>_U{ls>SFkluVGXlSECˆ s=qY鎇ՖD \d-YrIduq ?D#"uB\uLzȽD0N3@\F!Yrr>9E `/7 5d]5M^2CnowiۮOdjk{02rPnMYXmL o4g'Z!coزfelGJ6>/|zʱ8ϑBS I#13*))RH #L>rLDNiU\ Ifbx]c3xz<[KmI>A$z9[4^Ik= 5UT{季{{ aכvWd ^ o ~בJv'o,/gFtQbb:}>^!o2FlʼٻX D-dH\ɣ86 Kr sM&U,e": ҡLH1` \O^yd9x.linc/g]]꓾_X$ci/B "9*2 j0:J1trk4*4MZea b.CDʙedw*h:5g3?G8F:$P9$U}q(l$%:{q)VUU8\dpCg˧@ )!VZ hfVk|͌h)0Y9;?Zjjrgjug]>.mOK2h#w[ެmMmڦJ7x}ܥF4f)YMAGnƜ0G+\j]C[mM1YT7hkcDHD11 O ޤ. j#c܏J5,&bO~x{qQv{ssseƏOGyzd3Y7#l01.TR@N"0a9I{aeb( AHQ`S!d1l}Q9#vN⚉y4=kIǾ*Qg$^IT FHJqK.&eH2* B'%ye<ę<%ȁ,dbȊF aMLo" #a}C*2$:_Zs֨_ݢ*x,XM>Mw*;f="U#Sޓ gNEHrFw1ۤ$H}tq_7ZD> mH6\F" 1FA ,ykG՜%@8o::jRbٳ$ċ883J F 2!*d.'bǂդc_<P쇇aW)d7Dz<_yn87mZ7Śz'~f HƆR#(  '=$sni^`ҽfWڧZ$%+ξmZk~2u/G'-KB6QI,!k#(:3#2d \hR\fAm !JA[}bLǘˆsHNƱfhDz,>E2f֗QNo”5~w浟3LnzwyE'hMEfv!Qh GeQhTĒXHJiѕ8^I%-@}IY/S L[MsW_z64ƓiE#qP3⧏DeLD\MĹ]jںDže/B;IIݡ8&25DdcR#,lS2V$NVֳgz8_MΈ}ȃױOH NYA+#Ԓde~3CD iV_ֆP Y.FO!K·RMFʒWͭ!Jn԰|$ $=E}tK8Wu4)wKE@O?3h\1YD- * *^sֆ9 `O N5,VkP͆52b2Hr\LA$ 2*Pp8d\)ʨS(iޜΕxcä<6\"7٦R(΅ "ǃBQҞaI[{$)S>蓹*|e.7Bhʥ*Ha ֒0҂Q S08HqN<;̭qȡZ5:AoicufXMI^z$6) HM)؄TqX)f+Lz뿶⹣1S'd| 48Xq]J{;j)NN'Ac*P(JILV[ˆ\hn%3dziƨix (W$6!`BL%[@;f2Rʃp-a#^jFֆ0%B~~浬߼Mqy^ 'X:mui0r X`ˈ謠T"|Lf(vV :|(mW ")`FǤ"9̰4N+ FHD0 AP$@2?I X00n?҈@W%%np5<Ƀd^)(FRy炅jX|- ^`Y0r?( $tF)X5KlQ̈́&1I;Az T-9fNE -2M!2p8Bk'Rvἱ׏d*+ C6 ?+OC3Dh63{;HrݎU Y6&!IM}KWQq705_'%bx^}mx (w/.Og [{ΰc 9e`SDnY_atl:%x}:/U*aY6`MH9;tÅåp,]wFEa4KX/VDZg&Cջ ٍ)IJU MMov}[( `NB";sZAmS囘^qRL.P&ϳ9y^_ 7կ'?.ޭ`ͧ(R\ ۤGwfXa}ڞu]7IkYe >ƅLvhP-tz059k{%hw$z+h9i*Is`Kob<=|&gz %wω FǙ\;?^^p LŻ[u+0 6yj-^13Jϳn?Sr"^k(!ˉ{Un6+(>\;ϧ7KSXQtd~gs,`M>}D v0C,g>*3$q跀x ,f1grHxJ CC_z?$py~0nG&UÛZ:y+?gkT.N9 ﵒ$&VQR1q*(ӝbĜETZI}-{lȚSŲ9|]lRZކ,]:0f.m'Px_βl۰ȅnz9^>G54mg*.baC܇vo5pX7wjy)mAQ`t3>>KEjkɾp-Xz96g=\5;jgok'A75Yv΄y&6A~k.AJ2~cd0n)&NIO1X1Y+Dnq!ʐA]:s#sc=h,_E:LF(ei aƥNA*ƀQ(Q6 DS v: ?pmB:%ndQɩ졜).>QSi<FAF`.EpTpp%V-/_UA;[Pwgt=E%Qǻ?%{q4ϸSb c̱|6/%$Qs-9˝RZhȻӤq{g#{˽D^9G1F: lp6HF'g)Oc.*5#͇>rwS oOw?<{2+zVAgT\H5h%[m@', (R3CnyJﯽFNpAAeN7e/נKB9c#!Д{&sB,%I kRR(3Ɲ(~QOЁJ>[sl0T@s5H{G֦/&ROlJ;s!549wJhR EC{I a;pxy3S bx:5MIÍ[&A"8 @D0$18+ 3Ηԡv,Rg2D'Hkh./%% Tc1g;Jaظ6љNl^}tAzwS@YL W&hjkgt&KOx a' ( rbAX -iڲ ) O (u3.}zjUOy^1eR6<"9X;<Ȇ 6 R*e 9 j 4O+'uㆉ{ LqYsxO%AT Qdl*Ic+"WZ'aM:k1,)F{RzOI a]527G-?* [Km($eI2kwdSQ-8 X+>U'zyLw%$h-yHe*mw5ʓ ̈́6Mɧ:4 sW$;WV`}Re͸M)2"Åc[r<`?0Xwڂ3Cdb a0F!cHCvRuΌЍ]!؈bm Ոy0Q-O@"uV@Z(^!PR h9?6USb,`ܝ)/,xrq=*CTT_yzq'٧layĮQ>89vt5Cattx7AhQq$x񋿪e嘺  ߭{hWwPftjǖ}F8,{̓ƄZGjovHm |۴/sVߙ̦V.!J9wjp%t&d16h <y'cSD)|Z"pG*g/e8Wcr"*(dp^Kj i(uaDYcj./=_Qe[;zȢ~xyW~fdx^cݏ~Zq/~PZN PQ&vR%re!R ّ*U=nw_;~}b's 0V,P2+!W_cH!Y3~%V. Ɇn|V EXfA F >8Quc?k֝5l?7[Twc(:.`!p(,mN*d3 GRƍ\CZb] |Z"P %("e&Ό4|-\rO9[OĹSPԂm2o]o9$Ge8= [pc26(16-zSBТd :+^Y̿wki--}Nƒdt2ZEV|^@JH"璪G(S֑6YcTȖp]]#r"rP{Vd UJ5?eګ~68ĝ?MXiPlq#q<'wa/1(M;  ȑdtS MK|W.io/5,k.W/wuO vg~'F]^'uQ_PH-zϳI lG씃k<ŵ L-B`N(cj)㲻ލ'&{#&W_Zwqo^^~S}BpzC:Cv;]ܽ3.?eί4^t7Wwܲ3Gc^Y aSrv:gO{#Ī]ْwtW6sX~eG,89c`:O=l reJ%W]:Fq#?ٌԕ]ZϕފcP=ul?`2'lϿ?N޼x)BY^r4LAo|3rI *{b_f\.l~ t~1Kp~sa΍с[Mϛ_neiZ4-<ܴ[_ W:ղ]8*c !6m4IjJM&>+O2_@~yd&QFT0x a H4 I{NDʔ0ZRcU:7t &HKurg$n2Lϔ:bG׾;?>L=7[a \j␵JByg ]IN7J@Σ\nprF΂i%0Qpr7Z`|s).&cvMbd Ηo50詑L>]R&C`\)lupFԎ(lL>YB*p$mq 5)^͸*"z%`t0x5{Mr|goA4oqGǓZ(=]%9PzC00X䜧g8}gnKIؙJ$^bГburȾ V}g+ߙ38_wN.:qǗ{k;2=k}}m9P %"qyxYоA"JdMa7)aS(-qHaŘd4d0BQf a+eNE Z5|' xz||J ^{VcfyO^3g,CDoYl.a]^iA(=ZCTA`Y{س-er+Gm)iGVikSU,"Y"m BtjcjqhC>qҵEAfO>r{QUa,cHZ>4ؓ#p-Bo|ĩ fgtSO?٬/1bj!_ޝ52G=p;G8O[,1Գ d# Je;6Q{ۿgAF+BQ02T$o̥h@G~.:K*UcN)($Ij*u6*0u6Κ{`=~4@TbpCРQIև%_KBH>P9ئF*wt9nhS@N9!Qug i5)R`[%XN6!QlhBG͞NGާEZo|}3l[%dB;'|x0wMR:6g'\1 ΚRs[Fcd\cj.!P a)Gc7m]T qZlI  @P̪$,$IWs'kqju4,cXcw;=xEc0|9w!<<~;u3:( (&Rh C59 bS mntdxj:gU )^s;")DHS[pǓr%VֱiԎvQ#{ŇEL %-(V6tlv $kmH_e`䒡A8k ]l?Er9d8$E %ҙ]Ħ9͚~TUP l^[և0q` b2 Cv4H5>@CJ{ҴvZw6Ҩb<jW#V:irlBNFi Q80Ua1TXW*>p][=kc)(f5:W~$nx-@"R3gmeą (CfMFy ?L]?R2!k_Uލcw"Zy098 nX8HekO-N5D {IPC* Eۨ֐okLޠoKj~WF943"LqRLKq0& irVn0ߗGw=7qErIM>}@B B] O_U 9Ƣc\J\x ,{3tny$<]h+sPsn*]vZpfXnR;fIM.Z˳kTk.PHx3L|sWL#,8ZI}#kEM8KG,U,ہe3n6dQ|vwڥ^-&}ۇMK?[aÁN+a#ǭc!΀-æa};E7lޓ^Tg:M»>cnv\gs^8+\WMb-vԓDJbB-`88GsƂ4{axuy;$: J0>Ȉ`kE)!P(,HqA]SQDF$ JmgP:stzzŖ4_n ]ۓnkVUcz҉*z ނ/XhCru>k|8KoC*L~@6YRE@!' Gь) SaT?BS|$#۬ZV_OW@ՕW[h~UJ4i@  OSfz33?d߽n&&,ʠ탃^W2xٟd-`wߟ^r"# $g\Kr1mEP6`_N8_r30a$ϰ?FO&go +{X}Xni0ɫ4;;3H` wMՌ=>ʫR4ر^jK/0'q,hpÓX7EXv|I:J/s>Oh;(-'"#PXuQݶx9YG?Maz7{O H&uXneTxK{u{/ø̈́u_Fz//NM*@;`k U cJ#68R$#Y@SJx->cR =3>[78D> ڭs]hm4& 7mAp!YnF?-ۀNYPP&!b$ .:S s~b( PcN(VG:r{@J*EsFѸ貗J YJ:k1ۆCUY  F}eӣ M0n("HD5 !AΝ6ZTe5p$jD^RrXN#E#$^~/)N1b*Ly+$HǔZ!(d2gAaf:.TE*L<ʆ)K|Yrbㆀ%0l\MZN_iIo @rqD!eubsf jx0-:L&}Y^X0A`%,NQ$@ZȵeTR^A.PO;g4^N4N|i2ǔI:`h 6(hH0c;0| 4O+'u/u*8yeqRO1*Z(26bjKk+"WZ'a iThO;oeh 6CC:$PԶe׷;X߾-6xcQdSQ-8} y.) QrA0Ŭ KcNw.TC<"}әָ8eU#1cAaySgk atĻ<r0ɭ&WH#(s(?yZec-0G Bb.DyIΜb1JL Z-Pq|5=Ɖ}Lzs%G㦣Yٟ.[{˝viz&oC^Tj !=pi&ڳ,(%eq}XROB|$ c_-AB^DK[xg7Q`R]8(VJ]3Ӛ3J6'D˗襠Vu) ip^WE5Yg?ܶqW(ƌ"ޔvZwǝ.&F~ljǯ&L =[tUՠ-W4gVSWa)LLr]E [\9k#`oYe&di`fh ug׳j3/No<պ W%{ 8bNSPK Ǣ7;"Q -ݙM+GŁ/Ǩg?c0,I7؀D9c"221+p3kE%t"/fLh?p5X:G[/=h!T.E\x+>0R.ubcwI hkGsfqȕ:/2Zbsbω39bGУ}ڗ\8 `#) FҘ *co0aJ@@H`v}*CI*N/Tj4vCq`:A0WOpPS,e&Pq.\`wS"JȸE6P9&j"@k+7#p+^d&'Y>6)d]r|8":5Ѹׯbwlamx7=ޤ艹̝A6,=#YnD@|HB,Syi(e>|2C EE6"#wWWR!{uGw ]HKhq8cdg?Sԩ1ft|}'zḐadg띠o9ropkdQ:UTܵ fdS J\/$MKqtwv:# 2{W,i_bEz[0lY{g4(|d ̍ q\RL#4ml)l$\.e&y$ViǬQ&A_b^jF[Ζ)8+C{$@I4hZ%{<"< k4Zbd¹Z"!$6-xkiF=7WYO2CH$H&nږ&|Hlv-WU!raXu{[%./W ^ R#dPM/\VQFگ^W` -}t~:)\BYKvV: ce MN:@?إxd׵CWZ:&G39hM\SY-)~Xy! vȲDGImD! S~ 8j3%z!w;yNQm:9awi3 h}(XhhMˀ}e1[Γ2pJA[=Ty2/ܝ-}`^A^K#JZ4zpViP7L(|9C1^}! M|!)x.s4\q +@Vei219}Tak)h΂ݰafMpvBAOobΤ$&\׉M|6{dۉ߼ѠLұq{|6\_~q1V*d;x_*!8, ۓ3G!fv/O:: ?IeA?=[N$TS\.ڹC4\%(qFUQsB &NW1'CJQ) @T3\ TF1EQJ {'՝ zz|y~bk=;[)Ff 1KݲwWfYfѤT&i~[YbI!02Y>5V=NqOأ#=Z_цŸ\5|VkdZt:ׂ\d XbO6hrQܲH{:TWkPP59Gmbއnٌ=B,>(e#COtq?~X)>&B%Bթ>\5`Y<k/UPgYiOjuBTc2)KhXzeB_!tb} A ڧVɡ*ӪB1-yyFKuFrՀ :1ͅ0礝cm*%Ūm-Bj3& Qglxd82_Gv灴cքMϽQBBw+UZ1m[=NRGYT+ JtZvXjʦ ath;bz[nlandj ->-Q!51˛vs%9]<<gyzfIWu@-nƂ >4"žTY 9PRl]_WdbX̦mJ4*NA%vIW%TWw6[|>+w(XnұNv`-c=+tPĪ<ɰD-xh(Um%d+ tg{X3! N CE(a8 KEj;[w6ÑԯNQ5ø/|hpawM9Yį"EM!_.h/T`޸l5KIcl|D'ܘ:-љL.qࠣ(B¤vB(w=n(1˧bõr^g7).vQOvq1+[(H qW[^DAAm@ObRd.=&CX LM5|6w7|f X7E?PcrjG?UjA9v4`2p)kOJ֦' k9UKU]>7ӻԚם5] [j,\v;%kI=U"׆z ;_*R;TUee *+&_>snJ(4AT6 BN8AnP$~I/0(OjsфbP!(O%kKV*(H-h2] ǜWsdL6l6}7#mit P1z($tgyBP db5M]]2ˇޛ3JDdW峝ou8|s ƶ\A *W#-?;E,N&B.[S Z!6Uԣz܀:MBvY;׸ z5o +`ydV19r.]2 HM^(7{*lv @)leSK!0)I.v>n @׿'?ک/_E AMc_ )gR!yXn Lw*#ƞ=ž52YǨYuQ&qD.# dk=Ƨ( Y.1Z*cug˔X7tXW>?:\Y߼Ey%Y))$$ r6*QR 2!7,Y 7E?XWoT dVrLtkUHS($L4QM S!E>UG=YU R'\UU"լsb6=GWZE(D廢kAPahZ,m̨"iAgֺ!(L>ǪlU\zcI>QvTˆǬ"f0GD^吒1lUyxxp{Nso/}NNrDլj;G88Mf+_X{?)~n5e_ }^Vw=*Wgu*Ӂ(yY۾\F)#T*B9c,3_<{çibt"YUFR3r# wjn[˄л]"-Z?ϱp96P.p4$׳$37Zo9\]޽4TheVOI?44^~KJ֊-%şm{M;nScyƷW o;#ԄΜ'zzrt[n쇫fgGoqv3vaA/ǎYM sc~׼P>ʑOIno ) +plYɷdƙV@0 DY-6l- hKPN<7VeO6<X,帮-v cɶb.%E][' Aiē__gB_+ }ŻŇ|0Bw^&t' >j?7#><"/3ݝ/yX( ˋf,0,Qɪ8ދN{ilCcNG xc&F;+ OY =_m]Z~i]^)~R8mNLpFIHliʓ ZFgۇGףwpdvv|8e|:;Q4d&A8žI<*8"o=(HgLJտWV'$D-Kom4Z~P5J8IC3j${gwJg7plјVj񿓆iλ~U]xf9ۤz .Ś˅]>*_l!IؚvlGMl+P+FX J,%FSaEt->GsrBcbq1wW9~96?zwthe62iq[EY_CWT/-Tqix;(IW~a0]9»>/|/y?\~ӗ7岍I_*-M\RP~gz\Oͻ0o)nzЭL58;P>Zg ܦ6ݑz:hSWk)~<4!W}W홛?7dy(nՁfNN\DչA塙'iY3W `S1-Ycr+>> 󗇝y7Ǜ2dL&kTQJW_6&S9kY1U ӿPnSX2WЮBoeWa%EKuh2TޖQFK?H_&)s,sHJR]<!Rp~ [_._KwC>=88Lr jk"C B&-)Va~a^;\c,aI؃%ܗpz֪Ԭ6[}sX}:>dqnM!Ƙ=|ɶ~eu;H)Ģ+'U)d omi>Bf.}{vYI[&}˷woxسu?9IٓtΚ|.[t/;c%6w/ϸSOCZs]7w''׊D[ژ^]4ZںݥFo\ j/u}*],lw?-]^ˈayh{8/|9paA?Opc?~~SVĥ SV! 2h'/j"La:-a-vj6ƛJpf ʐ!cF={Y^kDvR{ y8Fwy}6P_e3K/vŞ~ev0ϚOK앾PoT(>4|/*zptpP~5+ְvI/tzVWQ+;fR33'WL r;>׶^(kK?JUӫHTF-?4Ls˵'價@]t4Nlv+mV3?,>#3E92e[EƟ%W|8z .O>\tTE-:wʇ"r6yO]Zֻib1Hżl_&nxs&_^<5|ťAzdw}jJ&p/YGz[WcKV2I|B:B(*  )w+jZ&iݬDm+o?3S0,[9Dg@PPl1 jma4*Zٻ#k5[W1 R~1~eY؇[Y5i>~yyZ[?.ZGJ؛:+t%QM䦠$h=Y9K%HtdtV$ACBnsEŚ>Y xaPs_}L)[; FNʾi@\ TA17I%nJt>3A!fRZJQ+)ķJkz);u֐l5rc}hNi\Y7&W5qUiY;-$BΌalx kl.Xɦ"@t)SV5?7Xt@k}9YDu 0h(耾{ ׂ n, k2֡ -0T0 Oo!%W 4Y.rj)xٻ0FںyĨt _@! ZT%A='."gm7u@N+D$|}_တ2ާ{UAUiM̵4✪H%uhfST{Q:!MZ;S3]v9jcM.sd8:Agxe z6i3!9㮃Ÿvּ,T uɨ|U͆V\$* "R`bR uqDg()f5$JL}KPly/2苅0LՔM>˔p :j PYX@SH}0a 0&U.~zdR4XM䘣 !{}3q78&)a\Ü6 q F+V s5TMX\&Eeۈl0`A&z9EBVTEܓlA#Yn$,& \a_3a=KP`O R B;2-l8X;喪a=;8>;3a׷=T\sƲ׈f5+C][gf1赫P+gZ9&mZfsHY0] `LDyuhgIE^wS^" uds \Π Ze$:YY.supwz k!% Y!kt4ba!wT|Jcwm_bWC]\E5:rv$ۊ/#7v;M3uWHih4m9b,Թ85rB(L.R߾m|??SB9 5W$` 8wV>cᰃ=%:.vd&)1gt@9;&ˣgA+h!a)hL d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&зJXSb%aAs=&PpGVjLo dQd!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2]&Ӻ0hpgL hnr:L hI @ @B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 Lo ,q''L (4WSa&ؙ@a 1`Y@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 Lo t/܀0^\_'vgEwp? Dr&\O>( QaY:Tjxtfr=z?}%22F\Bgi;S)7?7(rBc-S!r&7}T1i؁;vu=_D3FØ'a= &Cywܥwbt=q>a!@x:Kq >i)ulIwZ>&RQNBS|`oQ?i>K] Yh:4*X3_\iicሉ8gJ~uމR~w4s!,ԥ)H$M"EfL }McBCmĹ8x@6ΌbaT:A4d8>D<윽%I nV 5*<9K΄H%+[ٓB)rZ UT_<>~,ʋ3;-,^bX@g끟{.Z6YB8[s|0{0f=gP-y/޸ 읤,YR>Ad(䣆;39 К.X8@ KJR`^}z:B\F+?x ;nޝo\dBXƁ@`"k&V;WyTP |WKQ'asw 8f&ٳ;ӹR:>Vئ0$6,O˦şEi*R6arcL6L< (b_anrg sՆVr)M# g8H908;hmTާWaTn?}AӠQDm娰猔wDo߼^|G%@Ӭ60_ų^2wJGaC[[%?m5i|ϢѽGO#ô | K b<&X&|[a9^h/(t76fCU a|"S€+;>o͊[ waڼm;h,ly(eAF/G׃ldZ^5h.h4 (( F7j8SЇtquGO.ڮ^W?YgZ fhH-7}SSSZI2;UpVt&ni=~j곿6_}TBgr M[~.]~z%j\ծDè蕩GUL CͲ&/ZK/GVe/5Fm^2`m{}uÖT[)0?W~3 ]`1UejjVZZw*"./Q땩UjA[?]2<]ʼn'-MJ1L?Ʉ 4uF34AIRev|,ę穰(Lj@2Iyσg Y04 ,u^,Ms锒N$ӜpnLۡկ[-z>.ߝr>ft"3t]js_7dbzWВaqVb;#1{ +1vzE/h^E] yGS&ɈֹOT9Rl{㘱,&R=ttqX:AJ ddɓ\+7F:>ygh9͒$\CkQ67,,!O>7 p|ISpOPzTzR~SrOղfr&FS XF}K,wΤ!i*4'_71qPӐ9r_NAo48-.XPa?|ݓZV[z/A ~UOGY[uҖMm/)E`\}ApkK`. W'vl\E?0Ժp'X'QG9puqitLh9ΝFtF]G۫-DtN03x)-6/K"y9';tW*&)MrvOuqblW[@Klvn|OkZꪠP Fx^ Zr</'<,W.>;fNS9NN&`cq:Ӊcsv40w'oAܥu8N#.xNL(so?h؂ ݈`(/;DZጊkYƩMYLr MYk 櫁yJkgo{_h ["d+x/spr ۾HЎ<}^u)SalYx \sD2&&e(!B:eƐ܁J%$q))\ZǍɡqIy FMZiYs1<9Ѳ:d;]O7;+X]]}/`/A5?ТlYOP/ZP!c iK8Xg~nZ Pm-Z^KzDϫ%åiИ4Ԉ!}x NG00bٲ?Ҭgy U\a9akuxV. fUo9@ۥ.wdW"F{ǥˬLDe<yj2yyNy!TInNxE00؝ɪ|seYo.}^ h0 h^3ng _aK]}%ÏpM<,zƭ@֋=h!Cl[/>!򦖕r.a/~;V7sqGTWxw)8WFR-r"G &R5ǝefԧj>;p*7IJsfŚI$˔{) d)Vxt%c9}}4OrV.(ܹp|!ܥ0sUI ciair,Xi^ɜiSMY J%5 e92%yj F*IY׹5e.Ge)LJLN'ǣ0P}?=0ZQ1Z>oQ(~ jκ?+ږ߳9nQKnL/=dݱf,0:Ͻ鹡oHRC曏>{xNW?~Ybsfi*EK ݩ9Pv?kcrl[7ֺ߹,򗮒OWqftr,Mro=a"3L)冥L:ʈiӄngRp8.x%9qu>r :ԭ˲J0taÅI@'2.Nr |[xD K94)NCj$#3aOf<"DBIHZ{(a\q#)ʄ693kO$tБdVPH“@@26Sp|bRa(3u:NBIii}h5e&9i1O3!r^0Ctrhur!S=ɥ:MuLքPSE6WQ؆9Omq5,Beȼ5ѻ_EGهFE8(zQe2ٳעQT G4>O“wE<}TדguɯOBK<\>󟼋:\GkiV/v^cu]uug{V."|+% blD VcOFEy3̮̇I,Og̷Eg@E$WaV脓" meڲm2v|\.z&T?L *~doPJyE`z@̇aV{зi.q|[Y 1 1/6h*W`$qBI@,-٢Ws?sf2iq^yJfZ].i /4" jwKk}h) %u176r#⏻/E8@>6a7"=xlDz==bK/4ږǍEvVzY|31^T-hpqq^]hoxn+/Y?=&˞{(+>\_ q>/h`v|$a՟f.Gi~]M񻻗BKe<$[sH쳭٤s>MB2hW,T K]7܆XtJ/\Lk"C& Q.rH0Vh0"©:!Ilс;վ꧿b#gA;jibRұhJ Byiӱ=zt푏캝d׿\m`[7Jf/#M /)bfw [" ]6$WC `&XfJ|fu̳隹:IU4kE8ZaBH([N,Lg}g)YDIAf8(t0F-PEe$&OsYn>Fmj6왧\w /?c&y2>OD6;z!vd$mS)x"I_^ܖ)gWÍ O&mW՗$ AkeEJ'JBh]l??GoCDCFYeV9*Omdv.deϠTciS,(F >@K uE92 Xe)ԓx;]0W>ݸFkq¼pf=n ;׶>A j54@1e*CJkV@\⻿fw$ٻkx80mعuڛfmq>KtX0 5e5j3i@V}.aTvS>h6@a@W뀨JD[w1;qX_yFۈ9n5s$>WnPrQ^Ank)7EiqVJ[|lێFi; 7'^*fZEr Zڮ(Qvv~޲׵e6종 t($wQGo]Ů[ y r[9ѯYFW5I&F.ۤAUy2[w,9>-f.?/y^.Zn2.J-?G,7fN^WIW_bW'!=1t6@|{圕lzuH.sKԱ"^ߢWG{du )Ҧ-k"SӍR `P"٠Œ:L)hrbV֩m.^ugKG!{펥HȼpH񮠧Ġ*)S`bs+],W8` l|RJ#rU&o,K B4*x@T7|(W>qI4$A<gBPw7y^VÅr2X}_f';X7e篜J3 XS?r({!6y9/lrcҘ[3[L߿ĮLbp^+]LJ%`^T UF#4kT=\%,BZ%Gm,pp*B -vrTod֝؏tn+bg, |˫ƌ$@οl.p<}Fy? C:99xrόؙtչZ@Т(oTq,TMg TKAUmVЂFٜ5 6m8ȡJl0V@(UI(ug3bS^ 1&uڱ3j 'haJUHIʕF1tr2^Xv _T(1/xX3%HPD 5HM.o$/Q0Jձ" KH3vf<c=)5!℈Z<+8A'ޱ7ՓhUdǘ"W,.9ق5>qQI5|ظ8:V6ɜ uBEH"iAJI5?/l,Osݬ՛pbT&EZhJL\gMr 1؀ZdYQ+dZb YǮxh:ᎺOaPq6U,|yŘEܔBُ&xُHǖhH1ާASƞS68Np|?ʘ*Ozdeʮ}YljMNN6QX-dhl$/9226+r2WcjٛiܝW~*ʚ,Kylfڴڠx6aeC͕ S"V796f DPmKNsrq?0$N 鵅O!Q8w\)Lڨ} cJ!CK~TBldCPXbUuΖ~6?7[j[>b3VCm4 X{.'1\CF1ګ26P=4|k\ \--pkUee0*+/Wbm (s2^)Z}$vJFdMq}FfrLtHEE-xq 9{2REeM8JMl0of[UWP:xdr#xD%\A&TǜI-Uiߌ Mc_^Qf":|1VhrR%i9r1/u 5>꫕{s2WYȮ$լ N')xM}D\6e$-e0%P Ұ Pl!9)_Tʙџک/o"xjCxU' . -7腦ʉ|`S53YƅS)< D0aŸd α֝-%ntX]>;:\Znd<s͔!J$•XWڠPL 0@Lłu1o9+.:5;öt..>i[d68[ j@ *Fp1H&Q8B57gۋ}?Deiu_Y_<5*1+ѪVgJ{8_in*KJ`X3`1G+I}Y}")Dњa쪮̬Hk$ⶸ,P5Xp[*?;歵?TjUe]e t5%CUVdJlme֑uU;Cl t]6GN쒈:Uj[H6]QJm9?ḙgL?dlGi ~9jގP8=Zt2/)Իɴjf|t[;+3"?rC9(fgma6ϓS +ݜKNחk6-3i^&yRvh2蠐Z#Ǥ;q6ヴ3#~/Vp$XV1 )ri\Fj)Zr Lnn"ΛRwk0.8o |::] Yj }m.={vv1: &/l'.4zizy6 r^+ar{:!ntO˻G~~Y\y]dypvUkaɴO-tÙnMԦ?̻^Z<}NB-5qXxz(hieJѳ6'uU%ouɾV{j8묽+i Ϗǹ=dUGm&ܑśW6&Q}W/o뿽]?}{d?As턾~3mvLcAF?|~Z~rsZ馣,Y8[m~:N˨@iSyhﭘ_nC+Mnݴ1~ުi榭4Cz[k]iZԺ|P!1_UY4^ˑ.Ie+s$gת,fӊe(R(`@#Q1'])mlmww@އTHu˓za\'IG:LuuQbJـf(J5ʥ'kC^ c :j`6l}~ΥN ;`vٺcßwq:>k7ˉ~Q(Z"0r Ѷ$ʘ%#w1ێ=qlX:_ koX,'J+ ZzY8(rR1Ր,f`Y-w*|l,iPgBW ~.hRJx?鋰Y" "8EDOݧ?чI[;xQQʨtÕQ2z0 QfAjc1Zj`#E>:tTͭWQdA|+Qdd&*H.IC\V5Y ăsQregnqSI9BoajpqC;ms fh~ OY:z;޲_:#u<₩x+WC@ki qc+r lyzrz'zketwi 3nolQXZL[lYf % 7 |RR] ?>. wc\dg|n>nI gM xYvRk"Jv|wR [)K-2~CPs%1U)+=5y g28VH?;2<ʷ"j) m/z0Zvt!u2'\)KW+s*wt<jΎqnЅ2'=Z.Be.U*VIdu*9YЮ˱ X y6zpT,)mQFp` jbW9L:©jC׃L=x#p^6误akvvݡ9~^e_Y+ge:{ H{g쵡N=Jr ΍ `F8(,p2V-{ܲG#bQGmjԶE#RZlVmbVBh9g=Y2H[:D.\uI[I6JddRr_9넮C> &z|!B?&yFꁞݥqEyhy??a֋Ec 1Srky2/W9ⴤOq:Jx~gJLcYpW2{V{Cz=X UW7穏ID5 }{ycA+A&aW`J5=ߍf&GGm痢K 0!vVMouyhz啟>9wUVWG ;lśףگ^m;~7gdN jLZqp{ Ex7^<=ǟJP^kb^Զ W1FHȤC,uZ"rGni҆#۾Fvs'm" /K\Rǝ /<~dqLE9L0OO@ ǧ93U;~JV/+E'>Y3ϗ ݅9P !Y?W7 b.w C|c 20b8uqv\*^wp/ %ݪ@,}n?//=|.^gƛu>q{ 0ߍ~#ffՕE[Aw=7ry ;mr\*88=tl~x|~Y%4n][ߢXTG1}[Wh7 _/ER®YZ%ieOwŶntndZJ\V(AV[Anr"J_זMFFa4I YL\-;KrAHz֢$T(cZ"۔QŔEᖋD1ߵg[9>eKHY_RR(OvHisd*/|20N%c0$jnE)Ukt &Y:Mkd^X걥Z%)m]5BJᨨj"KT;(Y|a-l&5pRVš1 )ZK#I`(+ζ! S Z_,Dr,I !S}:ڄФ4IXPQɔ* FoDp 2:Z9TYٚ,0Jx<2N=A v7*gtǠ}?h?feᝏYlkǁX^ V%C(d ).lm}A6 N*1G]69H%9ʂU/r5;sn:|(p㛤 F`vR?I+y1뜵'Q\2CDAioeUVX,䒳"a6 +]A5+ )D!; X{d ^]addE0LR/Ѹ htސ H9rFbkQ@Q$[ 8gsh[Q7B{dHa=(EJ\ PldKS%2苆ʓc0mV:؜#vuhTK6ec]ZTRĖ[k8 ։{ƅvҐZdRjJ\Ç Pʰ7TdAQt\ЫVPQ',R =!nAVor5E 0WKH!`6&0\zI2 ), RH,dU' Mrʴ]2MNVɤWX="z͡ vm$|x&YP sN1|k8bu0 #|`Kdbr9`@g<"XQr աJ  m#+\ܛϠPg2BQ"!\͑u0  Y;J@+hl?FR!uJ]لTh FE]%A/9F4*``n6pL"'~aAN k K҈H)d"BU^42]h8{GD䢃^F}x3 nEf6u‚չA,#++UL!llLe F"`;/ 7Um+~'3hOYssMF"G!]JS0}^ s@ nDjGW EUR"vw-qINS7}0ݜ^vp@G3sÑY_9/h$Q)v%tp]=d v31Z@sdwٔ ٨c>a6Q{F,9YJCᴐI)PqsII^qYF<@@~1PK-Ȉ"A;CAa3H,(V3!DtIUNQ[CX3,tgFLYYk,z޼֐ڀ+U)k\zK, Jֽt ߠnHtd'|cP c0SNC V0Ҙgr)Q?u]SLƽI+F 7U6:!S5{CQmbfaunQZ:zH=U 0vsFq=(/z~?ځ W{8%$/!fgjR'BhP\!p`9]z!F Q.eHP T"QO|Os"xmzjz k&j 'd" R޹TsyrM^U-O0 0V+B`Z!p T!+ y`?U|@!o@#dBhÂI8:  Y$Z5Jt1C"1Jc1?Wg燝uuctJ.׉ABs5de(*7Č[V2Aj} ^=^kO`JpxޛC u߮j3CBRPD00eCXnVDZ # Cw]bɡt\D( Y5 WW$hw*YSGXØa`j 7B^0.J:Tx`,C\i֡0|矷> OFXF\ZiXH DKr{7+gPLױG^W7dϪP*@ 0O rT@(gkd!9@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL .رoo_nw95ݾ"N'ۓ@ZʷTv{BKRu[N/v/vv>YOy#2M|>NJ1ד MUACFd _'lK|6`!R(iï[rd|:i>O"(xyѼf2V}ܛ' hQT=/3Wu"s%LW#8*;V4w}z8Sp9,(3T>2a8S/vᬘhok׃Ry F5,Ҫ )X<@'yQ~JaEc:4^4X'p+W /r.!%JUG쩦ՔmcǬ7|10 >jt&{ əcA(~.jPPOt:[wBW(k g_^Stֆwcv 4^j M?CL)%unŷUkrիo[clq(iר{fY|6Q|ͣfL2T'X"[#rvS5r*J?ͧ!LUu/y3?A3Mmhx3}.z4ͥ4%߫ ǣr^"mn!Ѧ˲{ե8k/mB-8:r^0fDokQ']Y%kjVC˔t֫%Bz-ͳn(aAqYƹP3kM *ub t`l{1gw=Lm.,?_kwm; ?\sȢ$XBpfzbu+p+KOxk]|9 wY[*k#1l֩K/$m 7 A@Ҏp92KEr9$ ɔSͰCq53X 7}p&}ѢD /htUįd5%jmw~z6<ۈY7[uwzmѶ"گS3o޳_z[֡GcL V)6ܙ`ew#R?A̟Ao$y leM9g4;4fz5n@x68{S|nliD]MC%{,)`:RηzwxnJ(XLgk5(lk4kjJY6?S> J*:S k!,QCr-N'#=ٙ/ DWg^Vi_@m[~´ 0|H&ATZP 3Jv,ږsU£!zxt =d74s؝`qus{}zg2!5۷GD| zqVzjG1Wԓ٨rӔ[Z/[su$@GKŲ܆^vg4/]А6g,KG 68Aj/uܟ@v=+?<{C@19J$$R,K*eZ2ntVNP\bᲺje:py\ߜ-Fqj|E{P1xyR,9 !&οOC'>{yңo軱y~28Ajp316Ϣ ﵾ'o3Ԟv5(I>s?g']qGL_^Wvil[se\º#0p?dף~>jocÓy3t:sÜL=[tt3zŌdEkLI{X6NFa4y;UXGDr$7mf~*ci#FvfV%糓O)(sb`eeY V6"z[;Iq{&~)B &n@rB@9wbWAƷ->!c݇y_~H VU5Fy^¾M I599!Egg9L.Z ||. M}!-fv]Ol:Y/A+|˗}~-ʋ^{4 ^Oc᫛x=޲.][ȝ5SdS<^V[/n.w79iitvfjR[ovW~;0Nw{Jfr %,X*,OYp&bf2ǥG.Y0hf٠.yYdxr 8S 2Z$}N~?8{5NB"nbVRTJ[Ziۢv훦)^t7E5gRW~XqE8YI~֮~]o]jٷgo$ϞώW8>o3Gq r.rzGݳg~?5lpgֆ~ӗAzG뙏(?mOqH ; +e4~/qûU'B'?I@HFdmeCki;b1i$f\m"Nq~:G3%׏~_4^ nNwI{FhAzxM|4NThAy!v2.$c @,>Ma|0]W1\lk ^/WVn4"}5wm59yb 0kw#-7B ~>,.1vr۲vFŋdnx&@ؕd_t ucSYѱ%y71р_MьN z^4VkP݋Xw_eYq*оU>cA(~g6 gj9_ѷ/*za^b?M :%` 0Ttõwn&{hGe[%y.d<>)d3:P S k`RPLklzNH l+LJ48K߸X"Z(1scsYΠʋe?GʐfJ9uWZJ ]ۚܟX#_Im|]sopLp+X.ڑov햔֫]-9skJD!BNH``(.9K_"V%XPF؎W*}W--hkAw6R<Pj/4x +4R B;sc C6L6>x4iO|TQl1;!0 ,8/eNP\kry e/Mݨ@0 0ѐ8'~ pb&0d4_9*PM3m;D=ɲU|Qy>9/`|Zf~w.=:z&t h(ш 3MHP!b|rvR$٨%4'ޠ`Zb,G'{:gs)b{{ry+t%lx44ֳnxm iy8q)!3A*ӟmogu cb4.~Ruo0meAICİN*XN>5R̜s'0iO@ TKbAkBh!H)TLH8 }QG8 7dPyu7jۼ7{$/8^~/l }ݳ1-XJ~$=F} 8F"[.-꠲Z8P&ExPa h\T,zmMJP5:zfՅ 1 bD29+yU,WZpI) 1 xv..PVM-v%}wWz9%$aǭ:Qlq7z W`N.gtٔ0>=-T\VRi e͙V^r8ŬYZ)3>!B:􆧎OΏcoz֚61qג#dMNJRe ,%- Ƥ-OS 'e]M ñ̖k&{s>Fӿ:Uz7*0,Lxٕ]RVYW %Խ@QmnpίR #^#胮kd &}I \B&zoB(UQd&Z%@< \[oo`U==;%N'T|9H2!OdǙmɍ^8MWh~ÿVd!+%FY.NhaBO a]H $#eVmӁ+$7 A^`0sc drFlсnw^kw@O#;a n_ ?p>|Q$ i.n˒vF^uOoSO5a8IKg\<{UO/Ja˗[Z:Ҋ-Ed ή{^K>A EAG23sa$iqbbBÍF8&vBip}˲!7Ene˻k3yYc7]puIBY7s#Q~Mk&nzx|=Iw(Uw<^YT.6so)GsVz͵zwRt8 /;kp"ukhz[h\<)oĂPi",eپ#=wfBE>XO7rn^xjnE(@V%MJ3z(*`eJyŘPA<1*1V!Y ǧxcAĤ'Je,U@aQ`ك Rmq^2f/ξu67Vl꡵'N#4' wVRrf߾~XWL27 BD\& ]#A#d*K}BY)oc4Ir|ԔRhVZXr K6SydS6;n#ta2&Ys!>C}`F9tJRT"W&VgÒ]ޥM7-guhͻ'dӺP*ilϗGl7*eg«VHURX4?.f&`8*'"zk]ҊF ߊ|r[)+%j3x؜Z<Q*Qٴt"UN8rQ Q2<&[Q2uPFkDH)+¹"YY5Oh id` TQZKeTMU)F0kl}Eݶ* wgb^(ޫ#[i DRJlsRH2=)&i;BƂⱔ* jk}4׳})!٨6,KDCF,J01O1WVMU̱+ˀ&2;j_*Lp%m6, 2MJ(`ΊzVM-ի?^ͳ;7LrGJ< d,eb2{;n>kK GA1]F.fžFkU \-3]!< O9ӌSkWO)YN GV5OviOz^(i:8/- Y54!rs&RHF 36*c͉$+&_= x 21UpR,*c@'X%e=t)5ՋlԔSf;|COTV9e6|\L.2#l#~[Yɩ˂Gd96A2իG=X&ϾyZ3ځbbZN;r[ǮM.]{q2OHM'6Xj߈貖F>|?,?H7•bYk^*h̒PEsVKe#X΍"̻,^c! ZK 8gQ+ q !Ki!Q[ѲQi-qHbov#{|浬<]ТshU( @-99<\eT^ fb.nM)tLwU>_t 'SVB0qUP1phRpAgmfcQk49aɑ"$%dnNVFE"2;XSZKC@ Pޑ&tNɗ9ӽtbOXrH3+K%f)d.. qa8lO܍Z4jK Oi 6Bk28s(~7{^k vHRy/lu5k{y&& Z+i Ik38O>uwfن>w݅0NZs~x6^R=-$CֵrmK(oot]3|}3Zmfy|$Ei4nm`ŲG'_=jsx:<9k[lsNkum_%N˱4בMI[pbJ+J$x`0+a U4'gh ώi:闟>?P?ˏ~xѪ' Gͺ~h0,^3!> ~(y8 =|_ G.lv 4Az@ں1%GW߷ohurVF޼iaȚ|v5~ޮؚCa@NM"Y[0h^~>.9q? I:A$#O3a@*%/`]`4!hcIOp(>/[ C9Mq 'sgF|r`ebmt.t%@ZeeJ6U.n6󖎻>weVU?Rz^/&x+$)eX{\(a? ;HNtk q]~p K"P""8eL@"F3l(}TʀtC8]5`A̵XyU㳧Fv Y>) .(OwJ(/In#JpE)#*QQ9ztwzi"k(&OGkcp`(Fɱ{moJig.1׈e Nl4ϋTC Gk)E`6Z#@c8܁"T\2c&׋H)xo>腀QN`]^G7^G.ڀJngߙ8wƓ|w}m{ ,ڝOOgm(?{m9PV9棢 <,iT,"$ED5)aQ ʄfYa:gMO&0QAqJ)p{q7Ǔh/X6읭-+:f\JΊ38lf%yAU6 / GJXVZ3Qr-3 'KjF4u c{<H=҃`\&UK"%|䜥 =q2%.]`FhMB0`u`@ϿȭH^'mJ B`FcBbki Ow aa w>h\Mpyj9LO?u 'MAg^`of.PaG;?飅tZ"9@|vƧIj| <8P:Egq{6p`AEDJKn X.뙲K^Rc ڀ5e h~A 1P kqR!rՉzgҫ\`0,6-,6oY??'M3 `JQEAR H[F J蠒ӑ)G50odQEAe9wU@S <*FFˍF51jN&Y&K?M`M19%ĭ`G'xTtr}ZM׷fuG[KE=e>>j&rUޢ YeuYz.sQk׈Basw)ʕS2:Řp01qQ{#uLֳ U;IҖpJ1[XL2v-4-ܫ-\(?<#51۳)&sY A~SqaBᆅD dDF= Ki-p O!=3βxDFcE .Qj8[l?k`J.]-lmg;@0 AND20tnd.rPGI) r@ ÄdT CFdHAƀ8BNNIFTv a{ؒ)l"- e-⼿"vqm96"C%gVSgQRUÄL18$ Δ6YZDk(-ڤ4Q #@A͊P$ fBs@BJ-6-jvuHbR]"bg71"G`١%gNc4׆A(..=,&CVL|&,(WsL]#*d{/MF1;Xp\X2lo2Dg-fڈǹ%oV9ӫWկ}6O/][_?Mޡ;88LPIZ&81R.u*W_v4Fu PE:6aX }U88E2ԉYn/v1{]cWGt\6ezZ[D;W7UPx= \t)K_e5R4NJf=]FWoxcr[sYZZyMAYI 9xυrT'p@)c3(#nd<(XB$/\\]ng,c[oz[w#+@eK4Hh" 4Ԍ+bYOes C>sdu vzYdj c(y$$&$Orc#Iّ8..$&l7!6ypb@~;1 4HŹR:R>fĩM)Ę- ?v΀$| bXYy(ZL&d'Sj|onᰔE|ʒt  1dhԑ0mcD*rufL,P9jQ$k))af`B4*Q( EĀJFQByEB'b:We/Xv3]Bzu(ҳ˗3Yõ[=91Aɓo?*Cothj6Ys4;'5@V%c…63"L "qy\{]EThm7[SmZ,{hF565:x~ĥ0R rfD :H*WϹL>xa 0'REP(Q184JA_/omuwHzol?@vȲsF>1AZ:1ȵw$m]bQnX]~p{,nLKFI 1B֒y]͈"5#n[odGu9>|r}!>}q͓9N}ܿv癠ĘA\IͿ)iQ@C2nH^z['&y5 = ۮG .P Fy) ) % ZQU6Nrk𻷝^pt9\c]<e~P~M`Mo}yDk7ə-El&*]? AgzF\kn4 BVkT׎c){l~Tח) Ni8h!p"9qS7&HaUsQK@ϐnZH2Y E;OG2)^zt1Q3>Bd b?voZP|zg36[ۇ]~oz]Ὧ=z*|`>}&S\+kҦ,M֖]"ơ5#TK=vl:Ѡ HG"`[ϟ[׳֛6tlpl;\į̥un7MWWw17a2moqwcVn:g )U,$LÍIHO|Ė\dbV Θ[orTيU;Ȼ洪9.1EC4{LWc߫iߧn8/.Cۊ.?.VsGU3W4;Oյ_V_^Guxq >AbѠyzn7k{q]mNX KwUoG APoY;SKҮG0*d5V\q.+Yv0qz0U-U'țJT?az3tISU.^ -.-pn .!1V+< un_6L5_<i>aɟJ{\ue}X4Gzգ? 1k7H*#jиeI(z4&G'd5 kpgG0o&-8/ES_&${ \xbZv7bɖ4?ſ4͛dA;(< bdw&ڤN7˻?2]OVjNϧdqVz&Sh{4J߂eoxnG0r7/HpO`Xt!?yluF--L{c*TfS3bx]V3]>.妿su,~c9K8b**wmHzLYE2~=, YXN+$Gv,KQh fHY*R$Z+497hQ2;CaV6"S,fO[RgsHW,mXx.%T<e*cg?8E_|69izp*)3Q*DGPoW[_G\mE?M]$`rʄ4Sשak?_s=R K[R-. Q Xr%b.^(b@5>=w xv+f—qvXஏ{xoӖP{^ ܾVfqzЗdNQj_ٻTS[( X YeDM>3"< "4[^ 30<0p0^L YCq*$rBՒ.^ۤ;U_2u* U ;g9 FV4@V[9pؕLf;E(#&arqNEa[ fO[N55x,̹`̤u (M*8S1$Pc)+*c3/W|Cű7Tm)j~e G+W]ې9xcH=uuZ1vcZ|[GG72-G4ޚUӟׇiB-ãe qxYŻE%koV>A )cͳ H(rAv1~sI;כe'fuqtعAMЬM{mwYÚPcRC 1q*( t{/N/Q}[{ԁ4*qfrZ}. ht A(/ 6{;xu^Wk q+_]SsQ9+ ^^w܌|}qe)dѿrӆ'qO-s}<&x嶝>g\?;g+Eӫe>wvnoo}۝dz7'*]womz"V Qe .v_VlЬFh%2זy_erR斄֯}A)[?< ?|wJ16\zuXhʭV׳5mnmy'[_/.."#Z'#O}W>w2i-WfRaGwW6˫O+4?B_CZuXk^lA{أyWgl e4U/{Xm@Dyе՝[ϜM6ky~[eKf<\Ity3.3uQ roklsqv7M.ǒ+ 'U/؝@[4V?Sm55!g:)?8@`&u !r+gͅ?Fͽt+YfYff7 !w&h9L뙘7>Pw5lrgg;*(3bJ{>i~1} TuCb-rS*Lv^L6}%G\&:l7m\͏=y.BfS/AtB/g/&0KIbfL|5|J.ob>JL>>^1D^x6o_}<9SRL:}CPj 3| cfBLullv2q$P1([6 96Vm5;_bƙW9o!O^R8YmR*Z{pUԦL %άpԷ}DCVܟ?߯OuE[YSlUJCh%Wtt >RUvv#ά\(i! #QW\"dM$N"ZmA+']Mk8uY?\ -NSTRɁ)҆H1՗+ՖZpyXF;;3ۋio)PvHbΤ(`JAI0Z=\KAh聴HHU*as5Brgu\C}6¤Nhcꍌ݆qnX2B .pS1-56Ru.,C7y.>>~#vʺT9dC.RYN@z\ؾ[T+{A@M& )*Pl(UI({pv#v8s2E~PQ;̨=3؇, BJ\UX Q{1t1$P j@E%ͺ3VL1K$D (:kAI`Wd%..xm8p % вgչY͈8# c't *e U=Xl$dW_0"djбo޶B>ظ5:TrڰP$NB3BED`N[ (흷k8": ..eSZf%/]bWkTi?VĕC8a!Dצa3.>.NYǡx"a;9*s5ElTeŘGܜLُ4_<@:$8[lbбlEa6%D'2UIf1[e1rPJk%XЕ*\&dȋ c*>b.M63ZPMދ݆G`Tre/2ֲ 4,iܝ3o|W|eˏ?+ƷlfID zͭvͳ8#mއd/\GV|.I03{n7Vd~>ع(NFK`BζIm:!pPh(}l]UƁ J!ۖT\h*z|8OQ1~m8{W l-^OK{5u,`Ea0E!b.t#@r,تr5.>,=pk̭NU%e0*#/WbiH)BDLQ|}6%9Nz^622HGN!& *׶ >G}PWƣ+Z ]cR]5!ۆ zn~=Q x^J;Tr#"e $1Q笣*䕳&y$wMγ[}*Ck$z25JLP3o]OrɉKpVZu$ur#m)JH`QㆌMw׸q&϶'!xMܩg|.ݚ2NHM)D=drvJ@a̦˥ 4m~{T)O4WD:ZS5XQXEI!FIhU)fj3f5 z@t,E(6ɍk@6E}2 q N&.AqnSMj^FrL:߷ǁrԡ]l 'J+ ՜1) X!97l]Eq3mAWk C̶Myp6꼶V,a%#2?DZ lA\ey| C+ky21丟>?&^?i؎ci7 )-|8ivT̫|*{N`V?傧H0$?6Ynazy=L?g7] '%!zxo/>6\N] 7^ q}G:9T,RuZ_.B,\QCupŅbpe_,ZDϟk|->*qTra:S+rmSNL9ӜdrR5}T\Ӣu45vS8g7{j0JiܮvAV򴇧Z-lI=uڻѼMby2 W0bjt\pv m핑{]NֹJJs拞ܱDe< )ʯTdOb![^&~o^6E^"9ǯ~*o>>~1e߇_D B.6/^ЫgQveNCs< gM5iW".3?͘W, rsh?b7÷;w-ko57]z;+mSn*j٪pkbC< cʧP9”Wj#]ER.$ e1Ȝ, K 4 **ϣa]`4"hcI_.{xdtWPsI;  Fu<9riyb^\s }&ׇ|a:&v>tiBw_cUn,㥼 J-lJ LO;Ai' Xy)s9@y}XLKK(JbJ0a954eA\ YF4%; 3R~ ǂvJ(nnepH>BN-L%ʬbM\HT 9Άu4%dr_̡ǤGl;j%eꚲo%j/>}4:z}DGl َRt golXɠ(xn.c0*8Æ2 J ]L:u('\F;̵T+}pk#);[~= "")f?AD8U\\%Ryazxw"-&fgيbkS|7Od!}s)Gbp}B1uQ!ȣ祔"0^lB XDp8|N4dLRN'| oI]w~2'qq?]L9z%6-w ]yyN6;tF|c9PV9棢 <,iT,"$ED6MN98X* gXJ:M8Άs7oYOYj _=f?;(vnWX3Kʲφ2Uя?v:vYl7fš;pa:Ek. N`V~od2Ho'7-^Fz!RҖ'LKv}{AkM(NW$&Y|a)gË kOY n)jՏRa ԆhXh6^5œxV7)yOX;69vP{S4z$*J^;{0(^D:Y 5P cCo~~| H9@VZS].>jgv] @E*E 213YqxA /ADMYAh5b r&iQܰ/vb-zt.{ۇ=RE{/Y<;!)Fͤ!oHXHBdC5SeĦA ձݖw$29XP$ddw=SaXF 6JyY1u6ljPTŒyPXJ 2Z3IEu1. W$%|&׳@,{AN?1BFJ}"pmNH >։c )%Evr= jh_v.'$84J%!#"3gB蕤)`&$C b{щثzwRZl"]Դ T+ Z' e Uhq$# ѹ~W:rH =u墬9,nto+y(oia !*$4GFB)jPhF.~Y4ѤG&Hz=V` }(!:#:AӬXaEؤ&CH䈓e@Bl4&Et&Wk,EAJ'`ARDC%5h>}a'~Ac!PR$i͢w>I"o=jt˙Q.l=@u{F\&2F"Xlg<T&FE棶4EKv5S,p.VdOZjU9*EBH@u>E#ǡJ+51Hj&l I1#p坵Qcj^S˄(~b`ZՒ aZ‰dFr8dP\f}&V 4-GŬOnpVP A{C9B@)Ah 'mbq_5 Q4* =? *=-մ #yʵwDsᩱ9 *Y*L؈D.Q =#uH!Rds\*hOws$^Y˓1$l1)wI۞fcدϟKOCaHqGq10͎y/neO ̊:$JiA,70p<şϳRd8GEsx1~iδ$Ԡ128AB9a 28cN܏. lx(N e%xB; V!Cp}Rn]urYIVױ9zgpY\Z,V WjI '@%\]fvj].߿mjyމ 5G]q*&k(USGŕ9-ZM^?k7o͓Mpvs0Ls>46kzm7$HjΞqx⎠z!iI֞y[7[$Ȉq-p*fM>F˅gM^$zmd=i6 KK$_Ɠ:|+|@7Jo1|)kRYT!M޼lX508Erï_U?|x}c߿~Z'\m^t1 +WόٻFrWE~lH{/dvpAcedkz1jY)[3USl6U-@y|wӿ|k][VN=|kuͧ^Nվ#/kYT# scv 6ʑG$uΓxvh3AE& ,:ڨLN'mJt1R.Gxdet8W4NJ/AdD!D*&!MQ`!9fTĽcSůsT }ǻt=t Н !l"}НդW40*1JSipҶEZt;AZʇP"zd6$<GX'<䤏KxB/lQPEA<9 ;@ЙRa]k/^ϣ~H٢7ݬ6I1bR_ί|=z*}LA`>Ȕds'cu=J(>U[3K<ʕ$ZILZL:mZa)rP<,ՆQDuo@t”yWo5z48I9V.fZtT[x]x]_Jnj]K9m{uXb֊Zl3-Am"3aiO)tE@wQ&gXNlxi0#aP0 "B8+"gP#p5tqH'^L }"Zmk|MFv3~>RƝ#\J rA"J3Q(mҹsL:i %z{ ;tV v?2`;(Nɱ v7o|xoju%d?ZT ޟ*oP,)RCN},O}5 edH2Ejy@[bJ)D8867<O '-w1_bRsrLOiCZ0na2jYFnrU.6h[!"0 ADjYƒT`:=#*ވR5<a@%u,RhtG,%dLb\cm2'"1xg: gGV̟}IsBa4:!)9gЂ&acf3骂Y/JAnH$!S HBbJT!)ZTj١ϧ.~IzT~}q܋Ur>'/viz>jowZ.]\Sji,~EwklF)9%OkT|ΐrr` å WvgsJhӁ6$E)(RrVb2UȤjkjyX3*ta.tutAuzD/.vۇ3ξ`y 0OǣkdNAN@m BH c`Rвn} dV4 $QHQԦ,CE>L܎9< ̌pmWGឹ'75:YGTMF8XpBя]h̖x'}J= 2/4DU&@C+qѹǃޔX?}wRIpcT~k'0vx E)d iJYؒ)r(UHc%DJզTJKw< |"Ѣ6R$-UB |ҤH< KYWMQCB 8Ôu>xHMtrͥs]jy SX͚{ (HL 3O61a}ʜF}Q3!jWT%y9(x#5E7$"4J F_xEVd<靎jQho*0BJ޳ !fHH[U@C:*fʭۨo夁݀sKҲ tX7jGrmRKwa}g.W5ꃪՀb!H )hkc&ɤTMK*!t\,SqʜzX(1pi7Yށ?OY}NGlӲ '͜ڕ}5Mr}~nky2t 06[JpX {\U}Sw~sLWחſgz{z3FkzI߲-n9mPpޱg<733oyU uB˖۔LЈՀᆂ:ǐڜ"ܱ6s,}\Ġ2BqYJ!I`9HH}0 )Cjǧ6ԺLXC8xFiɂH645- <IuZ*m/5K>\vVLB|DN3늣_.NfڠQSr4lNX?*n7нC>PR~Ѿ0_ԪO k]QG@7*n:Hrh5nb7]"uդeG-=!BggFti.wn;_=lCsf\} eگE<>&]=E*kǮeݸ֡ͦ$<ZgG6_oE}[-nwO9=1j[gp\wmz|;fz([]ny]N:^)~`#ϭr l4Wjq7mxu5?6|;fM1ġ,mws58ȓ*EHכ*m,W , YX}kZRs/>~U8|r胝.R8_.ӝ/gihk]RmƀL12e/ICLɐ#&$\7kW|*ե,@a^h8LЛBÓ+"P)ڒUc-}C%uaRT;~DG@zF0f-?53ȜcK# D7\({8 f,2]*(SP& 㭏sHHh:I(Er:`p*xC-2 cRZ@%}lxy:?7z|{}iٮy]ɻwͻ BY9r>vi~_wWwh{c_m g4a6ߤoz;Hw)wv}4կ5v )7Rd=Xpc=Ԭ5)\TI$5]:lL3)|nf} O^)6L nA OrtrPԥ8Sr|{fLzq1v ODh#b>nT&b>޵n\DB4`fcj@(@@cP7ݙYTelAPa~ڝ-J-/4Cv5JMJ> dc!A,@$ɔ~>vMwlc.f}P(zbDE.Vǥ.̆c˺FKO 1OWS2Qh1S$,.1aKNÒj/?o!؛ V792mO7/eq.e|$<'v3V!Czo@:g lK|pa&uuy%BGG˭itKc9QdSڕkv($g"G +u5gJXBkj'"D ) lkiHX8oS io Jld$۝,:Oɷ8k>ϐ'K'$\MsoأgBGF1# ?#&HLR(Hr(^ dh`qF&Z Hz:fG=y{n%lx44+mw^^]4tQ90=z&HeFW[Я󿝩 2ld DqɱUdeϠTc.S^lE+RYCU&:3O.YBgRRXx_Q*:d§q< +fu<_㕭φ;O}1{jM^[it܉\y&` td>)LԨkK}DxPl>0Crr`>iqC6eŅP$fv)[.,߈Bn$ S&#K<+yU@+#@s ) @/U!>ou+=G[0VTt(,fGE+ySΆe8H 4DI 0q͜\9^a1 MOsz6=J}Yz"OQZ_YQvzO,:x@-r)ʥT)YF1hI?ׇ[c̑{-V&E:hTօ5X";NF=߽+.C>- ԩoE0l\_NVK5xqMЁkQ6L*QBl X* KVQQ#w1paKp(XU{O__6K(9Mg4T|\gh> GJda﮿<[zRh2MWXi~dڻٗd@u: sxz 4S|fԝpޕLRG%;~P'S{q[{X.DT*K  1dt\g_y0KOAw\(XaUgb!}`M)|ccfIl~V}'ݧ;#gֆs   lClBl7B\}]$Tr׽Kڑ"xeV6-ʍVݭZ4<%omm+^vgmʼn2~,q+Pn*ʵeq}y/2[ۂ] ݁NdOy$ipbbIB[pW+Uv7 R5VSsޛ\iuINYs#?ʣnWfEѯQC)[yUKuRe4|t rk̄#w#Ijs-]#g.лK]ŊUI-J4^5F휷k LB-͛f۩;105P:Qc.\O#g]?\*/Xjъf-ʞRH;'%&/UfL(+]lLOv,HJ m@y|9h8{U4&3tɢLEnCe$p$dB1p<*AUf_~+NWq[;N'PڻpME\2AvI{a˽հLY F-=*=w:VxW6Becp9rd]{òLCrwǟ?4?~vLg/‡FZ/A09JaC/lHO& 3IeQ#/9SK}#U[PIXz{zh6 L&޿@ܐ=.ʬH <%h 5VKL{N3mx˃ՋK%F-&I43D&7>%}v !WjJá᪥1C]]+g~C9U]KZ[zLU1>(Y-K/epϡoͨ$Xm#3Q 730F֗PU'.źxǮPKsLT<n`y$\9U$|cV:W*t=(&b_;[vSr 31Kctՙen_EczHv891GQ8@Kgc={㎰GC{ Ш@(2$! +D| &R0iDp>'JFTYyZB0Tdk>TfY*,E]P*k6)=O ?ӣ?{zK2!3AL /G Vz|pHNiԙB`\E uڀW0uM)5_pQh$D \9FpKzTPAm2%x;;(ˉXxn 2,w9 )J̡{;\`"RީM]uVΖkkC+! gl2VdCYfpF֗MM6f4:j8`NLqCg˧=D $VҚ1f;jlvY`)066~2c5jvrOd ϛ]OBIWjf^D69mtjlrc񚶥Vz27wUEd&sL;<#䋟+`DYRw1b,́-h.׮򢭶SBEVylBA |/uH>'(Is])HՎպ92*ba5˸o,ccW/6Rvv^luIs~||zq<|z'!r M̌i *TPEaR(10(aS!ɀK1QȈ ^՝;ܙ&QUڮ=]gJ̼H S9!+-*!r@g獼r
2!C.EG Ś #(‡*H:_;Zw6ÖwQ+}ܷ\Չk->1=998ýuZ% #QU RaJֻ6+@gzH_!ev()`ӽ=>{0=GĵDjDG,^:XE'MJ.7ZXɼ*/"##$H}rq_7[> m $wY. )D Qs# D 2(#[k:[\Ž@zՁdQ|ըWw"sʞD q?cD()bi,'Q?%p]H\܇\<yX:v<= /)^:Io87m0ŚD?J.գWSaDj$e`6Bl}Ctst{ ;2?uT4#q,UqvKw,jwwם QIX)H/C֞'P>̌Dƹpg q &jk HQtbBcN#&(൑զEc9,) 2yLt)Oø=j^q{ ꀵXhfz9 8.Jq6ƩHL8ey*wl"ѕ<^I%-{r } + 0sn5]$}٩1L.JbFSc(YI0 /Z=.,{ik82;wD8&u"+ fh%NVjْZYKMJ/mu>Tj2F ^ ^XUƑ!($2pɹ{_ r.Ry!yɞJ:BYl'CjP+N/Uo'52F>(b.A8! =)KA+KJec̺e *c2V&v96,2ᪿO#m40nMWJG@[5pE6Ej `u Ibx칭Gv{MɔO/h(U#]OVfmYrj)1.1iM2h#c)՞g%6Q1ݳG=X\Bv=:Ao'EFCMI9Hqig$J $J" 69P4&87TEndԦcS(SJO]Y6F$S9j"T˅ӁuUd@kL(,I0qp)B9C,$xN1զeJ4=&n~[R:;KNG@r]!pdxpɖ9-aHͦ\")MB Rϑ*:},@m/2/dSF5J dc9xց2ˆ##pt*䣘s%("s`_<.f=]QF, @V2ə)F ^=FMP+47B/SRM[eLΤ+QP)\ ڪl3 !nVgO&陌ϏPQ@ s]#4y}"q*FłvNf(YYJmSw2\dr.q0K{'LŵZerժxq:`$I2 hHeRA\TpM`נޟռ;Tzxbp;jԜ ~$6OWTݠ.i׳ޫ5,FbhmN2xKAT$͙+2YpTS'{n= ferzNѿn/S+ʥ#hG)+鄄d0mVwz9^ma44p甹;S餸5p{ >O~8,Zq*jx=(ILW`F4ϓRcysyKwْ8fo[ =; 4Pot[߈ NggϼDJ6ܤrOn8fyKwrnǹwq<emu6Y=r.|";#,ȔJU`C,dHч8.2Kə\?rw;tIkWWm:"LM6q(c,AHf:hdBvd?[ r5ݦ2o%֫[H=t2`8&7qJ t%OiQNJy^g\2PӔ" 5Fǒl]Df~s#0|[ǫhO~ Ʉ\*اDjfzh_'_Ʒ'\i̢iZ9+5@7,nf=,hi h'(~9BYW95w^!yuiXpO;7-N#(w;7tftpG qG{/YJ=ط-I^-Wh5ɍSu@fsץm4c.[ nuҺ=mhD6OdܸmԲڅwi{jtEJ\tL7O7;Vkކ*z=?LѵϿΪ7.IsEmxwKmm)ǏӋ:ʑ~v Opž67_{[/~_w-@i#*fmHX 'A̅ƒ@oKtd-s>W[Xvl>u >D~)tWe589 HNh-Sv"`NZ%-R 1mA% '#JuMM7ad(O6Uߜv=6\2t&YFHm}^׶SXy|ym Qq"'e,\n~/7Zr5JR.,Q̆sͼKi=>EwYѠRjJHFJ.cIP8Bԭ`{qRAݽu ,YiGø%U)e`}d/6W(,5(h$Z4T}HHMUa!/!y^Zydu!gЀ\&"lurX_"/- VpՉH*VHGuGދ+ݨJ{Hu H!kK;;S:E mCvlqlO)Ut]r,j{})^[* ~..d%Pe)qK@щĕɰMt6"O.YRdR);޻͑$qaȸ3o0Kh9NðT={1z_\!]C{pc)Ny&"ٖg$,# dg*鹶KHB N0$'KmN4KPq)6D&s̀v -&72PRǔ`f'+Hr%cPĜRh`Rd5zMgk!wmM lѐ}\ر^~pO0Er9:l[=/%lٙ3==GU=]G'6Kxݺ eHޝ`d09Im d=vRH74j|=<φGʂ :+$H ˉ akR+`,4<\ K vdv~;ѳUtG<9LJ:C$!*jñ3ƽRR0iE$n(N]o,9{OU Qdl*wpۊȕ wXaa[4oϏ11}ɬb ? Fԝ@Q+Ʋtw"[KmuU:6.%dTQI *xi#HzXt"4!&N]QCS-m<,ݙVZ\Jʸ;cv8&M3~}Q|W9UMf'ño?*Vi~dޜ\KbPy)/l?X<#IЄg5 eh@G$1ՅoeqxBR5aS|&"rVl!O<=%Ye3䩶ܧ+ R̺4_s9`TkZ .٠.7 r,HmX'ځ`q=];,\&˵S3@U&.56V\k{pVVl^*;<,kOLxx/mPQQd:MVm5le 5$Jy33Bra v> a-%`kK;ǴQS&7rxaSo@P/ foIOZlLfLg˚6~LR/v*CyePƼuI4sq|1sUSŝNKDD >}QkTioVECO'[,x*]4gf *n? :Tu% -oY~yHV_K 8!1Kr<+&T=ʚg `"c,• B'$ˆLǬze- dqj\b Pk.$%Uw281^3bX T1:/~sѿ(6wCy7LlE1weG8Q1CSlcF{1 nP_e=@%؉0!ܛ @, gvWJF:5 H h~"q՛ g.f l4\Gsqvb3`R4 :l~޴N\,_MN=@승rڄXw+; ߃ >ͯB.߼3+':ȊӋr֤lj|J!}Jkc)UnGP8%ٯiZӬ2yݓP I'Ŗ>ֽٚ)1JIT&Q07[ J,) SZk܎ȵҝ6F?7qXJC<3\>2ƈk8u.V_fv@ `TJ4cuOᜢhl!0J)Kyl!Dx#XaisDBq[>JPZ 7FnjBa73Vq/dTaXK"2e`"JFVb=f2|XXs*Bb(u1cuyA UL+(X(xcyKe {OmJl>FA"\>\Ny9W/JrOۈLၱk9c6([Rv )&]D}I,sŤS| >. =xqH "lxgprxbjM AvI}W:F,a(w7S%f,fukD[*}#tD 3aHnf|0feB qu5{߂ Nykkܘ(9 C:s2>-4Ë(4M)Wu+.N\q h1Qܡa5LH5oPt :'DZXͣ'@8,k\pPK@b,Ԕ[*U9eYm <ͪ.IqQɨ| {MV\ʮ羃^ϰ:M.5;~ouksfʭ.Dk&)C\llK۷4Tvtix&%.sH6E.2}F QncwV0#)6>]DcrUZndvdfKmyLwT>G0hIi~F]&2F$.|S6_~ ǻd#_[ҧ=>O辰djA$ ],(ZȞXs0M]Z$[:|o;!ޫ Yq[1g)ϓ?@nLNB퓨/Up7<oO< Pzysn=eUjAWT:wʗaN7xQwgNkT+.ֶݔ[LA="4 >O(m2Ę1Tdfi\#5^d qC=ۅ_>|hӝXS*Yh9uv]h9uv]h9uv]h9uv]h9uvPh9uv]k]h; s:GѮs:GѮs:tԠ#IFa7HʻIu,?ZC5O'CGsm45#n }(5Mjq*7 ͍G3i,lA|)mF$}vu9K Ŀe^ \9xw}dWcj5ѾXuu훾vk+3<_A4%IWS7ji[Ԓ]S7R}#H9o81/tŤN`,2=K}u\akA[!m)FB,H @Q9LH Cui+T(*q[KPcD2Y+S/5eDDL̂$EC2)c"-o5FΎ3k\>M>}. oɠn+LT~.6B"jiH)i֖=Q 7D-KjPkIi.Ӹ `k0A  Sj@LɁe b+`KPP៛%zl$'hn!}Ӽ _ni2O}.x s I!J@HKEb,4<\ K vdv~;ѳUtG<9Baã1lpl! 1KQw`$#m'yZ< 7Jd|AGK)&XEEFLOz\ik Jx6H0MF{\|Wӗ̺ - mD  @dKx !.t x1b|O}sn`w_'Aə~(g*?7qFR,`dv8y⏊U٦7'W0bP0\Λ_VCɔgU! jV-¤ȻePMϺpl2aZʮXHYW!8~)6Կ.q w;Ԍ} x%.56V\BUj}=dZVbWaY{bX_!|kK[8TF԰4}2Dd:MVm5le )uYC~Ay33Bra ѕZ}&ۉB[la[CB԰+Ǝ޼mΆa(޼;?*;}'b0Rmu}jU6@Iv"Y}9̦~qÃG뻰rrn?GU7%=Ps|# a#) 5z|g. KfTJkݕDuB],i1k{wZFv1 ~-Y|.ۅ%6 K bڠyDЫUʙ"9G^f6Iszz'b2ATRIVIտ~*=zLI);aKLL&~rH~Izw/9Y9geKڠ6Y!2PZZ Idt>Pㅇ6hϤXY%] :GDP >E#kDHdйHY($쏈hhI9tyu֯׵l=uϦ?ǿic qmP3vޘ. T3e|&aL4i1| ߟuJc#ZAmsc;m1?L,vDZOda#걡='uON7UgcJDʝ5vOL#9sI5cZ ci pI(SΞɳZPpVZ i`#Bޣ7'/K`L L!e QG-f{o"]pӝ繿]tݓ\ P ml*nxf^>^3x񄗪eĝHg|"+!'NΗ:ּuƎ0v 0ca0xR"62NM_#\0DKL> 9.aL  $gg5v-sguU!u3t.^],pNxB%Tb.بR2Km"NNf+ 0]b1:Q4` Ԭ+1X R:2} %1,/sJ,17n ^~=8;,\{5R-r^12'bZ̲f_H=Il(O أ Sps*0{4@]P^d5TB4Ǒ==G9D(0&fY)&hJQ%bM9H{m&TP{)bL5W?i.&Re{E_D,n6سg;#-B]?~Xn[#|"iTyq5/ߺ'Nӫ [8NN.ΜEQ2tф)PFURQOZoܳE@Bzi`#*UdT.y#k|:OlKO^1m+JfRbL.)u6Ξ{[MlcsƣԤ!o1f- &ʓ&HAIG(]tY(Hť"T򦩢.(l5$?YJ$XI3Z^WҚsD߭͆[Q= d/P6߂m{v޷su(p)Yf'b,smrgޠ`|%<|}kj(eq`% 5o ’bRJцƛֻsaZM- sQ{X#x].!)Rtt; ֖pv[vX-l&B>xnWoްخmnv~=e`_tl"S̀WXJdj2BZ"Z\(Vy*ًY HIfTȮ0>f`*zpv[t6w(`ǴCvhlhGM Q a(9ee Flʒll cfPjfdb}SXUEaAP2o:ͱޙ(p41L&66m{w#JC B$g1cm943(z4$os@?8j[;ꂺZ _`(5C+p>@\!*#5?d!%?wpʲqIO *{eؠvTvquv z5 +`oV19CD. HM({B`UV٢m B!U:@:D: ]=7߃2j8O9]-m)QVLdXeEh+{6(n W9R"vJXu@ iR̝ߑuռ&qb^e-6rW[gv^98/um0x$5"+Y0US!bpߪ@Պx4#bޗ8NON t$ 5MCNgAgDJ^T7weThV:!(O"XK^e$C @'C*0Vdҩrm!3HߨHU;ٲELV&DW;2F$oR)c;`Ni>+ S::K7߳< ߽%_(]MybPN=գVpLGgx.%9ϲ0x_o0?HK=̏] +ϥ~xrx5ٺoEf?L<~9/GLSPHŹxqX$8i^_<}A-{+(jOFR8 7~eVu/ބxMۘ]o4nH7$=BJ?OœYdJZʀ1wy"J A{R"'=ae\xdtx+Y..) %]NPÚ.%8Yj+z!1u *L <0/쬁UcΓT_h_㬻//4Ψs:dg A8[INKT 1ֈH(Ƃyca,MK}<\Q.zH/3"J*&*җ[C YAj3R`y[([S4RTZ"U*2ū%¢yDh" gOvt"[拳CCߓI;xS )S?ξXsIOFesPA1SLQpZi>MX:]bAEd۠}M⑝dڀ&*֑$]FSFP%juf٭7k|Y@vevg7Rϴ̟5ܷ񖑻>wVN־k?>^!5Tf:/jb@J+dyz9=ͣl#NwHLΨ"%33h%%nb"pLFo\J(1ŏ cxqqX@20j1ηo30wm 1/E"A Y,"k3k3gO%.YжRU]Ub]HF\@X-Q'(»0ZoZ3@!Iy^k ێ<[N`lMI hO[klDRF1$1y*;C]PKTQZ[?V  :qNѦox;nt,,eQ X0vlyRl[N!u 뫷~TjbXk~/Gb-<ɢògG'V T v~Xcnt{.>RuYpQaEEun8u2׼΋Cޮڭ/>ȉ dܹ %AWաۖt]p B}rqǔk*CT8i{H z\ճY.=9;,72p!gG[J,JS"o#[֊FףHMiv']E_ŕ.==S/$:OkuU巔`.c7kՑ;ޝqyH|o^1g >5UeD-M6QѢ.3qA#%,=aOy5R_ʲ N(?_uotO\<~GHX qFF~OwQg۶ VUJdK3`֦,.jp Եg7+o:uߣǩP%-;Ţ꼵WYZ H&&gȵUZ18Y4ŐVEbҵp ff,(;-8O5?~ v*A1:2B;(RD3б.(baigwZbn oI_k0+s 3$:]E]u=%OStN?b @9 -ۈݜѹd.s`"`-r.H.io 2W]\}],6~/.F9!C"C9ΏtPLL bkB A-raOjdPd] pwӉ}=W"])5::h(˽{Juqxt Qs]G>E3϶(P7}2/?yx@:¼ r0?=q<  \>Oc4B { U@*8q& ̡℆ =[ۮLknGuVKz Yu v+YG%=*_lHQWcRŦ!*흍qdCA݁-cwJ~<.uv|V2zXO2m*kBclvrG'X'NN 4o}}{Z!BS[֯I/0EO9v(Z#f{~ٕJ߈-T\m7|X'P>ѹ+w|:6#[8[Arͱ ~t=ͻ>X7UkW UU4CEVKT䝿;upXwcR!nzlWNٍO?mNcL '9sebL%d6BqU۳W(# R==+f^Qv?vt?__~}oVɘ#:F裏g2(Ggu8ejF?J[Z?? Ћu:w(mH^?W|Ob)Q+C90=:[hr:lj= {{^G8YXR\ barY%ǭLBY6Bʔ%eK=g OQ}1AW1|xxxѥyVڑl҂:1\LR2lu´g{,PC$/?vQ ܂$P1l:s)]oO.yŸ#vѼ-4T]EObRB VhyF+Ѿsj [3!2]WCB4*XG7tdHS,FZDhRgC&Z?UK?ߙEd"}Ϸ#|p7U+?i[^bRšRsg(3q%.h昜xm}Т ba+ JlЅ1j)oAbqR=c7qJ7_M3v ޚT{qِ"-r9-st||h~=v&:7cJyTY*"JjVZ bXܦmL,)l V6*+½:zΧ'M;vڱ׎מ&wUY@I`jLuN(EG9/PmqWUY0S"a! 2 ,l kJ{3 y䅔 HHu wa7q#o ϏJmq_7샶%grɄZ`0iZ8[?4}uvӒ]'ĬpRVi?)*9kSO jS52iǮt;6}$Cy\F> |ozIpSd?CG t )2i)kOZc\i=ܾV1jF$w1Z[͡*K/Rk`.=`)M&^B1]IqX-^MS`h7qHtOT|aq][ sc7̠ܣ1ˇO' 5EPVE)>Yg<=FβR&A~Y,jΏh,x7Tc ƅH@-ODPP`vR0Un&ɆM/*-,-=("BTv k }Vh;Y7qѮxf:AQ}gYs.'1!r#U(BMʘ[puL |y/t[W‰Z-R:fATV\x}/#39c &wHEE6XB1HI%b.֬!1*-hr]^on~oLcQ*O9O>G^8U*IU*K;]leyMdm"""y Fj|÷1RQ+ i9(c0^Xw}`cut)JX(fΔ! Ipx EհpRƇR-X#6'K| ]:&NwUg ^V,4(p@ #8E$Pu[!pOQDq?֓g{8W,PP~`I ?b/#%Y9w!$D`ivTUyF(ؗµ SC!BؔE4\){uB@> Q4* {υ!\[Ve((3΅f/d%R3l>nF%&316?DZ lA\ey|v C+ky21d6?&V?iהA1sPCR/?^f ZeK)LU@>c@IR|f\67P\o*S\_glj8.P % bL\eO[{!:Ԡ22r:#9c*uwgٽ8Q :)k\n8c'`0Ve"-Tp]\-Hf+k_-'J. Q _'l l_hzxcd|R5}\xYbdUGḮa}QJE-H;r ;e5*M;!9|o'??̜'}n uEYk0zq~\zAG 9wus.S|!P]0  >Vc :̺X0\^߶_@3&pgrZ齍qfhnC+*|قo1m>q= Xq@`1f]!GFҍ:M4 =Il!/BOɀY2`b@Fy42"lC FmQzsᲃ'a]_A-L:s\'80>HT3tR){9Chp:kk4񅸾>tջz찳V]w^uGZ;A /PjaVJ`Buւ;Jh<%PNJlKa=//ӼK˩_aQsi`E] QLI&,FL;H $HF$b"`Pz*< \POC M8LNGi傩DU̡KQ $#etԎޡd?]:9Qw,{GIz9YkIŧQ c#"8bS`kw}r`eb-т T}ZebE* H$Qk:)S) Βԅk6xfƗ9sLuun648\IikTu577uZONo/j+%o]w]RW^-/[[!Ih͗(Sf'sJE wȸ[W譏QNnGt  K"P""8eL@"F*gP7@Vprܣz.o=0\4?7_2j `.8WRsrnGRFT2Jۗp8{*F[n| W- 8+Qll{ek8iՎY5f4,DKr%4L꘳{ f 'o]9y{\;88$㙠*L3n q#)e$SA(0-u UKC7|༷ eԢ).ygҫe|tZJ:G3 {94837EpZ e@T5fHuMSO̢ϖ#B;u| ~b7zG5NC(w11 ك[tٿSo.̟UL{GP㛳B}x-&$WwY[v^$7N"efs׹s6⿐ƶ<[غ~ܺ=.砺y3Io\ͺClYb˦nZ;=/y=lR07'xmhv7 >=O7B<~tУ5I(oyN#쯽Q0{K Yڸ?y7 L8ayQQZK:z ebf[ޞ޲cڛ6`n䬸ҌY|>ɜկL;ߪ"sB0g6%Wo4-u{Mp_:zGkB98 LD˔1Y  %Q\LlWKWN;NAO*l{Ti5O-e."ƪa(Md\go>+Ɵ9b\l1B8$&D'9+Rq"7@Kb>kKtR PFbL Rq ඔ]@8 rZUOIb̶ ?v΀$ G6֡vܬI&.Q%&d'Ӷ[Lky(&$$q‚20%C>l+Q/kVZ/$Bgl0 s/dӄOS[/jI9%*FGNKBhJ/XP\Gֹa[ֹG嫰 fMFp.P1႑*fD4)8޲LltR[pd^üH% 1j)8K@ QOH\%/ T3`+ SI[ן5{|+]/t|psm;dGte7m$2Z:*1k9K<+Iev |DJ|K|tFz%#^[ϤUBdw!FkI (d#\J8.p I[D_H8 Sk yJ)tG%zl) )sEx䆠 y4_?_g&>/},< jN/iwP"!myi hH&UKo$|z f> =YcCcZUmDgH! L\V ȣRFy:%*EKZyzx :;G\2OnӔ1ӕOXKyEowX\N-7+"NQ6.r-𻷍^kp>^ t>.=,b -gڒwh4Vֳ9I/ʦr-r; ԓ !&g+ԍ9\s%$ &J!S,PK:G#wAk'OCr]z!# `b)W\'(o(u(06qDVw$$u6q[ӖkQ g'iDy dBlkm:-d4 ɘP|R!ARδ1萄gh-t2Ŭ0^SV82)>rܔb_Y,Ί>]6A2Wg9ǢLTUQp\㛢.ś.6_+â ?NwRYAAqp\O\hei͢pڶFq]M kǎtw^vddZ\7|ꕚzw(h!3Yr0qMoͽmUb#!jN8a72 EuW_ j5A7@\T!`8(OWԼ_Ew*sc||scwg2䅘/OwRg) !qQ:/i懦JEۋSdǬ9@_+:Ԡp˰P !3"SXF2qU}߻TTArvIKlKfQ:ˤ$;w&. _PDF2놓ڲ=8=3E\Usq)UdM1LofV\ҘܗvwmfvgQP7w,-Jzɹytsz?~͊~{jߋbRSM&b񮧤 Pr݂AKPiѰC".OQ>7c2笡^ù1j$*(h t3]V2]>;Wn+FViWr /cXD|!$pTQXS"yY:O`!R!VE8˾rN4xBPV9cgF&"1"I*oml 5_}?H|DJ}wJi+6RNR 爹-J҃qW"Ze{_g9Ϟs.,crBB F=#y) 2 `ZRͻ /{iڵ qc( DhPBMٻ6WJqݏ#7XcuKOD钒=g")iHJY0`3͙SC6$)Fl)R,V'7}aQٽe>ʜIt_] sWG7iqy4qώ:{`(M)*Iٙq(IQhQp)"1I(Gk՞$ .ijvni<<`ll<[=R%Is>=7F6=z$J% &GHٛOS8}G?_@76M:;m8|} E8IY@hs0[GVv̮女6/gnћP^4 3j;YgGbzw>gOOF޼&MT{_8_n([B$-&s׻폈_p DnJ^7|Wj*& K)U%EAd8n=0ʽ>',wacP>ګVl޻ԗX.e"VK=0te[M\ufH2cVnv<O|,93.9")k}C~ &ݠu ?=4xBxIل+!IT&:3O.I,T>Hi7O/hn\4h-m'<4{ֽC4 Vqn{2c8dk"WP'Ub3:(`\3gdQ.P- b Y7q<cCO/GC@EUQvTFhkE~NRD1hJ٤GAuk㻷z%X&`5 L8tQօ:Enw&'魷~JF:]ġmB݊ Mv[.@|i5~ O|øZOIy!i+^J.XL(0Tu9!|N{=ߥ`i}n5WkɆ`w?Mo/pߎڕKT ao!3GE|)&w/ڂ(a(lu[kz/p(׃Wht9ubM?;!( +&Q(5M6R7'ݗ Y"ϏF~Gwˆ0l̻~6m8dm@7=VTj#yռ_$Õu:>_._4՚/kYS?rGEH>v0];/1F'/?iY =G_2L9yMΚZHq:tQ]V2}&fSjb%j}l"dn\ֺ3pJ@_bc?Bsnаd'eeBֲ.c*, 4'$_a:hƤF(AO,3~̠~C&r"PQј̔"dkc2,r"BUIR*mdWZxҾ#UQLڙՃuz7z96C}fVelgXg/&/ɫ_uiC'/9~z[ǔg~: %a],RNgg]dS"fjZXXXr?eE%8,ŁJ~@}8wvMWm've,Fq-G^)w(1coQ.r]]vݬvJWbwVuG ҎgZ h?T/M8>WmR iLdilI2e|%=sVU"6H%U`"(!25Rue䔳s)9{-YZ [ /\F /q]6;]xR%B"O*ȬSexc0qvYSew(8;>]Y^=y7- uMIzAg3 LK^WI̞~#zvڈmCʪ: ao>OЫٗYL5^mMi.82}z%n[ӏx?Nf.htݠSʘR}娧 FJ_'KPlpZ ʍ`5ZaNk0S:iFxS7tS 7̷aW-PMOF v1:]rsǬݡ[zi*nZ%>k~9-/sj*FV"}{,\lK)-M&x#!‘Gz#pfrt_~RΑ~NH n?|T=o?u[wN)a(G/:9ʶɰ༒"+55+'pUit["!֓,`8(% %!$vhNaIxqٵ3Rqvsg?^ߍҒ KzHiM>gh] oI2jEMXwpY%x'SA tZ+t[E'uח>]*E8 LJeCNeN>3VMB$ݯUI+3^cBsh38fQQ; QR-qY=ӵU=vh42#q^oz'.긍k*F) $$I$h^ ^wy-"mm/VOXS+Ɠxf"RZƝqZ 6aKPN9莠}g鑏y5G.x57k(Wچ@.,X<1&Yk;nUEnX֛b9J o,+Z ƣ5"luؾ!0:j)+tvCwO "q3F0[e J Ȃ@ B\*Xy6P+Z3P^aa= [u@M02]up`g,%mP6:="Z2z*wLP 8HCtҫ TFM `}R%m֩A9p o,,PZnJ *A4T1viPAAd&y,:Ptt= D--bw7i%b ,"aa^gԁg= A0ք%IQPRTyڳ\9gQ#I ̢ěPJ)QN%{!0m:‚Z:cI+2R߇uM}AA OkTx(:(Q"r;TwWU]_5!afs.5פg A0C:qju0qO1fxBI)^MM;鏆es "%x0Ե.骅 7&SFgTn[(R0¨C1$p58EMy0e`c\w An }NÌ.f Nfs f*z;G P#P)WEX YFol!|ῂ0D W}\N.Bb94L.:@3 +`S!` BJҀ5`z@x \@!36,gbNiuKkL`@jXzY"8Oe<(Od:behZ`5< .OR TfrG ρրBoA@a}33dhC3|Q=AkO#Ĕa:V(NgG,iөΆbbUr("2O`v̂MjSD!2lw]dדt\NDY|MuA@u Neg,xÂjHDh ōtvkDVMZ4b} (g:i^w7~?F馂m ,5G`g/l=k-g_RAT#Jq:A A T#9`<1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@ϗ T@ Sq;&PL Rb=C&aНt&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b2 HB;&kġ0PZk R:b=K&[@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 |@: ĕ`@\~@h @ϑ $9 'b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1zjaz+^>TSlu^_6 o@S)y2ϻl8*ّb4 ̦"7 GZ: {)a>oYg a |k]T!(H?MiIh6NѸQ|vp2eyktT&Jޜ$/ot1rGEnGh[ 9]mRH`{5H~(^g$;x^kk~/uBCd8 8vENC 0]0~ r.y15F81b(Ki*ohd/| kM{?F+F#w X_U"c,+cf^ ֙J~^"d eLh ~|ZZqka_ƣXVȆ~ʤJr6m+6ʭn6z: \)օ|47ASK$$Z0{zYba:f<,7i_tFB_c#yd-&I1p2+jq.|%`~N[mnhJ㲨:^`}fe,.M[j N=l8 y}aKMoL<}h0;+?7 /\?a~W|jڥFqxþ~ƯB״Җ,o?xW_۩ᤸzūn\@J@Y(ADEK.cɈ$r;ehY0ٱŪ]وTq m}JȬe!RU [\mvs %'LxTvEQ3qv֮>}f*6E_–v1 .L'WcHOLZn!0BLKZwO3籲dJˇaB1}P) )J ZZi:'ׄ6~~&ӟ~Y>7 s=6~,땻{(-"r,Us$XB.\>GyeB:RuS,9NǤLgtM~cZgaX>t|@Ŝu_\AY'w,Ƕ3(^eђov!.ӴbAk q,r:&G,Kk? 5&Fh]0^IF/,Gൟʎ(~#I6jyWJwudlr)M,)6$lP%(R=I@y,"Eê"[Y5NW QE>V!D:1 nuqPQiW^*Fx˰\zcsMLIn5<))-sj.aR\;lOBcP'eX9UXpܱ15d3rJ%¸J.h YR'QiC\M3)UcT37FmxF s@ss2R5?Vץ5rUldxet"b-:%")[B05?Ϋ~o?~Pz ~{Eڕ{qUMӓ2nGqOϺXLjKT%=Lz԰ӮMώ7{a%('CbP1mpfq%~%nOMPʰműղ9 }wZG%o?-z/gǛCJ9&Gi4^Gb\\q SH[R(wM{qy ~l~lzj38. g s>u?9X U#ʭT7` G^ę|4|\vdYz[Un겑۾u['%ei#ro{?-%r6U{|̏x[yy-/5_oRͻׯ5w};.ܻWݛYӒ`b8.i-mه$Lkw6?lt^^ lNOzi-#|t{\@y}^v]}tOlw]6yw]ýX}7օ8!llnc܊.GI$p3y$^]ǚ4-(\T-&RNg `g!vQYN K粇Þ2 ^mE!Q`Svz+{x оN':{ڙN|%94+Ƅ*JyÙUԝŰ͓A*1JAyfYn )DOF0GĶ8׃r#q.{pC̭));옴y }KX~$M X £\qBy J-QZ`ϳ3xO$Q +`լN.I0L8G/phZ[8ǽea6op$n Z=[m>7> '|! ALzNSRyQ|Oq'vӵf5C`\E֖-ؐ[\nEn@'a,\ P1[ P(UQs +CJXXA2,wD )(,%g29SީK}ʨpܳhg}N_WvF(BOrd 28I7Ih `UUEE`ƉB=5i@vnHxЋ#B %1Jk8)lՆE"xK ?_~>z?d>{_Uidz=m'2^zϷ;OlZ[n.Kln)k-GQZ-*I5N9kEUiup 3Ɯ #Bsv7m59LnU$EmʤH.HնՆeUj-ml^mjDfWIywlf$>wq~<^G!œ(ƒ7#DdL+])Td"X5P*Q֭$@P=!)ٔx@АpduYCpd603rkou;L'開y0+kIǮVUnXI*dl&Pb2S]ZUjkRIefK>oa&#yC&dȁ(ѐIH&Pب 8bg>T i{;Ch0E&;ZD`u-⢿"qm5;'%poVI#VʩTRzE啀 A ˎYNƨ@r2ؙB -AH#@e4\m8[d>CzՁb.bդdW+E>.nE<]"ѠRj,UBQ8N$,5gYWUA-HKuIKڬ͜m6%*[m8_ CTlb:`{uy"8@>@E rMǸ, m(y-A4}ЧА%"Bd *6 UN`hkzn?ywwV쬹B4 ^e #mzv{5/e+e:/i Nd,8ƵY#sZ ցuq;!.cA6OW[xZV\ᤙQP^Ӛ7ߦImǺW?7=0^Ot"\+q@Ûl(;pcX܉rY ;t%>4y\DkYw9cDVl.*OŴ.o ?@zIfx/ŇI 3gS BɖW`pCAC6o=jR?E $z1FA1lTRHgH?˹dB켎h\ _Nmөu0|L#Q(f:hʩ/ gAz_)~|oyjլwƚwO}׫>I[hy^h)t8Uwy]U_ؕ'eqzp`'K%Դu133Hb[U~2;Q<k&yG`&L;|,e^Ob1$S]G`.CHOUXn=XJpgZ aj  {=$\cE?]LG.ʊDvqފx}LȆP9Q=88| v\ |QԞ1jR r%X+k@L֖]cݸơuSRK{[K6ߘoe=[-n7wO9tw_YHn=4ʼ;=ι(>w7;nyߊz.:Eyߛ{z V|emmǼi󜻌?~ N"m/i3(HݜrWM\\I@m*Y8څ"o{mce]c*5sﺧ ~^^n?zo''v?De6b V]pӝO4589 HNh-Sv"`NZ9`R &M5qK%hR]e\raN X]txz%*@L A.rM.Ac% = *5PUւ1LTL" DDf-s\jIx$sLqYZdnUȲK8̔::'o\c,'D⑦h%Uw %|0$PJG8*`bX(yJ*OQ 6B'oY,ҷ]oMgjCX3A5?"cEbp[^sJHk,>Ijst"YL3ite zcٔF' 2;ngzIqB@1_^ ui,Aon7?> e"V0 g,LNP}>Gò٣=YhN8~R]/sKƭNƅd v|J*}y>fϜص{rEL"sgq\``D$fkuMN-#%W?$${ ~?_ě7[i'ǯكI!h5)< mӉ8*F~_8sӍNɭ;RSĽysQ̸_eHXI)9)z|7͟Y>*!c27,pQ2h} U#?YYiWM^/w:+Z!ftW.rke=(n%:|ʅhT$gc)+5&`͙1Ӛ^QCsH_;G@CfR !xx!N,g:7e?GaZ{/?-Z$.K.<X_+cC6ȹ Ƒe3]"?,1+ɾxb|VOٓ tXY{}aK!iH3n38;B!] EBkttȡ 7-#hkE6ҰBȌ[VPhVY(W[G` h}4O4r~nqH9^ `ߛNR{{5lcڻ'g^]ԉt]9ݭeڸ 8%9ۿϙ)Bdh[K<DZ 亷3cQ+=@7uv?7sAP#bBx hE''B4و󼿎^O 0a2'GzNt؁'n.f/Qؽ`Kfz34}Ie1 <<#A9#rIz0,f%q3 >=By~4 z0 ю*ރ[W<-s= ,e|L2R6)gtiK{{Ͽ3r=RjqUr 0ѻo'I^'\˴Ri^i-*4E8Mw^ګdR&mSOͥe(|u?[kct_ώ!͛U8ܯVxhGs5M7D59FH7]QZ>In5eʏkqֺ9i߾M7uYS/Gy>]bF|g[4ǼH-~[_o v-OZ3T˧Ym>{>$qD< ڿz񉃚dtSSMRTN_q-au>ojld*9}uf7WiIפ!GȣY48;9= ;]Iݮ_w@Id~y}o%QoUNh{ޥbH_~EZ۞%_\7M {bU'I-cJ o{?OJ-V:E^Wnrf4~]7i~rrG)h dó.?pgGG˥ЕktTЅ7u{.t>ԡquqTA;/3i7:E&5/;[ΰ)}b ]gh[zZvY0_&۾@/tV-Y˺SF#IEuЌI(]e?2Ȅeя4KEɫ;TT4&3 ܆HPE`U %p%6yz^SvJ{0ݸ0vҾ>f#w/73 /|X (MW;ӎhIK^xɠɸR0&T= 2?Feki[hfvXwf\"w+ClL;1-"Ƈ/נ|iяgx&رDƖ;)S/dUegxleg=`*eX`wC!%eIPBI)g3+RsRZ vo+^ĭVy8_*mv JޅDT YkI2d`űǴ6vv|M\y{I mz OOG,7<%$fWbG*~RRpZ Mѳ7'py5jZ`bhl`ՓY}%6xjׅ{5}ƀf,Nyِ~{i|:=?VIQ0Iv|>Z ̲dOd62E zRAe -vNє&xTϫgFSop1Y@򹍇޴&s,ݝͨ-g GݜzH|h/zͣ>:7xW4?~}57O>M<կ?xJ&;>Μʘ"s㠂D$^؏}Fy3\ g .g89"9.9!L,ys"5Ȇ ? _\].봗Gvd TGoGn GͺiGiȣvN~8pud`OƵ́F{}l$gvW_kvu:F?s<:йw[x}ųZ>ZdK~Z %-1.~Ru.YJvSn{TXYqVy/hVr\hqa!f t^\E2U,pɯwvJhM Z&jiSe}`d:-풥4@fQ{K@\? v}NK2O?qs?/N曙FK6Rk)~S?qrb7[):J^zq_/me]-:QۚU\ui`\UyݫE v YKc.z;tK>N Mk޶9LUq-|1= 9ȩs)mY󓺿\lK)-k+We|I$E=)'ą%]㶯yh~[LXCJkr9CCe6XrN%R`BZęeIܗNmнҢktNm9Z_tb rQS[!2W2IlȩgƊsI UX YT}#C:*8E,[Wbz@vnhe&G&d Z m׀ U4uI$ /xCa]|l RҧR)hY0\zX˸3NC "5S9#h|v:1_l2ET--QsvZ.xm?c"vfZUP>>\R82\I @],gȠ>ä /g/јjdVvRX2z*IE F3`CKN84%O!']U:T/_QAX1y6l0t }:(Tƅ%XCV$+c П!h) \FH(ź+,SNiX)X]U %h1{'yZ%= _OX|/@Y3܂VŢd2C]4gʼnQd$AuD^4InHaK\ZH+e2&x"fP57]6ǐ?31* D`ȇVH@rLh PTK5zr 6 уXݳk .\&R&LGP U rXV+\G]@2boJ6"LEw%\f +,A{x;h)B6%I2 zJR5ѕp0 ]Mޱ@e] %9&kF ` շ K65K!C A e:@-ո0aƸZQ+cfXęVnP8t^ K3fDErٰX̉sBs} &ت3qahܵXw8e,*uzXQ&c9MKóP\fH9a:2YIo`@שX/h1ըҁƪÕ$P6:="Z2z*wLP 8HCtҫ TFM `}R%m֩A9p #JZ(-7%B̃ V`8-B "uR<2 gu]~amTgPI8Ne@AbQKbFDDZ.'CAiuCB5DtIRT`k&Ot",3,3WNAHRp@"+E!RJTKoe^p"L[eZaA- fPĤ D~^&KP`. B+C3+ 8nedWkdU'O $8g2fg6*UFm(nbPpuVIb>DsM=q5(S? |P1D5aD9QE]Fx* < %  `[Uz5VuT@/~EhN)s\*'g uo Go&B #C]E1B&8#.R1^=,g9(:A2UjÂ0e+b pNiTh)#,7g{8U d#8/ N@"rlW5M4N wVYREV66t;/Ntʸv̸UOrBtH }:$' ِ@zKH1=@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D}v$ޒz壓@;UH$nqIj{&hZH@i"&8,|D `$ޓZ_ t}4I'GC D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D* P$H <QOڧCAm[: T %@`1E$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@_. `VORItH O~$ "DH @D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D}9$z+ne?{~b) /_@S)y2pzfKi3O<Ho9>u6B")|26βzw gVj{z>/$*O&r:6l\wjW[zCq9餤wdt>NѸ=5uv.o]P7X%18LWɻm.@عCx?S"f}G6ٛܨP\NM aP(Z5ڨ|FAG$[ܝFYE||a=Y,nZ_Yca)6'pɅRv5J8~]QγGyR-o7@&V%myg+Qǁ%*EpIk.df@+"C_^4NYW^aW.[:|6z?tF _ݟ^!tmFC,|d/^׶ڶuJf%!L*. UFy^3ofٳdןϮ_-rqyݖZ֮jv[j]0e-OIz5oCk'tI';_hD|n#i7n5nA Ӭ@m.+zտ1/41F:$֩3vk;dl']Nh8 (+nh ?҆ԮmЛ9鄃,n>3Gן+e='LbVArʂ̩ F 3PirЍ9)bYȓ9l|SFQnœɥ~]OQ0* GV.ầZ1Ly$ y_/rpnqoFϏ&Vd-& <+bJILeVd\KU6*G ڼ`o7HcHܾhp~Z~²?%LOﹼүPɲiX-ku~ ]z Y|}"0x_>.׿^_u(f\#&jLa/hW%#22Iv `ҳc 9|JZ*U 6;}ͅ$ ƩN|*YgRZY+7 *| (N0Ytq:|бZdlBpl990wίCHT'-G\C´ԙXYd2E%aʇa L1RA &M2J!&?qh2ս7^q㎸0G/3{j}hg7fqðڹqwEifgJ\#Ap) (9+ic'Xbid☔0ycZƧaX`nWNNn?y1:=yKvw7oYָfZd0|8ɒ..'X1YLQnrβT'ҶvDno`'|]Ⱦ';V>9W-# f*49Ra!Wnol:&3WS%86I3Cѝz,&>vt&!jX;ԗ z>5q<Ҵ4GIǮ|ٷ|>ݎVǥ6`h\yה7Cxnov 6`΃M.>%EF&cKibI%V;(_bϵ8}EcXVdo(Gyju\8fb1$޳N+%Ck]hեUUunJ/Φ*؃ r72%M>֘p15E#eDeEJTRx=7jATnW}l mYz2RGwXƋ1dJq J0K/<n$"?Ҵ"E.m &֙*r1 z 7FmxK s@us2)RϞXV+K{Dq[tJE͸:-(a{~Bw_y!sguTz4~g6myg[ /->* $fl‡lg!=ǽI?Ϯn5,i\Z'fUO27Á00ikMh 㒉3vssdRNd QfOgX J>+1N+\snOoMoPp: f^/{Q%SCߕN$^zڙGěIrOSP5M2p/W#1.ZqKHYR(wM{q{ nS:oΧ'^gW+c:~ʜaߞVU߮m :lj|I.;wP_+6ƒMňXuaypQЃ؋75 vsFn*uc[5gNJ:2y}7tÝc~{1xpg/g_R~7|.ܛ~iI0mj_firk[&Oo4?.eh_Mg{n OaO3 ?+6tr;TJ⧭uU4(Z:u{ц.[=-վ&o(Ųt)ā !w9nǸGhM'ä/O[yIEBE`*%t 6.p,h<+Iᰘ\8Yl:xk1NJ/Ё bfns5ؔw*c<_\s }L&>}Nw|ΥFc5f[cLO'l]d)S<4!`Dm<_)\pE̍!)[ƈlq?K`CXC Xpux GV;6Z@KggF_RIVwZ]\١Yu1$CqVrٻ>/+}`w<#O˯_O* ]s L'k(3)-z|sSe2ipw7f#CT $da ]!ȐbQ0}5(3nn)HnU&aRY)I<|2 u#ȵ*9IQ!ܮKk2>eFX)kF=^{薭_Ldj6ǟjZl3 &JÙBf[[qNr@LG>>޳sV,k N'K-MS O`]:}ccLktqdub8?,Cn\z99͆Ӈ&'pNwTD 2].RՙT6h\`FIù`rM(ztwiwE6`<EO78Zs}K#7Թs sX_|~̙}⦇Q6ZWި$djuN)"x.,c9upOJ\mv.Ӕp>==@F;< )M1R&bHJY PmZPʠIKة'ui"xS<0 cuIZV Tx]Yunk@ǣѴn _~dn8;S ~yYJ-ܳq; 0:3ƒeip:K[,{<{iR9^a&YеJMR) S(WܭCv3\H,QP2ZG[x %wΥ*b2Y>}:۞0C7fC;@lj߇6?ӓd~> b=;ĸj|@&K&ڭǟ k#GJcشG`I~m ^cGQx4S-A\b] tQ[9Udm9ق V^Fmv|E р:yM?[# NR5 yH)A1HVYXf!EcL2'sNz.+jyVDoDeelB+:,GF(:{q1VUUTf(ԃ\d솤ϊO <"R#d7` V*\ TOGǃSMNv㸗|_vf >iEI&o0XnKl8n- -GQZ'T5J8kr䋞(9f9YA0GoR{*ykkrJ"<-6eI t/hI<&3xIs]|'>m qXT$cW[h+B;½͈ʳ-. 2Hy_7g$>vq/~:_O'+!œ(ƒv$DdL+])Td"X5P*Q֭$@P=!)ٔx@АpduYCpd603rk;g開y0+kIǮVUn#ZI*dl&Pb2S]ZUjkRIefK>oa&#yC&dȁ(ѐIH&Pب 8bg>T6 q{;Ch0E&;ZD`u-⪿"qSޓ gNE֑\U+@T*xdK٢J@ e}U,'a|cT p \LH!Z@ږH$OZp 2f6'd>EzՁb׮bդdW+E>.n=g(QzE*J F (e"@T$\FhaV]lwb߃ $=W[YF>oJ}\Ipc OثFĻ8QTH>RAM>[>6 Z`v8qoɁ7%39p/9TX3Uk3L $ e,fBv92ˁ[2%^XDIk hQDw!Ғzn"F.hPi)FZJ*(WR@~, ?rAX$ō% SN:%mf@MFM N6Ua*K6Y1SQh0 Ʉ)dXy2 RLGȳNQY,Ex4Ϣ4/dIkuoH(U$€@0!Ix*k?Y䂷EHyh59r_I/%<3^(0GFI{ʏd^):Gο5I6k!Zcu@DD-y6F^Gtn":Xg#"*'B9c!K&e†rIn2#^d{,b%߂NᮨQK[? `IHլglC0QqoͲ%QW`yT* P\j2qh.|bkarۚ/JOz LgQӭ|-B|-3 0A՘Mw=1-G· (qA(J:&s,G2BA HQZ Ji냷IaӶ>}uy"8B>3ޠ-|+҆riܒkO}R P^-r-DڠkP夁툆 sKͻC t4j'r/R+qxۼ^5m}Pu{^zv{5/e+e:/i Nd,8ƵY#sbCh(s'¥y(Ȳ^>f3Srz+*S5 jWkZ񗣦4ɽ߽zw_l S= oc`C(ZIatcr'uUxӕhϣ&f޹_&z[*`8 .!ѳ1 fB:˼G!X%Rdu@b6|jN̄gψ2Fɔd^@6kUDSN}Y< J I\Ͱ@|G2ּ[/|^QLB|BN)/E."rM?.'֏/O.|_ dZBM[[`[k:3$e[.=_0}e?PLuTჾ VC=/yCTwY/8/^fOڔ}!@; LHzHrd=~7]!2Y{+Z!Bgg&FPrqkKm;t;?Pϊ=F7*|9Ow"9PC{cu=֍AnZ7'羸5ۻ.~n Vfܧ{+Ժi:g#nI @6Ր-+Ju]mvta^ȅњkbvJTh +} ckr-]>B]G*jKz@Tj"Uւ1LTL" DDf-s\jIx$sLqYZdnUȲK8̔::'o\c,'D⑦h%Uw %|0$PJG8ZiM̠ %WI Mͭ|^_o~Xk޼{SFMˮy9O'6riW?=x̟ǡW}g4a27}Hv|󑾥 GQдW?~68_4TÛi)2V,;Xp}=լ5'*铟/D'u<M^x-ͯg=Zi~.cgTO.uo.MMOGOC4r| ~"{Q(n`@p9%ξ"_W Gܒq+q!cIJJ#ux3gG8u_s5#S6ǬY\%|2pF"GIf'bgsosf=? K[K/۷OҒ;xW/8_ v=O&|֤ijO%O'⤤/ѠnrBn?6増"ͻ^e}Uo폕ݍQW䃽mLiVq'-gˍbu,lEKWxca}^嶙b znqh]˭i DY:(QIY k5gJXLkzQDAHlk@ZRF>ztC߸82Zh6!tb9s|A )9J xiѢ/%t]СTZbb~1 #VGtĬ&aY%n4^!@na6mPX>15isgSzRb& `'.`2 Vc)"|"|FuȃZ#,aȸY4 S 5I1[ ܐO Ն̧ Q'Yn?c&\2j/YhA: ONW#H:ԟ^/^y 'F1ݓ_.HD_2zNm]JGH2r㴨{י)Bdh[K<DZ 亷3cQ+=BoWuv?7osAP#bBx hE'WB6PCU&:',AJQ2H l_yqIZ5XX۾7{aRjo8}?՜M]C{6ykNb/WњC[ZOXdG5DU2sm#wi{E h0$'-'R0.r9T\ʠ ]3`2Ơ]B!I򍌬.T1%0YIW%r%cPRh`Rdſ !2eVfI'y"ynҁN o,9Z\E&q&Jq"W:IRAR@ FiTo뛗$z[³U}(^ڢj)E8_}{җ!=֧EӇG.%41 I,UZ;CU8RL&G̥aiB:1^_ߧtU`xsoyHB`!OOQSأy0\?}*si\zT| >kw(=}3]_w)^~_~X| h %Q>*@q!gIˤQSgQfbO+3*6Őb3/#ߗ¡@͟F,? v;T-5r(U>fUyKld2мJG>h,z~~1g{aCʗZp85}lzVeX$S,VXSY:k:,;I0Ԑ]˰*MRd4IT V9sT$:M%TVkgΰRF%t TXY-NI Y"% 3Bsl&۳3qo˩r.QwRzV\ǩ9N'[t? /3>-ʌIO8*TzpP' H@@Qo^.&q Z??z;&vsg楀~%|c:f،a^R) W(km&OٲNzM1`2Nz&[E-x}WB̾ۀ;Em_ # hԀ91q/sn rU%͛F9zWcoBM5o^; {@jcqsF/>RvSp\J6*PS VX;VF&*6C٤]V涗|GHT?RPVy'I pЭk;B^9RAt:,r1ZACjw'mR1~]J3#nԋUiHEŘjdŘ*nPb5p/6rmN$\h|aiI>wf(C frW4!|M/WX Dj r'7̩(Sh%s0nw |vOȭDJcf&"Go@8#R"Xm"Z9}fCJ%x`6Lg,Jc&raFa}_-YiHÌ6@0I& FW6]-6r}N  LXmTgCg> L_ `8<smϨU'mot:/0Y-z%hz :!K[kDh2e8oPQu"aq~Hk9hLp$R)s4-x^.űWo~oGO̓*<|Rq|1:FII\I`&VS1 owWSw0.x󓶫IgR]Q74T:͢,*ZD -2ru8q) bm3XϢ^G!2@ݫGa r#(:{$W{!}-+8zf'1dI*u&v@H1VSu'so0#t58OQ< %ktb&zs밒q?0XW!ܚ{¼=x UA9;+(ӢAuVٽ&.C$k B'Cg[JZՑ:L ٖ2u(6+"Ց,\m,p?:zrDڇ{VL`}:3@]1$Tb /UV.4%xs 9N5䞼?<- ny^\1& ʺjM%qd=nQ=xQɩe:[ѶtMtƩ5:k @3.5s~[*e[5R*G-``WZcTӝJRt({/uȔb݅UR寅eB'SCg[T~Z@T".L0l.z'7frR'{^nn|. v8}2ƊpceEk9jda]ºTG9p*J? :hd.W(VT$1PkijUBi>рA H2)K3Ж& >gOG=;lAlαvj7ij~1{e+o**CD P;8e+4 pŠJQN8F$,?ˬ؅ G\Λ]}V-1}RX%RJ |EnɹUvyR'6mbfIIkκp˿59ơV.Κg׃@F+t Z꒨e:h:SNЩ*bG۵c=lK1–=rGZ;[d݇"0;cXUIE<Ud l43+שּׁLvnsNRC=ȤRwdH494"myFY^ߓ8St&(b6G֔0!Ek˧,.F #?n2b3ВKm'b.mZ6Yhp|u>}D.pn4#RdB1xBKhXhiOSmyh >EEpq)}-$ Nۂa1!"f)=h o_g(V<\T,ӗN:'ݔ~Y<h81;>ҏwtyY| 46mJf歠a}Dw2y~D]>%m~4-wFUpn¶ :StQI͒_쟹X-ϔ\c `'8u뇟?}V/mn4d vq/1wwh٦Tt8aCd{-7ALpΊb׌{Q7jƱ<4d22 ;/a7g I(Z@JfCa V!2ZN-QЪ>+tYOśiFYr*:h16Je\ ,k2RZ 5`U-ǧ,&BG{ |JCP]?$H Oc$YN# lJ9$™Ʊn}hp]&^Id_\y&k7)չ:2$  @ 1b*Ko:.BgY^),)S ,$@)?;,.w\bwجGC*`-B*'l Y8"w66F[JU\(1(PB4mi+,F/x雋JSM>v5Ƶ ɵsb jeJ<?A i go;p هtKPMgQlq?exGӃ՛nĶ7JV}ʭ"=% } V&KY6xBvjjEzp(* >6kdJ`/ml.?xFHQZfyNl2GiU$ޞ7ʩhIENiطW6+*rG\#;Ԗikc WLޏ?f?qeF3Dl(.yewy8.9zO>-*tz@zWĉV-\ѤhUŭ iJ,Z?|qy_?}/]jY۫_ίd5I"Dr?/ckG( ZCm&emaV_},"t~):rEj=Yk'/thdf-L0&+<] Ʒ;*\Q A@֚* Q3ƫ>;K,V>+Bnn<“P* B`<UlFݨsCbU#{T:_҄Jf).][-FJYY򢠲xj%`Sf6?VYKۼbkڸyAo9@dvmV\@֨Ye@mC+=5+q7lB*ޮ8X?R e"GwZN+1ۮ9{rJAi-w3R,d':t2Gd!FqS!N^rq*q/|dQ4,gtǠ,PV. ~r8N9=rr Yt|2>k3Z;.{jDB>,t'6rQPrv!2Q 3Ǟ*,W ~>dzMgmĐJ*J-[Sy4BFx)'0Y8Ҝ e% ƉIcxX''*Ii%&1"O\bAɤ'%?ꨫ!2c9O'wSto~2d#;{ {*KQ-9c-`JV` EqgeXs,9= ?ߐM<ߌc X@3U|8CifyXy>+ ] ᭉ\PkQj ~63P#DP",t_ lZdg6@t;Ay׍ƽmc NF~6AhO._BZ\&z2P߻:/˟RR:G7ߟrI2!ZBP?  _~FU? ixoww']o1,85Ec~/[=o5; +MY8*A4-^qݞ9Al$ Ry,IU.^%otf8nOTIXTɒ27agFx4"88Aʗj#/o0#ʁ<8kQy]li yZ* _=?8Jk>_g?[|<6qZ3)gٶĮ2*7 ].ѱҰF&ϻN8Е c2&oboU[!,E٭8Jp>d&MMW<f.ij@8ℷd0tAP^.cZ>#^P6. Q),Fۧ+aE?ȋ~)ZI|e0 X zuWN01hIx0QKM!MVZVvr=%@/o8+H¢Uڙ_ۂ'/'!舝r2z3S0' ѳĻ# U_?q5s%[wn" .Sya9ȅTZ:"ȹ9^oa:->t^OMt;7%2R{KCji͚Tg$^q))SHASxLő'4]U,~@h#CD\〞+RQZ,5Q Tw"J6~\P#Z5WufD.6ml?| Vm ٹ%򓧒*0=*{m(*?=MV_]!\`9#}5sSsT>MhGIr@_ |u%AQ:G P T* O e;䀑 udL,9%'oe6/ +.biR؅)W0@ю;B  #Y=w\G@D/)X:i!Jl0 (M6J(.Ĺ5cx^Q'\@9qꃺ .R#u=q a4Ĕ9b-?fz>c z G7Nӂ Ik<`2M.6@wRьoCʮKZ \(Q\C'ۺZMxV BW?0W̴ECq-vnI֚2$VjF!$'Q‚ F4Sv+/rkj>1PI"dOBtιM &p@s&uFCU߉l?.k?G(a n|hj6Ɯ|MFhXQAFBD)K {6/ϳY~GAVa6Wե/ݸ8&:f/O { 0O,^F %Kṵ̋LP:7(BBE42Oߔ۰x ?ؼx8zj"=(J"ϔQ{,w (sd" *R xǒilsLA6erw}Kʍ(&F1#9|Q02I^KoYrl:!-^mM#Eynm}Dh0vW,rƨS7=[KEjSwy<(ĩ <52ieId͇otl$`YfOn`lni㢔*YLƆb?AkyvZ&ԏGJI#$(qF`-,[%1A52vQq[uM{9T.~Kw ښLd8Kkz#Dy4fEn׬EXQV/Uig/8D7 [O-DHZ>: Wp$׻ӑ))qJIX.JĈZ`I5gSH*&'v{;WZQe}T39CW/% 3SVEH`՜KxAuPr/( &?G:'BH<.]Ba2{F"T#c) {0@V%e?OJÍ_7B_^rȾν-ù7:΋e@A6raYȆwYuԃ 7MRyfº8aQlI2q1)Fy5V,e pܾ(qb/XXˈYShYO"KJ+!c"3&/% !RVL~*?hFqx`E3Nbf1"SS\ꈚ5^q̯BXLI~I( rHCc4мh޲,:6DPԝxy sXWu.!ohN_57#dg)18LsU ob|=I;k%N@TeMNx0R\{ьF!* 1'kdbRJ%z\b 0xrtOsɕ"ց)$w"b yp5j[qk&atfh:lًa\yz)QJ / *Jf FM`\1{ ,atcŸ v`z' F X9u| Am ^,;,=aAGK]k@}r8-ƒnqbyA {υZTV|pXńpgn@XUn[HuItn'H%I1,- KN 2$s M%gb&dRFPxHB郉-E7VVT$5X-^X]ai3щs܊DiZ=Guпglxfݰ#0f-Og:k{mVXf f%}ƩCCq.)R jdݷS] NTRssUaQiFwZ 52=]_#j8szP O6q Σ!ׇá+ҧr`JB1Hlm sa7|uv<-VtGrIaj}<]PvͰcK^!rOhbE|> PC p,/lA ߿~M?X^IB|-={[+]u[%4`Om>?f?4go@1?#j~D?4Q>Cr'G Ss1}Aj@e?>ùSד269ۜgQFt_Cqa|[‛ǛO(ws7f88^x r|W&vVsD8F/7P{IMm>[!dτHkؿ~yX[xi8yTLG>cbiRLJ%:t`k 5\Zj 0nJ[ڇ,T}|6[ c ĈDTٛLws ^m.*a[Ǚ`J{$Fg p)= HpXu7ƹYmJiKM*3xnD7Ҵo}`bQJ4AEG;;k(*h~{S4drc۝-TSk>O>/-Y-|>?E˙iF߱y^5?GRӹB t|Z٠Fku|ޭ׳KJ@odU{RaM^^'.o |$ikjw4D,U@JXZo(GOO~>݉m5PI"ӰS >#.VZI(}&ý l(X+: Jq>IWTdp|FS#~\(azB)N&r; 'jHgSBzC[Aж|*<"O^G)z}NXV^|v@0m>,͈V{}>U&Mܮ5Af&m>"fYQ{`oUZ9+knOʥ0.a9F,eY(w51N!+P`mPIȲ]>xk+%N4"v;j',EQD,TgsVa9ur\ʹ9D{)4G%3@jTX3M[ټӈݵL]]%Qa^lǘH/&dDgH,8%(_:4YzaH(kˠpL`;0;O|<FF8]!:RojEU?DwhQBvzaڅ!-woAQTC%-#C`>9 j#s@ *X| ɏèbȬAM YӢ5m5w2ݪm bLE5e*!r}Aԕ+jޙʼnϚX|X6dž^8Wt e&IQ*"bN1|%NIM Mّ\7GZ3YIO '/š:uM"5r[W.c#+ t`9p tD N8ok|ɤXuf EM){Ź5JPcNtXb+soV{FWdV1H91J)Ho0ĩn+Ϻ&pkkEb+ŧ!rjd }bL'^RԨ23K,>ȝ1QD Eqal m=Z_9:^>#$I_ !8n ܣ"PZ )Q.Zk=ۇݦ̅=b ><s*4褬Tl&'R1ۧ߹՟W׈lef74R^֯o~.yUO'PHD9U~HO!uE:i4YCcX[ayq ?8QjAW9M3u{HV,U&.x]˙uU&Ѹ5JJQA %{t^P"{1p{[z(xWaODȞ`$QkɣDCd`wa'TBRUQ%O¡q:)z 52!h|Vqta[60NOIuWM7Rhb,AlHljA)%TsW2@O'(Z{J(4R| )KB=:p,p1^ƺ"~wqPF0L"S[r}d[6NKѬCAAEu9-.̃ê͔ќznߢ5nBáRsR0Kb|b1[tcmEi ,ZrWuĢ5 ,~͞l ۭ5-O[7)VĤ0yI%=k0}4@1ÍjHO0¸LFl ~ES6u!.>jJ &xӳ*j9zaNIT x1r ٘Ə˽YcLwJ}֖sң6{EDI}ec|>CH (z "=[MCᅻ'oVPztH0Yid%m] X,Mあp˖Džξ;PB ټz#/f?fqq@{m6ym%I0JgSHZ'iJH)M.C.9R4_$:maAAc_ߙe'H{#,}l hl!4}g -J8`{)iۈl>%V} 5~q!M n:3XTgvrju!18SKF|o*H+|aoDv7Jف7ͪD)]LXfR"bew/?j4*Yϓ|P4}s/}V_{4.׾C'^AYVG3v=WR] ><ٍ:-{I4Ѫd_ z!i;>noEJ7 'OvbQ QcۦOql-(&jڲ`몇nV 4\٢BLa֚[Wpy5G4%n̴G}!+%nԣ/4b5`xzb6[| wLL՟}Civ(BtjMgZɑ",K;~ 0.!{! HvS%p{zحG[le{<vU,֋U}I C8~,jkϺR՘eCf'iO΅L!0'bfUŒZ`{1(d2쨓DhxCg| GJ}O ʾ폟_P0%nC< L3TådZv_,lA.uVc4 FFhs7aq&uLF4s8X``c^.b4q^TN/Ɯ5?nhX/@D_gx9&Ls6-V-?ݫR/BVŽQuIezO=`Eu[?<ñ4_Dl ݾ˚Rf=8~s*$9SS73!jx~2UWWh[鎐f_gUoZwoUj\)Bm>ũ(j{ EtnlY2x9f}8 GxMzo;lo25?}l6~@a %,~qqG.vOӋO#~%_ sf>ı4UEP~ zO߁N| XZ$9cZedwL=֪Ɣ],I0yF, I)b)؆V:t5D1ci&KGRv2hk3י0D0ZG)@fiрy.j%-1gOt2~t|?BS%Z5l,$}CtDrm,bX,Lʫ:3L}]V8lfNiVK iZmu"&%?I*p 9eBf+vo58(eC^u>N`֍cw>}Q s cZX41gڴoN'Fb*β_V!7NO1d5&U]愆,G6t F)X$= Zʦ^҄i42{k% ab[:όhLYOtnER[/SL&dW8)=#Jh26Ge4ӼiNTc%4^Atpb{qIF]UuT&4a!y&#ggHglEcd8>)R8e(p5Ec);E25dTy'h){';:lwqF#!=Ɔe_@)tZѬ;{whd.j1/vhh6j)kZ@n }%+ [UPV;{E8iJOߋIMBKhnҊPNlEA44;`Xxߤ>*%4^M Et0irFx.que"y+V4]zOʌ6wޤz9 !u2ռ6 ä!M+Q*.20w^"ـ0756KTBc_q趒B`3'7 ]^0NhjGCh)4c2_ 1< ;LM$E6E=ghEco4 pRH4Jl\AjKݺniW 3=̑vޠ@ Z+>?f8ASW]fb7гr*5фz1)hS 0|6=;?β٧[p:E>~;iP hӾ;޿,hzA"x="`^}]ٷ)<ή ;+a}.}g!%Wr'F4uK)Kתsog߁u󏸳@"ôGn' ]?/uט&5~i_i7n!x)61`B*=+Lf-Ms,Urj:|Y;cN-w%cAyt2icޙ[ػYg_7:[Y zߛ]vzYmۻC 7zNISabT@N|`4\ ^|j繿"q6p[NTy\+s05Lf>y0:/ͫ/h +0s4rٸ+  dR(Pfi}xͳ lxdh52rg sM{ vכ}B/QݎQ:+F( 0z. |>؈!x eItM i ep|zn N"oG C8Yzby9j`S[XWCV4Cq56iI4Ƭ=Jo;81qE.Uv_X\B7Ykb-~h?(z&< z9VaED0et~8R":.9L4[G`b>ίHf#bsP(c RdEN!Z3A/u'2dl=DHo"bABN 1ZLk6ۡ9.bA8Q | $bAYr02-b55 6* K8K|XNd9* Zr WBjXLe1O+ Oc*KbT'+Y!FgؑɁX $ӚmiZ*ǹՁ`Lтe'HLiaCeD=ƔqD#u]/%#`]ZSޯ*8cUB㕻@ IۓDAZzQ4rIZV+ejύ@I|*UK^6''3Ae`2H(A"࣢ǗA!q0-B*Ү/;a3sN DE ʭTd2y*\N̠pu9/FS<{=C˜y3\ $eJ,:Xr3ɐfDf,ǠB/qClB>+L=:v/].s0pCඡ"kGDo*ƪКN^Ȁs7,S=^O_/:,ص4H&?Lq!^fR^tJϘFnŤ @pkhtw3(薧ҺM-\ Yo0TʤA+'6M?I@mh7ܝ14P!OTPqED:) qԷ?]gJď(+q>+WkBX,ʉ t@Xe{It[(!?<[:{)EA 6ڙ&W* {qR4mNM+dy&h8E> iR&Q n hܷ15gϊ;'zHw UB=[ 4?ĕa;sNug%itRPlВ2jYk?kmE2N@sPF}r:uN2F;ebT#)4.j֒oXX3^?u+THJ-3D^!)v&9Z5dQ>#ͿwK.ދx8!ë뫇 s3,8-4 JIbɻIݷ: J %44B efad[mV1e*)"29ŤE8^io7F~~}oTPͩ&{N91gIM(3Ѕ6~}vRY{O1RI&fyf3'ϻB[ ֝"+"50IrJhtHcx-QjB6`ќ7nl|y혖GHϧoL0M4JrZTQ8`pUg"^м/nq_]̫39er!sAu^w"ڐܞS;nC\ub\h@j:ek>5vq6sTq{ nsa{ŪȼAuye cB|] o{hU{ćdt~ {2i~b ۿqh^]hZ37v"3W ⫥J"b@ITQ+Rs cn$GD')VN Sĕo*@RLy2]{Ê,Di EQ%(1AӍiZOq>I>2>4Lx  ϖ `gnE UbeZh`gahqVQP,ZΟObVre7qcj]WRRW,sZWwgӄ41Rpà2g%&X.ʲCF*aD"/`Z$VI\Πmo غd`~Xgu]DSt.F}fﱊ )DYxubkFj J Z";D.f4=Y}**H3_O.u!;j"/\ Qg7_KA.N_~?ߟDşdH7}je+?\Wj v8O-G>kwQ10auըQ) }z].˻y,x'KTgD ]2#4)r$pii2 @eIiiަH$)+߻vUoQ4̲tKos⏿eKQJJ?:<.[  2=B E]JJc6s Rh;9eP9ԙi(Gv B范aq·q@yBq {*Oʴ,hSF*5e 4 uYi +yPV1{ԏZ!Y ij}> %]}/ WpThڭ t# ,wX8 m"Z F#?[ E0eFtvInFݛzF,%IJG+1#6Mq M#0a}CkѽJ io@h`+}Fv Fo=%(-[AYEyCОT6<ɉ9d;IQk4lFzW$ڊAm+ @DҴoZ{L7~*ׯ/VZ^7x|q:. x6J*%'eL@}X'jKB}}#iz@U\ *rRNdȸ8\Wl"h7xJMg0Ǚe"CҌMrzDKsIn$ "M\'qGU $+'g{_lhڳL0]?G 7@P EECڦ 0 I`W&w3h Z`, Ke @J")qYmw2RWiV[Jk0lukV2DTb}'cо&2{\$/X Kw*c 027h<7'[ϥ䓺eCAY,gKal8{uh'A)i&erj[L%(SLeS߻mT 7yyV)^CΙ[.^' e% i0ゑ}rPԮ=x(T=G"};U|(2wVFKT LJfuҖ8f}l'UwEAo~UÛtЉ5~]fo[fRd)(eß*!Z!qK:귿GQrELA\.yOwu0#1pvq}fLoNH+1?.YL75]Ϳ?fIT_ ٯ[r YVkv,ܪ ymk59p}D!˕7xiΫ:4l"8c,caɰIJ!41CRڭC:}{1pkP$HR ]_; lw8Z;R(=9qEK+]JM'd;Ͽy3XRˡKit ;KCNŎRsgao'M` '~MU\n*ovU&0liE c^ G 'tۯ8e9qdQIF\GBC>c~Jk;*"_F俲e"|Ψp4N)e=Z`PڞtWεd%e)s'i.ЋmIG#&L~M:iŐ|*7ȇ1 l`ZM>3}l7US\HD;&C;9xA+~/- l: 4gIД-tW!4'|d?OoQ4̲t 4yrumR)'F,`2_'Q󎽼E'eo2jpd>ǫq 6S<[l,ߥ}(p$JD -p0[K\l%ӷJ!T1(pĥǚikDe&Z7>Kzg(2*19 ?\)THu2}L}eWd},ٻ䍡p e+nGNA J,g2*5]ypP@2Z9\| Fоtr%dU{@c5h"[kdҢUP! 15S̤^j3S4__^u}`w|(q%g7;K[ܷL+ 7lUqcJ NPFiÃˌkXy&/w)Xi==/_L1 V*c7L@5V\X;q!f #lO ${؂I}l/5o",]j"  LRSA5}xs]2>]7 D'yn586qF|Z7T%`N'U1taƘ şJ#LBRUHz3 ϪR NMoMo,R%Kŋ.YBKYZUߏ騚mq3?L΋d8ڤ.5{J1mUDIBAg V{m*W=&#o'p~m6DTyEXR~9kofE߿Y_WXÕgt sٍ,eN󯧿pr/׫A#0Cܞw0y2DͅyX4__nr4A# 4xڑ:+w܂{pa"9uFBJq1Ri)9k(w1]=]x'tE"oZTXA7>D z\Ib{ U9/TO )TK%'<$*GYg<_t*q 5}aC\dit< {sR7_H=XԜS.D7l|?T Qb.PSB2RJa)ˈŒJ@c YF!˨Әj"ߒkRF.ZLv85bDrM85}j$".-XϽ * e%+/,>0a$ *?w17Z}*\i&IvGXCD+~LϋWkJ%N=q8$ITgbp6qqA] >ƒ2dwrg:.sS\;6}"эoQ4p#q16-޵q,׿BM2RwuU?8.p _ ,I*vrI.Ґ3)s^TTS]EK鹅aȗM,~pS^~~'\mp^a_Ҷ酄g"h;0ͻguncEDf#HA bAwC2NOʒX$c6&{m8%.yEe7Eq eR/x֢8jɴT4gBmPYBjt}T8'ho2`}{A>샼#5̒qd(ksLi*DZG J`ShvL&.Ebl_󷣵^w oᄇ5p<{_^xC/06 +%JDnxԒ  '{̟ٖU塷^[/rj)+!pmJVu2Cĭx4!s#RJ\}(Z}vATT`nI4<*UET#@2FfdS!1%b\ ND"wk12FUQY4SblP NKIaL FZFm&mn@qt p opH|{CfGl^1p:\wk>EwYg?]WN5rN`UsQwֹ?їgz@e>1>NhqdZrEhi/tl4LJ#&GN6쭳 + 7rʭ`ޓR9b͛m1$x}}*eJ5I ZYgL 6(*$ubIВ/4#[Pb31VvShF݌&NmV06eZA9 es$u&LU'’ 4IO&ʨ D8HCJilm-䴔L0CowFUmTqa#:$XT }gPe3js:a7Ez^>U/Q̒TUl) WJN8(F*e[li!\kVu5#DKp/!&^ls.9] 8Jt[k0'iѬdNEhO[sB[HMr)8$ {+C ]nκ9)z eNʗՂ4kz#hR7hcC>gy!HlÊ0Ŕ㐫QoFAzSOE%F9؈q&dj0ȕ`/9ASv' Q3Z+=o7tk%TvÛ["ØZ=2|ԁxl(b-슳0X-cc(0m5nokl\Y *<h֗KK,pz^~eTN%Xǃm +iHFNqjmų|IR‰LU}K ӹa*l^[Btсszؑ˛,ˎ/w~480UMq.&P$ 6Ӛዱ>K!ܢxe xY{#=xkٲ>3dy5х#=McV-K}Luؽ5?}c$`_& a] #ŧ܌3cq~k5sߜ=:w? -O,K/Z% YD*1Nd%}l 8T kאX(7Lq0pdae9+2X~qʄۖ & +cgeV6PаE_o{_'j(6<&zA.ݫ\h|v=N9vʅ(ra :ں%@^5}73p|kPr&FuȷONp_S=(ux~Te-.GkZpAzQs^Ir9׽Tp*9T=+6)jpŻۏ6!Ku~(7_{[7ЋņL)3|V#?MX#'Vf+~MehL-ҖDܖֶwd-%{ek8egX&՞`vFYNWo)7iEEfd=͘W bR;f,Fݎ'e^bR/&5i#1%& 7!-T=fsТKpe²C;m,g,UҖ{ԤBՕ3g㘿H.O_/q\Ìs2^,aprT>R<,a}V-s'}s"w;>GBp=E=&y ?}k'Pd'}84}0Ͻ'[Cݗ>Hr9V0^Js }r }~[D6PU|!|8jEdY ֜U}^ (q.>"މN,q(l=ElxƵ{yU/ OJp߯޺?v<!i`EFQ4jw$U'j5bB,I}( Lq u+Va)۾8Wg1ZfFVfj pG#|GR(5-M2JU։e53Z#cͺ1VNIj"aΓ_擷ۈaG]ᇻR~S/@.{k/zAڽ8=ϯE~wsji/:;w-qhSr,..)ٔ%GR|EpC:5~Rf@n[}ٍ1t׭WٸVwCmxyn|Cnj $J1%q֡|;t˩XvޫfX\p^ͮqsDei 6ЍXwYq*t5ƟF 2 9Oo9*3 > )q3s#g[}s ƃSr8-\;O픃M0hqlR\\9ծ$OL#+p.p<Ԩ#_f;@ҹƷnv|nk9p΃ލՂI֝SԲ}ysx|dv צ#2gΖXYiOu9|P[aW7#5lw>I1w E 96d:1,WP*zpc Z sc"j#LT?J .b?L"C4P^>8a}7:ٵO-bWk&6fFH"*h X!äa+G1k9Wvijl8t2^C!  ꀝchvUkn$ `aX Ůs9 hA[q97T iZlN_'xLrwi@]2P&MdƠR֤y$ [<63j3F\0EX SJW]a-,I>=v@6"c<H>d|}A :}Z&&<$Y vcmE'2-C4)-awǞkZKfINnj佭tuDߍ$r&a WSF V 0 nC-9yCbu3B ~5p6Tdί/½SI+:*hZ^}uꜵY-uJ|QO|`p~&̷DJ0$?.EWtPFb0\l` 8v2Al-ÝNƗӓz>p0R5^z͐)Yo؀Y#:Q} 0F?3+A)HaDS`H1mq:@NO ǿY%=~<lVVlxkVXQb֔cFHػkն]kNVu۰D?e{*9S`z'Zczd9€f `.Z]dmQ$7hp*.=z Dc\ަѢ9dxhEa|ILrZ~SQc:~>ͼNQ =վ.7qFcR4_|~M/ZuQ#kKXtU&hC>$=WIi;y⛇1h(= mq'!'0#ǿIFc4Oܺ`E[㿁W|%,?AMxCM쿅%HJrٰA^>x-_~έLwI)PlAB2yLBof[-4B|UM|9V-D FLNƹHk+Z)CZO%r% mMħ. ۏ,?+xW?W3mբF# Z'S kv]tIAmm:ų7flSm(7C`{G|hs_4Fo@bgϟΑ*e0VJ5l6b7^ J%[/%Oun,S7|y赫48^f]|Mf)}ΉBE=:ST|sͳ)*6`%-93M?!}=L&ǓZS:o'Xzg㦒RФmy8fm3qPg%[o56Ogf2Y&Z-1seNNCI(^:vZZ QCx*CCl5&r QEqRPgW%JFln\!]*&w@M7QZ27~7M1j@9$*8ANG*}6g~ԣʖ 3uاI)36e4B/i,Ӏ%#TEsƅy\uUjyzl3{JhG+M\q;M/R*ފΑ5s R$vG7GeaӍ}KvAxu2Ss Cī0w fWx7Lj=0g^>X^_L&Bb%8,'ZwoH18>ms 5v_V&<>gR4SkA]u#ȅ{axbGoق:R-%83>&{=Nw**.p&wMcA˧dHnKBLQ5s4I99#%\}˸;#;~@1ܵyDIڝ=/6Kjl%"pIJr11e!/%f՘QW R jIV۔و%ݗĂk*~.Vb9ٵrw1DO6Hnxvfܯ9 ޻rH\ؿNոhKwL>靈=vm,|eٲi~}Xr[-9om xM6s4'!*v&hnzaNdsmpx+E>x8>t $I-_}g(H$v$vB׈@֘-;Tq:`*ٙlʔr,ӵI1G,)~ٚI?+/@' )gS74i ƣK rssa%;@(z8巃FX4\q/7JOЗK\lgAa$d7II Z9B淿>yg92txQ^7Ezp":+>wryŎ!眭rLȮKUDcM 4_J\{KWW-J~¾zVrMd|Iva hP259W."}uV@5j}QqHu[22ᴥ9Ѧ}hɃ$6:qr> <.f~ߓFw[~͡b E_'t9nӪx'sFO:c/wJvڹt޺\ЀC7F}7\gdIE :934rD=FfHIQ2g||QRO!B--7zi?r4*^x&kOb|x ٷ#ҹW,v֮܋pd" +rU&Z j'(gy>f(ِzZ'ٻn+uj'$/8S5m޳0LN>{܏*Fݒ6ҕq}s t|L.xdӳk>hxBk3bC%\7z9Y)cxApi;5ǽfr̆R|<3(zyxb3GOwy'əge 1 i۫buGzݴy>*IbQێ%0*v+[3ybl/7f>xCT[nxJƶ`r iR6*psV7ۆũ Cr{ K _U((DpѥǻDm{ZLתyTo˺vIU6,J*Po_v^|gg̚j`Sf $Î|xdWS[UʲnuXM*. C4m[t9|EG/xml@7fȎ܃c$wecX S|Sa%QևuN̯k$o%ܴy>+|n,mMK8bZ4SvOm_2Ffa_UNQ2dxOuG-984Nd?BKp(곘Dz53im ^fc'fdxg$E87m 3H1pn%9LPC>PanhRϋ.󳽽#j7}U/=1t'I Ƥm)ۋpugOh\ O@m!zq~4am&iJHCO(`_t Gړ W 9'D(zu.{5\䓱 |'JalZopon6OIܑ† w)nn#ː:}DW~CIr71A׸H6ST=ż!ؾ;Y]g߲' 3(@Oа/rQ0oۊ xyK̭Uumgne"H4h4:F*jHHA[ THa,)ß1 ۦiBK[5'kԺaMJa*[A6P*358K-vn]!?&7L6/bD"KZ Y!+1Ǎ웂ՀV8v&?i@G5ئvR0IsAE;AJUrmF%MBjԹw>%zK a8WCL˧J(ju#i:nq2gj4+Mes"*rvl 6( jDÎPw\6^$[7)I0^ha۾ ޴y>Ef# SuL K!+3qeŠ1Vs~}9o]Nyqp& Lމ h6.ssJ6t.zdOW%T(*cw).YGsfFЌMD6a31k$9L4-2 <*a1-0e6^onR,b^Yn>)%t۫\N@{)i =Ӛlc⮵smZMU2Zon Tfa=VNZiX_B4D!}q+ka7ۘa 0)}I`r~0gȌ^5c>ܶy6|iAAJr*w0]j<v+7M8GZ˷vMSD6E 0{:ST(tVT WNJꔼV1 $f9S}KocGH8T -ˉ5b#Ouiu|6ѐ*)0_\8a3е]ΠhieNePmq\t"v ni0 d.1ƒf F?W}_ DBW!2}zS[{ loz/ 7ڤ^ LvE kӍ$gEOצBdOzs;cnǖ'樽=hD>8cgJUK}{t6Ҩq߶y6&l"D7UIrc%JwV/+ lٻW<adQ0-2m9Ao$9-Q@t9Q4_n6z{jW˗L3ŏ:>wU+,۾XH%z]iT / ,k(87ʬHuE{M8Ħg(yDcf> Еum٤n1J 㻰7+ ) 닺6&o$|!/ ,c!;W3 8C+[ݬ}o#;Fg 5 Vhwsm7Ϸ8/.hF(%)QvƖ_S rCj$(3Β^F5:-g ($J smFa#怅sPsDMs<3itʂ^fg d΋3w?S\^P#wW~j:$U2-~Lj^:14ʁ/QX]qu֋5vo:b \^(Գr2BUӶ}hQ+P841Sll4D]ЪrnȋjXs;ؕ/>՛Wsۤ C-@aȆ]TCs bKĒIdCKYMKh #0{:Ғ."g|Um.B4G4TL8ۍU*,f!Z'&$Kk-އ2y\fl8z HK~jRqY6'eU Me"b;c.<ak08^eN#42vlNd @2-7A5w//.-Zcoַ>CG6wSY!9 ,"F"x4 +M\RG_:sA\RǶ98U Y'VqnIO]͋ŵT:װ JJ*Ra8:'0U#S:so=ۇI`Hbelgh Q}dn%Uц9ߩ0H۝%vBJ3J(6SVPt" xuYq eO>!@iVĭD:r[a 2MUKkK<&Ù*w7)ՙšgaT`7%ɹG'I ,smK̹G=*}Jܭ[_#Ciܣm1G+K9alἣsj:,ӴdsMR+] -;^R_JtJ Fg#QQ*g˼Lb,o\튢S;OʣIdU~vl*=cHUP*Znsk`5@c*{(~M e3w$<S$\/i&p>@^S*̵=@y XS-xo_֛/f>No *<%mq}rʯ뿿ɕ[E[SJ:V7ib$c)FMDIQb1.#Re9;ylRm1'+c#B&/u|OݴŖdE# δs]裶"܅PbQwIT;bNW@|4]Rk]L<(҂l vM1{zYo7 bt[V}ӆ<:/%1(wpD{/y~k{ˋ N1ZQ! Maur1j8MFAKa7 /(j_X[6 :0vف O "^1| O*0Q8+)Q$@ۋ_DAH:VE1ۖ(1J' SȦ_xi3˜U #k|U$\<3EWH|޻WykLї( WDj@rK8I/J¸(L|JrN v>j&̣ਥ3 ui49͚FM\ZS\3/Tb D B9H;N] 9(W=:qH*>'dTjTp\rs=H9Ӯ'+-s"l_ 4Zng7P gڝimYET4}P ;ۙvgډO8{;{>f4܅$A`6hlH.R L n-XQRR[88UYҜWVXKsd+,3`7ָ3ΰl!@.${ iF\{bL5'_,?b3? 5IwɖasP#'mf^͙3k.rmLUUs3P0Ŷ 99RyHc 5s!'Eb=.K'E(%(b@y5f<\YRnwң.w~=z?lz0v99s4 5gB7T@1\H&N,E6 ;t>` \A\.]e,o|diX2sD! ĵeD y~ԋ"= ]%@n'3N([?co)i˪ßR!EQny୏'<Ӗ$n !`7?.n5`'ݭw>W߃{ S5pӭk&DNZ'+ke9~7 vʬ |ܘn`)WF480#ǜ>58`>1%!ei!&C"B?%gHPQАt#*_cm&o-KG?:yquqg{3@RDF6!"g{_^f3RU Mce3BDDd BjEMUj`PSlOJlJ)=k189-?[ ؍h.ئJj2ID3-cmҎqO7B}c@3QWGA56хK,Z&\zA9,xgf uh7kLW)A) 4 ? NOwv{ehi8d־\d͆+JL|w1<טuFԬcVI%tטOH>!y&7][2mU/C{ >6Y +}̋"Kaϋ ޚ.)G~C$pFbՈpA'(SKݒ ҈.wd&/"luo\Kzl'^b6[5? vl/6ZrUT)[=$Fh3W2FWCwdbog˪z0`fl`) #S)E}sYc.{ZZ>x^39ee[xFLSaaT#ݭmY J̹+x(zO$mWcJx/Qi_#!UHv_)&5+̺8^Z[</pa7s=͸t 1Ί/Kce,X.drWdM"9)6KXƷ8eQ`(XV@0U< =B2Z-c` F˚xp,I)tj9sdfNܷ+'1d J)kY oC)h,Q4)Ζ9i B{ݒ8y4' =QڡSJ9 /VlȰ al5&aBM((|*Da zm[~ @O/O 8*}Mi i_5l0u`\$C)nl5R~Y ;/X`reiY??;G7フ?.H`m@74"$C:8fDp;-+}#M B.1|*&5G*"0K qDҘbNS7QpUL1![/`z hDTцH2(W?GWS LU s/.׼ \0o3w*/7gG g6xE_i0lU?{s!>\dT"|_4͋ OgWsz=OޝS?O?oY\̉z2yK9B@bݷNT) vaut|'BL";3f|\oKrIQ$_Ӂ` ruNe\;ϣV/PJ%QGDu)vT+|s #AE1=MYKd/,^U?g3o8:p [(C;0Hi0GpJ64.) [MgqIlDPP6`! F&[LyibtU/_j= ο-+Pw=4f9 M/> .\h嬇 ,B3ES9)"Bn2U+~1)؎jqֺf aX{Hķ% d9q¶R?R2;W/.B<%$XC]|@V[3pAsi b Uj<O62(T8VXťS2jNQ^mѕ)zz0LV#P=*1m*1Xs"Gu\z3$@İ)4:!u%JPL8qXc<YqLLHH5OB+h8X" aeELҘm tX"\V,U#AW YU)E@ `6H9DVk\)IA۶qH+ڭC2(9 AKedbF3Vi"Ӓ[" )jAKLH"ѲF粻 Ce X5_JZѠ}!X,`nS#M0pCA%}'%JKzNS)K,(6%Jb. H%44Lrq)$K^2H;RǪ-mhV)R*ܧeC>PʥafEd4ͪT%EbY0N!!0$m;psl ;8X{˓є e!)0>aWw `uYit3>ODZKN"q uFAؖ-`oQ-s{9DGsK<B=@(eq:{+mPX#0 eԍ|=*Q"qgX8𠄋 p0)p|  l*AU P'b`p`i~$wHKDX1fwW9uj;`h i@>:s~oGG#cO|HXcCS)&!q 0."#o> ߂5އxeNծo|393"aA,sNSZtȸOe@U 4g`"AgsZ&x[ 3;}vdvetQfv#sf. mpf 'Z M;|YVa%Ȝ %hL+Xl[30"%77 nl;^_5OH`n@YT4fE\;YǚÍIC.I[ ynfoVXaNִ9W`Ҋ94ѼR|4%>?c(QG8; _T. H#gz=>*!=uE"5 l#r F^lS[HzqYby Q*hii3s"^&E;Za;qͻw|'C>;B~PWXƕ E)A0Oþ C5bH 6t$X$Aa3<^b+t6KV Z13ˀm@Z2) R%Ko%+9ުCG-pYvzek70շMKn#bHÏȕ.hdV~Iߊ@yrgЅ# r፻6pi2aqcoxFH&wsZQnM 7T+ăNO;<ӑQbaNaJ.4?4&1,4v0S+zI뢰r(*m'Zz0sr)Y $2K%3@5>H)xMXR:9H%LzAq̪c x{ IĖ#LJgFcwKcd#eDܰDw :P{F 40hHNݢLKdX -x#jY,lgMW#l`;+tqrx53Y}v.3jޘkemp]tyG9L]AW~7&(̣l2RjKf.ME|GK 7bD'+ tI\(>]RU4lH6t@J2LV,j9jcZ z: HsDHuݓ*)gh볯v׾Y}IoYq՚"RCW'793Nqh7M Aޮ93,FYu#uݸ:k ./ߵolGⅾ]]<GB޹럢%9!\[=-{{ZK\Q=M# YfTcvpvd9z4@_Se59ZVYҮwG&?:m\ -7  g$yXLI:~q=oQ!_m겝$eJ"uX,^woFrʞk~{wвTH4M:\z\ߍ(:Cfj;!v6qɷȖfZ܁,6Oj(1`u"KcK J`XHő(I3#4F *BaSĒ%V/Q7nYݶ+ ү•}!gxeH MR#H4fsB5K(ÐVEBEn!,XEjK V`#dbeS[#(bnb"*Dk O"1R7m&E !5B"UwUAȢ׎EYſYk(uK(!|kƓvXNBf Zl[}s Rl_p骂կdBWFbBuv08NEeroSFaHL"a!s%$"%Q1"2$l̕5j[E lyop+6qMߏshiު<*ZP{D[rӥ)Mp&Nx 8>39K( 4( #A#h]9..*WE*В}žmDE~{>wHr=˭@g5pʭL] wAi &pт$D10#B ^ё #?D(D#HBAd gVR//B+*(+ %J@:&c1ճqD'Z2iw"aqB bˆ TIb$ȧ¼ 4\V |UK%\ F(S+(D-#ZHCaVPxaKU ,[gAx+ `* 2Rc9X#&U7sLyH(r@EŢ"wvn(3FQA+mƁ;g-Xx^;s5 "C t2fy}2vff?jT*Oeb푌YfXV+;; uHx5>Xղ8IB|1 et@s@Hݾ2֭Hsu)@? !uE[eca,NjlL!9@ eq˒X2˒ڲ$,YqYvڝ&tv+!C@$ FSE+fDCuM[KEeʣ28.P6;+:wo[`4G;DKlšPmLDeHаOgnҍñm096Z#֝ve;ZͺB|@s3;`% vRC=Ggw?&7==Y9;JveWz^I % OD["re)Ȑ&C LQHO?W+#C J2TQ)OT[2)уî j2*P!>P2W ;Ÿdd1lh#& ΤZ)Cb{ɢ5 Id̸H ?u$#<7]m0-V*nܽl,ƥw[U#azL ^]Tg7LG,e5IU Ƒ E]kyra7EB&#EDYDLi""xP E0b1Œq | gU!BY#H )P¸_ؕj")]BX(B+2J!I,4$Ė$$8bHJ&"iB I FX3(: R@X,-d9z-˱2_*bSAnj+tuL7Ky=c^1 0\clF ;?-~ i8)=R%7|_sZzS1@@ۿ{zF}2ͱ6 KGY"1uѐiw1# WT %\s0r_H&mXGkބpuVkRiqrz$ q֮-.gVo)ULbCMD@%$XipUU}r~Y{yZPgztYY85p<'ANo,flYE؏S0vxp hlJ 3M7qjIff}MoO{nȼvVҕ1;6nٕ! .FtwO̝f^b >M))2qugQ8 n]Cƣa+ϟaµї/?wtGe;S:JƎCNm)1w}3K I:u[{[?OK:y#0yw9}=/{ +luؿC{e;v 7u\ 7iV*#=q5}RzT$&Sg0>u;{6w-:g!Jw\RG;NE] uZ^<ol|7ޝ5q*T[Wg;;7 ̀{6i׃^7y{?Nߺtǟ^dj!^If_|'}}N`¬V-#2ZnQu=}B3swNڥNNm˭n<@Fh '<$>И Q24 @aeT!,1=tbqC^@MGb5kX˫s ^LG-fqɉw}·|N(rfT:IW'W}&]xYDjɚh`m\q( F8#>8u.}CPiʌ}ܳ6VhtT">twO|`0Φq\)9N>ACP&\xl3kW&vrwJm}v}7!<d\fަJ&8gs?VhHCZUr i-[ExFF|1.yOat5/.WlTwOyʮϠf0w~J1E]rl H<^u1^OwO{ VmSҤ Ji0 n\M+noo/ F^m}7\^Sޯ= 5b)~CJ]Օ?|·mw9<ᵦoĮVa_Y5'hVY]Ԭ&GhVj:=vYs%jW <[$j(s[-( ︔ҹM- c(a5jvţi Gh&۳ΖV7 o|7>4; ʋ實Gd6uRT3sSTA%ku SAt⫣yv̽$j4=dH5dhnC7[wc2R* ]/]AjwǏ|̠UeʾqȐ3v9_C:6u|p} g\?[ px>xFwd#M욦6@╫^3"'  W?[οzݻF|rʞk~{wвr5PZiͤ~=H"WI?/s՞<@m1bBv4 +.<]X#.Be D _s;u)Q2RwƊ{(3Ƈ9hՁQk9h+ wN셄C"vEmvnr+Vٖê9 C7jh( WP|^#m߲FE;aB1c3t-ulgۄJK|UU !oFAnAFAл >qw;9S z] BǍS҈7E;wcuGhET\hs-)e㮙`rO%XuB۝'qK[:֝vl#V˽4F"1SB\H<6D3*ٲ17@CFq`qkcp^;8;M5lLA͜on3GӸneOv(&q uњiut[vG@xײGhSB@2Wņן8]8Y;-{3tč_O:~w ;傓pe mmLe"Y%\dC& $K1i56Bp%ZP>R`*rRb*Wr)żl[ۉLG800c"xP E0b1aƸEUBJcGW-6V)sE2 2KPj^%&f(Gc. QXA/)W`h.r_/./W4(?+ 霘.`&ޛ__~P1]moƲ+?\'-s$).XK,) ,)KL٤HJ6H$"g晝Y4@3L6AcP%j:d'g5q|UhݺRA-ԛ.یJsn(OPMT׮hӲ3J$cMKyQ2~,NL/ Xkk]$6XQKO0T by1*VΡh6քy͂Ԅn;х%Vc@go*+b篈NZ/S"p8{*Џ̼ŭ}<.w~cdѿ*!܃݄ Kn6v!BEߢM $#l .ŐM݅۫qehmWm9y s f:6a!pK2~^kOoI}q{d.帝'hl 3uffn ~2M)d}ioB:^f&F8Ϥ|ʼn#[F:(|;x4@s G>JL100~0 [)Bl\^(&L,>6) Yb,)xñ~"xNj$ptLݽOIl'Sқ_y6ETۓt-U\ mzDh 1Aɥ7>\XǙ ka>ʨ;L۔S8 D8v{4c'߻<|1>%CdUNxFM y8>?p`ɸwv8=bq~;|5ـ>&__'tq5/O&t;3"u4_S͠ۋMbk\զWAzhvv< W7#;^'ɵ]D\FLsYO=,ZgՅ擛j( iMW/k88MKC8a?/Aen2Ltqw2ϕo$1ifz?zjF&)J/"_?^~x;N(dpufd\] ^L~ ~! F wSƝܡយN~?$4L?eڕF 0&;I&`l_F/ƗqطS@d70̾O`~ }:a p]> _'oiqVjm1_5-]3Iepgrv],z%,sAdBGҸLL AϘ6&!ELsS\{Vgk 01,l:1 3]eYMx,2 /#L5IlG*7y0EA;[,R}1^/,Eȅܧ͛GL6^GPkM]G ŞM`JR⭷=R| x{p5 HJ H՛oƛW@ 3MZUN*`ͷE$˪VG[fWob2jXQ;nekpb+jBj-[+.zl*vzeVE"Xl@TQ8pfh$d3EйJ($b.P)Ž<*+^@r͙ˆ+4 TJ.uU!*$3LB2GʾpwN%U^IO*+ٶ;4{U <_fmKt(V!#ɤPQWpbxn2% .>Sjh3r֐f|S>)ޚg<$5dJXis.V ۩ hr$zTL"%]aZ~0R3hܱt1dt:>뼇"侔 ]I(]hQ ,q즴Qg[uiM+jkt=d%[.:ME< (0̒8Q (Dʼnܫ (rU@*S/]JWe}{5PP! ȧ1)j#0kD$*T%\vPOS2RpHw\Mjz/}q$tdKvdjz4Ɨ@YGA3l(E2Z]\c%sRYeeedVRi&ǻ9Mg@x2nS؏m?Bk,$jn7HkR3'W"[d9LqC0r\~IjMm^(_ _,j8@(7 ҝ&P,KwH"(j+3Fײ;#[JLj:3/łnc_gV֊QxɺziP9'v؇U$^q$ѫ٬n&%\ͫӾ5 j ǣfvcb\ ̓t἟dzGpTٳ+0jy p2 y[ h鳽FAs)s&!Б^\Ž2-C 0ֵA#B〇-1a(F2H{*A!J+])CUD%9 0]*@ G&RT.`m*@A/dM&7IoY,}4A>qJØѧ PeV.`}{7]B]j@!$ˉy'yx˘<Γ<*$\\ЌyınR _E693jyKuh.I+WB]j@#UW%y'ˋQIvdIV>>k\IA>&G0VR_H1ꙹ ;WdD<96Xb˦{/ѕȐ1k%ǰ ub'P;WP rcph&Q!޵X6ך<-bc _H08ـ]* :L~q ynIFৃW}1³M?nG4Kc1W;YMMзkOU`0, ! 3s8mƑ&<|7 4 )칌%6dq7YCz yW嗺y`W!Az5vIǩ3|6"'Qoʜiٵv=sEA\Q3W,_4$B0*:$*q`tiPxD#"BRIL#" QAWC4KiW@pl@mٯOWj>zvfjL0T`(Xl21GFZ.#Y  $( B'Ց~&fD2YZ "łlѡP`m 7&斆D9mI,˵L2rF MSW C¹Q҉ԪH} X*)W\ZZrU\\9#KrU_k|a89hKnYcY'$ؙXkk1 0lPKٰH: OHI) R0VY&(a""04 M" L$'P(XkG'"pgUSDl)*jN d2k|.]\22N jP ^ !*7st@AcpjX|2mՓfisI+ɖVB㝏Mh=xLҒVJ|Aa2[󜕽wqm6 (kknQECCN( o"PU'泾Ax{?kK3zNA}v]ee*pyj9߿~sw%]W˻ &-NA0qADtioN8O36{qrRP/ML?'drQ7Ù] p#6i1Bl]uD T`q)}Ϲ SNC(B< 1x\B _Q^k#\ʙ*PXVUߓQF!uTpܪ}tQHxJ/0Z*wAJE1X(RL~H\]XpO ƨRq⤷&z.R!RG8* V!`&RL-ETe!1\8&{単^CEvrƥ` IH0]#/_O0b!]F.0.pd)2n+dL Et=rY/djo{oA&|% \ Ƿyu)iS%-B?6珻 %lN"@5i.,BಅeKw GHXk]~K 3g1isTy^ U.%{(]JrkJkʵ*D?zK<`ˠdkC$q#f, L%}L.u0X}vGy2ȎKzhž^qʞ iMmZ.:ZKO<`M>k+z(s4*8M+>cY<q"e9G 0ţXq`w ;kn䳛hp]>&-g|*űY1-70WB|x1velVW#oÃY}WKbg[ʁj%V xm!t15HI-u\L=sS: 4N nFNڲ$P`M$@14@ݼ4LJ+86#n7γh >%-?T6Y?n/$U$LY.|gbd<ˍN E{LO!>AD.)o}Aspӛ+ u֋:N4L~; *pS [2 t4caV)X W;"B!jjMr/)}/U$\TdCNv`Qyb$?{ȍ/ R2ë1Y``7 Ag6 ׶7$$U]*I6KU.=vI` ~}w֭a?f晠zg1¨6,xa|L*,%+{wPK˽;|`\K%SiC ;WV0n 9Eͤ88W!|\[QrpB>ٝ-R5z3ӭǷ.yEaF]s_.՝]^C0kw C-3meIݶ۶~S7~F[5#ZJ:*.kZKijfVY[l*Wy;`Mt npmJ36Iٙ ΆP-Ih\7#".:dvHcTw8tsNj<4+F5ñGрiakLBPrwOm@u;<\gN *~̛>hq?oYAznaw[o!Ąj7yClγ:{TirRq0)lZ\~Ga`&Gۛ= Z >a`5OsEm3.y_32+jֶXZ&TYMB>ɵRDH(ZO"dRpg k-?[cg S78A^9兙bf\Tk4fPqabq;_T^R{]r^qmFT$:-A>D죠A79.H;FiXU˩9Ay'>ILw6O>;KD:_ɨ>ȻN[/"/ыX.NO)S3_L+O/Z^gȩ륿;|1قyL, ɭnSn@ަwՏ5y{o3|n N SKxlA%D *ֹVè#&S\_-v2XZnq@auz8UIUJF*a5']Gjp4R W@]FDO,41:p8 :K9r.Ətuugc` \΁\!PM7cxn5ao>BZ,dY#e* (Ct4A0j*śt`TMJ"7@=zz;9ːsmb{ D5FAo.l-y1KQG#V"!{É ;z ʥ 4>vՎ^(2K4I~*Cecdi9a`J1hJq~g`*9A0 Ca:W7g;#_1z*T5E0G6,'Bd"ԏ5s,'uTR9edan~ ~P-4 Y  +r4/*8JL`HIѢG)4!j ޠ^g 7wIJC0D!`" ?_ -2!ȇyؔDU&2}^%Rc4XKLJ2!:LrbRE$ɮ #ɹdTt!Ag$ }]G$љEG.5$᭧/RT 8J3d+`p‹ZGD7[3rCZg HWiB1tL4Fb0*d<5|Tܲͧ!J\~Ӝcjn+,k\i@2G\6h',I6ZT>=4dZ%Hd#P/>{`Ұ͗D=o=+F }yj@dCiM)p8ym@” =nRu~櫅9 u~‹?$ـB?O`wN^N^ψ &0C0 G]\D)B//Oo_ .7Ke>:}&ΥE[10PQ0+УR 7QMh'!!X JH^<Mw%qy( p= %H-q)2b0MĆ6 mMB iۏ=`T;y;Nk" L7ދ,Gr!Ð"7e< u*L5c YK)ѧ['sh373Are@F2s)1|'Gw*"ݓ5 >PG7ݓ]9B*fX~8x,r`2VTg{@;]s7't5~Z[BMc77XK| VMtJDKBEI.|ET0F()f-*5MEԩSвrcMBbeǭذB ٌ9L:V}8M:YYVR44в%j̱:S w[*~htq? 5|1.&镏LG!*zґq8G2&t{" a*]?U{Zv߲*D嗿rS,j;9ϒ1Zyϲ;({B /qUtUCj956bf:xdF0+:B=TϽ}SXJ-9eox4_g򋃛)|N~fǏ_{E`x0sgbg|{v~+6jQ㽍mQ-o$Î}~_sqJtт==[%!5༯TQ!+I  NEsrv_cd3V.2(]"rD<I6sԥR rA2Z}S_q"R䮀Ŕ;Wխ}M9+ cWөr3jmW'՘#IAGeHxi^ј@)t,dYbD\(IR`@nΣL }H4Qpbx*$lGm>aǑ0:6fK!hb4)˜SVH upS,GR E, LFM`]$VrS)x>GRcZxލ;w3<¼yeuJ6O|Ԝb*斁|9EגGrB2Ju8h4Ly-L"L",C%Tl%YcNN2^-5>1ƫg*h2dJa!4 `\  fRl>%G!N6?-y6~`Ơ3l>=兽Xe Grd/:X>06_%¸rvY[k'.L F l qCDc8rj!YS (8Yk`Fu1MXnF;re6!)69 )Dמ@NB$ԩQH:4 Sv'8R^'$|$BǭOMM(YmjNnuBwh(($wK\iz3UsD5USf l%a;G*e;ygߥ}pMO'ΪG@~/=Si _PlhMci6zD(&Zopy}kq_= "G&a(=*u)DI|JRc"QmP;=e(Fyp׳d'>NU)"pz7`F B5,Yp JΕ d&5{(m"J$^ &Pi[ ʚPI>)PK£vl D;\OOe:JGigG!n6xdJl"$}vP.~U{=(ŧ KXa-͎ VaB,q ' sJ cPIp4{%B# )-5 p{g!=&}?/5!PcpK+) &]U ٜ {u_"*8(Y@prHld=d SH1R,6-,BEYab D{>X4m#sG=5Զ&{׸"{pe >ݬZ%G6o e3 mv8_h0-Wƀ|Bk.mM!WWt=!=,jVZuƼa6=߿<\M'c{,9{f/#~aCHMiL֢0OH.8TL#y mq?jof6ψ7+ `2dS!̃KfRޫ0̢5n8r1Pr?J'Xm_+|vvD~.aQd 0 ^śg)ݒ#I|4<0VG @9b}vogG]Ξ*`mx(gvkt G߀4h,kF+h[s ("\Byf@8M^ 6ѶE#nT P/'Tf;` D1^^z^LQ8*rQXJ)-S+~,#'a/ɚ}а: 2u=A"oxՍ|,cV7rAc7np߇\N_.|-L|x가#\Cڒb澶dSR9b{κBZ۴ɻ1R{S3G }&iKnoY?+_6ZMLmu>Do,ZB4W5S :N=M`u8S喝Zg1Oq^D%H`7Q+tָcB윁wk}@z6toNzC^&!I-|c#6CZ1ގZ '5*2Gd %ggQYUTѳ3άJ ;άA3vUnY6xGtkb䳲1i֬ݳ؆%$0hb0^ڔyi:PL̓">1ۙir6Z4yugi) PݶY$f94’m,MR] i-ρ?SAF5P)=[Sul x^SOR.kjT<Ԛʐjj)eu9|]r<'6v xԅk*lɧω3r( <=2Ӌ2j5ټ* .ߌhɏq wuN3p8S/Ga i2kGcmAfY;nX~Ns֒4T>w<,g }0 8 v@R T}7l(\ ‚ o+Zѧ@c5/ Uq8=H1^KK$!>*!qvvQ=>rLhΰslcgXKku=]ھK'+Qkɏ?߾gt:)MWGY+h<խMR݀9 <#,~q]G%VGeB:-7 0m-:]2厄[Rw%ђkhUqɴө\2B+Dh{..!>\2O7-־ʪfE }3bh[ze>r={ӿ[&̏=}^+aY7ZKֱ6VIt[ ,u}$B' ~f&I ߿>|>aUH,]_N|+Svٰ(eLO/fsf_-ODQ Ah7[b1#9Gr_Rtİhn(F 85?s5δNa}8j:ew C0h6_2j:$dyrEa]=" s$Fn$Czffc; |x,_(̇ 2a\P{t2͔g@EΞ XcR{hzXj@Pr/@Iap <:D4 TqE| ,}pvJ5>Gw&\Jp:o 2F9: ecYЅyu:i'ϩ[L&2qӨij%.*DR]](aJз> K.[4bҢy-1lAE#&eJiEs[4B<۔E3AuJ۪}oWB9ozvltTPF 0jN\ p;o1nvQ׵RU6B-PKZujK^L1݋Wcru,ޫ,}gai,w:/;.v~ $By=(/{Ͽl]ohX}aut6.C o>& x畔5-[~z9vy_uJSj.b,ޔW$6lhZc1vZe &pZd^xZ= XVk%&hX}auK^$,>j2-SVaDXa5qVj-W8&x pz/ڭ{#UbI,il)V؁)$njDZrzC;C5 r7] u ER;vϞ7g`|h{Vx6%gb͞;On ;hryv ;7-~s3ۻK&Fk(O\`11x1'bX 95eC=L<>Ƿzf} >gag8Uۖv4nun~Tkj뇣]᪯#_o~OwgRbs8_i?珯9.>9.iq˪? cW<+IeRsKWBn3[@gRqjLRFJ,ǝG[XV%]/GkD͓mo.zHCM_%Y[P͡Ju Rx-b)CdsB^ܒQKSjS;6(d؛c3؎dJy0s=S5&Q-J/Nєm3j wL>YÛ6>IqE.F5v %tjk5Dm)Y[uX"Sr$+Ň^zjD JKPG fi&]ÙVBhL-L{%5æY bXS'Uc{pKEA>U*֊"T[ar\J"1ٴb'~i+Kņiih&Ƿ ]$Iŭ{H NS"py H{pcb 9ƜSIw_'Ir`J.x @kV6hzJoBjlL"\#n73z*Y9T|93&PJok%B7&Za!`sߩ"٪9:Q rpa_SAZ2ւrE:!UZ.93֥wk+"(5TVC29y%ۛssq(IĄRÉ-ϝc@u Qp yDjH bhB*H5hL=Peht5kʧQeK$`kFiǃKI5Bm qMfɩl ڤbW<)WM`/ib;BX#*,H^CLl(c<~l<),\YRu:abFnٚnu"6`HM%Cn@B(`V͒nYǦPK.AkrD=M+ krSŠnfnNa9pFE{oII8c㰒LDQFQfNqxɸ8>g.ڼtB m7ebV;K+-õ[ݠ2%8GpPYP-#Գ<\l{>K+UXQ2Ֆ:`ьA4z)F<(:B1g\l UsԆU !8lS†1 CO[pW[aXY. jXT>m2 &"Y!)&(j(Sw*o FkB HdRXHܵkB P/F̃_79ZUTwKzv ЂD{D d҂ sЁ)E@8:Ɠm 8P2RQJ 5R 8 P=)!dZXTDs ,*Jtt[",Cs̾ZS6nK~/6~>Y D@;$%qhp*-I8Eg6e $@mhj~Ecd LH(!8Mƅc~N j<HV -02&.o9#EUNMHD$Q/b9 ~oª"ctu)YcnyE#wrDB&]" Ƈ5h@A^ >[QL)݉%!Z** !Pu><`aPⴰ>‚E d0??(7jZc{e`cѱ˪pQҴǁc;`F&(1G  WQx+CGeQISBշF{*V|q獲p[9oKNO kf#NP'죔JsVlA 0B,XX;u$?rbVNBebM4+"a{qwR@ @ aҬUpb΢-l n: C<| I50%,&ϊ}FXZpZ9j$l%x5Qjc:& LEr 5e@ixPZC"Al- "geV6v x;%vC'jjZe:Ҵ~FXY5 u4Dpv@tkA*ָQ佒"Z&F{͸DIGfѨ O_}Gi>2-l\&C},|~鉮v@L`};Ig d3"Ʒ8C+D3GձjPkvȒdѠ |'ofmj$%D: -P*pyHPms~A5(dzج**PeP`* J{$=|BbH@a>kd؏l/c*>x9J+E$7Rj4L9F|Gɟg.js|c$({AFROWU5pӺ6)6dzWW19]j] UDn1<ZYPAR2 *duxfUD go}l4ߤ+B5rFp_zR .$t'%iI _dLLJ{R 9?ˤ}́tDIcvH_DWGu,9ƪ0)jXVK` H/1؍%d-ч32Dؔ\xbqQ?3|ٿ6|v3 X URlz48j,>Z.cOM)-gRK0W>y5mŮ{z{f:LWy5(^?=_!;ϫ/{/39gnn }L'y3c>ݓh^py8A ?(Znѫݓ_.]||W}`{?޵0q鿂b]]..'UDeٹ]yZA@ڌJ==Ab4z53=pgCHhޝ8q4< BTN67 M'GpR' bB1v {zXGp Z*Ȉmt<0¡)&(K)K$܌Nk}ml@5᪜C5O5$bKm2mXQ"$Q>L>PF i8 R xD1ʃ9X0˕4 !ـjN5ٳ{TK ߏ : J44b4Bpi3.ЧЀjҝvljI%&J9UNE,b߮2eoWBoYrœM]XRY.z,L9мszA.R˗^7Ai=ׯaJ(sm$SBt>w0"%[CT2 ۨ廝,0x[ӎVZ\D9Oв ڭ)N6hu`~<߭qGj6$3F2EB>nm>=vkʃ*Ӵ:t[)>vkѪڭ EtKtEF8ɞh y][jA.Ǽ8qdb-vd-vbW Ta.r0t{ !{ b|ȁˇ ˇ|HH,);dxV ۲aaTlOl4FJ˼aa(:qOS4 is畽ժT='NHn ljqCNA2' 5R~9l!q5;jtE98> 1 SI0&Fz՟h'EVfbd?Y˺JU3nwnUx[`&PyCpcaͷ#asч哌u0 s>A?J3ztTyFitTh,&ӻt}kZA׬kYOIZͷ;#A G=}'6>WyW^Iu $ kl^1kGY$Ppcg;uc )'P*}7MlM{נsLٞdT*KH%KGm2H^knG$P!]# atҸwM:J?= Iyb 7+P•@9kJA9E> V'sNՃLhRcE ҢU`먎I]>w:;'w6DQ씳PclvTRv60%:;TY.K s4p/ȗRe붌aESt( LE<mMfpa(N@%h#pE'BHhԁrB` ZP*Ƒ *.jB|a5Mf6A\IS^?S岾< xu!bJxe?+iHϴ,{2O1u{#IbV+J TE:Ĵf8%%;;1HDk9c'1@Lk2f"!wgjA&D^ՌYu Ԣt,wk#Q?aEťqGWEǁ7|Nxކtӽ;*ijtr*x=Agl^/~s:^f?D8WG &nNE&KrI%obagk<`:dcДRy-pop4d4z hht>D:L'&̯ǀ c@ WݫBe/n[h.)K>]_-FA$az -|wX$.Ur57w-Ba}|1X/nj axh 3Q:Y432c+!w))X)~xIpafue6.uI'Nwx3gK{h⻫%5SH±-,hE,(832|i#O)挢;j4ڸ ~4Cux9 QԴ;':(Qv*}}Ru2`S$ j)S6:6HEAp꣆6bF2o:t =x$SٳVTUk# D` %kAGZyruќ?yNdDk]:yX p(Wb e.Rߚz"Ê]o+Y̏L'5?MxXpjƮhE8sR}.0Sk+sj7(sdSU8Yh̩fT-Nĩ z4W66#K^SN l]=b.'o#cGLZי/iYXnkr ɍ_.)1#4s5f:j7Cܔ))T 9ץr>GKqUx7~t2dY1[/ft&LחkE?}󷗽%uY&l]g2 ޝaeo^ѰƄ}csx~]۳mjQh鵌ҩي!ۄ/BkQZOWr bﮌ/\"ڛC]{V!rE r_o~<0җX% Ytc]L Bx9,~zݷ/{}ia{ d&%oB p‡?1ۆat؛\ op0\\X_"(x?O;߬?3/ @ _-Q~.{ZZ譥>و(F%Uj[SuϣV_Һa|r:Z`DY/G[Q/C :;c62S$vK.8TWǼ\ ̮@;Τ a6 ! ARk߄dˍl8uqdO utչ`H]r'-Is76M28wX}N~?[Kp /'(qyߋL6:\Nfof0JW>?hSy[->] hCJ! +8!% 6$Gcg  ?Rc#PNS^"+Tqla:D HxB[4lSFd&D!e!x X1ca=6MT%b7F֔9˓RZ9h: 11Li)@큆 }@0uZ-I kxi\uJ!а6z]@5ilb`纛\^^_4!!D\cMISo TDw4 !Fh(;WeXKld3!J1O;!EB[CP|7ŀ'X6ӘG}EZf l3(3*h2:&WM`) FGR*`Ӳ7tjXof6e"!8` o:xR:O\3 v6U3lޠt>S#/Xn)wDVD`nE, "hU$?X`4%#?N ))Ch+)sĺL".A+ Z8 bSbU.=~nj zE L4dB/)  )週QVD#qkV>;ZcnQjyE r\s\ \1k;뜹 VV0 .T#ѢIbQ<ȈOg=kU,`~tYLІEiUFp `Niu)<7\;+!Zzz@p$A I d:TS Hce yy (M.~V:g%n?hj4/m0}݃hAڪvUc[j7~EsJn2a?~ju4rvمv< YZ[K{YZ/IyseiYvG^s=bT~IAD ԝssK<8l^ߍ'͑=)y: _ RJ54/G.r30.?@v,Cqr]hu.=.ڐv.=;xٝT{+y.=;\zvҳ; ~.={]TŹ`gwR\zgwQRKϞhٝi=.ҳG(={桞ogiMYySR'YQW2[\ j, [S{ GuЇ1D&y%L|A#Q'6/e([c_޼5CNڷ4܎qtL^uF?7Kmau5eΑN/o'c$V~aIYEZ 2D uZɝ,.?NRBZmkOy0.*956y023g:vH՘'Mc:?op߆ލڻ"G)]C* 6Gw?KyƟZg+@wu3 Z]Zi,\<\\kF]{2/r17 caE:xzcK2{i9eߔ#>Fi݆#:0mPF3Y,nRǗ:E831Uo'P`F^M%eߡ}wS5동14cx;VvSVy5TAk^MeJiHh-X~i)j]Ijr1BFT:V9ݲ '% Fm.x"U0EАj 0?K#9@4Ń8VCAU U TKai ]4@wg1{g[]Kܵ71@KWӒ^zjN9}ٌ W9H%_}֗T78[Q:T)IƤ4HQ8>W; >-tѬ: lRȝ*r+E**G%KE*cYR|Z'kgs@yo(x y/qN|NL "bA2ԯ"w+!g;3ɳx]x-bҀe!u,:uV S P/t,JK;aʳx@\&^wiDauE?y ?*MےfJ״ pUW*1_E}U Mŀ|Wg1M='Pnmj♍[{hjQ(]6B_׮7[_rf ]=`*$:@TN&Ԇp?-%_Q]kVI)i/7P$՚64XKGN/w/ً6 m)RuZ^ZBHK"<^K>[S"*%YժJΉ*jMt̪4])B˨nv3; ۦW*lMם3T} }(w3I1[f})2U&b ^Bjˊzz=J~ib~>?dU3˚[jBIa⛗a-!i`^ExU w?wSȫ)zZkGDUY^kھZULL|`7 夽y ޾2fj WxVu:iflӦNY@`yIao%5m/m srG!ǝȞ^?}E5x>~/.8=c'F?7敵ltNN7w\.O;}M?c;(OA{u 1밋t ~QV>bWwcFRYVǥw껫:I٭ִUmh{ߝ^Ӯ3q^Z^?<{tpe;ԩ*{֥Nc=tN||KN( BuJh7v'm_NT-ػ(z NA{vx#0S{wAh?)Iivłڿ0تIKuZ; Q{C눫JoZG]gT8 aJ'uoH9[950[pРziwkmx[ԑ */͆S.> ;QDT;uKIǎXY:]pRR$5˥8wm~:_Cs 2lluQMRRS&M)bXɪr &cL[kQbժ7aQwyR?k94Z]fVlĽ fn%pYq{a. ZSc+?NiT_n?z,זo`0 K C%"4Nz tHHF@= F#!A±DJSdCΡL1rTknN5\\>vDwG~a=ɜ( cx}P$7cwQE4H8߭VL4R?GBS6J qRz N B) &]]Nd)ղ9/_e)ҷݘp{T&P<(Lo3JWd$w־)cZ!m(eYp~:O3%δ*?w%Z@1{ FƀPeEc ƫ8.MB5룜@zZCZ0%ARrtä[u %0G*qxbi^Sοxи"27 NΉޠ#ӑ@yQRhĆ!O(ڟ1hNHBHcmr=+\'廕n'y'9MHN]P4eo{ &/8&G ++ KzBpt'h-v$SqYuaA]C( Bg\a~ǑϘE 1V~f#KKi-;B>ܛ *GU,\ݛ摒"1eTVO,%wV1/3E1|bJvxSwK`j9Q00n4۳i ΆXpg+Δ~!fWt?mn< 5 sEZe|U.DBv#X4 d2z٥3 !@T2L-IvgM gB[Xz4_fԗ?2& gxwl)aL0 EyQxPR5A!FB%D*OrrX` @;G,R* #ex:RMn*cJXZ/qph}G\WGW%L0҅k/'HOߌT3:[[A)Os@񒬢8cneH;E,fl~DŽjSBs8PΡ éSj)~6-va #k@9N]E%+OkJ{دK f E(>?UYpxdsksdi ! ʡ_)iI|>4Eas0`¦&5Yr.lxѓ(ҢȠ]CEҞej̚@ U^2ʥNxI6Iˆ~d;eQϫIs.eknvWLIna{xeZ2 AFh8z6Jgt|/.+'hF"n0KypL+!Nꍝq瓣Z~%)9łqOkeG8IVTGA;/ā of~r H5K"QB?fʻRAwLt̋q6SVЦ0 u ܧk c$":3sU,?RHAfgv-nZrN[= MnfJW[,QxS/و2B.@˽rw34/!C:3*=P7S*JeB%~|J^ L15Q3LYpPCdZ߰肵w 8W a߮c3Ĝbః4dKԠAX'⥟d+ \ Jx&9&e/zD2y=HrLzTiI]r,)%~m΅ bVEůt({W؃򩃝,´Ӳ}N ĹK$ D-)AmejI;x-m:sHHǗGt׊0t]C_8#>:* {!׾]C0I&@"):Aʖ's>5xaiy4&hN9< zRm Y@ۓ-"thH|{m8N14Oʑ^"!5]GA4u_hFu8H_h5 d<ٟJ,;v 2\p׍mmK{> nwPL(cv]eÂ#}~kЍ>*Yx:GkRay=ƓR= gpw?™_~MY ̒|Nqaƴ 2`CSf8d6U -|VC=w0:rݞL9nލ*Ԝ% e\,ZAolTyZ`""#[E>1 S`F$!l{PtrGb##No/`FyȳY)R9cAVYzƒDvpվ tVۃMKe9`CtR]=`(:j'~J]v8G Ȩ*oF,CD # g<eG0`DH"? ¬ Xx+y oWBE+:' f@tE0 ]I%/*ЁjRΡ\E6a+Uzc lιxs[v!ݳ6Hg;gaoW[{T&A\ ೳ@lTϗ&* TdL_oK:UO ܜU$FHșs):3p~g-q90|? F͙"nѮRHP sMuqAλxjb):Mȉ+ o-)q2% ti)5=8t_Ӽ=XE2&C4uGZåa ]2 wg)d>JRH"Xg=*qPƺX(\N,:f!}MWTMछݓ; =e,:5;ES/rJ-0s%;QH,d8{)\W;6"'Qf] iҷh\RB)?1ێF~dF8m֡`J?>Οǀ҅t8=gj^9e96[%P) TtJ{mb@ṕ:bpx(d> 'n;Q~vVEv2yn좉8E8:L`cͣE0\Xk(@xhͶ~u8a^ ),Z.ڎ؜` ҫGFc.>Gt>%*AK[?$ѣd90?VϬtq7G}Q/Mu5?f㯏}W|-o$o ! @Q@ %! 2,,8n\VY?ؼW᢮j κ6ߍ̿5 v`=GK3zx 4SVv=`Bvrږs? 2|j"E-}b^(u_XYjDA X2DH(HP[{!nlQwVnΒXQ;^xy y $;Uʵ[/~כjJ۳>=7 1gx/"ao5Eo]Nd(*Ewpak 9#7c^ڣ}2bU%DKP-cD2 dY\ YD4V/~6* xHBD<}RpNqNUDxfGK [ijޮDU ͬVRX~[HN⬋deo1 }, / y |Vэ^ a3|M2mq۷c8-x[FE'Pޮq~b\́XꭊfkT]oO&OwFxI|EL."&Fw,OԐ1;]]JV,I_A`bK>P%3蒴4h磪Q5 i}N;[q;Y?XqCZ,K) ga~|׃_@<Y<ϒ9X@#@q{\E5K֖Tޏ7JޯcGjryĴOn\'0C@t3 &?A)ij"C$ 0Ø&"%VFz7jJMǽ SC"~߁\-X 7j0,~\}Tsv!ݺ3W cK7b6bkgd^`6ޓ7kSQ.6f]VbkrܩE<>TIwH〉+iq8vZ*ע7)[+KL%lHe"}W_.RYVƩ8#D'ZժI<s,4JQf)Lm[YYN,}'o.†>)+Ȑv'l[7oAPҞ"@9 t( `Yؘ2i^Hscn0O-^z!o%C<8dJan"Cd-?y@,ewJs;%2i2="Vtx`OJ)٧mdEpqwlr ␴ώ яڮE`w^TpI0fk(6] 3 =`JXvQ{Tjd}b -^.mfhJq0MR.i)S % $p!\t_vxkծtm.|Ӌ׽Q6>v8(66żmMҀ`+vmNMA"J!&CJBQ@ KT)+pJYwŏ[X^BR-=ux*2FN5,!*MR(C!%TPb)v kӝ~a|DB FPߣ!? "zfqAop=>uXU"1)y=uϫ*/ǰQ1Z]Ui$$VVyä@|dY$ v %izEzkQ}Mn^ׄ `å[t{|+ xa{`d,ub __l0phg8p)gfm/xl;cJY7_;]C*f^qsw.J)@dO(!@@_#IX]\u=b%wIA0g#5}o擡Ҧ` \ mT]j%=lEHQwuNn!cdu}Sb) Eק"v.N{Vk[0U1{Q@)j_vLgr{Ub w,.W3cTflf:'[4ba.V:M)sފLPx rpp_ɸLʦXxt?]S`Y xN7b W~XL5v6fg߮>> P@%d Pt0IQP[;{[o+=wA|uyw  fӔ9d0EaXp Z3mhҌ3 I+~{MS3HWq!-ϭ'8}8iB=v7SR[U2b!d)h9 ZO< !ykMz`E j5K!i )2R5/**."DO{å^&ʮn?AM쫋ϾYY[eYs(䐒Bؿ "C\.̧=Se4{`@0ŷw.#_بXT̽W`a#g9@ ì_d&doa?đg@b ìX.u)dwKO 5Jmǿ=a`NKХ Tf 8@V#(f}}׺al˵n"]\Z]*ϳ9LrX ۱Sݜ Pĩٸp4H, Bvt:F%Dtt׷}c?cm\ͻC0@q<=?c Kmx*(͑ B֚vDGWpKF;PP=Ҷs9F֟ :$:z8 Q cA }x\L`/Bc֠: `ⱔH.:Q'5AWHES}'xRjI]߹:Vrh%ŏ-3@3&-S= vM^.?^:ΦJ%zaM0řp~8Հ0 .Em%lwK5@<®rkق(x͋nwi0aebEޠi˽k0ʛm58Ϩk+/nwl+jPKVVJ[f˚2yðkmuݛNt-5ϿwbaZ^xvo qZ/wrЕ{yzoM.l֗(usVSWL0dyZ0mײo^mQѿ V]sڒhcrJEy7+J9qz14I$Nv{10h \L[x*'Ŝ>HCV=8rk rS{m{@Z@p˥nn=ݻһR-1r/쌸ccF|:)sšFIJ7OV9c<4F,fF=~BX="^/{"H]-V]Pᡆ*hCھ9%js9/ˏQWDI8@4Mv^j+'R[onG>(b4nRLdFEVrׯfȕ9,270%{fFy`B`S,1N8{̽СOWўz !nx7x HɄMLUNlv]ɧɇɩgcI/S HpLJɨŀDXf f)lyB9 |vY\S,Wc֋ewl[JƶbHMqn_9\Nqģ(bӳm̶)$,,%ד#ϫMP&fϳaj7[ͳ .A$)U<%\1|&$~BSڽ7k;ڢaӢ_;0QF# l8fkH'o,V)Euc4`̾lL]Kj%v}Q";n` GL%Ĉa b3 .Q=Wٝ@K .5憦 $B(&H2e.% $ B\Lۿm 1+ĂB'7jcqĞI 1HPƉewLra;4IVZ0Ɖ[: RɐYͧ@ KT)[=ST{ )_IG7%Z R", 8PZ"@L0V\(eP:d/iooeM=H)KB|y+NۓsS\yG\V䪦w_Wݭȕ@EG !c{JƼI\䁅훣ic4T]QVgRM;p~x˽*_-m(VU6u FtƑG/_$j/wi+ $* mW|Y}OqvNZ((hHUuEQIK=gi]Ҟ2ԙ<5&gLr],QmbQ{C7S+T፺"PWMo9g!4\NWGxݯdֲ^}9"ȹAES<^>;& j VU|APrnܪ:-)޶8\l{`paxIpWY]"@_3}95wX%P((xA6;W8WTiW[2GoWh1sj?OQ+44G-ƠΙ11x0Z˰ڧ<֝|J 1 tut`ȿ%O!$rҁօëġ kV$ @J[*V- JS!Tѝ;bY5=B.-UkV[*]5Tam2A>RsXb%@,(E uws-kmؽ Ʃ8#D'"eT" 'XcF)L5)`'79BskJY63xߜh5/FA_e.!5 ΀J6BXIK`aOa #>ي&#ͤ vYyd$f0S+F830"iv${,%oUlo! ?t*R-sm;j*kitȴ}c'OS<\{9)9=Tt +UGmL@08=ui`ޮ"s0hg!cIygPzv[`*Ϟukl@v8 R|?hB.]$O߃y ][o#7+ A˼_ qݜ 2ٓ M:#[$Of%ۭV[= Dw׍dUU8qw^yVKd=~kL2);u`>cr?Naz(bT%lQ[x{ةq7-c%\Uq焿 *'4#53:L@z&k`ƥ]%n8E1Gw1p#'vz_oVa 1@% sö`Y+ p\@UR A)FZLDHŴ33і . Z6+J Zw +4l6(Šäp|ءѴk W@^ ,_8[~xq6_U7s/ظ{5 çx)sdbK?>\|OUppt;p-9,Gg"Wl)~ Tҋ!+tʀYK/ϣ;I{cʖW} ??_27ˍb࿅y7[Ϳ_Bd)ş@6ڂI"b*o1C<~Wss./bL\!QQDЩ`r&ˉt&XќPTX&ܸVoc^108q ܼ&5AQ:~;kR#x1ޢT_n)J"[U$'ż'fxW?'%FgtZگ(KO֫*1xX2ֽ]!"5,n<9-*!$)~YcQ^T!f+'*!"E<*N AV΢28G>ǩW۝_',:Xu0$W` DD8a:$VBa&A / fnޗ9gWө7L7/V.ffM_t/O2o;?NC-7V^h~/_i|9WcǬ170@mjcq%k"'[ʥB q]OƜ~2njMd,ti|6Էvj+YmPdQ Uِ ;1nyD "}i`bRum)S3$hFr c VƓk'te¬,ZJSFT_<^;!aH߉"K^Kd& X/f8}< 4"$/+:0{dʐ }-SNpen k7,tHJ9`@iI?~v`2 i9a[B0+18i4bHDsvy\{ffcH0`k+Q7d 6h +A)O2Z8j<$Hmu Ԓ _6b.ZMQhXai0X։ "dpDb7*$`d !WZ4W,J8>HSa5*@kK ʯ{PhD6+-|OPB>txG wB9hVΠ@"Φ?g~J[#}[Kc6(!>lkiLՑȇ(e`V%yc8)Dg$vX r[<A7# $`s!Q}ޖqԖo>BV A.,BCEu2MU&n$hyp\Bɏrcp5$ӾV.ѝ}[ʄIn~Fg5%O;c#`W 3Ƶ ,y`Uwi|Tp=u~UƢʢ߷[ZW--/QNh#lRȱL\¯4_WvD@0{V-$j UBUeo$+A8z{wVm,咚 8͒_*K56éWH奄ڠL 6YK8[\yes.z,nE?e ΌLŢ3J ;8_΃^~/=]Ëi6%%WSɋ֚/]kVXF~I,W;z;r:Id!_nSE.[JWF4.ðeЌ z)T:9BOEΡhώ(h}ʺ}'݀/#qك%D`/ϯNRO[ұ'Z4q~zZJbbG-ݕW[*:SR|'GG'-Fwd$OQF$9MX=̕y:Zd 0۔4Qk`k q;b/ba`tû73T L{˵_(vب7gA3t%=n5|Ume5|U-%Q@z%d6v BeY,~-y]%ko'),}$.mtx1s-n'#U8^1XivʹW=9F ~PܪHiڪp$n,#mD&䃺]4@ %'N)fyIdWZ Ӗ5%UR z\Hr?~SE7/)ܑߍꁞ8wח d!&sqrHCׇ\99m1Xk&J67='՝'R_M7D%z04,>Rw|}EJրvzc(V'ۂ jt~KʹNI$mðU0CID"5 d|x sâfvÊG2hYbgLvYTt2':Bҟ9L$Nwߢ? _ )_+'JMGt> /.%DŽj[Ej1/tiXzi ȥ*P*7SDPwDZU'(ϧ«B:|W[٩)1YK;ez:T@׉بrǽ7Z.ݍIrxWɽը 5y(%Rf'o+$8eœv#$ISPA V|Q)LkCSBz*³GxhuQ}rS^mv+ZCyܴeE1WzSs<ᦆ|$E&ΚZi* \ܴJb,yͤЌRȖs 4<5㷈'+9K`#-&j"bǙԎh -HL7 UƯ ՚ǷkFb}a/VxLJl2}-\̊Q25}TӳH-_}.F=[/7oum H*\-Vw r?nƿ2K4OB&;FGNS~.??a=YkXp8}SG=;O/:\×f:eQ@?WhdQj:k7 F.WR=*l M_t- k`wY¾o:Tۨhxup]D 1~r_WyCz((PZ ((QDS-wBzYY+ TLYmM_ȼ`]կ8潆 n q//Q `eCb/_$ԄBHAlr!sl2>z؈rɕU6fr%lrL Z_δ +s3 ܨ:P* FLECJ*g[H)F*N~r!Ax#~Y7uX֜L`l)SD_qt`-C|=Phy!I,Gg"ul)===}{z|rѝ=\ ~sg! } ??_O}dq<> oy;c5?ngO~| p mAA~UaŷX!gr,\w|#7$CE`Q ]IֵvaLm@8o^Obf_C&%Nqp-Gr-!@{\Cג1Oؑ|'"y"B.~a=Vd:ѭQjB~gs'(W|^yUU[>$}0tMOMʹ4ACFI/i%%$ $d7f 4PFIUp_#S6r֬Xa|dHHJg7֣NF#Tp 02"R69j A' u2#ݕٕFAEi}7;7Hd2Q\(N|R(5Kм^aq"͌qlY(fliK_ %s FK~?nnr?wy*'ܴ4uBWv@Q_. ># *ޫg;)RM;+uN۴ QdV-[ 3S9=WFHvRMgj_>Wo11kN?xxb O^dJqvsWOgoHHoz(|=W%i{Lㆵeimv{W^jG,5A:X|vQBK_o! qw?}lgsB3k8e1mu߹b4K2yX*9pulfRz~u\\($!_/S\fQwhXׂ{ǂ>,Y>nNG(tLo%axݳ\ wkt=3nU>/ gݼ+dbEP4Z_CA}Z. )" GM;Ƈi@0‡Fk_;s'Zk7a4`\wwi nV|zTW{=9o'23S<(>2:(?9 1.aGL A(up \{$m0tH_ӋC+ qө?ˢ0JB-qT' Uwx k娾gW+DJ[u>JS"<Z1>SL8e|0 Dlk׫G}ÄQprt@j PU) wt9ȵbd׊ ZݐyA@z\d%tPb))">kp تAcL_1'\fc/IMCkt1 #gAf;1xわuen/s =k;^]sM}Kwdlrxƾ" *_)1RI“R6XUG'0@|`( HARu2pZDRJ[r'2q .>qR#KT0P8>=8ILnƮ-92SޖrcU%)~0&Dz"btJPJu ɈY\@ҍ2s*Řsљd @T0)P Dl;_`k|U8?k؀Z>:P(s!j9O22$͢pUgP߮CW5I]kz,竻R{1$㿿}|GW+5'#B6חysqxO]G𪝵gR~u~nfCS9^Uo;;_|%銋 n[ P >gPI ˲?/(`6fY*/ւJ#EE}XR*$!iU޳9ԭf 88tfbQ Ba,Z Db2wMu T͚V4LXXG ֵ|A:I+F?LUVKo{g G ϾRh\xB#Ye[8mtVf/~{ofsHa~i~)|8o*ndaVMEcx x{ 7< pi R j?8q:b:}b9^7~x >@sn]t q3Fɔ >DNG1*df "Zɦ C咞 P:{F9w H`OIOv!p`=ړP"Jg˟=Ќ2U}0NBk>Ժv285LP .~ AeRuA53S?q%`'n_OϑٜH^* ĜL` &"<3.Hu"G,<ߑ?7!hO%<(62!轶LuyKR!:ۂ'QݺasCfwm'W  }*7aAH*HCwJ_pi35"6ee6#`zE"n;B@u-ݦD~ v]uj]_7 _5C ٲTR+UE%lJ *Vw ݸ|1Jl N#R N3A$Fc6ikHs$Xf|dol%:d'MI }<v:BE; W7qn^ysS0٥MHG$t_C_o_MCHXh?~-HחAүoޜw/ sJMJ1/MtIJO1w>蘸#i((\`z>GMeol_=+?ku֟6g--G'F!Ի=|h =[:K]N|ql|vZyx7ܝ8>LtQR1痛Qؿ|:@|wk.뇇i7j/H5fvyi5+ ָ,fuWu~\0͢_,g)Jt-(;f$ eJSm$ҔkJn{wd3V~H>{a8&1"jksM+6΃ B*%/[xCbe۴CB,z ?Zl}0dN-v Z=BwBjW.gTא{cf]ưAzB=;͟QǞfntpCƦcWnל>m̚\1CAuVk*rN 'Tؖ;VGX[b1T栻ga=e싂ј킝n}AIe}~X, c|qJ)܁9m|svt`BQԌթ$Jq}JN+]\daf@,סRwDCd o^RUo!= F6p%!? |m3vFJ/fͧX732E*R*ቩtJL2s:BH>N4u[3qj\ml|__V?M&M͝$Rq]%Q!UcSLlI[`s`o2U7~aPtܜ_&(%Ρ|GVOP2+a¾l=c;J JJ%‹j1`9c ؎NRi, j@.Z}2=HN9z`tNꫨϱpTb\0-ջT\ zM۸! -T}dc2.ybo.HBgbp0Y<H(JW>!ZjF?oLw-y<{,F"7»32uAjqy:1Lk#: C2:4Jų,*%9k&S&b>=57;/TzUWzUWmu?VNqebH%$z1F"øWT}WSK˳OAz վ]J"62q8M=ޢ^畲缗{ PKl׋䨋D4 E}`Rz0j驍{IBJa2%OIEdL/*wb"(h1VH"PmDG#hr+)5G6j>OI f|O&Z( #3T~(0?TNRNѳҒxrz1k.߳r =M,'| > &R'Ò."&Hěz@T5Z5p q.:2_H@{uOCh(_߅2#^.59a Ʉ@iA$$QByk 6Ŭ fWqWAP @0x'D _5%x2hD($\8x=ڜhpkDcke!Im)P?0Me .q`k)Z*y4A Iu$rڠ U?J1Yj_gS?\F w,1*>,!P){F># \&~T? x 96PaShɦѠV:PQ{2kqs)vתAYz',.r#;uZg*7/F3bH1*In2]ӺVkݐ߾0C% KaN4*`l|LT:!`uiHtqq?s| rr:WO~3iN7E[ OY2.ZRhkF+evhke}Ǟ9+ЖF8B|J&&Ge .:1hɱ`|Pd_օhHLw7Z,uPG4. @?&}}7d Q`keAhS ܆IONxXi ST)˅ݝYd'nqDhۢJ\W,v {5}7K`v:#¨?}S?M7YS08x}|w}*M\i3uXyJ`9KS ߇̏ː@KY@Z #h"jU m (J ɝr>/Ac"G֑k^uwѸ:$0Y"0Y%վJ nyfSMbK5zͅh'f~uh-~9uK\4첒aP+SC@|NV,Ua#amDH(2 Rm"vĚHsׄj4{jH%&;7.w{Do ʔf~XߵFl{ꬕ4S,MS_Ǐ8qzw_s*]|P]rտ`vM4G'Z}1Fއ?7&gLZ'/'N\> )3eZ΢E|U l,0n4Ou""c:I=BY'˃6R0V qKiKw-iEssf} 䟷2~9HɁFC#阷h8I&2_ujg 74)jiP80,5A @]L#ъFCADkJsp5-ڃy F;ky;ErKêN]vR)@oZs!rR ]I4XXd>;1'#p<! UG'wmGtԨrQ9`# 6K vY8O8wg: dR_-4Hi0Xt}9]/;]_1ffoNOo 7 薁ϻ{7t7:>]r5tԃ<+wM`iKXҴXҏӲ]KrBA3X<5QͅM0xV E%#yh%y=VFE޸Q=™Mv`pHZidkCdEvT(ʅP(N }1M` "N4vIv>L]E~3Wgwʩwy}Xg0yx >wo\ Wԛ%"p#yW98!wv|xΖ&H&ɻ?pNƃIEdɋ%AQ2ec q*u-@%Pptr59]B0\HAB֮ˊ75E$T XVHRqA+DkU]RР &Ixs(զ-H ڴ%U]$tBWI2hdYzm^s•6DO[S~K%W딠1f@E# Ffoc"xx4us5;.?ŌnoN6+ԏV71uavLYSWeOs_kTIBLbʼsS,bI!t7S2g.gŜ>B )P ղe%DqYiYvs}đN`8`(}:ԣb#JTGGQEc ?_noW*)Z]x D3[jH E1%P@3D`®vf9l,xî|5Hܹ! q?q n)zJ#Uar(w!XNR4"xNdػhqaRs P 03,=VW/E!pVi%IFE9!H!DC Id43LͲICiAE*{1lV ${OhSc6~f+~T?8dҩ Nqj"q: x9f'+n\ qpD KX'j!Ԏ5 u(?D[ΥO,}IjDN{ڴr;/R_)FLj/F_دQANMexaxa5^WiB#0L-):Gk$fPC=K*Yxc%TH.5ˇA N:/QIJq&uaR\UUH!h&T X/: S3,hfIf0 Hz 2a2Fl%t",]17K}O_w1 nzӸ_>JӳKrE f|e{ٷ*ȴG%~g,7e$Wh$b^oԘXi$0.6ya{&FVp A ;|REG12ARWF*4%Z(#6,=_"6 G i,H@#5i` x#e&̦aP!a"7pE3]H#Q]N&qRblJ8ܐRo"bd[<0Iv|| +$A&  jn{?U}Uqs[aNR@ ]/{FB_P~q:A;OYԒT4H3R^a_Uףڣ,qv dCn8P#N`ɫK;\ZG2]X\눗 a\ƿ.ƿoi/Y :*>owAwE@-iO򺺛qF~9 ~DuNJ B+N[h;P-FiY¢WTqPq <Ȅh|@(:ߊXt8<`vG#\ L< jKͣ pd I^:,?A }Gl#FFתHۈQ#&ZFiI&%CXy"eT;TB$:ɸhAOM)*9)j f;)%BD5mP(q "Tj`LRE  4SdDl J0*n֢TWwb8JC>?Y$wny#תA[G[ݿ-|ڛ|{WbITG7ɺ}{U 7W_w1ۻJ4_ܼ3Jِ\ v:65QxTZJv5an2WY~(ujH](n3}+hwSv NuiEA)NIȶSW&&IpSs%LyG&! z&D%50!"fIۘ 䥦sTY GԜ \q`ˆ! 4ئ!04XKl&65sZ3\lNq $̦hwRi:^0?od%ۅ%w&rK8bi"vLt0t4 RPNEEh3Fַ_r48?h-ZN,4VBK<@[B1b  ɞ2 3k9 ہ+h0CݫK"@5X՛c~a @̠P2Ax0rEpJ ]:~hb w]C0BJTbLGRjD9G_ ɧhNjO;4}qÒ ϧ"pC@Oy~HUf:nůc`JOou5.*nӻ-W7[QZVF_ x'$P]yJLǧ@65]]~9<]^Go!;!;ĚۯT°-8^ h*_S)0Go^}ߍk\-^އ|mzGzܶOGA(^3K90oֳ kyȑ7hb0ֳ!:]K[ 7v ╪;jÀuj73(`PBNl3\)"(l`l^ 8A ;AI>-BNT\# Ѿ]NW/W- - 3!\ݭ ) MSInz J?G˭U;8C8hdžcO&*_%bRJ|qIAQ5X-DXT %"̃JQxut[qM%E`))piX#-~9ӷlO CʀA7' qZ&$arS8mlqskXq4T2 Ыf騴!GO 4\##ir|%aU-T%RzJ4h9.X5GDIyT/`d ?u վ獽-z '|c1_2}RDu;qѐ?wz*d1w/`FM{x\͗ʞE{Xd z>ZV~tOJG+)1\%y>ttVW2bDMa.{5xlξ~.!-l5ѯU3kl ei8d{㻏=`>نa2+˻OdQ9R=ڵퟫS8[SY/8;qv7z~R1uL|œCsޙ5!=5t23Od }aǽ<-#!)f:Tq,9k{GA3#}sJ{W?>wpUY?O QyR(Gy(οxRT{rޓ¢ykXz>=)S)8>m+ )הeÄ.b)0[0:ErrT4Z.'Y6ԏeshs|뜫vT`JzS6$\}~jy!|(ʧ-o?|dV'ߌ]On<䘣agަ~{uή#w:cQuco(j'Ee:}q2/F~~};U Ǹ~ mgB˒MR;mx-m 2}=Q]*QVeVK3~bo&"7J겚3Ŕ(jgJ=:j>8iZ3e\6 G%RF ܎NcE$\zvHphr_!|J?:rs; vuIxCee4g?kU1UӖ]YlZ86$*B jx7F Q_iþjhe#Doڃ$ iZ 8_O$b(x+v:'%7$BpJe-Ho j-RR22$ Q* (6)$ m7 2ܐ2#Att+hf-*wÜ܏\Ws8L-j3-{@("A0Q];N6tͣmov>}ڡTOg#iJO|Uh1 s7K7{%W׵߬[/UlъhI^ϦEJ ǿq ջ޶{E0B;UBiR21W,SIހ3}Qsh JJy%U|SsO*8IB/Thy+o B\B 'A5g76 6mϣ_cE%G ْ_?fh:#;~j8nviWg[dB2j.;,qBUcѶe ZِbC -QS%,! %}& X)%X>ws)QqR~ j;8ku'tԨn}6w `@~So1 Њ_ęQS'lj(KTAqWhgdPTRZ^}{ѭE%"!}xocF-Tu#O^yso@! Cd;Y]kBj(1*ɢ #P@% _ǍuH A$QˍM/đ`:9*Bc^ |LBT38K"T)!r"q/\FSIP /Sh A!0̛| Z f̖67GC.Q q w^wqWå]+28jQ4ht~bipܕ'- CF5X=f jyTj8kͨ'@徢wA5g3㟄MFdo] @5wD V%T&%T*k e^0z {p,[^OOCMAD~8QĘ2^0Q#Bq{ ϸEdQIՆD J~FJq}$zWH˞?D4Cd4`hM 9m.JS"1-A=6nTkjeTj( sgjm̯?TZh) ms[0Tu0C`gpCU3PIm#D $\1rYV>צffBb ?S}7 )>ܐRf$`}XpQ $ߘy64~T#4ZONH}tK%*(_ ֆ^E8=kh1 i/z#mH'\'C@׮finFg\eJX.*`IU6$ +q6=cܧ'TrĒj%}-+,4ȏFh s+$+A +=c~=_'-o@ q̥5h衼tJ9*TNRIj1" =c>VoS0fJm,"U56dWܩuZv*VJٻ7ndWy]l4ŀY$K6sq俟ѵ[nY8#+bp.3E(RxJ#=t6P9og ^_bIWbn^ m+*ʞlWe&j/dwO+/1.dC(TE(he_)JIzӏCAfۿϏ(7ތz3*7d|7Yp* B2S(1tv*`lp 4s&^?_ M&w 2dnGpC $ `e6s0Vƾv ЄHԃ" Kbl{~sG%;wQ-wXLH) ֑h1sTZvTKv~H+ĵUaIyUf+6rH:F,%s*iRYB!>*b(1DB\ CpRXI\,d]Id*6n^,.TkXXyPMԽJ Op҆`NBc'U \,UNG) )fΨ):["7Ob?l/Ω}ͷ&RP")Ȼ_&H,n`W?o兛aÛ Xy'R{?;\e"O=yw)ƀD_ t2\~GsRa9O^WS0J`C%?BSiXKT"hLSڜ/4ыm6nht?.Wrѥ!t@iy4(N-3 "I2ZbESX#̃X/'5CJ$T3ƻ`Ad1X("@[KJR rnЁAYC p|nVBGR!i-hve`\ ACϡ2Jwi`%%- d WuL=!!̖Jp) sl7 vP`զOB%S59ۄNS_0mXX~aqR!t](UL8DJӂA@4Z * #dZ(p/W:Mt;|>7)jW7*dm,I,A Mt& -xH1kWO}D(D „ZɃ;؊˩E6yX/Acy܈Zo5U VjIHz +"d TYl&CIWoPwG ^¢OpHa? %\[FVr X r nhFU& AP{Kx! $JVt-:ͰpĽQ>~]mx-[2RE$6+ڀTGƸwAH.6 !_ 'wwfn!5G.Z#6XZ:X= prR~Y𧆷5i-x e[!ati+&gQetskH{9FM3ope#5QcnYA#K,_0N*'zŭE\Qx+"^*${vP-oܛ4a _wRA"6G(ʀ{.,EPg-QWH5-EJkrW)U)dt_T"+ˍ*Cph[,J+ b4]t6L3ņ .$sxhǷ ^\0s!hL`I̖/>z ηWfП s}X(,Mi}Qh |5pYkee/I~r}Y Yx4W˝a:t[:54Z^*yp-y/r-~vix>fP/Z3 emx?ܯf/r9fPdx=>ow_-hR*r&sgy&Y©mMlm?\Fza79SӽX35j>YCw 翠y3 :wLJԥX.,N]mũKٮzOpW]U"UQZue ;0=MeH73bw72_#wivn"?\˼lmsy\}nӤ}| _q~ff@$}"AL:u/Ly&,2|o0|0t&κr 'T]3 XB\ndT| b(CңZTZ(6DbɱwyQ Ѕ_H|3CcS8:bbh3fZw!״|Zw![u!ϊ+kR.)RH%!Ōµ)jÆ{៻a8 zhѕc~m@ Ʀkpp!7 nl[d 6q7#knՄSwB6n0={ R sh)}euaH H/l@ukb/-y@k&5h#9홵QZ_mi$oL0oPl!WBNą f^ SléVg*Kȹ]!JT%HPAp*ENȈܗIFVC>8}1s0my4Цe(-.P2"mgg_ R56d;FV4F-:˫]'[r䬕1:JFTzRlW9J&jKԕcK#I!I+d%;sHRl8xl$*ERBGɇnqVm9珑Lq6pE-EGWjJe_Jke_); _zhȶlBW}nPZ݇QnJev(,ܡƤ SD#Q$@ZȵeD i1GJ7a} OoݴQϚVi-1?v]ۤG_m{~Yѓfr8{!7)'ׇ0's%^j?QpF?PA-Ȉ*0*u4TYCtQD"Axp5NawZcM:{+-ֺK;]ob۵.yV%75/f] &!b(0¬qNLs"LH1X 8K9kΛbiS98bS'k3FN.fL]\~jv1db&B]Y!nK7׳hPL/#Ph#hp:{axyo_w5ˆp\h3ʼ6$IFQВM欖B#އ0a)0Ǵ<:pԂk Kb8oS?k!t`0`X.(KyTa>2&8DXy&.8MmʿV%c)cLN_d1'-|52.Rґʟ/ާO#b>s4Ǹ_1eûa4S]벇'^ڛOru,zhKk6%$:8[rOºjP#ffv:msL|O^rKfЪlRaAߌµv5fͯ0Ѣ(\;59Bk!ԵzCjA];( #p :X宛jXjIҺfis'1A "ԚNZ(>R|trLoau!O%Q!3VOukBB\DkTa^FGVeD3XOqDm[h%[r"ZK0oGNuRեAѩ*m<]bz֭~iIք֒)ڑb0*wIR cgTvW<˻&/5 !G.dJWɺ1 VeD3Xt]R;nnMHȑh-Ҵk֍ ՊB]Ϩbߡ tg֭֭ 9r%Sghݺq`Ҡ}F>0;n|nMHȑhMkފ~4(LT/)"]߭ 9r%So]u {ѠX|븛x[ݹ }r"ZKm^u;?ck}PkAIM`@Cwe~*2HeסeNz+*2D{+נdqBׇw}x>u9A]V׳ը&pn~[v=]O8@Z`^IM3]@7Ш&pnv՘՘&]17 1óUWcj̍jB$1Cdjjt>k¬՘՘֪1w5&5AƬ&]17 ë1kn|5f9>31w5F5qux5f-]aM(jZs՘ssB1Ի՘eJC1Jw5ܬ&h!ƌ&1w5F5E PssWcnTĦ;Wcƈ ʺsWcnRj5%1w5F5 uh5fZp՘s ct5#1G*L;3Fr.jWckQ=3@M)j]QM^4u5\_…' 7Qgw_dr5۔1'RKg1AJVz7 a|;J"oa-[4C&% ,a 6@f JC3:#/*Cx]LI)8\sN]pf`nΦ)]zՀnQ~F@$33v[&֌/~_Gyd?'_X@wL}y(P%/\ߌ J}}0^@7~>YNZӷy-Ub@6L +W b13؏~jÇhE Ck!^d4m ̇D~at2|7G,ݯ]HӅx<'9rj3srK?+by& @BdQRiKxΐAmALg<55=ˡyu(xk4%L꯱ G#yǣ Zpq˅u4% <0n0u o eTڲ~dGfvݢǓ}+Xr}L?ޑ룚]| ݋\ {F}pb^[2\(!5hu+BS]o w&).GYgeXnWV{󯚯|k?^H씢R6U<_:*." = 'Zz_ζ]p1&7Lg+HH m_:S˵tʈ tjL+8 bhZ^_799h1M?#g [| I "RD.&t6_ͷ7?qri/AEͿ, r\]*NJ#ŔV|Sd\F<%ɊDR6=C2v*Fג ljvŖgLs:b@H[̓gWlj& 6 '9;L\\֯nF~]F?WoB.m/I,-'ܑp\/S1%gq3p)$RE8MtN#]27]e +)P, )%*!h;+ Bk|,v>1˘QQØQ2>{#9bLZB!A:/!LkgwD K:`ys=`Ȳ & wVJ hE)>B\S+7{@s*AQtJIk7.hBBOZt&QS aoR( ϗZ \WGKĎ-xk}z%upǑ8[yL *i/C8 *]-&) ]ehx;>8Lk|ìi5B,RfCx3:?d=f8N-Wm4v|f|K|qQfQvvTH5Nv=.ev;ۘUro2 BǠ[H^gNcX#1>SD ŔJp>Q36giӒoENkΒo5 ETKǝ442VYhVO.hwZݤޕks? oE0%}.L\$V0n[k3E;_8sa1suA_nAw@p:xa$D µŠnV SۀhC-g *g3C,cAS#5 %rIFLOpFZrVhoR<b f#T",. ,2O:!'&m/_aUj ,` Oo"ǖ?gyqF@iv3F/Rx{ڸڸbjcy;NדiΙPO7J2i&qpqXsN-4W]Ψ?/E23\A/wb+GKz~t2/XJp,D7{^ʉ3b/΂ 3ki%ĝe(-@hpfîFl!u5F[ T$b5;u"AoXQ3k|Ŕ;o{7[I ƮrR8- <1囗YT,Z*Z?;f|_pd7}(4ɲ2t 4fqGAʓ;4O2 F`3!RJMHsLepJ,C/4 QGiii|k3ބ̓ޚ*YX>IB#y02R3 ^vq.7{Zت }: )ۆF .{U$R3Q2hUZyd9՛Gժ}) "5Pcϡ9uR2QgӾSf(-kCd{:c9w!3d!4`y=oʻiX| 7 BnBYOD dn[­:%%?.mmJ!򢠣mV|s}x{ۉ ] i.,ܻ 10)*@Z8Tį.7NApovloYϿDu#}m0a srQnԯyG3BԚ×:8.P!{X9wC_lmL==cp}s #~;!^lg#{>2#Љx BEww(`i5on(t[L4Ϗ/\DR{ ],UI'`7VktO;Ŗ^b!*C>/kDbq;6{nk9T`::޽ :h@wdjYטÎۭ 2O\#|a+Ps<*a qiUoԻ,d,XK.U Ʀ[,5^7## \v̓\Ñ/[/q*/[m`( r_XOҒJ^M&:|GdOj ̷[ tZSx ֍Snu׃uAֆ4>UhuU396 g^jΥL m6AF`4.iF7#Fsx25 .EJ[~ns ;ޅt@awD+ȥDž|nts4b+ =R8r3 za_LxS*h ߝGXy4ı>p~ÖSƧ>gKFirպYjN".qbNKJsvX){Ni0F!H3p889)19nW%T 62se":!3%-K5 Tl7Bܨ,L(.؊1 X^=rmȖw4pwzP,:e"5ӗ&5w1_NcƘ~pN椂V$}qWfxIќw9E9u48V: `Q8%K6DR+3Lr9ي 9&E3j)^R9f"xYj#pU0ACZAIQ`pvNNT@PyY1>Pf;PwLv3۠%S:2M4'`Lq2 uE -\L2_fIRyA|QL7<DŽu̸H$P_'qnu2ߐ)|D5NوRQBW’>Mr&]rP4uaVw*zqi}wfm> 3'<Net-w[nfo( /u5d2}n$>DHǼTx*itKf\i$ h<;|Ra! /H |s`!6u5qҖh^>S }ɍ-$h^FEl)K"GĠwZI┳BȂ`݄e / nQxBXaܑ2+"wShKg<ME`h߅χd F=;/@%%u@Rlժ]|;dcexG4GJg+j7TR'A}nۅʜYfWV8Uܗq!ieWi>яo,W%(M}k*d%|^=-jOi@(oZ#E \k4Zq-:MI8\B?jNG\񔂓[;_d68dB7&H`I݊{I %)elql$dz@)t撲MTsc:Pi8 Y"A!K\(0Z 9@)DFm !5q7qaڈ 3-Ja62@*to0LS>Q񙸃$24`)$&8PbݪcB3X7x*F3ۺ4FLNuZ&*uTik"-"L֒Dۆgi!L@W@OH5)g+;yザ'm (1pv3pEfU!itxv\)nh`(P EMv=Ӷhs< :(Z[JG@ShQu12V1;"CZRmFse+1UۧF1D7@vbqUzZ hL@ 1,O ii#S߿o|b: `F{y|;\cR}߮@?'\BdIn2pX i8{@r/#)x i:GnKSKXZnJPh K_`S ee s_]ޣu'7Yyg :R8jpn(C :V|c݀;s1%3k0v ?,NX)2M>"[y٭gC{I\(DXT<o7TM,8̻ڕW̟/<ׂK*P9<c"IK*>տd$ϟ]Ŝk|یF*xa8!лdd. k{y;(jAԄ93O,R|:5/|o:c9S پUtQF }S'GI4-D@5{R5*d S9LAfYI>_.+o~]\-Zn2v]]3fw#! 80+`NwO~^XU(u!dTB,*(v3KZ(gRfsZGDp[P>'`0M(xH8'/7)ܰ!|_fs3k@45fĕ$4⏰ ^Xgx0 t[d^|ⓡ5vmkg]|r\Ҝ04I^靖zr<mKBj 0i\r1$O&@ R :脐!%AIÆ @FP՜ b4g-))lNyRV2{S"ak#F0G%/ U5l?0$0)ZeVQѶᙚN8\~!^z q=*qK6dBLgM d#mz?,nf6\(2Wf@3Uj4Иj#g:uΩ:p8aׄ~g<8>Ɍ9dU?՛mT\;On$U_-ľ˹Ml9y>Dof>o{ndƽzOϟ^ۆÏ"(bk%,I]#$Џ"mw-oP!*QiG2&g8aDeF'R)ۚ[odk`(L٣Ed  Xa>"ѩJ(I6׿ժ⎼[G2W*(JW.rX9|0!h|٪ASxKE{;9JoXLGmWν"! 4  VC}[~>}/ORW$: ֑Ͽxfgtstxr?  On]i1+*}:\ŧͧS({}[`UL*FQEOt6GsvHJDOO[ϗS;v<\\%?sM%L Kh T5zh.oZQ#4؅RI"/ w}6슅"8ءizS'Jdr d2W /iȒ5>n.'PEqx=幐3ޡnC tH'->Ly23 2,DZ)ʨM@;#_#aQWmIuA`|H~_ͣ)z<)"U &,::ĥRs쎻OcLYӾh^Wf(z#/GgAwd7,(U#1荖o =C3<JFH `4.sJT$G'ёÇbf"Vs )y9M?wdՕ~{H 1P,:h@%!/D)E-^N{>FA&8l'LR B:WP'~PĤQ4 HpWjE`^*jM#:,{ .k>"*JyoZPq(UmSOwe\/NទpA8 @rU?qFnw?NcDOX qrMq~gBI,w==:s==MMQhꙣȾ K 1];7Sqfҩ `ZpR5nJO>Lqbi.1-R]>pE*Յwe-QLwu[s=|Gԕ)UG..ɽ!h׫Vn[opY EB[j6o} >iU2 P$O_N`4?aƽlITb*Jagk#~JڠlrcJblvئG.xj1 >>DGWLIQקWJaO22_%Xĵ/1BST c(Ľ9uR1x<b$WwdsmV05a]$ґ"PoFB_Fg8hTY txm%\y7Oytŕȩijj7O"˅QO/6^ 3j6id*TW{65QjSv) j,3FmTsܶrp%UE9uOUB[+DM4,`q'xs@w:G?Hύ<y< x,h8i0N[.4K)HZ#-}FY&U#-VK[*T(m^Ӊn{z }B h(\gd0ל7'W)M ɱdsڙT ]Lclݘ.MA:čG?| @5yhLӌg5}qBY"8JƝJ `!IE*YhϿ|Ð͍֒3)ј)'ZE9ψɤHL;n(-Pkb|\~]>IC)gl&Cwk|L`l0p~U6T LV !~ DŽDh H0N=\pd ߎVq+yk$;#|ys0u5A-1gE;R( qRhG7E_g#lrO@/ʪ#-CF.CfTuƯǺaݜbcb$92'KfY5qyKjԪقOqFF2FhRaSVfneR}&>,s?_uJOq{])M&s>n#+go5#O3^“ Ekj!RS(V)BTfVZ9dEy:)*:!O #"j;$pЄI=wt?w}e6)ژ/ntВN[W-xRX󳌧NX*:՝S?N?'>h{ْN *)X)R֙sX IJFBz!xF`LsdR b5]`JJg,!|~аݕ>jSiJ!.!XԀ$+$'F# &e<ƚU$j 7s5hF?Vd lVɌI1!%'A#`&"@jDw3V(.E׀s6Zb Mr5X 'ʣY(jPLiBɑWX8Ԅa{0\{]TV_bXa.k\ѓWqU+!G*đ[@}H UFrp /%ypZqJDjLEk(꒼wjwWq  @H`XEY0D)P\C3)걠 cQł=AC,W @/Ge'Vɱ@X!ڀVo/>ZLyҹHi0E(B+ V!Q T\.o,PQ! kscJ08]ZaO}|mބHG|P ~Ϊjmwf~37ouTtŃC~(h!Sm98hB*91G,a1: ᫃>-(f!),F=CR@ͬ /hTO&>PL=.QaTXk8kOlfHk|Y xj-f`ʕD3I_TH#R!)=jUJ/ؖZb`"ֹ*k~U)XӘZ䱪PO04TcNK)T*-(4 dYDk \sG۪1a`o<2݅";[-`,)Rn9fB @4Yp~Jv2FO᝝4)ϠRVy5(-4|+!ʻ@~/>bt?y,B^ld/OFD%r*HR[;e҄$%R?-SrVA-QV^Qu]g RVTsw3;VSzg[0v`y#nDL/L@tpv"rs>-`-^~KQ*zn_Q!*f2AޠQxJ&Hz1*U,92&6K1UVZ1юAŕFшre7v96R)߸" feұA`"P/?I(aaaa?aTƸFq<%VTyBK(e^!gZE4KC0M'jTg3f H޽oYQM {x% 6SWis;$#PjokɜjcM({XVW@Dc*pHdt ۫sm@ؼ!+Z4H`S`KPN "qvkBŽR ՉzbcYV vz$@LJ[IݎW;R. X.)ra,lB,tUt"$+ܥ(u_ $OlD:6|VkQD |Kn:ùH\4!*c"REX'*Y?Tbf+<ʿäuKILС (dN.&K-bϊm&bG.P|eˡ k!8:0aZ'P53T?3Be(>ő.LR,8BYG:w#hK`uKZ5?>g[~7rJΓtO#ጴgv23/7]e~~bJ!g77޵6r#EЗ^|? egݽ% Mv8#[$#WlZ/-6[ Ym6W*L|`/|5ϊ 1̎{OTXWϟ>NlTf}YZghPWVzM {!{5kє%\N3˪d,2cLsؚѝ#/&c1}'w-^@^BQRb_P-0BZ+97~ 0)"Z\j# f 0`Rڏ۫S$&IrijPpErI"ӘN1kx{K' @ڽ4m*є*p8j?9F\Ҁ) hΉ US;Lr^.+` 8"c,leVr ⫋0͇\"EB^7,LsdYbtD:0MGx}o2sNwQN9)BU^4|&şꎋіl/[Zn;_WS UdŘc_n+R@R8UZI,X+9!u ]:MJŐ,5JH $8'5 P) _"sA@P_<Bh̿E!M+Fzɭ}o] RͰQ#+JV*?\H&a~쀭͈ЛJX_u,IL#ثNf_n|Bn?o6*ڳ~Y}ݛq{?kkf>`tC?k5 ; ;rBercp1bɯE-tJn>O_\v z`S?um2O3cZ2HL$ #Xvh 302}7}A>>9wp .rFs1~v;>S6#zYFn<|2Fn̢sFzWb?97OJ 8*|=?QYDJ KG0 jM,iqѭQ>)XFqzNOXK;|oB$J0dNYeh5͐N BXŠDZzcdH3luI $rG3K_[07`13eUGײ7ļ\@n]j]ҘZ;3i@ƈSI)YFDB֒9">eQ)#Z, r< L _4jNeۙCIڢ|Uck[Vř?yyJw4kUS q|mwܪľc]FŴ(tv>iA׫^>h]eoc&M,J71KOjX9/o֒q=^8t J͔EM7S!a;+mMW$䴛t  E8]>saY_TdvR AMu7rdlãyfLxq܀'M;!/g;giu߀\aL{՗zk 'ӵ3cE Ѥ'v`OoSҩ ‰o&vfHsIE4u1Q"_[ҫ {쎋30YfQ4FQ,9*1%UB% ɱت5kx'SlpgӯYq[³٢:{C[Lpͳ"D4l,Uo$M} j-A蜎p?W>. ~=mT_|YDSO΂jQG[Ew9 nESq5L 44iLj QK6)_ChYe;^1(!!UIV俫iX_69ktk} s?̛KISb]E5&,cVUaQ w)9NKf2\P1e̔¿`M`$GbG Z;! ywqE#f ,NרeSx 2y+m-ͱ3\mrǔ$vyQT<%ųI9N`-hY2^zAëvĘn.HBjNzP㥖ΊCDk>I9 ,9Ym"ڨr5Y,Eִw꜒1SRR4)o 9s '29HvendѨXAYf*JUV*>#1 d/܂=e{gZ9S.zWӇ/VI.IYH:a;\`Nyq]d-acHg3xQRH'l0B^>,~eGo}Govr7TJ4.Ka Wb*k†*sEt*bF? gOOWѷ;_uِ8E׽E}2V*/4Pl^s/?Z5y{ .7^[[oo[|+d%ҮF %Ä(CE`V ^C-\"w싢;[xÔRL+{Q9V7$,!cAݾVA(N klQpxY`$`Y(6d;Bb%+gek”&!CVrF6"/U=tf:/@5Fe{~_9X}^yݾ=8c(;0Pr2%b`'Y}>HxZ =~|ӇoEhVg_a Ws!t?<3.r۽?kRylLgO \I!9b{86P{?>>qvWR0Z[p3݀NOgf  ~k%*zÜ7V(k"Eul*w'J3xdg81Z3&& ʄRHk%TJA7Vj1T4a^F0]\5S={Ds9S܏1S;Wףxr)׽܁ j>KE'Eo8?nSz?f+L| KP/̰7UVXS_CRlTf}YԖ5Z?T-WzK|_^Ȟb,.%Θ%4SљT1- ݹwd,ټ\8pkbx!:f\FT|mA$v`R)@(F:* G.5*Gfua HfR3o, vOXrdT1AiNoj ojEG5{F1j '{no)3ջ* {-shh# 0 +g 4iZU,CMiAhe Bj(Q_!ACl69`v cے܂T̙"PV] A=c%cp1FHܹR AuNeɝA2ʀk,a"/ݲsc8tbWXΎ"EWp*ɰi{]Q%?"Qq?.U Uo]A O)&w3or}um"I}L\#1@o#i0/_q՚턚PȨBGK Xo>qٶxY[+h%&w͸+X]eppUtJ Tb6]16As[+-yάVW"1{@ 9M|9O՜v9'e~O]8h2_p$KTۛqݺCo,P|GW IdJ_ˉNV羸lx7opheSibҦG)yMjPh+6.ӖRZe{_zI{ؓh.Ni;^ ?gl1:9YSBdi4+4*˭'lG t]9ôPo:8o/Y+S]2fQ_>Hz5A5:H5ū['ZuxXw Rߒ{ KD(yi.7xz.$KxK-2<_/"j7ļ@~n!Oo>9y:ig6vVC 6r$0R IRn. !xR/;N%fL_8`ݺ (Z"t ֶ{%H.L6o+*cXmGK@++Tr_"G $9 Û\tբV=k-kw=ui&H x]Q+RO`:c{Whx[6z6.FQ&5VܕJ",VC(e¹!tE8ͤ2[adgOO+ѷ+FELb~=dK &$hɍ3n-G<}z- VWXJO Ѭ2)ct66/Yq9nV}is\c;'WJtM4~|(}kK4A/%aI]Б$HE-2ݝ2>m3m ϕxu DCO0Q)#ߔ/Xf cVMl[vQQ76^`1nx_,A8DT"N*']"v-/'M BW\4,*1Kg3K u"i FXdE- N>4VA 9mbmvKdY<^T*uXBZ8.2.h~Y 1e0Vod2RJdtNچJr$+[6UJɊT ,jHtB>{e^ eg "`KXzQCf/(CQJܞ` =Î`RDv( &1N)ٵ#4FD(9z3 AԦDv{!v?񼓻Ǔu?Nn|E2V S1M͸o))wjZ>{{C|@CE},>`뎉=;c1Tx=3}sqekPhYh||-xf~>=K4^yэʀ :h$}-) D!,I#Lm fcJLZ遙nٶ 凬/U`fDdLCP42CfǶ[>QT[>EzPŹKi}W3UNF)Om}V[ W[pF_!ޞnIȺ%K#d{ N\Ż9vPZv[XzX/]"o37ia/@`w4_o4q?Q,ۺy9"i[x9m0[kXQJRNPHh$m#EX+5C޾g6&@r5Q'{TTTt̐Tt'/@dHu, f~Da2 9bD N>ݘ+:e r4N1>x?.n7Mq+̥֚Yw # YG#S1|"EQmGӛo.R~#@rpH£ On;qx> \^xQhC``o`gQ%(Y8]12AABFyQ2^CJ,t^?߳8!\erϧLl s3$UM`1U$@%Vozk : `HwO^|NpRA I>"Tt*>iDL^ڌT&?Py*CC7u#z^h*[;$E݆v$kXV9Ftr .) Á+YOzFMk6Ĕ`F=? 1ctBk3/y# ѩock33 %ɦrS|Ĭ@'!BR&Ch/ .Pll9:mQQx/XF sf=l%iBZ_\|3(Dgqu =`oK|`HڇfzX璛hka̎bf_̧ᴫIR$zvqH#$ wecVZ C{,رRVJxDVP4lLY J$?΅iƜ|1j (Pe08V5HJ*^,\XұfHqf"r=Wϱ$ƣ`Zؕ,|ІwtEKTԋ$I&y,.g5Z(A9LCSU*i(9YU݊lāE"ȑ1[]`f$#Ȩ06,Qk*Vtt jm1z f9b9-2@o!d`at@ v^b ZC?9ʳdf\!1p.x(X<`Txaءb!9H,v2ƤӉ25]-8XY+-VFJStM)aѱ$VfPZ;kz9TB\ cצD.y[1jJS+^Fm ʥv-ײogv㮷Pcr#[KՅРqݾ`%-m,4Ʈ!k˕4uaovxަz#.VO'qx+jNu|N?:aݜkӵcK?Q*ONo>݊}W#ų}~?XtmQh}Bp{~zq4޽X~!E;LB]@qdnsKAdF**R0FAc0F^N:v؟MUtm"#ϜDZ]oh"Ԋ9r0YP.;p{c#,}$~:p]F3fV:DMYئ!ӼC:q8Fi@Z}׀лZb#pBk>#5]stB#wRw7j6珗9޵=z]P]K/S?=9=}b>OvD-b J Iل,Oe|4=dF/vA/X Lik{;?Y ؔcA:m芧p38p5AHyC8|ʲͿy[7]ˤ'1~Yͩj?\Vc4hN]FH\dU+VUYYj zMkZՈn3$Sw{9X/V{7jԝ> lNm}[QەGw G57-^ǵݛ8{W+_xd[:Y5c5c\oΤqM~0t6"_O?b6?eViyTߟK j = 06VaTjLRG:JSNElVrC1Z Pe( R" 8l`pldp=UBYUI^]\]^m}f|~o޿9~zuv^MM>^N-)VwXZQhiOVe1 ^'*YgZ:f9-S4kPj5ˋ }P)?[?SԣIUC }cցk' u8M,]k9JFikUhi{)3@J7̜j{A.pj_uL@pt\[!ttVw__Ɵ<kɿxy32CO>n">1ӓӔmb Y>g""U$J-,!?I  ! t $܂ܧ/>vP(Mn/ORlOdXJ}SyaeVW91+5Eңe+?~urt~߃vaZglӓ˖Ls..X>q` G65inXf5`_0F*i?”*Z#5Z1kOcy2Aca\v+ h-I$n m֣ *pjoT4v!Kw |IIIIG6(=,5G +&#"M [- @*|k,Da.I4}Q{ ,Md$:BR&EX"dY8Pׁ(B B:[-ߧf}4J/68zb=|: 1PJ0 t | RpU;FI qE$c)ji`xLj%'%󬔒)ޥBVr^5f#6gbj ?{WG/pY.!M)i%dy793zF/V7bb=G93A:- r%iWOɅ֠5péݫaD۝(JS;먓%5C9aŠRY0Zv0@x#+qZ[}g(Qc߷*")FAQS;!PR[JC/4%RE#O+wB}Xnm*ڵ~ 2 BϥB.rbY8}uty f!JN/`o4]Xޏs>ޏ7)QedrySo;o rX K /u׾i˕VL5}jg5Zb^/PH(1%7M[yGkțw.u7Q|~}e2Eh=|x(Ԁ0?ЦCvpԥ)3;MwC*7hfFlu8Bqr:[XKFM|g Iky9RRp=uT9gJ(:L!+ c淘}kԉ$Q cYs ND5;F0u00;6AD+:!J+#M0H>a#Tء;#vƇ-,rǨE,'ASἎnFզH//m@w21 M*S84Sr0 :5mm_SͥofMEe`ix:OCuYQW/~MpMfZ9&+a݂XL:k36)8;mr#ڻ_GJՍt !'dpɻǝvP  vu^?ؽ v{֩ЍlJ@K"5WL)._L‡{qqFBq { V`^X9K%tg $fFEz ,xw|vfQJn+"#¼Aol#XhjU 輲aֹ7$XoV D=G[%+)VPm\M ڄZ2KU4qOG.\sAܲ*v+x7秭 }cc+X4S|fA$g X@`QqDԵ5IS;7J 0׹@lNbunOT"cmӊ:hp6Ln1ȈRy¼!i$4ȃuP n fҹǎTY5mjgʉۥtQP&v+Xw#WdFk.UxŒp*j0d!ʐP"{gdupod)G"L.F_4VlHGbèa^ER+T갹ݳ.g䏲.k =ʒD obsEi $Wrl_n}͖)jK8( LUA{w1$pNQ >`g# ^T!V"8h>ugYJP(|SJsR*CZ юAd3"̋T6<T dZy:_8thz8<8n/y i.Q쏓#*GaQAQ vWǟpTS- x)`eT@AEWX/fG,=/ )H8dMS֔ӄk쟡mh*qz;POOWVLO\E׍Mm?)Lr?Vdd g~?&J+[OyG>g/V}wQhbS6|e 3qB q1OL,DӪF+ʬ*^u nSFp0eTh ljwF19{M }l~ ?9B?&~M8qۅ8iCPըזӉ˂`5NSZXsFa8&CElIǻXfL<Tҕ*w;,NUzU]Dފz/ 3E9ҞQjR|(v(mbm63LR$}rt4TˍTʺqx $2w xl<+kжYܞƵ:b'[Ь z`4!-{l"2y";LٞQνv0t c d=ꌽS]tʻ0X$eӅ)kC"I[WI@=CTHd}j)]X16a:UJҽ=}"8p}DPZyWyHZ2*>'ҐqCiLcR'e(i81X jΥZ iH$6*xu&IA@PFx -=1ZowȄt5V%hc|6*@*-iXfx`mF֌3뵧$z%u0H|m Dx୫#bw,do`6nFT/ fD^9qLJkxԵ(FIQ465khȽ5uJz%j$ &(F"L#Ya/PFΠ"I*ɹH'pU4/B`ɧfHVϼE2Yichٗ1ֆ\$nE|<[cv-^ bjyօWXuq]^ͧzw%dDU0$h'/. H])b( M>J9(HZFcĜ U.dmgylĚ,m?Kzx=qń>L>tp(>?H3eT/K jqe*u/jQ`;Mm(?<6@B dg[ai\[8+FIZIJjEn.Nu5꽕5G 1-Q .N %ksrؗ4EQ z4 2\yIn.܊3_u`[i6btxd'_?%z@*1ɼ\Vm<2JfH+i'NLMOhMw@Kn:H1ox wSyNޭ MȦ${;ΧMJub:nS"b4V~:ѻa!?v)͠~)s'>0~K3yyÖ#dFo0 _92hTu ͑驚6cTMyÎB' #m&f'6,HY5gPbu=P' *%zh96u=Hiхx cb`yXъ@iNJ@09$ | HZ۝K3h}2]4 "A.nt<$za{'P;-*w91ztt,mF$ң^2Q"Q vD wԕ1ȅ&$ˡ_n:AG;>7#$Nv(^WHJͩV *xyP͹s_ E6̖mTkWTjqMkAxS]{us_$j5 J$]wDi ѣ 8W't@}M_WL5Vp}%,lE BosߘӫuೊŢF;[NOԯ7(l7bm,gvzWuY]s8-.JRMuM nWW"=A=/5,&|zpB::Y`Xm"$[pIkA?_PO J$x &LUR҇c&!H;{r{h;CTt-9%ʳ{@^D 聖%2\5_))qA9ƟHNI˽I- mA̞v{/qkJl{uK&Q\iB"J Bb҂(Fy&a-be./UA-(₭_=[SU_!ikUmS59yj4]t IuXSQZo,`ח/6 Gp^!@5@#AAHS˄ ;X(MTK6JGXat{PWeR32gt%4ߜ6Tp0S bW!Ny| \7Row)Ԕ0DGe͋x"_5B; e)`VhyS_:c qF3_V@mL2QS{R&i0C Vwr+r-V 1p P3[MΥ~~jEU2Q^Oɯp qn{c c_0ȳm.K?R A.7l1*EVxaﱼwjn(I+$e349q;fQ*Rt53߆ֽ 6 :3E54%*( b*U2~c %Q Bԏ,,)sA"M&+ P Kk&_R+KAA',iB'(dۘK㍘Ty/c,!n=.\!VuFTضf &$&͞f5U6;wk+]y.{2Βص+!BT|FaYT-u؛!A`as`ܝ+낤QQ̫tקVH;+*9$yzPBs7*v$tצڿLUBtWގېQV}l#Sl )|>@D:P{;r'L#񼆎dV2}Ԥi(FB8 73X\; U&Gp1Qױ=ZǓ3c+R#HpTE^ wYV9YiIJl۰l݊*>kbYC%DsIfmem+WEOQ$S"-h Ø0AJVe)ڶ*̲.mԥd^‚CZonJ) &e*x)Ӊ)#0*ME{s$Qo7ɢMoA 0܈XVVqAwNo7g# 'z+9/(b#&*P-D+4q>%5彿+,GI(wQw~F9PAgἹVULb1-L){J%b^ɨTg,~(D1N#bUkל<&d0dRy-҉$1T:y[}KTn`E3%`Sn9J H!9,cV]bH%ݍYi(zs?w`X k$T;[?`6Α#!D~GIc"oLj2F{zOKZJ;O^|2|mG;9nHle"˵3s!00_cfG]LzI/Ռm&XAX5n='@Ad5nCp@UIX:oq[g )!Xuo c-=y죋TE9Io}.G"L\tP-Dc)D ‰A"TL|ƁAu$.*1`:+v !s*x#*)DVYŇWJBj%|Ӽ_+,.#qg}q08`Ġ8p>:N?fM=G_h)5c $ݸ^!Yg7 FUh BsL;d3 BB ~O ¨tE 4,h>&Avb4ޒ,|iГbo0roqH߷[K7@Dz4g Z.%<AʩD]<1EM]RfM[]v44jӂ$9ip%jvfE8`̦p+BPy:||]r)!7-`%a%(AJR-,rT/_^LJG܀ۆ!͒_TH *d5SR*6)+܂L yn)vT,phC:K9'?N] ? 'PRi=qvSOg7-)i58R\:)3)`+O;nj"Hm `\WOvV~7_0r5T/U([%o:13XZ,DC! ‰ Aq 2pÍ:3X~jB}"[yPl$o'֓RKdvRbLnqJ)SL5XDF o'ޓ x TVbILZ}"["sM7IQh/( @k'Zh*QVB@#{qF AG`., ʙ3 OJӈ5No H0+&h$(i'/:2 c,ޖP9d`dM6Α5iC,hͯBxۄ⚣wD]BXHo==uIIrcmR;֫I1x?'xIƌ%E $H`$aJH6P#h G0h ֋h'z>0t=Nj=p8;0Ef~>q(DKqx8AOaҷ[Fr4n={ǟ8==g?i{|g{o׭x?gt4*npqͻ_}Wn_W]~+ؽlۻo˛79&{ݿ@^~ng|2.{?fh~z^S{x4D|w|-A ^z;Ϋ[?9}.IP!<_4q1:k%%Fa^hrљQ( /4i8.Ҡݛp1;[ٱ=L~̟]Of#[;GCЗ ?t;ofѱy f@o[ (~Rٰ4ky~16|zF~)kx ~`nؽb.lfU1_I%ŀ?{_MGcy'D|(lс[,sT~/JI|lO촘ɟg9{4,sӶr:g(xَ}tſ?珟-.?gۣx]koG+?%@ֱvXzǐ>$ HqϩHφ88fCȳ{0KQs|Ύy}|07Oz㿞= >921Lm磚<;9-ݳp*K;|4xaJnp4xj_Oo[<qw?68n> ?4htxٟ߼']ա_M318k< $fu?#@<Me*_h,?׋Nˋ\z{--(*_"*IaiY{~p=4Z|D0N1dnN5*}_kJ5xݿ]#}ݿ1 %K?F Ue%/!v0E00v'`)J >>BU詏VKbK[b] vbbw^{0"T6:_`muWnZJ5$;BRMo[c$ij9qL03(CTJطKW`_}W``26]O %_\uWQ6XMKEKhZNpMj{;o1KuRh={6=TsYdX\bl( &`2eHge-*r-*򬚵<۝_ [iCȩj7h1* [qU.Z5"@y:Í32ʐ4tU֥ 5H9I>pz1ʨ+QT"ZETEoB+ I TI T'mjIO*oBKꚋ1Pݢԕ 7ڻ ݑ2js. ,c]I ]EJYP,2o )W/N6{(7+_ V:} ݇IUtkBbKުgw>1il!)j\ wآB mE5۲g$j rhF1H FŠ%f//,zgJXD]$|4'{*yJQJ5ͷz_?Q|бn-2YMjX>^i@\Cw980x1Iu%7Ob> glsߙv KjvL9dT*q+A*c ]m 'NV#Ѱ*2`$x%h 5 ~ʦUhIFV߳`$p QRGK91y uIj":և7 J:$~t[^BeH!F2!NV 3!DžC xL PK "rCb(Dfqf,H(A‘k^7cӈ;BN(H*sUQR Wv Rhja\i##'\W  q'pE)WͶp!) h貚(ݰӸ(; F<(bI-Dn 995r/Ҷ_Z{ d#îfSW^TVEw_Z{٥zv?@MKǣ ԍW~;WP 4{)1i+뗘؞JM-Ŵ@eZz} @(e#IȒҔg'Ez4%)r]Lwei#EJF>3jҍ[2.\<`2Tc@hLrS&td8! \*]2[lQPZ/!#i̦gG2E T-"fl` H&JFpJ( P2[yJd=C-0# ^zQIF6$BnQBۓmg=,KXlYF|n=URoY7\_5țA]1wL)ۋ78yi 6"8FԻU%.&*ӻ,F֫' MkgOd EA^ďzW'>~r"iLg\R0Un>"d竮KF8WWѺARMW"+\ǽqeh+,T!H-_k2A/_<8>;L}fN-3.TitF÷eYgu^N.US[NG>ȓԣ˧? q>eʫBa?pgftGkejEniGJHY!$)Y0F(2@DY(UJJ1!J2nN<K$7w⸐Br -$e/$mj? 9'?:Ӧ:0B4[l=jGϕbrd:H'I%'d+hƤSXPaAĂǿTXPaA>T4EGAr;uͿ|<畠Z*A KHmܙzjx\NC:hPy2"])kT,śmgE)<267_.ٌz81T)A8&Be/s q¡Z4-2):mupqMHήg~ٹ%c^>` ũbX;{=3(m4jl{嫊fJk3&H$5>H[%x<9XbY` PbTq%^݌hJkw)^+==xmזxk>BʇK eJ#\%+G0b$1DֶP$0b>2cv bY6^ FKĭ(Dn;%I6f{8mϞ±*-'*ugyXгR'quT*:P BC]'ӓk"wsxY ="TJ\jɩ!;[)qj[T* ض$e[Vb \ xDɡ󁽑LE"9"*z>Vk- l# ^ _E!NO.'߿ג,ɒ,IJfr}yɬ$dGcqrR IT᳓Yx yI_Fd u@Oe$lfsG 9MF=~nݩm6Z).ott1O/eРD#j Bp`uR"lv؀`J,v)`q: %`Q%`Q%`ƀPD Bq `VTBмL ;xXEШPrbݔ~xhz^-.RG1do`{)%ZK FIfc,M` x&6&<,r[q!IφdSxO=;=MT=SxO==sdFmtC܇ f R V*rE|B x@;*I>K( *@e4톰ټVd} ngBI {<኷/nS6 91h*Q.Re3;,2q@'֘ȹ:"@Mv+6$(] Why WpE WpE W/\1Os^yHʕ4 _%+UtUH#!W( T@G%ؠBTì Z~<=4 p8œß)8#br|u.0~]Ph2WSy^)zĿED>>՘F5yp{c4xrQ&[e7H\M.Y<.A@/2LKc鮭 ù$;6ƃoCt s ?5ZF=4ٸ''dSI}]lܴ6AXsZNr#FFOrɛo/}t^\^oo|ߪ{~xc{5<߬*+H mA$Xos*}nN&+@_Ȓ'ГQ)PXU.x"q]%=#GM2@J}|!Am(U~dJctBI?xAQW0R$J=jd՘S*ȰKoEڞ.z@󭍒לL5`/_2;/Ӯ$Qy(=W^6ѱvH? ϟwb?/}wt1?R8^)ɲ~]VO]`IoWWw<5^b;OB ׯn~x}cuk7wdx;+ƫz@ᾜwlPq .΂˥^XR91.a>.H|0wp0(/LobfXCJG| `z90p撌E ry>_VۉuE7\3ѻܢ|JѧOF+Q6}9ςٕ9 VreԽ\,5Ld!ƯG=l'RD&8a  5%4ɻnIW5.,euݓ jBÑЯC$%;6(k\4sb!HQ@ -k})@'سBl\$Q.!N  |Tr2Dz|;>]=Gtl"m/؉;9[Rvqޫ[;e$cm V8z(Lf˜~+gEG"LЗ32208rff;6h0GOC =Ob8@ ЦՊX퐐&EnXۭUj}*&Ԍ-c'%+Ssg8#:_EX*)Aʃqh'XObm71+<&krqn>u=SiN` Qy>Dc¼`TByvDP:=ngJ/2[;,^4JCuPȭݩ@'?vx=ɻaV~\B6 K+jZ,¦`M,r}%^,3e8''9Q.0(0Gn ̔WCar]cC먱0ٚ~QMQZgT;!@XK"{E n{|H7\֝Ф3ۻ|TX]g^2+&PDQy?} ɳNބ]m'GsIW_R;}x?|guzޤo1^qKotӯܮϧ?_eyuK1& '^x].^>i~_]Onq{M˚O5}ssfz&.LT<4% !n"Q˧{bPS>-)Ú dػ"'` tW9&t[:q~ \'"4[B& + Ʀ`ݮ&1l?_ PW'\v&rٙeg"LdO\mX{tuX)S,begI^j%Aĭ}Ъ>qk^di:0N3RGuLʿlcSc\M\RhQ2&\w$T*TGY΄ j'۔ZeS>0p-mm,\ߘIN1%98 ̳PcLUu0qnA([(j09f7^hB2 S r{f#MKC͛wMzEl HoE" XSѣIճ.:nU-@|t^_ Eqւ%|܃dj\Ex#fx=L.a!3C uBKT޽CӹOj3561%I6D['6 {!m̬R[@sXRl*?c(5/d 4=6+C6ĈnL:fZC\F]}I F8ۻO8!"~CU}:]^.Tz}i3z5S;~2O?3z㙟x~C%B JZ/3^zp}= ޝf؈;[q%DU`&*Z?IfѫL/m%*{zA~"TPDpS9 Kk@8^s`|أv[>O]u/w*_x(D|\#׏w%&G#dM38t·Q 89+Щ*3SMO4a,2>^I&o_]бos3 u$QiO/Cs )ث1HRِ|TؠGrE }+ЊNkTs%1!^(71)?{۸v_oL 'n,k$3_ݔmQ%ul &G"׬` 4q`ixAX#9i>(8 ,9FDK4E[g%G A9f$EQ1J1yJ2h$3Yb\\fE,D歳~47S;Yhx6-h #3NS/[k{nFqcL\8)(&VU\#`dAʋkyYZc g)KXEN5~Jtʹ"6Of@*;zU%j-W.Qc&|\MЊS9cVP#"zwhV/VwړLdcO6W_h@m׏z<W8x(aj$3Ϧ$kO16inpGuTZ mV߂siR-8V S[{u=Ҏ#.%0έwWFɃy<+V>Gi=bοO]"cwϣ$[?];|{WL^2 7,h2}^dWOϫjPy?䃒ZپrßI4Z%O|ϲdql22KzQȪ4i/V"Uup7 dǨ)??y4fAPlZFہ,2;6/u_-[>O \APiT_볓ڔG~aڅSR~||$XdCt=P\ sӋ'Rbחv򴧴vN}zB hF-Qe5\~V5B+݊OO+D7bUCX[ }LR_hWiΡdե+KO$=V\(Jlm_Nw0m)[!T2?b Rato嘧,^nyT>r G\S]ic:C4QRo6Q`r#YCgw-֤61߰m͙s"0S]3f?$48g '"*i8aͦ]IR,`f~^M}\z U@i=C'46_I,$<S%)f\*Ðsjs˓Hd`*>OV#.ϓRp#"QBsP}GަcTsA52JPp\S`[YnYq8#.k$%+씃!l쇌@@(nm=+÷/?H9ld|Wp`8#o٣DS8 MXǞ:(umv=n-fI|wl]EK;$N%Fާe 7y`! K\9XQ5Ʈf7a-ׇRO=F4Iu"F:Rbs iI&qlG+E9\]՞-i*/]|]~qI91s]$ #(u$p]Iq|owDٝPEFv& $btUƎ/ArƉً&[TavDU{NաznjoܲYa3I/{A(MƩVՖiz?YmVy@UXY~1\&yslWCb\:}6@WyM'gXqcqeCr8rbdt9J˺ Oi mS+^Mڡ-7yJ ٓ_Ճ՟u6KcP9Ay l FCw3;' f{vلgȷ;潧ܥз ]|akj2_&+azZj=rHH9ռp[Zژ-WYnVy0oٝnvgwn5c*'<1DTq7sgEaّm]Х9  Tvlo_Lsh,y^63iƶ5D@+&N#T7q xBa]4cNS bA "Te2Q G4>iEsI\%bVNۚakvSLd(,ؿ|'<HYeՅ%/nX6{D¬h֕{=͓v/OO;EԪ77WMKfrk="@rz `˦ e"ܤG( e5]ړK6]GܰVںr->%-ף }WWe֎`;\2H oy! M;W~XCV7?'"5 m|$? l ?egCwuɞ";CX?ف;qAFRα>6ȓx1إvz!;ӵHۢluAzɟa F,:Ÿg6&?M6n:5~PVPA^M2!P}<]B zCjZoV,%<wX= v%?R#8/W&]&lܰN"LSQ#9GqDCcMQamWoRi\,9ŹCD|b;'}͢3.zm/s^߁q3%tցS˞S#K/R/Ajuv|׌OdωЄwIwS Ur M`κ__Cma`Ãล>F` 1vuU ݝ-ۛWd{$Gr;X`2X0"Dwkdnv+U0qZ %Ij·x0ix O2-3`[twQRK])9)UE UbҍAI s&\fIALQqg _)HQ0L(`\ƕи'O%v-NHuк({6险I6&Xmrs;b/5$̵͔qjC܆olR}}]u*B2ϭu[^|W) 605nC\lwu#@htImֳ\1? ˅Nb4At%c)(c)9^%6[4Sݯj#P[~/VDp=70!Qvz 4d唌n;pr?8=>ENTo*X#Dc~0dm{] P)3hϋ)"Nn(2`0+Gz*SG.Nrl: ݵhUCAY%RNtT5ڥf(%Y܃UZq#8Ti@D` 1@)TRDҎUxЗUC'*:&ݙ=(]9']y]1=Ywn\̩c7G:Q9R #5ȃ K.X:$T\R9瀂;U VJȐC>B%KB v:}45+jfK2KV`<+$>_`BAusAyJ)RFM]J̀i= l@/eW@U >=ƓY$By)Y ))Ux64b]POݍNŹ*G^)(e,V`*,(:p}yҖr-IelHERWyZ6'>//*niW8NLT]W\H.R ^,u?aqd"EːکauUy~D{J.5EMkchYOF7b_N"KM&7PllT?ܵ <4 xpY~ 6PS} aٕ~`C1"𦔴Rw8D*RB^ (![vK$iRid6*TGF`(JD10񈱈]'6¦ԠTi}CG_t-š 'hOrq "y\ܣLօ/A?idOj찚#5.a"q3ډ3^6V]=&!'vz(KYR**D FOb[ SH\+b^rvP ӌu*n0zNG'se XN qfq h HkP :jE VRh׬A ΄n*f vQ_#VlܣHy JSҤE-*1<| >nuhAstpD⇧8Љ 8g[Y;&ݬ$!AK ,(*7;*W^m&7- ]1§vjhcND7 @xZ1tt~x/glTsOdI);>F%ɢ̇-Zrw73˝i|`:14O+0'ɸwvqgT];b?Q\p(v}b`al;YP6$@Y. A@EOWn`ɑJ@>ҜPM+H5;U>⩽aJF^3(msPoB̥9Xf#i&9Wϡx5fꜜ+ Z3ӹa»Oqi " "OO )|8߄`J/F+xOA(Dra 0a0Kkޠj߂>>2% ,NV}97xFQ}sQ=XVhQTgI"^ZXof[RĒu;1c 愰f4+i{&Ӂ.E^%4VjҰ86I9ʕDmzO LK!^[qT1:0è H0=wε)Eߨ֢7d MS `GFMa?Q79o|L.CD0bIKЄBt(5oIקQ g7Q doYq!^ھecK'H){ɧb!miٔ?UԔ{h:*>rtΐ{8U.PEZ;v59|%-lmҺ̶Zm<ojV'pSeՙ@ߒQkC5x |+1\WU}s跟wA} }B3]JUz~a*z xծ]'yBղH Y5Z-aȥfv /tHB[;Pg"(&zf_3?CeYɧz٫?c׋Qr vy)mpۧ:[yT>JߘTƝ "leq6La|΄ϑ}.}"#,pq{!u%G̗^2B*I\Q*)=\EuqQ^ǥȕ.ubgYQqsBש^TķS`]thC@x]Xpj*Z\ wKYnHAQaê )C3c[Է9[HeB9/D]!lihBHNA+o+1g@:ܙM)4W>)'M["1^ [80\B@w^BA`Iy-h{!umP@OT]_/!oڀPlX :vkEg:!fk2$p#\JXs*ō)YS b0T ?G ¹nҕ;7i ߊqmEVdu TSb'm C4:!ZFX?:oؗ=4"1FERMvݢbDv~ ŹP]F߂RrG5+.h¦ހxm͈cTvoz77v,޴k[<~-f!]'8G`Dwyl`,xXKFX>7Ai*@9#Cڌ"fV*5Z~iE@6 ]url܌4y T*/>HK񡸯0'&k?mDZa*~V+*yH# JקHk!x'c8 oh2G6FA[.`3Tb:vL?GݎLλo&? L tpw6a&mab$\7K)FTSd#(@?v<&e8gh{nbs/^Z8)̟L%L\R4,Z֗Fl!Л HL$|]40J޺O:?=O%}|^ =?y>B޿8ܸ*;0z?w'fLB7•呍w~٬iO$5=cxɘ1T0W;ݢ 7@El;2gC HQJ47#gj0Dt1JA]mpt5٢Y%Me ǓFO00_D4|\޸zrxwٟ{W7 ¾ p\>WC Z3 Is~˾Um~ >\adoF3 s% Oc+A΄ަq?&rfbInIM=4> -|=ygH ,5"0w{Y\܍;yM5 Mt%go`ۗ2t~?qo4Tr)|̂VppK&Ss>OX YS?:y?pV~2Wxe Y4 =PE JMutke/}lҦi:0S`ʾ _O!P{&kDDAvLa?ܜ%gVdb*A^TKiydNWO^@{zG4XL!M4~y;YߦgcWuFkʓ9NoPg|"9#2 Al^E?ePzm :폮P.SrƦ.[a_Н762n>MGջ`g/-ԋ;b/Lf&vgF|x)5{8(q~t5䖕T|*\ڑ)hӳ)ة\x\)o?˅Bk7+t֍$+s9an_zVGY^ (DK=(SJ(_\b PtQdCmt⻡o2'xW| nʡ$ߩQ_ T dZ nF@l\&ZG%cnm96SԶ6QkBQ -!+/>3Z1í~`*ʚVSa-,0*9}Zs2͎MkvfGkvTnvH<}h~Ry#9>aLxQE!bݡ1ɝ^uV. "׫* +=,2jl~,AT#|  #^[@4խH 7OZR N QB Q+nl:"V1Ugr7+ߋ\r'A*C"SDj2)*<:qǣ+P9}0SuH};.lj{Aw)uvB`0O.Wf;H%G8*0 3?GW[8(`r c_ɑ&}x7`5)l vxPchmZ?Wʵ~0Bd@fТ }J]7LؿE̩X[5ku ac+zw&oКb;P^,d_۞tltxz,oB)>?w)0(H%I) ǘ]Th m`C+n!]u؛V ĹHh ue,?ݢUᦕ}&/>Z'S\c'4QI]j:ƀQrIPNL%4t<'&ёlpPJ^K0u'.rU3Kf5?rHsEM(Cm0J A #ťDT, BE %\Ov2oN>*my:tbSAvמ($mM|pn0V -<ȕN ^;l}`BB{jO_9B0x~*ICNChIL{& ÂBqGCu[RCl0O|{r1X*=&EgPv{5"w)j8=n|}Qʰn&Umҍp3qtJV9i0zNcgK%H%X2K2@90^sTMPIV NGԬA}nPcFT- `K j7Z<~i)Zdgf2@4de2dؼ|v P';ch+GNmfFcOK"8GIF~*؀}ï}i8:^A}?yWƑ A,$>8F4 9〩S"x}UMRMRGSM9NbI}TWz&)c9ZYWD+i@BXEn!aV`"L#6z<ȅ!^MjjxВ|N}".wY k]LTN)dʇ_b"傴LV /R#yL;D)mP#h~) ՞qyGS-P+I:tP)"{duvْ6J =#NT}}~7Ŕ_d:rN{idddhfҫ]p煙Tol@+[Mr0w3@_V컝0^(y_̓RހB/DAa2jZo+H, aRk<1(I[F)PoFl?;5b})2 jm]jl[^ӘͤG/u[sK=#}"H `ǴE( LIz&mr]s <$ +b],/1cpd܁XV?b%MPXG!_0EDE{McWYB!) Q(@ihiÎ뤋X[ف("#<:̨8hOs*s# a !LcPzcũg _Ⱥ -РEB Z$4hQEa\Sd# Mȸ;<[.H óp&Z>u"[iݪHVEuݪμidPVZfC0b!F0`W(xk`,!.ONA8#8 XC9,3:6ڦ:ñ \DI`#iJ<4@V+8p5PiJp+x߳킮}cdҚdC(FyPI4&yG*\ۜM_ רy1?Ɠ 邼fT2Z}=Ǿ2wb}! D>Gw! Rs%R%tx]EQ08j.1z΃AQf 'AHj ?xcX|&$Cm\bӆ6d?HP Q4'wb>Zz}} S%N`qXN1|Wbϒw Z\~LHO=6,֊ZbhK2aD@ }:.ڥO7j޾8iNi-nŠ q(JvK^_i tLh2ģEOnYo`zSq;lxзG˧-ny Eiz|y!${RxGNy-ϰ|WOy?&}gFJ}fZ[OeP6p\o5 dzߔG(r}G gv\Ypb,ŪKŲxpx+9_hz (b]Z~z{7a2-|>Ww<_+rtoBCrqۭξ[G!#)yG}%hpuU ! ImGDZ!ɸ7+]J%J3wUڬرC8:_nR z0h9;"(d>SRx3@C]J֏wɀ|:8w}טڝM\N|i%CTit׍$BOddI^[܀fݤ_@ٗYc]Ye(˸vO.O)Fѻv!\25Mf-N9]T. GŲU?'2 '߂FDgIâa_~MރyxџtΡ%ɯ1Jntrns4\倳dBYcaPɖ-z:.AkJS>#]~*MhBU .4yJtn~t'7 Waߞ_{8jux%7IMtnP<ށV_].Hx=fx*~ڡ)g Uyf(h;0AO%&L̔WЧ뵔\,3Y c,{dY? Bٗ'-ӇG{X9-ќ}:BXs$I%l\95cc{zvs&zϓ_s0Ri]%Y88-_O?<| (<-W /FY< ]~#[ډ`US/ Ym4i(k{{l<`GViN%713r~w]~֋|0;8K_M:jP9L'z4Yz9픴7Co8[6hak˱L|Yhukx^,N0{YC_^)Gr#"#->|tZ1ei/y/}ֈ- Fr@Q|MSZ9hMc8ko.kgZлvN-d ap-/X=wwQPKr-ԓW8bF.w~d57U /,wIC&:eƺ1 usY[wE V;dy"3dy4FXF/AJ,"adqiC66mɢ'mon#_/iIe6V*?#&v=ujoOEkN&|("> D0 F BqE.u$+uQ"Erq? pOY7Q;sܫLQZ{2;d壙 -B̝GOX[r.av #杖+('j:}h3(>`F}/[~;.d NLrjCgt@L*-y9]ۖ7y\]6R99:j˾K;=Z2U;X^N,,6&CJ.鸅L$/nqV E2s\Ol< o0:$XCP'0}u8B7Ys]B^5M n^t-|Xuj.Hxb\hؿ*gsc"\]V~Zu퍜k#{2ԙ .J*-lCe호oߟ_( _+ts~~m &VQAZޏS4}QH])<٠#( #0޵q#E/]dG@~ƾ֛,`QbWlifzֈcG=KWd=b}f#׾ˌ^iʽΈ߾}9j9:bphs|_|sE UqwE%]9G<R<(k;=+*kO]}5y[V[ _.Q_@ !U3LF>\%sx #BR:^?ؼQrk{-nl`em~N鵿00m 5?0|!(:@mt_ -e:X:}V&]Tu­1s3\/qqV_^PfIF{_sP{oJ0i#' e pB:J&C?걎ԷbCVI6d"CpVVb LVx`dZl*aU &D1y%l bg r֏ل}50 h0{@0)]H.0#TI -Y4\YoԌ0+mYVRԲ ti*} K<,()W9[LU/&W]"EﺗVjq{+bB1sŸ99Tс)9++9:*+aCrSVdn7 Q$yӱV$+d!V1 Jf%Do+AyVs$(Q^D;mpwY0`3+WU7`9~l"ϊFN-O~JO]d֕1ڂE8yM X]E撦OJNI`6(FG <}ﮧ1Q=)2("OąK.1h SP ȴi=!J1LSROF^5 r>O g( /K>VnU;T.db5f0n[;IL]jT.ќ)cLɴxќ`ќ͛aDJ@xq4Ǻ|=M[4gmf[!V}deMC3ikaC&!i4M|"mFkI3dRpOF"rHyi$59(Y M."ƙ)5sҬNS 9O.J!D~] @`>rRyK|e(J]LR*p=C`YxH˗>MoM/7 c2ZyHrvWCȾ?nU~A],W 2g}n>o u|frO w:d{`ⷨq(Udr#66IW5%[𿇿1 (!kÁ-J)~ԛ] iё_OZ5 S2gw]g?]?{DnxYmȖev&i]}qZ~WT$[b;xs&>6{Mq/x= ԑ2eߔRIM7>ⶮ1(|%3ܟ򷸢?ڦ0eiUkTGjyv{4nU]7׶cEZWhωԭp<<qP\f͈BRXhY2_~] #L˱asD!FZ4{z)4RP4vˉ t|G6ՆHP5ȇQ9Q7O>+rNEt,@I-ibt淪ydGjfuQRaxdOUnrSV 8m5js'"Ŭ;EU#*t":FϫU)-hӢYRk,d9 'vbɧK1w[jyʟovؼ,=B呬=QzOkgZc̈́/} w0j=s2V;v\8Rx[qm)C#TjuNz{sӣLZ^&{ѹK}uBo]ַpa {BY0 (0ETC ̻( |u74S򌭚jQǿY5-__wWƭHCqBv_Ey,98)Upv;+Ex4/>eH =AGBzRi5ʵLP_\D]cxbs-zc߂"r<)k;OӢ O;9/Ԇ4QNiGpԙ1"QPhȹ?MsLS? 6q7< ?~7*^DΔ{rM>9&xONiX_?LYD#gARL;i2o\Z) 'WJ~1YTTexЙtA/&kd"OW,O73'7Кx|z?|e|@0\Jhϲl[a(8SԦ\}ƐQ9/Š6qOGØ8fsk$ѳ;Azg$'^KxsRlR"$/6N`]PZ,P1QhRVs_{v}x{{ 5h2#9&7_} q*=1'-S&*АXY4BP1L2bWL) WT׍6p_PNu#VzO?^i2ʚ 7BMa>e>[ 沥 JúLhKɓͧ"hΎ޶ [3GD_%պ5h+^VeXrn(AV^#8spo򽜾;FPQo=1l8 ~WB55\X"\vf^^ w/onٟ&e~p>ѕ}#SAn{W'i_IM˷s$V>zگ޻ xLw Qm%Dy{l*uyg KV&υ2D#li;Ѻ[Emۃ ⦆3[Օ[ȴ)):( (8ĨYF0*Wc5x.T {'3\4{ TziJ 8*#DxU*:1,I0A1X s/C>6zCg@s*2k-҅8 Mk},$] ,)EL ]*Y9٨;Z71K\%9P%AJ|lOp8)ņ&Yu_mD7 `-ލf4$yܨ^Zhv;rN s{ ,R|tt jz?ft'Ԕyެ>ah4Dn{޶F1V }ONBkPQ1 RBI:UyB'X]0zIG)R*#$cJQQEEzZ;l2R$=WP^IpWd΍aT/R,K](!$[| (OCtk.!CeږIn\!||B4dG>)%WCAF^ɬșT\J cZqŅU6^fٳ`:ůEѰp=兓0؟xzn^=_>y.0/ifcUp:eqA/b͞j 0{324M7t(G\rV46BEHPuJ8T͘AK rڍ*>)@ `Lɕs ir1hz"L`)J.@!xo##3J{dU%"~^(|roZ_Ca}1}wElf&˅U <`pusHzDS[NZ@;Z˥>bTyo{ E5f!3^R&P/A僝} !)>~WK)}A5l&[lڏXdTKx>h$!+ edƷ5Kn=˚{m~'(jh2?,vy}Oww=~Ao"x}ȓ^LjNGCQ~?Tia3_ dhη6?6 ˽퍏rRhc9|v|Fر XltE8.FQ!l64pl0{"9{ ԕ >NhθSF eh!ȖrrNZ|hFbq5AmTJ!;`Q)x}|km#/be ev086p3{DZ=#=~,َ[ݖ(0⇤.bEWC>>&I1UQUG xf.iАFAK$T)g &2C:H䠽ZʵjNBcY;1KA$`BaT(8v;UkaC?R%򥚖l%1 HXh]3JCKBNU.\3n58ѹ -I/T[cNrPT^}'7O6䴽1C咙HF@)FחqO&e.^ WaԈ)(~RcտCL>n.9% =:J ^)E86>Ʀ /9$ cd5Y{+`"dY,UB\n ֛:q˂<(n)#mpY! Kǀܲ . \d)!$6ȁ!$lKnb-0R"s{3/1Y F INܐ쵅iM-LrrFfZ5UO2|T*R$: ek3FÆSZQm-)-mQe4y,mf3-'n~5bک)ح_}}FF[Ri r6}1g72ZV /%.zͨP҄9|FS'LD rw*#uZUWn +Ŕ1b>J~+{U PсR< |N6`k|j6nLh: sCJŷ@at`FPNBr&f0}ͭXZ<9 fI$B D6oE#y飁++z?kc8DS)>!V pGKFv, z|}ηnArfP_nͬe#uGNf fOHMf'(9d:V,ٛڙuQuBס-x>ܞhi,zz+}}yܢ4D b~L׷7 o$ۛ[C6 IB0:dkv C-F#o71'7iH,|g6L+񚲋+ l@]X@YfkV Lk\d5Xb.C4ީ$Q[Q8[+p߁z)pwV`[LJ¾-q}CnDMozdǭ- T#{9t&2SPZsPnX,{.n#\swv(.^uwk[x W=F(:cxyȒJ׼eT !a+f'/E߽OGNBQ(wQPOZC}^4|I?7U Ɨ۝PlbjƵFԃg 3]|ڭ2P4Tm(p:2)wa`MW o4hC@+&7Y._]t{{2+G(1ěHk=2ŮFaG425fcf@ *E[ɦ9qyXbᠱGhH^&~,Y$7始 ﯷ7i~?OW) r9|GK^0䇋U:ȘNޟ~8xziyvos4uB*qNM>R+>vVh{7_|1<O #RVe [iLVˬ,Iv[!Lmol3䶽q2\rN&fl%1f|rcP]BWWmEF):[JӽzCˋe\ 4/JIwg!w7_39WSRK^Ypo</qƵ#=R0q6Y9ZN*cL2xYHoA&2q(41V܉oR[b>|V,PʩHnk<>msd1,օ@QXc6l0d䀙F]/u@\__D{ǥ Rբmd):$ku.gTƄ !Ad|t/cdB>,.㘀MfhBSmv~z=~ k[>#hb*M^ 7< aĊRl_ :DJZDQhj̈́EhCug5  NGs9E\03;ٻu&OR`[ Rc:nyCH &he]K'hǍR'Yhu5AbF(v!HzJe~mN,z@7`} L3@~qZђhJx߅ңjr}GfE:v;)@8kdRt.rGV CO#I)7T#кcT H) {I R~b)hgH FUh Ҏ F1v.Ź}-x1}}PX(BRbcFBONֵj c|ͦ0|=`ߘI#Ͱ|lhʶji m-G5"i_KvtdW"Ű~t; q+ʹ+ʹ~\eFɣxѻ J;F;f8UXCI[xmwin U`pT " xwv}9/,qM=|'ٝv.MMyjc'MoY\Y\+3dphxO9B( Tĺ!cx9D{'A3XLn5|O%%& &ToLYJnVwzPns3Za9sADUZ#4{Yj"@0 oMhj+@{8Y9YXJFHA 9o" +B-Ek Vv4yZ|g r5C=ѢMkRտ֚ԅ{B" J3ӎ ̝Ae \D 9UVlL.j!Sx %[aV~omYK4eںrp);%s@K1xYL^Q8Jp,FmzRձ{0Y^Gf-SɿHe,W\w2Hl% mbOH?6ȺTZ^9Cu jmia{r)p %hs.zFEndY4/]&VGq[H=3XŧXdz YsYf蒲7wj FgcIcI޴s&pjFmW&Ċ\_߽pZa+h<7M.&N4xY8]hةM $7"sˊOҠn`oD%@-4A[Ȇ4m|Y,+?=RG~m]Z 1NpEou puaݽ%kTy.o-˫,Vb*-S6Ż~ۇeqmi EJ].v嗅{-r9o a#P3 1ZEb=4ڔ^7Ü֌h"*~{ۏC Jd5yt˟uܐo:@M: ɥۛzd׵ vBn}G +er[Z*=QFe‡4a;^]cXhaF#d:(5W`oxt-{KN߅Oyh 9~%uHɃCϾY\bW#oC煼5@dy/K~3J0L*_T_wH|]c̔oncdewחˈ|_\ǣz18 %*p i% ip:r4 8y^ID1 ,+8}!IG2 +iµArs}bJzE G<*]_mPRhR^nǾr;Y^ eq4: jQ+F";cK#k$(ѮHy ݍբcikjݒs.榋a oRQ-l|7Ϊ O3!LB9p@[v¯uVÕIRɣZP>$ZOz1Iר`V.Cϟ>ber)/*unhOE-\~ j]-Iw_6J؁?WQ |R sk0;pqWp.*l! r&zJPp)TdEO+;*`P%Ÿ6qWQDЬU5&AhIa%[5\2Kdռơc: EAε9mS`;pRAkzœ y2$=ix4f2?Ts9[g{wRL=_F ')u9ILyrּIroҀ䟊_$1gȥ{Tx פ #IS9fTH!MI͈ p=U䋫UO]8ŕ5N{yxt~+l8֓׋qQVͥ }QJX8x/ Ni"1$ykIL08%܀\[=[4p)i{4MA~*"\gZEz.2]pc!Vȸdr­Zkj9}Ns6GtR R-8iuJ(TEA d i̐Feы^#ZWԏ1{W=$Dh*4GWFb{D"Nb+&y๣GQ"sNq RVhm(4}Ax^1]·12>.~:J` g0"0,aa6qrN Gc"|}2R~5`62oՃH9o.r|9-Eε,SAyWqtRAg&t{2\u9%Uc }ndL~:aYJgcA:0^a)_{S>SK#D8veRgH2¶ u/-9 >v8MO:nIs%J(\.QĖ} h|0$8KA3{DI)gaiR)2qg;H.[Oz1=LrͤRs;w0N-`Zc!:AAj;")KLx|wo4騽~yQ-nSnƌ!7Z Q ٨Ioq)'ڨuAj=2Řhw^XF[ ˇ&]FG14NLxOp[H\j(A%ĿvV*uh|"j)|}(w.bE =e)+eNsI|D<(/ʹ6ƼĂJ8"8gCLT7g?eg1-.,? g,y|*|Ix `~ܛ˲3_IˮQ)3M[uetSAuY(A<(7eN<˽S14ʐy+q<7RqgHjf(޽98ۨ6jP7ݏPk^{QT|vٰ:U펜F!KHe-tJ F-!uը ِJcGVn*MQJ젥 .&PVuSс"vP%eN6E:djԥDv8o=yuHT;zΛ[}+Xe诱Y3m.q<`f~#G*Ru߀hJ:r\?o[TB1T>y$e7⦿|;iq& S( ZXX" "7o{5 KYɂ<\Onqo| z?C|gvԓuՉ8obZkJ0<4'K 4Ȝ@' Y1hdJs[Mp*w!sK0ӦRB 3cv(aN :/%c^FZ9aQyO﹧Hǵ(B)cیe" %b͛U3`,)_x3ĈtKǎuPloS11—C/5?nkwЮTG tL%ď@7owt?t"?RYU)[Jec-&F4RYT* qD1"?ZٓNS V %xǪ@~:dNR2cM@FW}*tNP+ Xa8mC10'n-Б[%m;x“GAKΗ+1rɩLЄ69TT+Qcm!WDŹH~̮V2keBX#oKTPs ʹu"ߌW Ye^bcL1cE) 6ǘ&LKN5-BMBQ0L+uaD ɬp!7ڢ=įA2uEcJH`uEBtNcf)X>UPY+t++YL-ףL)6JyFX^3kS[E!k{=gyTZN 9BSkA3Y>ZѢ(}⠌=V  9rC(#m,nТPke+U|ҵF]@8ֽ`D)$+!9{F$EQ\EQ\TwU\d k2MSBmFkF%x!/\ۜV+N$P/ݾ2qT" çA\i/ۗq]'ы%yK#Ik; ^.M)UD7K9Hq|v*;uNnp~Ji c1KMpQD֓uTlna2ErQ @sCdb[s8¹Fi=Pn{vkn-BN <#߽HPGk*E˝MR+15l_'*|D/ݹ<ĉƭyVG:=AX2&,Y.(p;tzWo״ZܾyF9TVXw޵-q#_a/[M$Wz"b^Vkz( v"9&g0&Y uQ-KŪ9*q2H"Z*;.m@" ʃD\BR\~v0ҔjPrAx%OMJȭPئM*6`ϼCk3/kUf-a{5Pjyy˫'skĨL3(פ-\cHD=}6Ge&J[4( :͎?G7Ot9(zt-&>fÿb%H F^ݫپçMcFUl+txSϯMCuz25/<~M[>9Ҟ˜ap=)N>L}=TSw}1w&M%. ϿI!>hk@L Ϻ%ŗ JP*QQ8-y6:Ŝ!o?~Eq9u>}1]|^] AA.]t?^@SZȬ%?тgTwWE,"JYRc2er-ɮtJ4"0-0 \zX袦异G-T3R4N#Kqr:bQKsQٚھUk=a;r/F }Yʸh4ZU}2U'2#RfSZV Vv@t 28-uL`WNJ jA-/i*d@raTrwzuE5aǺs|tr4ճ;xIטa0dZ!\t4(iEkQHz( ƀ{ODO bXVR|YQ;B>\_îX־XXY[ |h0qa[xm5NaPC5ߵ0t; hc[ @+ *U:o2X-ɛy€!S1bpZC 0([++mzVh6F+UgPX.]D߽Q[VBf'ʰB! @KDD"赋VDC D8jlQȓuln|ק0jibe; apÒUu@! 8fȹ=Ϊr0);v3rg}<;v5 %lt#UHÆ;Qо~kpp)3ôaoA5oa1pM曦` eFzʔJ0GM3aHM7S (Y#mf=G츮I_$J?X|Qֺ`UAE )-x6)E f1 "Z jQ8D[:%۲R e+,6T6"F N`4ӜnTEGs1D)Dkw(MTlԖ=4{84oq8z>CFwm: R*,HJʦsW#R7gtPr8$=a#3e w̕!zY|W腹\ Շ+ )l:0TĕIlnMP&h0qi)Ocv[)3q  I 7S D Mu||4[[Gsht ;p?8[Y^x2o%;-/hB{It1?\ܧ?y% ͩy-ӆC5|}HsBݯ} 7M| 7a~Z7lSS֎Za:-tJ@Y+rcteė 2[ Zoaㅌŀe$z$Gccrla>-\L 5ܬM,O `v5S1}GEȭ0~s9OeMC]w_ B\w(-R9Δx:ډmZoVapߛnӦIH{!9sr('89ځnşga\LuydE*ō>|^sK%pO5j*շBM,6N,_;p-*L=$qoAɵaEn9Q*MEҁAqK4e^r ƈJ"F4RxNoћmT,+oJꇔm*y'$tTDe2M)1?pY_R7j7>fpm 6 $qg8xC.#j*l8>8Ongt*Ő, w J\UiCfpQ)Aڑ{'4LYvI.Y JmG.%(]{vGV_a[sf˫qc0*}3|"So82Wg:J;~wՑ# zo8=Bc w7wNtjM}->IZMzv13rgKFJ~AwK1E;٪]28[g]co;t|@Xip it)Y$gApϤc;pb} D1BNvl ݵsU;Y۹n*Qh};ҬX#(!QW aus}:?{6wF c i1"}pwT՗ml$9/یGa~L}ZnXݦ4>~)䟪1iՁa2- c2 l5RXq=liRʢb QƂ(@ZIV6@NF Gk9 U]Q>T&?o<Ĺ|wV 6FGzwϯϚm;q4I>BWFH5Өq licr_|9_%YE]zɧ0na3Wq{\_T+b^d4WM}eEO|Hr} 1o;/~L5]+ RuƜd (w.0}sKT{u.vd zxly}Ns$u=짞Ppٻ꼏چps^N5J׶cΉⱐˁ S;&j P<"h`h%-f{C\O]>-/DR_^Oչ˻L.K/:V%ˈNK&.+'d&*Fr}[}`DN)\JvCѹ4P.L[k$a5ю"ԭ]ᅵ׺ ө{<cKMYf 1]wCv>]]-~"Wo|-m5jtzy h * o5"gmii=g_F+e jBN2 = Y,-ԎˍMK-/&'&¿.g8ήՠ(>FBCTeaKV["T2e'Q3<hZh@C;嫀9Z5҆Kޕ$BŘ-:#V ,lΠyDK^HbgGF1YEdq &B! 0-vI1B׺E1QA=O%h ᡼C&cӭS!ht֥@$hahKb 3P;(ɥ]J-cE\KHAI4Hε;lƼe- cbq3_JPr Sǎxւ)j+,b:ـD*f@5U'ņ/ */YՆj/ݿԵU-a3X`_MFYA%hW|Y«'VpT[q֐rIJ0{*UxW9"NeFk[(zƸ]njmm>r.=o]ivS* kQQs5[qt-_Z/+nZ_#UVMVGiժ:E9Swwqe ]]CGu:jvȮN ŏt A8Uct: 刣嫏,h&9턚~|2_||OFP9~L^tpfrNcpïb~L?f3 cK@r ltRF!'F80(Z^j TCƫ4XV3r /!P 9%"gsL/{aһ2Hb0xuG]}A2GѩuĜY{> 3 fqt7ʓ{!ZOǧTb?|ɳ˴Ӹ7eû޻_1<%t1k /ߦt٫mҵPB/8#7>%rEv~|HkB|9p%!W]lܱ;w+ z5F[w]5(JyC cNMh否9Ή] $" xҍw gJ[8=WR )iNQ1t' ,.1QDqT;p8([2fa8+AFmY)Xp33JSN)ZNAV,N*\Ώ/0AĤOZx֊$;#P htI'>wjK:%ގ-B\cVH !Kn 2ln#"{@3U;x͵WC)/cx{W3 p+}o?EHT+@$Hr6]F5# ' rR `\e07gImItoTZvJç%m鋕ir^WFv-IWd:m .d0I;J-ē]*WxDp 'oٷL^:1D9gs i <nGqT]˛gO8f2z.I3x*#uq}e5bʛAEfA7)INKz"wU7c' ݈Cd|'/*۵!ٖ(qs7"5$]0ȊS}P '`1͟Wi/݃.{Od #cf7;_q)IoyǗ.'1!&?) x96smOC m^ I a# "vE\H/O"dAù#̞=ǨKGyywC7.-}npNM?s0rݸuo E$< Q8 zRg@6N ]zGu7K;ϛYQZtK72-"`C Jz;iFͺo:Ui`t o"oڡ `ZLzRϛfnDuےM#WÞ ct)?ogI.^iI9$ǻ;p gI"khF}m5QRrMD!1JBII+MКV2ݞ{R %]iv5: UU%zJD,)EqZm)DJoBZn J8ѮKV;=Ϯ2WJL)/Y ym֟Yޖg9kQ#) GV)Py&@<1:$$xaPp2%zŦ/j+yhkU48a"rTrjgQHW]H5zPh> :q%ɥINsUt& u@R,RJE'ڡ 1uDq}34\`1)Mąqq!撍q0(bVF{7F٥&ADcCDh*a@܋QFCWJ٤MdB#Т  $*,'҄z { V_> :L6Sg3!OW]VUuK$F8pPc9K}FI75OSP)H KՂ*~*w4b`#1Mt9 6ZAF BbځF.ZV+Jh&j`"Ds G6^|>o],7Ѯ4x[ŋw5~YDAv) cS~XsSZ{+L,3xoxsԗ Sg.ƜDo.^~)?Og/cNe뜎fK30`.BU8xab%Ꝍd'nhtjËJͣNETZ(K4%;?Pv Pa9)Zg( \^NL N@!$G8:h  a7C4\\ J*P"FNay3+G<%FKp A!m8K>HmJQ3++uMCqzQɔC3!`hF$IqfDQyg8aT.hs ъQ YB""ے#y#"R9[zMP+H(2h6S"C`\Kk=~ { ˅XL&L#2zaһ2pZxuM8x 'S{KMiϟH1W& ߢ!٫?9\ReάIf .ڜ`{=~=3|`/ Fljj>zlG+Y>eGvd5&0 g)ߦv/0NSdpu&W04t5]=|`o5/h"-U6eZJR}49/Z`4(߿GʜZ]H*KĔ式h[(BU&Yꬎ /kq #8<$!^NMރ2'85aL}I#Aj=`|ѦD?NI"J$%)YG ":̖%.d #]կA)ɾPTPzNsUr(&@h\9]PbKzt;K8 Zt;UI5mCY9?S!w7מ'8V^X» d6)B+e5`Zj_TV((U1uM㧵9Jm(|.6hkO^66rh6[TG6gkFEb:IygO^xk[=%ٛdŹH=a{f$[WźY͇uWw?Mφrb.U1YLDr2%{WѢS$!;IYGӗwSFxo8WC5?+|L\IV8/3N*a>% GV?b1,?feW:ߧ<c/_5rK`K/_c޼4!WQ>:& {ݰACCVgt\0tAO-#!*0b#za=w>>hKN$2*Θ]ƏZA:-37Vo m22UVNAgD$ڛVE 3H˝ hZIFߒڰ-wo5YM`$ϹJyW,}CGݧ؉yǷWs9=Y> ֲ[syEJM̷*<Ň02?<ϺG>%7w fgF>Qk><|=ܹ,L)ocC ԓ|&ZdSKɚIv*.b{dKĔ'K+NB4tXQ}l :xN-ޤS7M hwwj-Ahg)w m CSm}䆻|}+':dEsYg, "6Vcڵ.@۴P<>(-ɉ(TZ myC/m5qr^iƙx(D]^<9E|MkA+B-.jϬ{J@j-3/ˁ~Y1A%AsZәDn)hNQPQך8j81# QuD+ ,UTzJ svP4Đ,ER1%fm-{OS-9^Znc9p[->T=ʎ8 Ů, iJ+Vg2:t4/{ =c Ē(QH;;ɽ_F)tЗ[m|ʙ1C N\mw*S)MAI⦖&KbWr~|x+)WQ1(1?.,w(+p+j WL͵ѽȽ/t8;/^fl2*G )1o"J ,wO?/EUrINg{:\$啒5NPtmtPjcX9[Tb(ν)b/HLSx8 }T;8^R6艜lr[qBP2֊9&GZRCQu!h b jgݭ;]w`bJꊫK|bMC 3 0hidPϚT:DC`wOm`Ezp"I}2TgSQt 5!vt pCԘK$pOP:uj+QpULsFi?N*K7-c\4\wAf!5,[2_8+߫ME:1aE55gW_{M q#I%+ rP@$xU[ :443~uu:ӑ)pQ"ج^2WS 1DU,*Q36 W!ux/&dtw9KdSDA1|NRRV8^hJMrm)ήX視e? yHg=6*\ QJh vT0PvoC19ީ k:uĻYJZƸLh2ngYx])o@TBO[a3qϦVu#:>f]38ϟy1"uSBQ#){ܫ ޤTڝF}?xj@}?m9QhA&> (J9.0\ ~8o%؜0iC@9'UMjImp7 פ `D S€%Βr,\ х֔YI2t"7Ėyz{9:,:P1~Fnq&}InǹǹE gFtC dZ[&B%fW;u)aT ?ڄ4OFޖ&&k$2mi D´G0U("SEHmvÍQq܀jbz_2S'CQ:h\D9 pȹqE3QEеy!Kj%j}t*XH4 tz`'YjE bS$GV \!,5Qi\@J^YjSE!r\.r19Eyrvu_we x ڝ P;tv7KlD 4mM& {:E#85R _|LlZD dks/~ x"ڥj梳`RA5+epMwvf-bIh˱#*!.3T=Sy㞠vD<5\K:ْ6+; gzG|G1nJ=]Dh͐b>[N0+t`Zf]3x0䖈DžmM֩I? t?̻߆R5AǜjV6s4? gpϾ8\;rA1c2>JJvu%@ s-[xV+88i&sYp#IPEH 5,H4Ida@O =a'Ej7CЕTBJ[YɡRD9 U+մ$ 䌲S97J.⮇,ʹwm]R9BHʁ,02sI3_ВiGj<:S& J,XkD44+*)PF@+|"AϘ"oskA3Jh7"qKrQb4$~MdSЋOI /̰5}.Y:NWȖ-I. n=GUi`\C},b7_jk,qb@9^_0mC~{;[u E o'Th˘,G'wCw])'*o~.sjڻݾwȟP!%hp4Ϩ }T1ia6 Vg|ړz#{$@ŋ,?Bes/;d;'vyƗ/3NP&vGŐ Dcx.}cW⇘Nؓz#Z=^-hAy:yw6r!жnh?,߶.)OvqR=v#n1רw6j:KQ1(TzQQΣN֣w_8(ZbIY-Dv/.)pbD S€%Nr,\ х֔YI 4ڱp,|,_f,BpW6|IւQđ֣k18A 7mtLÝNz]Z6C1O'LuBlf /۲6`=~bU zAÊ{^%6{! DwO .!~º\йɻ'DkOlm<1m=<:r‰Q];gmaGa:< Zl1 cv>FJvw ɬ1 [٭U+l r={"(f{@885{ t G~W8}faMp,[^IvMǛCY,QG2OJTiRXY R1I!o" SI 3*PԂH*6(l{9vEj9Qm n1܈go;βTAePQ$;V,EJT-N3pb:iG9Jr3}*zϘ?1;L8.˒h_Np76TpTtcBq^K_[gc[0o[ofn{*-ީx47f>nvWt-o(Iy LP2[W-,4bߓ[+rǧ;'a~OWOOKc?<Y8ZǏW\3İ-OWy8;qDM <+дjmIW^*dlȡM"55bnpZyG miS OW(յfѐ$V Kp{].> *TyG!SLZkMq|ǃXYhsiv6cUhT倮Yv iޒpJ +bv~>x!ܡk&~ ®>rgK֒^TB8ˮMZrG4 Wnx*iFTOC)ȭufrm]:K:˪r+JжPĜ*ض:J hG o:B %4O&&(] t:nVMA3j3(T2xX~/ĥ(a镤k:[>*T&j=ETF{Viñ F `U,\k=ة꾡hюC㣅bs"D1;.(/ٞm3csp{-WQ~XJujk#\bC.4r}˗sW~^oGeZܥդ[Ӂyc;3D"c8>ZR[m^ImUgV]Q*81t ȸә0"G4}-6(ŏɩ5iP_E5nY1T.?tHTlI+MVz#etL1"andhx R˥0rT_0/Va㮟MB?ʧEz r52ϲu`2M☟/$աl$_&t9~h7]ȭH9/еs^iPTQiaՉ#L*?U6̟`R19p|*psy9I^ȗ3s;cuJdEqӬr55IVF qMYĎ6*࠿yp, Vʝvu&B/UϢX ^{_)X "~$(h Uஅ'ܧ^ P則)|א}i)Vge#M3S𪇦(\% |Y@SYܚu`Lex+;L2[V4&h456a0SH>g\ųxs*c'פ\`b2*\R >Gj/8񟬴 K"v \]unLGVSV*dUTB!1&61 7)q BZILj#vNp+&fd71ü;8G,~.@ЋR ].߃dH(``fq(2"Bl_5'ٌ[+.sQw`/ g#4H$( 0Y0& NUPcJNbjlsy"ApjyGX~Ud"0ҧ͒ R1C;);DdѢNouUiâtC;1"r-KJJ"0#2.B&Θ*J>}23XkADke {A}6rrħ/\2߷ϭD|T@9_PF@*I@44)Q @˕4,ኤ2 /m 6V'9y_GaE@7F@w]h2NDT?6M}r%+ث%1o2@k?S夣+?rp0iY;(NLӲ0o:Iid ںcvwwΈoM&>j3w{EpM@\{&( w59퓅^-iwnQS|9d ]EYi'䅞.q:\˪T eȏ6Uqd|dY\\ֈTrϒnƮQS%*51SLhMIyH:B"2&!AT|cK qdx{Is9Lw2pWrh=wJqJtpo6.AcUB܈r!#Hժ/M?bwsׄP F q}_XsG+pX"T_,]gIKϻq"Y*ssiM1s<֞nI஁Oޝr6U+l%00Sನmm)P%G;X.H^[ot +n!I_E~Q^Ϸ_ Wݨ/M밴iW*ޟaZe: $* Y8]CŒ*w~AټT6zUϭ>z>\bPV ٥j7!D"X@@ LSДJ"B'%3Z( nr}y:VE;Zk~7FtXZh/v`'#]V\{;N_>xEbpmi҆Xi&hvX*]<.ZKwu8~Pv[H 1Z1[Hj-e *R}oⱆ:' [f ܝ? NJ) %xr7&7X,O;[{3& `F9J!xqH/}ckl3=@P滏t-^_E-Fen%W֙eyL{IE Ur΁5fWnfy"BtٵXk/C=u3^ 3dȸ:&cڶ2j1튁vSf9ܹr 1}oU3wby)bkLo/%HڱU?## Už#HJ;q ]'#NS*(w!> 1O)v.ozʘcL1<+O hKC-82r٢謋Tr(UYf2 A4Z><YK ؏hSLkYGSy&!_2t2QA0R (#8J* c@ͻKip~/lamV¢ ^}Iyԁmj=UX<)펺+agYi4Kwؑ^ k$ެJ8ˣtB;Ys<.ǘ p6] ڞy- [N=կ.%OAuלN^?z 9 h)r80jܗrI>+Spicnfy%mMjJɘ4-)֩;vu5#ђyRcG,H WG~Pa_NB8u.t!A3g^H T4 S$KUr{)9K9^zYra8 e'-4@qnb2d(OE#pA\>w$e&n>u8Z;u:|C#Avh{;i8? cvԠ;X{Eqնk|!Fvsbe m^R5ͱͼmm<2w J728@q{P˦c o%wM 9혟$ evxn g+wRS2]o!V*nֿi^15p  8AL_B--ܩDp-'PH7%nMik=rw\rZE~p/FYR i. PTZJŢOsoʤ2UK =qp~!G` %n=_|xq>z]Q 2W rw%_cX=Kx R-+qf a>U P'q}=?2/d֊dy]rHiߌ/SED4nh3}m(d5#:#~sLy1 "tDe| Gik9cſf#PC8n2IIJ )J$TQu,Sz@UvSS6s¾Z9sHBPqOpьOr`BhF%-5jȮͨęьCbڋ &g9[C3;paѺJ_ҰשJy9utH,QVS{vpZ[S:{!R)w]ci  (+QW7mr/`ĮJD5Y\(B{@sY~T Gr_s777SM ! *]j;E56G*vڵȢm)>iy-o> *ÂV ,V2|SpY?+aIe`.^/1.Y< K7O4Dlp6:٘$p(B:V"8< .J˕GbuDLڭKN \}w8ZRbh-}{*] ޟ@Udk* ŗn?Vȩm]V؟chFQ!ÛN 9Ɗ-ﭖ"/cHlh\KxzbC-OFܚ,0ZN5P<#o!v_ZZ" `ڶ\|թH3D<&= o|}fNh>(?qslZ>y ~{޵B/8Tyay G|*]$ qo+Y!h=NωA'."fۘahJG43 G Ȕ59Ȗe74"W|\1}\ꞟV 9Äm^؞7N*04B /a"RNOd+sWCT:9DDҚ˄%c-,b=2 ؒk.mXi2Qma:TѰ*9 W6Jc)d B-h$MQ]Q'aH"(JJj&Q%neYAti5]*L)yPqRM1i'G:gQ#s6{hٓ;/,aPyzWzU =-YzE/E.z)rK1]Re"5࢙\S5h7u=ӗỹB wu{M~kEtAۖQHT=(o3e0$(/IK6m> MB3hCK%8E`G_$Տ;P򶺃o.Ъ'c]5&Ѧy HWd9 ou\-(=R .Zh|ZjDA9Xݯu=KX #@ɼW::.4qu݁jo,Q(ٜ@8fYi`Mv/^+<h]eZm,p` T4y)9NF:E:5P)J70=bGFĒ&(dfL,8_=FM6"x3  SY_:ңT:S)`4+=cCHe˒WclաÜ"z|ú|ZM\qdm;a` ?.?8B)_N>}>4u.LQsg:?m?WWO]?ᆕe+u<[s;F ]  S\觻;f\3-h TwTcNP\L…`K"EmJ{9 R\zF1VOS`-Ҭ-48.zn`R#+J<5ϪfUS ή ~.Wȯ*0BaTCCLHuFTl cڅc~~~)ƿpݽ" '<݄tlV@ rAf@05Zj^@xxos/a怭=߬?2k1r] ڐ/aaESȦAv@ I0H?ޥ0b2Fb$lw9|s3O22:>nYJ/Ӿrs*fʈiM<|uM^ŧwmOhitR6:I2xixqitߖWۛmڿ.N~G}~e6-[孾KnwߩA"83 6lX.ۺFpPWc^y| .lY~7wnl9"K-RM-Gp[[T Q8km1@w[ h=UhR nq毧,Lh:5j9nXfbW{^.Z _zk߾]%ϐOw@o'SQ[%瞽kf^>9p}^-MۂնGmJ0Mo \sB9&_ʶ٫ζWmteۗOߚ8˷]^^8u$245&n٥(bV)FZW*\l=:™J}C)u}̼PSUpi) ~,U%T˥Nl=x)!|{DdfILhEI)C"!FYg g,S+ 6O=7{tk ˻Υ!W@6kg=&X_xDK$1O3L2q0ҙvEA\mp3sخ .FAJDf;gꌮGUe{p.z\vT^96<*1~u~w#=+{~F.ŢT3U<7 v=7,M7&[^J^7&c!eSa֫I^=a!"KずZ gcKGGZj@.pewT[z pw[@З"nwi^8v2F̯|@ p[?c/*;Va'cmN¤Yj׸õZiT]JW[iK9E~zYE\Kfb%vU{D#t}Bb-6(9Y/mK?,|HpBgݻ t[pn9]3ۗ#;11ziMG ؓ˼v;x E$̽olf E _/T7=Q7wV6O4(SBZ]5wI+2WB ։Hh*Bj$ A*͈D)F6] Ii szH۵  {(ھbK # emieycnI4<H!QN6˘̬ۣ6br&n6l ;pqo-{kcrCR%!dǫGi?nq{G@)+T勇I #Y_^ 6"-YlzUX*{UX=~Pݬ~*`ו9x!ܲŒp+{)塋e`,q %G;]ck%lwVw;{)yaz[){2FV >15ň g Cn5賖]i| qefcm/y.p/OXm<\Wrc%.7#>][,`a3&=1cѓb*nѸ4SF{bb\qJk0{+WX%źMB_~"R ұ Fǘ)3?23XZʖ?쀰y'+Bw8}h"^%c7Ӈ!BQ{=>X8X=:bxqЈQU]+ΏY3zuY-FŽ^ldĵiTY[5Y'}82Px1,J(yg#᪷iM!q=ò-Z&IUo1\S}8NRWm2l:^DjֻMAا D._fLU&jnCPp洒 O AxRt7$dOJ!)TS}QV٦'[40P$E}:q>i"~hQ U%8tU; 3E̦]mϘv@(57끹9 RXhB)$XPܯc<2tFl?:[ʞ̳jJvuͦZl &I}꾯UcOɂt &{((C]<ń`VΈ&9gy6wmd-6)tw/8/D9B]pU"尒 g4qF 'qq XrjZbȣ4ǔpQ`P#jڍ :w^v"8 尺Ug{﫩܎hwfo1_`1(A˵'I;` ).2>J<k("I4a儱⺲,/oɩ<{vAN-R&b]kRJDc<}rmbm|gr{T^??Bw>|GG0H*Y dI"qDTE y,8Y.$";L:yxE4M x̝QX9*H)Bx ŵ ўD0"MC*<*ᕌTޗ8By K(Je`F^ w Y/ڮ&I#(Br+==}+sA9bKrJM?йǡ&P0Kk˜C!=aU Uޜ~pSw5*9}E_{5u cV:T̹>ܫ=㛏ˣA1<&mޮ5`-tN(QJ,)*);!@hݗCR9@4L_KVZ#[1ۚJlvS c-؇~샒\`k7u@T 7g9_zb௔"0R]D/tĒTD1R \GfEE j%JM+zjPdgV$t qdiz%5.N\*cp!(j3ӧouhKw_o׎(L>L;m.0Elʲ`t8^n>#-C < Ni%{z0g"VqD3 *2C5w pŘ u{v6rg=_-kH!א!҄i/nqFvぅH@0Cz77vB9*k_h;+1RQ̰X4JR$N %hWQc=~a`pOY2gϠ3/5XB^'锤('³HgG +54v^7K=׌z\48=|Dh_:mВt|D#΂4ۻY)ޭjiu7!6+u Qe =sk qHpkvvݶWc*Y+{yȽǼ{]{w>Z^ח<H_jo;:N^h'Ku~6P}+Cq2W\v?׌+%B^x e.RβH,Wtm"TJUVdT Us={ W 1mӛw#7337CعZ#&8&`5fyL1xL`ϒH6]M_ϙp3Z Ŋhl~vΣ;{psU 8Ѷϻza\;44cY`& %R:ۗS1A._NiN5M%Le Ɔ|`>y!Kom+ɅFL\?7b2K1͈fuuQ.?$RR-"񧻻3c_q1ڮc<wy06m?.3/o]/gC9.93^p1\pȨLԻ>`lx'WPQTƣxR4|}ow67DG*0 Y'oen(glY'Iro$=w8Q/&ԴpwIV%Ӈrl"rTxӗm ;S,ӧx]Bo2<(f$1pVp2F/92.QD ?dk>KhW$n>B/uXBr Re#\nbųNy 3GL$ҤH#a/oGm):S߷@e{K4V䭘8@tq "?ziݴ8rOx]H{֒j-2<Ίʞ)Dߵz?mIL"u^Us: N( U^~ }+w6 @f7M^΀!ZӰ_q_s|:pKI_!ŤjkpƴVuz g{8Z|/6SkNc:Hgϖul8g45 Rsy 53y(LY~?/ዃM2`4T&h|#)<- ?/Fx/;Zq]$9*"CTR4Hf*$?LȤ LS2% O-,F3H_ƋK>E>Ciy\hY:1r-3U&Yqad>KI5_8W6}'S^fx i,pG,Ofd9qʀ ~xrE7cZvYkټVSF%/k#WEK8iX$~ƛ dyW[|)n==0`Y.~U Yj95Z+_p+?,+?qw~gDxTWw_]oX45ԝȭir:ڱֲW-|V{__݇=}\6z7ƞڙ>_j2YZS9 O}7$E}bP/ j_U1גs7h9w5^ n޽LuL1ՁRO2T@{QJ>xh,| Z 5EZZ4R-N-X"{dNQDL W`! ]4U$iKZm Y' !uh^Dɳ}rN )r}SNvځvCƒG>H81K6jEo}&P(/ѬST)kR%I $<(:GUh+mǿ9ӓ&!J,&S[" ;x7BuwP[Y>a԰nzx[ [\}7ŧ+Wa)-`MvM뮟;ɘi:h7-l  2riFVr14BG5,N`ATnO'vԲ;tQڟ-Ei kfxlڛ#1K.o,:uWxkd8Vsˎ~>;Tp-n7TΙX01˒ A`)JER?%[Q6ߝjTe>kA\g<}\;*틙[!pO9;6}w7~c%Hw}L]=W|!+?| HȮT=📿W' 㻇YTB_3>O NY($_˫o򵶏ssFJpdpN>?קu^UplYƳ[UI|u:y[t)3 +p[`R Eg 43IE v?B-Y; ~VZrՆmW] q2ɲ#VC, gxBL?vփgnWӯfP([,]"K19 KCN”Ys^l`{a%E>~$d5fZ͘M&Ridm\N epٗˋ!yG$T@? >>b>;\D)c + u=\6܂V"^R*P $EQ`:Gy9BRߤRy ZUKz%ۖQ WvLTF0'mADӭI0 ف`5 F+gFF-x6yk햞bZadP̟Drrմ3rϩsjLIt E06++Z8},Il^FX"f C\}G=sĒ}Imā8VUW {?H~:aO_ fB/kKOXylЉ,jP DS:=K,FTol(0"$(#Qd?#Ym1RfPhdv;:VI7}ui{:POf^[=YﮥfW= *F<2Fu[Fd$ ^^+ex Ɋ:U5 -zk*{HcV@ERfĖA[1F1ZF 1y8>)Rб؝ö\%XG%|EeO*Bֵ3;q";\wK%bq=j=Ճ *rbUS8WpԱo q0BbHKJ k\%Yea?1"T?p!@GK=j#kTȡAJ.?陞Ge1[;OE+ϥϲGk{lgMo (g 3u![tr[ֺdU|2r|=.Uq8\lЈRu%l@ Hh1)S#t )ijnh-YyP.S BrJE@&s$Q F[٭MZ{9;Ȟ`?9H6Xx2Nےqlm-wZIMoׯ.\v}_ u`"q2 zRČc})F5fA:NLdfP;|DfECpK{+ P+/ SY+bz@s$v_~*_>o#]Ln)0~BuWnb즖(Ȯlf zw~5U+P:R?ZZUw [Tmʇ)%5pD[k^ ؚ>  {~ F70c\ h;S psK-\nFn})"hB"SE0Re Exa!*~7AU:AvG'N tKΓP7ݯ kr5̿\r3|ПLz矶&|݅|_ÿPJ=m cl*>72jm]5oV?o~z_c_c[)NY`o~ҟ[k&}K;˻Gz-1zmo^&+|pH.+ (y^[7@=C2^}Uf2å@[L6E$ǎ@ΒYTN K.H!RP-,YiL .(paW^A%rjLqЌpkCӄS p2~ *&ԩN P )W]|N  G^'}ەtג4:YQK8?$ZKzK4ڈw Z򼾫4`QD~HXu/j-J&c4 2,nj'CnahcWw!$b.Ze0k+^<[,`&Rx1rx=21W]2ds#z6Ԍ\wČcʸstYC3g,іtfϷ!Oצ&ly1H>]O=wj3sـ;xTQ%kdI ^aR%#R%bka^³Aj~|7/^'&Aoп8;@ 17/Jŝi^|5oh~V;/Nh>h*|VRD#K5^Ld7UV{[m_MCz#t1R{L;-YD, !e))&S_4ԗ,õ=,n+;-T6n팀؅[<=f bӎ,lrs;|.=Ysp1l?{r$|[.pŧfQ>G[qȐSBvӭVޔ|d9A/CF{K>e$)J&)ރRʡ88 ]i4S]__XRE=-k$o!F𼕂ZÏǽ{jܥez.-Sv00VaI^2CD86LXFپCsFW79Own1&2Q,o;X!j7oEG9|\3/4(e)~> 7Z/Gz^ "K!.u;?|Cnaht 4o.9{Θ/ًI_7jF8/GQ_?H_/pC%B;ۃlb}>C+Ө?s7Fdgeeȼ'.dN}֬5AtYѫcWԂ?$#c(P@^vh*42#R7k V!@)RMq) $[%=Z+(~+#& #€&Yiˬ?W I&55nqŒ8ճJUŽԽ{9:9QAMA&)yq,Hne>MނM50x~37}n؇O_dWH EHj­ENu׸ˈ+ =^X0kSCcSHKrK 8D 𮁰8ܯ}ޝuzmK Ql& N2 |Ć^3G#e48xg"r͞o4HɆO6t~TQȼ[qz0!9s:܃)q8#_gwl3F=a=3#4v\"QofԆg B2͔XJs<61%]ZLaǢT?+W38 0E*8?GJj0(|Q8%~9j>zo-ymEw342m;,OCa͡xswZ DNZq(彤ɰdS\b@fXᨴL2!10G慸QG O>Kdsbm0bp=HmSJ6IԺ8X'M2)) MܒTcBSm ^U)P)g^c ck;3~؋8XU~X׶6K-) ʧ Kcb1$9F&@('$T*bAfLHA:Re cTkuxI8a 1kFWaRƲ %b0)i2ZP㮦8aՍӝG-2+r|\sr[t|1}Fy5Z~ g=H>59ؼrHc?;̮;vJCL %zd$?O<84CSs5\-EswًL V= 0=olԱ+.!7]KBjw0Dɹ5oI>RZSwz!۪r@i:]*uIrB(. -̄5~/6JECdKῄYs`6fSX Ga؏R z9ظL˩æMgG{2y,/~j޺mv@r7S6㗅=ƚ;]xwGh6Mhc D ;/_I3%lֿ&Z̞lѹqIC3̧h9U=y{wcw::uQfw;91-Եh 9)ZyN1f2ɞ 3p)ȋV|bW]. >.ɥH1[ƼP7syĮIvq81)AM򎇰 P($j.aR2IcJk~jBbGR$a@NvC o/#/ X,*GX \pSTâvS9CiM?{73cT}5:ƴ]onf1 \ κJ8ez9 'NzNÉyx_.!:N(ڵK!vmt`5.cZd4v3RWA"~"Y&a-af)pTQ 8!K3k1*M ms|wxi-і'O -&U. I(/6'xNv$ôlι')R=#Pt{h.J/i?hܡkMW@z<¤\N urJZ!xxQ֔#⇨kG3بk (MgzC mu6f\^tyCֽתaޒ6&cuexǛŒ5M?쳻UMQ}iv-E%-7<0v')ɳ$qǷ2"+xgbT bc¯Jp)%Jjd7wtSK݄Rު(Wb5~՘yiQg!J" nZ^!:273+!mv~75/4'uә)AĖ_n,On0vi|>лdC:}yN#}u}=ۗ5/;ݥo[c R۷@yCU٫&Km0Pqd& EOL~&QS׃4DgvOGvF};<2 QYF#rfIPsQpd L#$Q[8> |u4z5/ WYo.c={2pZ:33L"NBB6%>CQQ&;i`x~C!ho⯃ jlF~Dm9(9ނ>=qBk7`L-*kuuKN, u4 i2 /?3 @ Z?< Sl67ɛEoCAQޞPs]!|ox˺<}?izY 6p[!R-p¡29{):*FS0NԛsMJ?DÐ& V(2ռ:؞޶5${g46C@ T!~)R*#Io܏:27pDE߸Z?"zp*jԗHPMs UeOqWSy+> Ǫi.:B Hܩob#oNRPUdvj:(108Η;^"$F2\_+lr8\;$6oq. G޷ܩ] %ny9 hHm1hL.wM B2EG\CZN-i|"FU̐Q.;oP]#@dXႱ>lpR I,Dk RTp%vmW@gDUm{.N!)tm\#CH;⭲~`ReL7p󍓇)!HXe QI*,e`B^TRJRȾ%1k A zޘQ:*9{;. f(F M"Zm<0Dxf.Hf ]?42z]`vլ"MF[h!aJ8d:HV dXJPqJq`TiJ%I1NeJ߇ﰺp?FֽGuT wus?F'(][[{E:ӆgZmU~5#HPd؍iLqk1Rΰ3`x:c 1YCӆ"M3X7]͚[06fM͚ʫ>k *a,qrF̟֙x-l7.#7eܙ_[)B8b?'\#`F `0v:W2f p Yh8c dRS'5EWLxX9F+*Qv,7FJ .-.~AXnqwԒV}x㑩'_%~0}M/N3G72\BQJ&IwTK^l*`Ǽ R@.;K.'|?0z0 ` \(vԌs:d$i }TBd+aà$do ghsSs4 Iڧ);BCH'($GG ]jnCjM9 %SNOCju>b:O]6zz6[$=eh8u_ω5`qJ^/~|waGbn.^>gMN^_G!oq'sP,JZ{>0^jOhk9hR +ٝr1rF\>q6'sw[wCLJġ~Jf-yt|[BîY R[ff`\>ܪ6> q~iy{f8gśpD±z8vڪt+?(ؐ(d[Gzp)*.2;2z^QY,f_ͻcW+RY k`?EN .6;S)4[hh n:{wNqx]mdD:a 7RHᬷR[jGE7Q|C\캸dHC܃ (!w{yi`tCdaɃ1)<܀p FbYRϔB%Ͻ4) K3¸PLYyls8"k%qdoA="Woup)%wԊ/_fh#;>cDnO0QZ!Q9v-P voD%#tIzűOg;."n'-,UAq2e8nl˟|ZV35Z'oO:ƇEw0ʀkyu3 n7(n]j /$+DRҔsw waWn2Ivy]`'U<=To2R[u.lfL[H[Kg8!$sFcQX"`vLxdAAZ_o7Ɗ9)\)AoFdoB"q|']:JྙF~ƃg7 ƿڵ2/L,+2f@I(\,}LXmhG|8fn!n8HnRxy'Z1ɴs{*jG8sP+Zmv+H =fV#/8)bc~9U; QmCt̡m?WVyW"^T\QP/I[ciNtRXtpt|W!.t>7!W߯Qۓ-S"wk\D'<ڶl4KO;bnM"2Y$'I&Ͼl"r"C_~AIl!:n5BځV;8x , Ǻ9L֗{_ANRRW.Tv%;lAL"%֠3"'UDڷ'])BA}usRM@2jRC%= &ݴ.Z@Vܴ AmF&fJjCIO1[bl%<7%BV)h*M.֚m2Z۞sxd_Xc!; >;o6Њ9Q1GI6@ŴH NTsWe۔%*wƛ&s,MIvK*'m2ЇCVi=T^P/٥<|ˏ;=܈%m+8.R&T$ثy Ĭ`m go m3m=0\Pg`P\KdR<*Q+TNqRq |M1+o"9wtAzE_}A齶lt+;TWgk3[Ffhχ;̂Pǖ#5nazOmqVpEH 7~CP JaϪY[o?Cwٝ~F!Ҫ7-^SmIF螵y$Fܜc\o ^YHf}7 sJ3C:œsҕtyx ?:&" uIW4zqN^༢D'Put(Dh?-^Sm}tܟNҕ浥m.;npd6熩e2q| D/tV+C Y/iaeh{@])ҳ^`e= Wt2 /te_i0bMmr֯l8Ұ}P 7}ڼA֒'Ko0wGGݗk"C_:PpuwȫhSbh<۾nB:#hsRҔأٱ R*WtIGt2"v޳=:P>L:r$)䏯oBfҞřT_~θ+:/;&^Ny˷ y.V." j WLZ&=FPhE:$."pjh*A$uqv[X%EYgIJt  c8c71J}B] L-ADovz=FkTag!49~IP<ؿ|DRZ4wwC*=Ѥ. HB0\VrYr h8H k2xo(Ah Q:Q|N+4$k?߽[flCߝ9* FʂQ'Y 8O'r7'Pw{0`:s{ ߝޭޭޭޭz |>4^Tpc sϧ\d{TDi+!= fT`&6Mg 0͙d:Y772WF8\"NG)4­r-c2{$zxll}4PmLٞ= c{^c-|}PZ(A0L!h](D5S gB$cD "x^rR$|*{$h]d0ۓqhԟ-ۓ.Ss0|6[e"9AvrVq\*Аo"PLM5Ȕ6260#:fy`^HHY+t rj:J*dѳ+1 CLDm1LS ׌[9#)߲$FH{X mϨhQgymѱ q\tbڡ XX! >hW, P-fc7KiRSbE0}pfC#?9X m<,G_K_A v&BYB9E\ߔa|Bx"9q&B9:o-0/_GB|KJ PD'0φdOݨ.= IݓG扣pG)#RyꃤQ 49{CAAD n*BOڕ %koüpJE+kO377!`E|)ym}|3AUʠp*^PQBצçF7+ᵯ DW{\.6J{!Ədq&Y^ nt\)cXE--CI:4,! +o!($FWaf\!Qн1(SdRQb7CvB$6|!XXOdV٘C̼>$y`T *tbqѠiz8$'V|%mzJۀkhC3b'yn9 N6WX j>/Gy|ie`dwiXB| f6G9ZE&+msրF᭏Ғ nsv {]Y!L'0sv İ`:**7}aTq;thqH6M,4xK7Gqtst@RtB& mT֑LYK\wZL@s{ATpՃ[s]IFu"֙rNHTWV sܮe O"R^dIXv_̗K 5o dp 76qZѨj(]"-`m\/:pU ϓhluöru ~>_j38 1 97Ӌ5c ؔ`sX!qy+Da3.Q|Mot/fnǴ:]~ xn]o|u;ߧ}:ȸ}OTrh3-8zo$:ul wΦ̭֖y0fy k p ~T(R+ߛ{͸OIvJ0\HZdLQFI^HpܵG55k27Ɍ ANRϪۣ*"N`$R#[jٸ׬w}SQ<05ZcɖN|`LO[Hw6= | >qڻi';mM0`zޡO D׽(zun&ք3=Ύy'XB~5`5vOü\b1icR.l s xjwv(iÝ6|' wڜ-=܊M  jR80Ɗir^ITCWQEv_a @vR]W)h*;'R38N׹KBW*Gy~!RB"uA@('# J(+- ݙh~7>iE Oti9oOHi6 ScNiMPd?~dd2$ BXۈLpp4%G/1XbGÏ&t6 Z:j2&{.ܯBd_ sYEb_z'P3>g{_s<RCtLv>].:܇.)=2޲[- 1 9 ܸ̚,q⻔L$%B&~)9Ruj/M?ŤDh`5lzl;>> }Wֽs^srKmm.Ђjl}'F h^/IAe_H:{;N@:p$g+wZгkȸԸ};Ds"˷. ч[NO8hڻLWгPƆ3mN{疐ۧ5;"o@Fwzs@ipF`.\\ݛd_m 'Q6c$Ei[N>S顃ɇ2ª'J-8 E=a@JONzG_QK _n&xӂr_Le2^, )[]pL,_{kp6(UyL8Ap(fzrĊ".D_axlנ4íh"{p ΓLbAja49ux VSX+'Bii LyA%1˸,xn Smt8\8!s*:ʁx!LK(Ȯr F4uBQFq 8QIH 1k0[x+;yGmDO`!R-|m%NxALR6 \T GkԶWFiJ*`Z J%nr[+: [XW#5,&wzlj=%քU@xW똧OH""WCޥ}(YN{{W6 WZ= -k_}tN&onx]V~N3߬kPwlo!],Ɵ7Nz 4:9 @WyЊ Z ǑȊ~G)7X x K{ +/J_Jq}]V'7BUm:[ I!v\Q,u5Cv),. I*YprQle'*嬂OlZ~˺8dUzaF2 eÙ[wCJi [O$&e4i\Rǀ6!g͓ *or ;qku/?+f4û쑺|?۶ twBbD깗IWfwŗ`% |NY IC: +A<*UY ˖)UV4^?JxEu7weB"$F|,',Psg;1pf0 r3%^ČQɋ˼SӞ4K^<_>S)"=yV ԧL9R곳V<[r|\2X`G7UՉr^ L%/g D𿸙GpSPXԿiRT~]=]d[?OTrxIylʠB7vN4:bqx[] j:F rwsS@X?a"* c]MfRveWW"TJq.R_(Ȱ1H\j(aM9s/yڀK%JK64)!5oK1kvXeM^Pr#AL՜7),-Qxk$BeDl T XjM*K K5+@J8Vt}E#^P ,;*W-2< HjQ+ba„t*2UEqBaU?(#VEgc[K0$q$qEDFEMDMmXTD 4"E*b5X7(?Ax0kߧúbۉ7oW?e4q$ѷfy̯&j3UULW珢hO'xו0ו>No] ɞ<頴ը )YWg [LI~r>{Q@׌1dș`}sfM">hHdy1 uCt/mvc5Lً׳2 Kٕ(h Sسaz˟1>kwK?_5;Veg6͸Ư.GN݌z9z?,f{  :1 XȈ(|[w՞횪1ʇN:Rgz8_iV+CJ``$CSj/l}#I]*ECn:"CZ"@>".qYLjEb,Hw% ZSC SDb.R$( .rX1SZI(DEha8 RFLxsL L瀰}p_O.5ѻ"L)Ǥ.1&C,W8(*S^GPa`^1K}c`(r5q^X6e')E8ߜP ?;#Nh) 3\U?1p79jR ˌbOf~ ^ZD9'#tӗ2/TyW_ʜ'lK?V6O5`j>} fsNkg"\q"ײӚkZ3נA(©7~+fBRƷfkW@3 E{±&S %PRIsRcq0Vdz^p)[qV>)NZM3.6EcMǝlu5Q&VQΨ SR{J49g !.ZeaXBePsA(s>:eEG|9^`iփ71*y﹧0%w!0(BRo(S\9`m]/%I%8a-!( 25+u\DrIa:"A4hEZ0C$BWŪULôީiZ@"רgjZn.D)*z5T/>wUUwv^|Ǹ5rj_6976y7m2E|}<>v Ɵ\.z̰b-zn>1_z̠߮C;#;73ܠz{'|*w/ 3ʆkmkY6UXQyP.C[]\-/peKs+W.+?,/0#afhyA>WOn fՂ@)BűYtK$8͉݅ v՛c$Rtmxn)?' 3ݑKYkZ zѓN"ԕY&KwyoE xyta WR#.N }u3˖ҋEj*XGGc'D`1TL3ZlehJmhu$`)w頬<0$8sY0Y:%u)4@F$lAE!)K| BР6$HküұY!874qgEJe?2XS.:ΣZ:)< 4KXvO↓TsfR{F;b&M;nJ;=Ma6}rRwcW@#SrVL=!CE4  a#Q>%l0DG9Ţ7ЀF<}}jNt>u3"FI׿W_yY _NYwo\i*GW_:K,%lI- "*Dዿ#Uh4̅7 iA/n_SL0@qܘʶy2)A㠂<0~cq#L.]~Lrh\tx)f 鰝t$7^(j_qa9?cgzq3`QQ؃WFՃz0;i-62-A+=gIP.űqz*y1ѻ8Р,caPzjjRxo_hv@KQJ{'/x Nkbvr)L¤]n81 cA,&o[@b˳\ q7SHsB+Tp" M-s>3.]wl[ǥo^sK 2F0rn_Ҩ(:Svo͝0 b-HM1Ea>Y-l2v}w=)ZњO% Ohj?Ʌ?݆ɰtoVuP;!z#uLEL Jw(JqWDciJu'LG}TἻT׊zw*Mky?mwMIM75ʴ[!>'O 9F7p7۠I͇|IU-!t }2UHጧT.$k'9#OQB Ǩޞ,l9]h<$GZ.d2f{|G+4* K,=9B)Bu RߔlQknq*6Rcyf$Ri@!DG0cO Gk*9Q0t+y#MeGNe'q;S-ևF-Ϧgk-+RrlxoЌČѵ~>wbZa839:z2A)JqL8/{a\uWJkÆǓTǷc8ʭuW[QbQlFǔ)T`K #Hzdџ5*{kc 1Uͯ}01iTO4^>[;[ -Wm>67r_S,IqňF9hbD2P\qj|Ŵ7fWIyԂ35-Z9 SF&/[ x819F;G䛋YbʐKE,nlsldfbYg8~-,.K o5eZhXOxŤUƒU`0<(VbVs<(Z.lpZ ތUJ{ x:CR*^I»Zl9u1PQȴ D3)1Ȃ{bw*=3GsWcL p6Z7s@bVi_R6G+Dg`u}Zu#֤޻p8[KZ'|cF Zeߑ];T$Vݚ*2LuOWqfN2sLrr:*'aF 8X8KCRFH)# 9H,ט Hf=M8&iLi ;Z/LJqk/0@["'8.^O/lN66ہA+˜$>q"k GkJȳ .:^Lk!Pk7̮;0:ICX/vxQ qՠ3%Z6kЂUڟs=`qn#*^ d%I)EIAk)u`w i-u{]mo#7+-k4Op{ ~A3Ȳ#ɓY_%ۭwERF`2ZSbUX0C 1 6Z o ' MjC)"@<[p`qOnY|>k?([͵W^;n@cF:q/QJ'舐&|}Le/ FEɧxz?5ćK8~/oS+I~㫷;'>*OaU!{wd:[2 ?Yg@mξ_ldy\ xG7'DPp`oQ_N^Q۾%(0SҀjɨ髪z 撵5o"X~(@Jzcan" J^-QkARoC6@x%g oA^{M47uJҸfH#~TBvG7B+/ylv4_JjƏJ##FJ-q.~)hִ7]TC8>cMLi*b (gK٪ (]T^0e/T>=gO LVQQC\ᒴW5o q,*+ʝqR,ZaJWsax|ByMcp!jE!냧xJ]9bJRy:_^cF^Kj?'1S?Q œͼr1Vڜw=%n/1a΀蕲v\ڑo\DT%#TwUc[S RDuxr=kxڭ E`v[79EsVs!b4sJ,W* NnT.qcېktt$oq0zo^>fi]iGJC5@ ~I/<=?[8Vh>{xu wgA悡Ok$DguߓTvC\]u&&vbϬ@: n}TvUj>,&u kpu2D0l4 QFU.me#f ^MPS&5iӷ1H2go f #k 3uv`TU*!9\$*e tkɖtZb9E^Rjk‚RQ9=ÿ"[jFxNj?F9KV4\ql C*&o39wWdOg<(QxTEs*A PԳhsaPB8 mSxst\)<) IKḅ yao{DᕢԍYpsH%S5~y֨OrLd&J0.֊k(W9F 4JF3-R{7 6Qx2xp ּedgn>hnC5bǭ'v/oS|s'qJSl4*ji/ϭ4>[əNA"pM"`Ύlݢ 0R:j)\nRx1hTgQ\^p¨BCQVr 6^y{z Vh0yL|@CrЂ2b!TR/=!P0[b-js+q1#AAxňJl_ ZP*:2w㺐BHL \R\s'W&'֒ꂐ"$RYͤ=^#SDZML~>G2)Ce5(̢kS^{RSY,b cE˜pj9m5Qʨqݕgzg]uV h(ڜZﺪ[x4&oU~?^2/?OoCVS>fa憒kT.[)y>9ms{7x~ ovX 4s#~hϑ"`kO7ق=rF+Ai)j;Z\[H~ˢ^=#Tcp=:EA#;$t) ?& LHToy߱sKN:Fk( ,Z[zLPFHCƢ cztFңI[ypQ4]9^i<&F3)QңL"AbdK^Kv-$+R{-Ty8wʈ<Z,n}ܹP8Qvu%4K褎 cL[Ň f_d AEߕۧY,&dJS_,ⅳx8T btB(;'VUQTfBYF8H+$"+UR^&@kJHŘV|O=ا46L<ZusǫPOJ%1d&8e-EWĜZ?=z5DK^\ބ4cc/*;lEU1.&N#lK!RJҨ58ޢa( eR)*P.8>/>ޒA)Ht4̅,ȏO2EhGѺu_d9VZ na#pK7\BVX+l!'+-q>?pߒocߌ+1͠@E]?ONa{0o<+5P%;^rݾ)\ޔ&FqLPpM^y)Q/)2X]ޣŻ˻ K[Z-. nn@Ɓ73:גDQpO4WXN펓FHK8{6ͷ` bj@ՂƂNj4(B K(>Gn:ׄ&;5vtJA0^ahRϢ&W˯4^,@ޞg,IKmc'fbG˽A6%0Ųƥ&duD.%$S$gg$i+(Ɠ(J{JUyAi4ݘQ Qplk qJ)•u{&9lX }ypS7=(U>Fe^=R (ڋ)A'aְ6D%ޮ4AWv5E{`_A Qz1kBQm+mjA 9+ݞV$Ǎ&¹:#W3$;͠X[a+؞L@`9-x 2\HA: Z,^u/buJJ '[6wC6$f@FAvHɖ0^FMȂcwÁ~`5X/(0)I& *VZiKDTg[xۆb^a0S:Xןl6WoIYyOܴ1}H+# QfᵫL2x{ktMHlh˄W-[c2I].ue%m R΢W3ğ=kXeO/>n([ {I//\V>.pWVu&6kpHA;[EWqA9 sω̄ ]2(u4TXC0Sm*)?z2.vuG'~sF4VɧŒqKP'c[DnhpQD#^}~'L*Y]8HyB;){Y57aCႛݯU IXvV9 \s_(c32AD:S)(Xb*{HWʐ{Δ2N*ՓSq899s# ;S1Q5z$-cq"̜-!mwmMnXe7;6nHfmVMʩqyI݉Iݶ5}AIM] InQrp.e 8o"ƽ0 !=|r`0 ?ew.?]9'@a#qvrE@*RqKUa5K]U4_ & R9 NI 숯jF"iʼn{P|Q3CmLEKs/2:sqd[(,5 JV4/])Q$;~[mg $wi,Yn voք*}u\?~sRvN }:yG:_'߮OEU򋕍2ISOrU/^29ʫ,&6ϋ--rDH_\Hn —޽@g!3曟/,#AA' ΢}APz+zJ%8,Q1nE1Ed:<+tXieZID0F$Ư疧RVZ Jrս~ɮ28K x4i-mOEUVTi<2|? 4+Qε?`l~7MD tdQm*(Px&0 DEV;e_5漫qmg)9sqeJJ<2!ƑSD^"/=5F] 4С&.uS-)R0DpEL8.ɾ?wLm)^]:˖46x3RyH9/|Ah3( ϝo{.e5I5kwu}ءZ0*aTE[|IY3=!w]Ӕ{ՀMRzyBQh*ZG+ZבםYޒT1AyCZL QX3 ],9r#0O(QRG UUN(l9J箢l_-g"R2 L mP8iIEAHf;FJ4Ɉ֡ \hHxT3QO8:*ʴ?jXIFu2}#?&Q'GsޑkxΊXٵ@vV.yGu*MhUo;ng `Q]򶳙pM|#TW(0PF,X5/4|΂%x#h/V-F. Ԫra5ߧc]Mo;-AUƠmBhMZn_-u xiсK-g6ehXw6tKi.%>#F@2p)5P5 R@FM Ҧgd_Zҩ';qvn]*.)W1?\GfHe"@zd=ʹ&Hkʢ!HL9Dt\gA hcАWy6|Ta<~֯a0щɰWEמbW2ـgfo?쿄hxjܻ;gW,5A<7nv7V:ŋNV_'3O6GsΉznbu>r;c$3ː..PA?98F*2* [+嬏jWc;.?Yu.݃ys&up {=2tL!%,|΁zx=?ü{nsRU! c^۵[e~frs-[/&ÕuߛGaR]^NngQ.w@9->z:HGн_{oor-G=Ƶ-îmD:HE:%o4 ˉ0=NμŃ m y^CREp Qz AnzP; ZXw|^"S4sK?bA0'dq xO!B+FvOZ!BP wb3{B4 tZxlx2١|sp7\ keDW2d(2c_<ދ(%[Oe:7Ly"H*myH(7yb씯x|+sTZ Ah&%ZE2ar\{K`hrӊP檔Q>V%O"-j$aaIZ~Yg0HV"@nyz$Z?^ǠcyÅl3 n ) |`D>O{` 7E_ubtP~jّ| ()3$ ӏ ηb2q\#9e}9S=Ya]ꈻ2f)UDS%;l`Gt*"7tGqT-+8N@3и>G Mcv<F=o?^ӻ׈xDi%MEwpi*$/u؜yÜ:/JNdUdc4A,n\*tg!_Fwˊ!c=fhXmy/013"\ `2) N&C"qd-t6Jk Y1z>l"#~~xn:H/ëI~5/SPr(x Cҷ'/B9ӗiBL|u=cw ΠTim5*:}v,*m]R&G.K8 ~#^ #"A6qy565^LP. 9NR+h^26i/xyz4haژ"Czq2_Skt3?d2S "ʠyFdHEzOiON8r+e9滷?'_$?1ʪN.oZN{mQxr,QԿLT>nmj[^oX%%/Fv9zM$Z=ß$gͥ6 r|/{鼕L~]&ӼH=^2D&s@ ̭wQL91_E-ʷNV%5l3iCN-3MŽ5+jteGs5nl fvzSz'y~ 8ocםovD,_ZtVϴkٝ٘!gaYҷB[ER4A\RBze}?^};絛eٹdȒ+r d,p2Ɇdt).@F}VgJj̊91էW_vITNd%eC^NLo?4ctd>RO=D`sbRЌ:U_=(5\Тm\!m7.͸;$!Zur;Rf3aK1k$-ioܑ^*I5וhzeP+\O d\c70t!-Y$owQ=ޔq)Ũ~(Χή9_& pX|lK+ a2H)u*{&@xnVWzmqU'vZ.lrubӋߖW,96qLeOخ>ZԤVf5ߠF\*DI= UqySL2t:qA+E 0tz`G|W;$$j d`fsM~r ʻM0M0޴;@YۓJ 6=B}%26C4h-XhKD$7r%WRGՓY$PZDp8&XAsNVv?Ċ*Xˇ\F\h``}M_q E,S;E)EJ0n\AhIOA͙ИViC1;_lFP#C/56pN8|"S2аx0PCUAP&/LNx0>Di#߹wK 4x8F kDP=ڞj4wqlP|2yYi'T8FQ5\YRyо) .̭B@P;Qs "fIrULf :D'hU 垷?Ej9m9dg"i+K+2z^?bl\bBmjr Y>5g'I2kHtp,!Gm Ef0SH GQw[蟣T0bQݎv+dr5£淣往7$n&8fwfb륟Ƴq~r VC(BHz &3Y+C  R&j` .W ?Dk5&{NH!_8aS9r']S8>)=ȴEmBunMJ#N2$/J4O`U?r<DzS a젋z4a00[HԸLicVd0 .F$C}O4˒EFQ* @@OCctSA0~CJ‰q\ZA2%+V9I$KN5 k۫PI໴`L<2ܺ !JcsCPF[-(k =XA|MZL+hfN(j*9e ͦEL A?)}:cXLK.+4IR"d֥ۢvO{!PoU/Ő}S#ɸ!lC>W^]_f6SK}qѣpHw!8xۻ'9Un5`:'㋈o>9bYAxs?Y8&>d:]ݜ/cBmy5s;}sJsu78>~"B=|{tqg1q>\Ŵus?5j8Rmșnr:ծ=թ ԛԤ%q^6hh6K΄:Kɣ|5."݀_{ʮwQ[Gc!)`֖=jYV; r]~>4qYֆdFUswKW8^Z[^+o–y aojBS]aa(h ہIw>F>=83דGQ镣[zհt =BH1iZTB)O F:\xJbĚrMr͈E^fj |Flb{D'5Ċ;0, ^ 6ĢAw*")jd.ͱ}*,dTQ}DW?_/i.ήfrÚ.ǧci,JЊP+QDz11|;qPPl\nuġֳم J9<p|p _ - %%O+B{;rx.w+ vb~*]:[9Pbpa8_\דW_Q#.AM@GqW\Z:=J>wR_rho6:F)$Aɝ(08FkjLMIׅcJfgbŗn.=~WK]᫁|'D9aG]K]yAc`)΃ U@a6Ms]K5|Ɨ-yW%>w/:VD7tmn5m,){"ż ·. 3Fb\\OF_yPW5F^:8#0%1XܗܓԼ{bͻPKg2s)hC*p}8;Ť%vU< =)Z4ϱ3p8%I?}ΣS*ό"XI JI΃-cM AĀ-5WM$QxiV'FuM.UZj^ae†Btɀ ! 攳R|>Q$Z')8,`ܰ@*akuzD`m 1>^\}T/WyIi$H"@X RJg]p#`lsc} kjW9iFUT߀rOH<$hBCE{ 2aQDH1 PU2 (n^a>"w\~Y@֗#Yئr#'PsHRٮAhE}k !A+m*RWVL8xsL5a_څo-Giru1])GbN&'Gkq5>8F'];WX83(o8׏6=.TTmCxNKS> [G:"r cY8& HAJ(#uFTP?]M~$ӬGS&:6ovш-.M'QsTew]^QWN</ rImY[6_WyE.^˫>w TَXtdQtGb[z.wXhu!pwm1qռ$ e|4iw2{}H\ HztW=3۠dA([-old4nt7X|ZMG=(d*8bɯe-+伃Qkl"]˨%W_%+o"lT4\,\R@^6LXv4VՊ6c%*cE̪aȜCTȌڱ"JY0`)jpQɋvp_XT!RrDPh'&LU2רcȎk"NKv)6[H"%W@-@ E93#З\",ekI^},: Bg5ЃF4y(#/uzh½xHɟ$&A$4Ce7PBd bhհwjh)>E|g?N㧴h~s;I_eQO &&s>(G)>x% 9Vi>}=޿s>[_^\p40s1w G6"`%?$3")yprLn$;dä6ku8#N'dxTS)lXR|(UPLLl Nn)䅰?4%wPM]Y)VP!qXMUș(4QYl*-?;fn#ѦYV"沔,ڒȢ7QUb5"j:O #3rͿ.*~xD0,}z_>ӆ&tהW.E}E=|W법bI@IϿ+WY_j:h3Mi-}C?0q) |lzmz\_o,7|σA< Z\`uQ zM98<,Y,***x9E;E娛"zuעYZˮ Vq+HbirIBc>"P3ږLӘl.Z+mg'ٱkI F:ۚVU:_^Jٷ$M/ &)qL <$h\h"<|PPMXkvEq-ɗa J I۰~V#Bp,!Yov^[PEk VP@Dp u^MC qWRy_ ES@n?K k'V.\>oثr8J: 9PXX9+"{•2űr{xQCmk񘵭IaCj[9׶Ƥ$|-z\!sעdz{;kCX4^"ʴ^s:n B0 R2ȏWN+">/g 2a<,!mϊ,P*ǼKhVru1[ tAyn r&N! O,4 R32S'3.3-Z#  ]d!TGXcpwxԓXģ&-1hDŹ3">cLo]'O0?ЌH6-GkU /dRY) 1f/ LĖY58 Jt=SI(]iCi]`+.$7]M*ljWMz&UFk/rCJL)'7B.FsuZ+=iƞI4}Q3/_&u9v1|?\&lAyӍ 4l51ۑF;)i>ж5ICNemEC%J/.ivn#(mkK TRo yąEږ۬W!}K?:I 118]"ƣ?)&\!PМ&m@>2}zss4gqQGzSWPR&Ol&~weޯ5]Y8V"oul`4J`)&z%Kܪ .EV ؖ!1f,1 ]^d`3K$!ykHj >g+:ɍ+I =?_gR}fPy7 C~MBǐZr^+ s6}2q+\O ) 2MIQ#S&egjd G@  W&nrUd^ОE\9v*w55_kqE.m2RKl iHs3Vj^?1cv/B61P+v22C:Do^˞QK؁ZGD^ W޿H b|Ȁ1/=ov'L[OلRG" )Ź 5n=5a`dk)Ѧ%wB:r !^U֢3u/elGRz|zW,Z]@#37D°e$7F+C&vӃQ[YX xd<о2M h'Ù/ֻrLvc{*Dw; lA!VN!n-F eT ģ@ED^K'G] 1 tC 6/f<-8>/F1)<#A56'Fgj^84?3S7QK`;$daEN>&]ľA7i6^{I 8tœEia}ܡƪ%("V4NDCJc46?5EqZzGCloyԦ'}Z*gB-qٛWnŁ1/`BX^=m. y!H:.G<{ iP>[g.Fg@?)TvDlz Ph5*(]1\4T4͕Os# oyVE ٷЛ:N *Ъ7iϷqvBKJqBߖq{zWwKαՋ)t^L()D2ѹpAeh&- RBi ު ZFR:*8/tUGO1 s6caG w Rz&/]RgR'FhLqHR$M (݀+?T]mxEMD<`-TK࿽Gj>ֲ~d/ɤFgyv3&ZwmmKz Ρ2'6q H!Qo5Pf83$0l]5_UץO-SgZ^iJ֝Uo3Xs1%mpikAܗQJ]Z2زư_ Xer-40V qg7e7q/ k*ل80$:%cmHf<88"G  ~a\-9~ X\jgr]1\˱%pՒYϥHJأUlI:3]ioO'D/UoGN{Ҥ -]WƘ)_/J^73;r;f4_wA/ 0\>D+܈g7^A;6)(1i[ch3S/^╀N;x80b"Ӻ#nţ~ϙ+R-eϔ;jطtZ%mu? 9dd/mH8;ԁB)9 zsX-7 %]kfWg;?K2G=eZSi6ŭT]aGҪB晛buw_'pd.$Uw&K/&~W5,W٫2i!tS.)Ϳ7'Þsؓ9=O[Cp4Q*( qqRqH`As)$Jp?gyK_?_O]\P<*-.'&sE(Nr6X,,>ʤ&{XiX}kɸ^V{onueߏӷ7K~X7pR2!(Q9Qy@tFZƀHE$BVGn3\:|6I@#DS+I'm3%R2%1A3eЁmt1Lc:#J91+c|R>s`t/)}oRlvȔsIJ<$eAT.ۥOdS]6M:Xf4*.8))^se`ZAe6ёYN\2Y00Z̢+e*,XNz@ &jN:bm4ALpRJD24S:9 |c@gB{\Bt>ydqAH\;L߯24~zVeW@O/7C,R<%7? uMAAӓ7aie0VmñfP|( [߂FI髣7GbsT䇼7GWOɝ.>8 AkH+;۠Zi .#:VQpqWAh eWJE&3w:_]j!^V}~;?۽9דw FNމ NVɉ8!,yR\ Yぃ!z!KY.cC0K yMq.N yNϫȴ ^ji*˨Xl>9Kg&$Z}˯K;wek9 ]*Q7>_ȗmMp;E!$"F<2(dEe(jF |DGްp\8ۛyt^Yկ ⟽C{ݔꇞbH~X0h5g0G=hI5PrD&s[߾_7\>j83ISZi'm]Hj¦jcĨb$bL\5:isJٗ[*@R)|D w7%!N)ٮhSj7ho eZW+'W''|-QLlCߐwr M3YFOTĦ>ݴhQø)]̦_꽅zw|#Khws]m;#ؓr{RbOQI(vy&hyƔ[SVԺͣ >RdLGL}'uw|HЍN&FC. Z180eOgzup"ɥ]/׌著k<<ƟgO&~UeH6&+ jg L6si ʸ --DSe;g*3Gʸ{ OvOklm>b0&9w/Vl4'O}Bo+_}^&į5L݁4.!99?|NaŤ;]l]x|~ǓW~GFfvWd/nQLgܺuuֹu7kkQ7:n&~ICFJH„43yʠE%~kO!U Bd~>Em2 [|¸Aoo%0Aة<8V{e:6OnX? a-[O3Ƅ0:BC"5N&8ɂ.f0xј)-x2*xp: =iXi.pDdW(-"1cҜkF_ 𠢌 xf yQ$`==Rjs6eC H\ǰM-l$k% J?R(=GDjc9`C8-@'/ *gYi퉡V/þǾ}0(Q݈f*.kQzm Ҡ"E#HBij^IeH qKWW U/`k *t[dYse_K O֩ bkL!y@|ږNNhh'r֒dDHT1dqdMdepy@P;CtWU%Atv%R}/ULt$m坠Pғsd0cȔb@D+Z vQU+e tgD,mͱS%Td:JjAz]u(J|PfI s-Pϊc\C*o~5Ge .@gm= uZ`AӀ*P @)u¡R^d;W,48W^su-2s2;Vv_;7Jڶz ^GD;4lYx6DZr`z͇_7'KF?Y&3PMizw&%|:TI`9EZA]^Uުgbrʝ;Sh=9KgX8ȯ!-&rq~F1NȊj ծXqӋ .НTRwł.Wt\)@fRäJB0i~YXztCpIN BeJpaKPPqi8ݼUѻ$s)0hYƶ(vPZ̲ (QVR߶AqNX9]1M]2*\Y]UjJYj p^9ܳ6~PZH ̷͆üE_n ښEgЁ%͕{-JEYdW;lfWaf[fEfH`u [mƼRAHCNBs;_R|IJC!"G١\ @=0ciehwK P3w-k*U=UH&$IߠBJՒTuL1`xJ%X(m{ʐOLSc!P ZiA1]nRBoҺz0 `gA^(Z)h#Zm!:H0`܆%@mSA룑3РwI J@LL>_oh5s%x㵲Fq9nH~LzD:P1&->:Ys<FP[X+FIdb=SJj}Ca!Z %@sp51NJp>%6cВԔhCY(Y<PR.l=k]ٞr&yf J&9#J fF3Jo^Fΐ34L)EJ_&lD"g9%DB|mmg9nPmT1LH ˄(RMKբFEh T+mq@QAdJhYnЭjVZ7A+ qK9*ۉU'˥~}\iG~~ (>ŧ7GW3zWM[A{sX4۹Ҝ6漣fn*Sx>56e:IAkFϤ"Ɍ!-D\e@Qpfa0g`jzY?[`EJ`thLN{CL{G7qpi= -vA=[@^ 5oi?{WJ_vgg@}0BݱCgF M6>d{h6' (%GBVVV旵Z@j9^ژY @hsmk01(=z0`BG`䤐3870(; sӾs OCK)|Vh; K-f-F(/Ddv`%HX~` c#-b0o[2«i31M@%#ݎ]7iUj*~ϕ= ft@ݝ#Gĺb+S$iMgQ2Gm4fǚ [-OWQ#`; *ą/"ϑ&i^NZThrk5HHұDCՑY 9E):ҐJp rinSbNa!k09FX%K8Ԋm5X2Q-Jɸ bKJ"R Lmx$dܧy= t` y@)^7!jzCȠ+@0|J 6MPoVuC=g'HMM6lDh/GE߀*KD&~:ͥT?I! _ 6@`fւMX& y*aiKh|HϽGzM7,Xڢ(L2xf w0G8hZ`hӮh>_LSK̇PO61٣K.ZME8(|;%E5 xy9=%lNIՁj"q`DAT2ͭ EB1ީ+Ԓ`t "vm ً¡6{/E7MS+idćc{Efuxt833o/7|}Rlq3/S8n1_.>;GK+%%{P ru<U܌$\).ůGW3^~3t @?cNDmT T,Dg}үfep/>h;y^g2jVrǛ(cȐSz1.dJѭu\̳a>i˦cb tѫN.qQU'cҕbՐ.^@s޵Oc\Ѯ.%QɼfGZ(9ٌP! .;H0AjI l*4 nܒM'!FӦ{{P][Dy\+oى7;b`:I1gVAqffia+:Ķ^Q5GCTR`m9f'ieo@O"(ckH;K.c-1@!NᠴU8z0ֹ`<[.\yq6+2C#G*$1"<&;a_`</-/FK2tHZinoGcZ,hYfX*|litvǜ@i7[[ JQE:sxύkE0:}t[:䩼[F:ʖ,Fgٔ=>B|V gQm hNY,˝W$Ǥ0jdm V"ZpB>+'5|0{j$0|vთ/K1 &k'wTjME;k@x UU`'J˝@`UKx]f'8/0!Z>5P" 0'r`N [͕SanLHM%礋h3 E9rS+ e+GKp׀"R, px`pe03cK Y[J).<<ϡTS7^ds5cX w?N.$xc`Mۃen۷ n[0+o_\ߚXzO|]3`n_s+WdݭӞBƼ/D}c(wћoaI=?NnTD6U҈/\yt{|Q<Uj?%~5^l6|&Ua O~S%pFaߘjkq*à MA(3-[0a3{h..q"V n6]j\YW[mVZŬQͿͿ ƍBb >/HB] &bA=J];x5Gl>(Zkޘ_ink04м-L<^.ӷS`?yEqK# T5  83!3`\3R>-^?0 ' {Bu7̽TALCjWK|_>\T5F\ !vpO_>7Wʧk Z|];66~.!]D|q2vpBo1Ls}!p>}rcC܌V lXn N\!w ip"_懿w"#Tk{|7!y-gsWqI92F^xڸ|[Bb fu9)BAE¡'ʅ4{a5ED+#X.!zs*9"?eZ?*ۑ uz6e*w/BmK :X`ZX0J#I v-NJ(׾E\W^LJdFkC a 8_76 U(~o..!fJLl'MdJK e1f`NuP0qesS ?{Hn1/fZ`6 8g s4Dzs6 R,Yfwbk[fbXDo .T!ϖdPzZte.V='KOn 4hPK}'Qѫ,,襎^%oѫN-Eϓ/"zi%oESTr9JbuH *a|U@qM`ey`E[WMeTJ1RA6f/>eRĭ`OCTMoCUMlFf$XةjLjb92 k,fPrFӠ0q:]Κ/8p0p"IdjF QNF7 *3^-H@~U"g6Q(Hsz|Sw&C]iX0&˜T0T$ iÎy8I3Ykiӗ[l.XjgW79vN48Tȩ3݉rjG%*T&j r-R[$J&|frA΅BR)=R/`ULo%ſ%S։RYՒ۱SpڷlAW_(\Μt~%O{vQY"_?me I68jn5t0Wq)ғ/h=BYh[ML>0Rڊ9`XQ+@p$򹴞a^(͓uíCutA+ j ?`A{^Fliw5! x$rQ^gɨU0ٜ?ڙAvs~ǟ-/$RIҏ,2sF"-cf_ M2᥶D$>F?)@G #W> A -FGry̝٤ f$K2GZlA&E" t%^H`stβF/ Ӆ}8 E18 _-ј=Ν .Qq+OE4CZpf8\OTLݶR?j@hPsVWѡSu5hUE5^__4{+, =-F`ɪ)ЪߪbxZhYwD>xέ4ܦ5BP9&}@ c/_2gy>]-r6uZo7jt>߽?[ޓja5 7Az/mI6~N{zHyf`f7VJ6/XqTVF/1ob/){Շwx0鼧I7C_w 2pO7ۻbmqGU3;Ch[e?mB2"wOd !}7ܽ;GGƻ_>dS9nbCgj|TНj_ݑЂTgoD/ Pڂ2$=RܺhYBEn=V<ڭFi177fAɠ;+pt1Ј yԙF[L5eӚOHQm`(M_BVĢC][t7⟮;J+[9m`ƍ1iML-l:_v;wOM,iF6u^"MBYY{ǎM R{8v$ˡy=l7췱 'u<d.&ѕ7 YI7c0O\44ђBFF7]įVتCģeԊ$B :׷hp{Ƙ2`z΅jb您yO{R&y Ozŵu,-<a\;K KM=T:N5(FՂ(TӄR;'nP^LŒ~7Uvs<̖^7b~e/C_døX]3y ɗ/M L^)gF{q"׬q&%U;&;'Č!QQK?k3q;Nq*S\!0ԧZӝ ( Iݥ|WcC_v[wm?aaj4~_>)zƵ{/]4+itNy,{|j%=)5Xr{$!߸VȔ T5p}A- p-JDv6Zq0oyYo\D˔&C.Z->^N/Ky@t A g&)IU04D=3Ɠ,%nʒO0HL ZwӂB*`}0ors:GKRI!w3Z"L(RϬ(*6'ʩU"x;HZ31r1i@J9G-JtB GOx]pX52^Ze7jSGJHkp+EYH<DQT03% `>e,KY. . 68+<\X.45Zb FH gVDC0k r&רɎxV&gO_&!J"u$YwiHj9(B3 "3sIQ).wB<<8kӶJB (ft%`,m##elEF&9I0**E+:E#" Ak+qJX**(iWdyo &-hqg3ӎ&ںNND|Bch5 TQ$`J $7|!íJ,..>X̼-kf"%tUCY!CI|4o.sɵmBfmb,W?_?}W=}\E?_@F>~||aW_1jܤG3f͂f#音$򷋻~>OwlߙK;_:= E[Fдp|ˇ'a0c՗.VO+ܔXR+2/Ղ:F*Nv%)N4z/NԆ[4j L[=TwT+gaRUX-+D0{QAE-gEeiQIM~d`AxHeҬW⥶Dr.ѡ سcy|)ߪpZp1*H^pj!GJB/轌:*O" X WlV%sdGA֛C_))Pep~ݡ $P2#&dhzHwNH?`T|c$t1E֣0;V0cS,,9LJلJ=x.2{.izRrLP=Ap C|ʴ C ``^^KYH*X@u%*/?mˇl0rҸ ׇ:2F" 1Iyz,gT8]0ކ <] @^$MXuw.BgW59o3 ũ-_hS*:B}aYJi}ܦNaCl^`="zbf'5c촦 Lzzݦ864fbg?TF"PRLHoGer9NF$Ӹ'-뻜(ՊQ9P̠!mY82ꙥ6ΤipL3`zT /dƁ 2$GķܟYę=cstjH `j^D3؁v )# gG&d)н~h@l]H^ݍPN Z9m̶DJ @ZX1%Fu. Tse80A NA̓^DM]7`?\`[@$4[ d܃ 6$(eAf7Һƿ{vN>!|_OWD.-* WJ3IɟsDvt/\"kE#v-ڗ}~u`r@7?gyY?xZT1w޵5q#뿢IU¥qsUT\&md&H)l忟IICCa8m5_;Ui"P=_ݠ;oDh+X Ҭ>Wa/:%)>13]]2Nwb:ǻ3\#x>y.Nț[!_ps?{I7ł8Jl,KBIxɀ@0% D#Vp+\ߤ ~~?;Lj.ޢ=.]~;hBN"^_M rJOb4T*$3"ňJPL I͛RwXsx)&:A0^vѳi'}IwՐ&rSNSzC$L Z2' J$q*0 q:CƖJζج'n4 [&wu uޫ&h W=^5+-)ڑ GӠaY^W8ny2JnJjF$nD^#$3eHKN'"`ӰE&q:q%/RV_<˛j]?~~:)E F# ChPS.6meL{nOXrz,B㼴Q͸t4`x) ђ8,(SŌ$nd Fj#婙,SZ6цVmLr$g*hiVdJ+&jDnۖl&(rB i5ROac[ ]~|3\"\*D51mV|Q7sJ&OV+%Ӣp鉔{\RM|<|%o?! gw;]0Z}|){bgR(P Z,z*[!_/N3^[$>k~13b-8F#{XkCd)k5a3SHAA^5bpdo¾@f t"9vk}cfx92C`${k|X4SḾb noY cq7Wg&l8)1 0U^Ѭ7ꚡe蟽ʃnazc2on%jקi`0['ڀ96 D1[r$ѰbzIn..NH=+/;V b3&]gO~f]>=.FGlM4*(FY6Zn{xpen R&k8EvK:%mB͍R^DC*["ڒ (W($po z-^q`e[Da 06-L/)6O ϙ*v%GNy7wRz7hۀ*GG_W,֯5z#6qN@x %]ƖZrʩӚk DHU*2yTQeB C#KBjD3f~!FR{˭NP9gP R+o/5.=ʀ7jԊ;肂i~߽N+P?F_F`@k~:V0:fmH-{I|+{3O ˰NGKϭ1xLrrLymmzyŏEGcP\ދhILNKާ?nZJz}]EWTL&h|C՘rkۦuӒ gmu;:z0ֺ֭ǔ` t\4ds.kx=Kݲ\EERJE;G" rGͮJ 6+j>^RW f=3p?K%_-';>6,5CZ Xc#o\ApD 5$JEV~fgc-uA<2*O?F YIYjs:8+c~,Դq*2Tm$/$\QKIuߚ<1ݗ>NjCU)x WC5ۢ[5M뷺N:2h];-T ?.Zu:x9/țLPsQ/6$)Sc9(ʗ[*j=ĉK0%ArWN=@b9Xu\DIU: U&<ڠ ƍ|4K28Lj\74Dpy$JPLOY/̔UL *4cG+$\^Įs9o?פJexBLLX2qDXpg˨ XD{ -2##c%uk_(atZz#D1{9߬8Nyo2V5–S=>:L1"!`tl^=g~(wM\?`8ҫA&Z>bki%\ҫ]^5(5F Ez ڃ+i8Eŕ@@drϕ WҐJ+T+iթ!K3p 9'>h(B8ZaՔNW\C@VHhDgZIE%ޔ^K4 0 KNX8n,_ArRh# ZbF4XBq$)Ͻ #x-CFSjNQtOI5G~_j6^.D9*%)M>,Q .F8SR4:Z(R4vNhZG3چ_jq*rCݵַN" &&?%/ |%IyYݚ-f2L7*/5h 슀2^JM6`ViAmE61* cViL2t0B!bL;v:%Ԝs`LG4nCjM Jԍ=|Sơ4 Q?r*&tT( ="flJ>FJ>X6| y+.Ey+GBgboGx5H1rwS_|  %IoOY`x,Ap":k 5m<ԀXI#iekryVF#)M45qã/旳"YcՊBыrv4r:# 4dǤxRor4LMQɅ2÷bj^Kvtj*Ӿ܄\0ޓ N>~ tl4,_*0j(vS[ZCNvi/EJW78Y*nul[NaԈ9[N.PRQN$CxϒdJ%UZo3rhg\0`RAkov4p"d%ZhqZ/wPN9OAp BuTgWv-Ti.ȲvwfUQQ%,gMsx/JvX$Pb Ir^8V c]R*A4I!J4ink؅bihd,5Rez`%=>?eb<2]q:˗j;tǗ*dKfiW0l֧99`&LVa}8͎zU?M窯TSF>5r 둗0p _Ճֽ-+*>>ЀPp0lpt[@+„jF*8n{)p ld$q'*[b*x|᜕;^tpc)5kOdT+hbH $mU9S+F3;5n {5Ncnci!]F@1J_m:|)DJ ^Mo6:f[&RPE>rY".LK>C{vkbɵ1 !/}\-A6-~ ҭFI{J>3ꪍRزjK5.m\w(mnCUȸPJ B\r \]ݥKTjܮ]?h8+8gW@v!JohD`͔Ý6DKnfٻ7n%e7cއX8N9>&mk3H; ouҌ Ǩ,`XyJqL!Mm0&T Ri3S !F egTWœ-#ogPJ42 l|ԃ@9$:r0XRψN5 &ɅÙ)XDdxH8 .$!fn3@Rz2nR.@+ZϠ _glAі_oO])-/ }>v!$ӽej:=MnSʥl_ ,^FKδKwpmfJ3Ƌɯ`p](攢2 UUw3-vwՌգͮ)͖bLtR$MzgҲSO'M.J/e.P*fB10B֯9ў Ri~$)Rg҄(/4NnPiLJNWP-6Z.b?T_Fo|n =<۱mb\v߁~݁jQJV%A%?-TKBR(SۺYJ+> S`|Xw}.EuE֗-լ8`şӉ8jcx_ޢMh6am4Uj βW oqbD\l_c'W*On2οgr _p5X9i}STIna !I8 ҩLa'Uo_چ#X!o_ ײ;[ +- ֜u-AbS$:t*3lq`f;&,wwB;\bU { !. r<"P)[((_aB0C%VH)TSP6Ռ`F/|_c9{I#ߍ]UJq?W@2^y?Ɵ(N-?fEW.[|tbǘS'ʿ|31)R̷w\Qvh֩2b{ykD+L3VPK,mE:4'~κOa %քl-!vi khN ׺il+Tz o} 1T֩lV|FnJAl6nmTXL1yHjAFh'UJ @* @Czk: NAxW_5*?oU D;O&_Evx^0w: iAdE?Эj+A{{& {6#H3(4.NnEI*'ɴHL̋y] "q^:\x_J=*ynJi|-Lhg9.]hz`:t<>`ݭ;wY.DO#grDk$nCڡŗ[Nn/yG<&'䩅^uy!]h)Mz/G/AoB,(g<=++u1kHrcU #T7kNZql"h3UEj ͭKLJCo?Ń#߼f _U|1ò91&&& E /ڝ$ Fy~fo񮯅K1=Ʒ] (+I^1iR\[ ','(Q9j1Xӫe[lK^dh 3KQ493 e̠Թ"ͨpLEqcҜ)E%j>d21wˊZL/5X_FX|Ng}rwo_+4*l{jOvl꿉qәa$/o|2OFɈ7dAvbQ)aI%TTNJIMg?wүПFC%wScz5Z\gm]'>P<ඕkE~ ].Wc[eRߐF~U2f<8:A_]x>sHJ~҆wWza@cX|;APX^_z pge#}&@R5ݜ3]}u5*~}Up@?v:3gSSݽEѭHy4h5ץ%LI4u2KY#Nw! Ԩ`O؞/[|,7< ۾BD֐"6HLhӞ"CęH rJhkwx9HhL3ao4l=c,!yG c!(͂lsK}N5{ϭX ] zx_۹Їέfw(-Wi[M{9Wt\&GW4Q.׿wu>vǺ>DDZZ?9Ca-PbX@'.5NuXR-x / z8tǂ$CRe} \~޹-ZSbPfs'صW;Ovϵ:%sCS6j6쫅b&Gb"%ÂVg/YoNO뷊G|h N'6AH4.![ǃI5ҚKLQJlJ +IY*LF6psPMf:sgvAiآ.%=%OU0԰B!5N+T"mkE*`7ϝ1-G&cf4m*3rMf5S4B $F Nh]$Z֙#))G(c0.ʌIY2%\l✱-T\0dgUlFV<+~b[g7r3{ g|Kuӣq9|TL߿[>40Y}zUi7A#\d:[cq|WwnE}wV͙H=/QB#c~w/]XRNAwYg3tN5FJ(QMdCpWj`;_W<ޔ.1V''wbĥ7 %W6ћ|N>=YP/eP*xfMAǠ&1sTTqO3<䗤Div(O>-71tS$IjQ \?jPas:NϫM# qrv<$SdFPmbEB`;3%TO?rM|`n`q$y"FS$I/Q43\B^WI-9gP>zh ^6%RChS qN&̬Ԯ? |d{9 Ofbž1%mۭ644TC4}T*4:VN% Œ=G:73js6$\];1FM}(홏sv<"t@͘) +[CU"D݂U2M@=Q/^r$P<ȡ ;IE IJ?K-w lh3FC;y&aB_jf2ڮ;VK0CĬEbjLUz6+5,բ#P*a.AJe|3Nkh5C@Cްh $VCĒ$Y\#$ïAЂ}~r'2u8Yx_j-.o\SnrהܔהrTan2b8V.~04G)#5L:'3)D*)4K)L Pcj;Rͱ@]x!Orn@4k_8KN+8j!UbpQԔ ݰhCcYC2OplF3rIĵؤ $4VGY%S $pmj2(Y#cyjE 5N1m vJd8$:/HFͳYl%D倂L Z !)߈aKk/lY Z1Z-*ո[hŬ1>0)Z)R"xvR`=EOjFUo \5ȓ_?N'RXHsc[0c{Z], Bp4c6-LUox7t7=üYov"֍`quU_*3*z0Ybpc,*;J*s ܿzR`hLJ ɝœ‚ X-HWe ɱJ}ܾ/ӑTS4zB3q.xyOx&ͅ/pmw*O)ʰnKk P)r ~Tf;3OhmǃON]1hOU3EOmϕC"sY3`V]W]LW $W)%DspK}%?ÅaiSsJiZNh8BB0|Gk^f#˭ k1"Gu۳14b&ɝ9X2$l|!_.CQ @1dFw! څ8%S?Bm@=\w%o_8$4dm/큼} 1\%QS{l/ pwՎv *=d>g f39au=6ȸ}~|~۾g?dG伒98,τjfY$-ʝJ]sFi;w%Pg˱&shisX*VjbR.MisgB<3!cPӈHVܐ$he]bJ}o?Xw/xpL~g5#` aXhNP"RDx8Q֤8r:iUXVԬ )NNZGs<\\g)Ffk5I3̉k nL rh̏6i;}}E*i\>bȊiP$L0͝q+;sAQj oKaf!l2~ܦNt4s̥8GBW4[$*KM(5mozL{MiM@ L'#b;X3,ώ)6XrM1ȹ"ۨ(+.ŝ^-s&ѿ櫟ܶZRz~?˭6T Ө?1o=WU~n@W,˯_j0|(\y+$w'<ȣ 7cL?ǔd>ɉ?Z[O%BhV*',䙛hMy)FC y7N<[,!:ޭ Sf-z-|0Yr)&N, `SL j)&p\?i$g~'0UjYw"8JP *eHS7 ZyHL=xx='ױ]u. KRA ʹ>8B. c@9hK5-t<ϔBnVs"r # AK`$5[yT>cl:*X=8 -%s$hɘ}mFQ$4@SB,я avɦxk.+({[Y=V ij[ #OX /h /cT]M%`H{ ]tJ"0?oVnÀ;hh[q@!B.A{ vЊ5{4|[r%Ujg{kbѐ00&mEwM]AeMvJ؎qtqUvя%lG|.1 FmkApWw?@SkbvWtݣ{Obfuʥf0εxZ$K4 !C]վO> ِw=|tiw@ӭ#-sVA"It~&"&ݟͶ@TafvͲ/\_e7.3V|IXjlٔ%6.Eͤޭ4X#wRsH %"1ڏ|F-bvUNvG,J,B_=zX BL'u[1թGO׶wOn),䙛hM1'{7jIyA\3jQ:Z<\w!xX BL'u[g$]_]zMMqȉHDdvXko&6l/Fo7FV7_O NPIєX{  X7&c1!w⍩3x"AE2ɏo߶w2Q?FO<H6An#ys0bkW63|2U|$7kZ.o?]}|!{ߣӁGҡ ΃(F5p#Б'TI*a__7$QjxC;TC3Okޗ*RRӡrR X];b4% YS€WxiG1_|)9uF1>*a%ZD$H32I0MP"ҵH=(`]h1 0bf\3 wzw|s|m}gNPsqx^8 ;4Nl9v;u햃Pu=_-b/1؂-ɖ}> xW,FyVZ[^BֻgkeoL#+O=4/׻l /:T%BQ}'%Fe{q'9 Sf_„#@v/'>h߳ %˩F=f%@Q%r:AKSd0cBR14G21W\.5:(˸@[le Z*ӱ1"G9zh5ƎP@=&벢qKv~>o+˟~y.p&<*h1BE6&c"X*QFC_~ZW=߻&ݽ =Y^z%rEqrW,tԖ+7kz\7UmooiMr!q5-obQh{DUZ>"b ]"1L~-zpZ雍˙KsYXꄽ\Φl?3)[Udl??,Nb;q_l7w") y^PH T#I3a9 a,~xK;.Pr}hAzY=+XݘCifK/N Y'V)nL D9@h#*X0 0HI2qZhQ(Z"H2TRg+W($e rQN9dhe[0-F~AS)L>ד`Cc1>WU_:3?sk؉֦Dŝtn!*h0xZT@khA\+ݩ.yڲ*bVΟ]ꮝΟ*MtgQ8 :z{`lHkb1aB॥?Wy9DVPe*#(W\s *(9t3PKN$*rAŽ>;ݡtb9ޑaPt;1B1DЪlgm3foFJ}̴j/X^ + p& "F B*'RI\(te b=H^c,>5`KK; P9)1 j  r\p5sdRhJ6- t%Pd@%ʼn-/&Ss;:j\])ᘘ+.qS|ZmUavb . Db0MVP BrDƵ+ #qM/\Ob+d+wa[2+2m"EX=.)iޭڷ*:+E# {kbȄɡ2B)*MJT>@qn-0@sHrm#i71 <g~0 ? [H <*-|J uOdvh\J 10uВУIicu?vrnFŀHtn\Jc4&Cl&*޳ YK4hͭ AY3~!^օ|Ec%B% B~{v:غ%B\d0H)bavy$ְ{exy[P&;KORޅD BC8%OJ"P@0j؝"J{BDG*lxj@y!EFRŻwSkj k5VS?maNܪVj\ ]kc?4戓i09wJ{3VQ(xЅL^u;'̭bKb= nONOmŦIZBk( ʐWKU3J|bt%_c=DL\| SE-.ӂ+314DAH! %["Ta28ˡFB amz|ۥ{+u!5'QIsuRϓ2RuH۷(0yRh!0@d$^+TŠ*4.{N ,tDKPZCf׈ ; /nK' hn~S3Z fx}kQlz꧲r97Km?,_4;ѝ[#G{PAϷkgJY_Yw\ gETt̼;oaQBYOQ:qm=ŖYy\0&gn6j[F/ͥ r]jdH`n_2K˻[!F>V]C4,|jatYs 'Vu"@j+4h߾mqRzH[vc'l!2Kf-€xہyMK'i׾aD7kZ.o? Ic9V:3%PR`# NcZ,QxzMP:@A5{.*^ŧ|kSrB;eLDT:-A=  x@*Hˍ.bk 3787}>~^^oׇw̽'xu`)~a6uy  ەYܪxY,oz<=nBΘbQm1^9UǠ/ASX37("|'Pލwb11gn=*r@s-+лgn6%KzZHdz G,PkN$:WlB[G8J,ZdDș^BOWfsƋʏuo:!EEA(6cAr1&-1AڨxPsPTY]Ll H4 Ʋϰӂ7%<>-PŞ92a6Ngգ/jo6Es$+0*+P'r6fq|~~?&LgsBIZ0 ƈk͑RFccpƋBR$5f*s]}]޹l2WtQu샮FkbY>䧮naN4~xcETb;L!BvJ.4ŎV\w9Isaٛeo▽[&u'B  v#F !!s=AQs='#:Zvܼz33ʍn4_ݔ{W7/w{vX|0H4@/Tma궭Z(Jl /mTzX,UYvs%V_ߞ-\]o6~cmƩͳ^G~yC/r7~_&|kFJǖyv4_qy:} A/{',vu2Il9K=$V*vh'xRT\ u{l8!!2'~ek{H Bt6nOtWNOB$F^NۓR Sw'v1(ǠD{)29 %%AtS*mdKmqB %mFJktrݵ0 ("xJ*TޅoOG<8zz{.ZHgsp H6`+&&ӟbqҽ ~re@ 2h5J&0ӪV  %MI*IDHɂ1Zh 4ˊB^J(h7I'SIa 06Z}k~,@(dIp%uQ $ЀpEuY@H& esHQI)ց@с ,5AL ulY^- t;dcstw>|XohSOZǟ?q}}w&.?63 ,yF}wnjLJ˛/oOaQ>Cw;W/@;x~z1[[dceNJ2reO8zf񰹚 8y+J2H.X7-T0"6ZEe| 2Af&#ŭCJRs bv6+<<̡i>c|4Z'nPtl>r )^{gfƢM0'$,.X }$5S<[;Ԟ}Plpǎ/gV>fٶPv;wrL$g B#7Bfd$?8go$r^" f"(LzQr9s^qrNEK { rr9 45DQMr垫y3VE*a`jݝ}JOvt][BSD}2'COٯe )ZGkNE7mM("dY)iRBJT.*04W&R*+ƐrIhnҡ &Fkr8ЗW7)"xV-rYȅ`,癠KYsU))$dyBѬ! @bZV(˜Ie3Z*&{dhDh{3{FZf8OgomayMh @Tb . }r&@KCG5 Hn,Y D [BzsKC/%3P`Xv䝴{0P@?A0C ✓bfKb?f%3¤g!:C\UsRNۯ\b*#kԦPhg,:=[$Ix(f.j/>y|!5x\I OO67<{|} Pa;mNݵ^<[8%*R*Ė}M֖.6\s͇OSɾ-U,ݑ&dr~0{{svc ~9_0eyV6we Tu9Lw_{l|Kpf8݌ BtC,qCgҾ˭jߥkڲ߳Zs}Q@.Q:тgL6A#x2"T$,͵:{m}|l珵j#JKGA5'Wdӟ0ȉҔ/u)Oxi>TXbCsp Sej326&Z/}jSM#Ly( #Ф5S^gb5S*j<&)Xx9fc:Ũљl]y,F80Vw@ 2=GB3£`Aǹ1#yoר'5wiT 25CF5N橮rbBaq*5Ujl=VeMzfϰ5W@!2[ h;yVś{mNQ0{œ~%{ fg壪Ѱ<(dD[l(NcC Au4ZJ WJe^An?%H`>"t&Z VBcPaC}al s{=+9Ԓ1aͤĹ 3TXh+"krRY. lK jLk6\jsd~aa~uM~ÏV6OE'FJcîԬ|yulpSMryV-o6~g0{$]aqA ޞ/59V \sf]OC_1;~,`A3پ7/o} Wsӆ?d2φiZ3o͑hfzRֳ6%;|ѫ*#Y+7 "vϻqhB1p1gx#(S)'n/ưWnmjxxѥ :bs%G ? &Yuv}Y>S@OӃqW6j?"2;a R,ē3)}嗛k}^%H:w[<-HΡᯰHlA%a$!aJ%P20.F|JߛZ!hY)K'0[Ph ~7ķǭ_[mJ)$ofnVW˿F1{?Ż:nһ!Y7r٩A 5DOu>hDJHWh)?񖴔xu4 \=+[kz5D`U"+*Q%5X*o_,1\d$g0zr!j^+@aJ(D8~eHM HKٔ#"ts"Q\>V`G6C⛺> .r ~蔵&M//|:i1f7HD%+EWxA [(2 Lv܋]\Pwm])0N)55N8*56v+yu}vH3H% ^3[tb>%rJr+T5 2X(Px U8{ՔfBifPUNZ0*8Pi#1ҟLwZ8LdH%jioD-DAlu+N>OKӄqKAFDkWI聈8~]19xƱM豆ur'HǦ SW}X߸>Fp\/##XkkVг5՛g!<8tcs'5:=jt99Zcǔ|.ţL =Is )%])),^p6Ol,F(ñ>ZtXrQ&J$P~-9u6a&/;3@I,#_fR=1{ WRkK"J؋S. JȸuS%\NC .AD%*.P/zȋ)IPryH-i^'7M!G ٻF#W|z&>BjB#/v0" g$;oV$'@ÑbѕQUYqJлߴuVP m0#`]yCKťқeӪl}AFG0D/\컷gYg?~7~?'oagνd*(%(2.PtT9&h)演T+&`P,̝}iԹd+ \yaǀ T"j!FS.xmSA25a 8 9WT;a(E+Xi$8q8q3#^3C2V:d =BѨ!恂eQr Ʊ*OBbP4Rj׀˼ˎsF{OhzggVRhƋ[qF$2ywr"K^W ]J5[Fn񧩷 l0 `` Ha)XZ)V_퀮 1bDЁ1 1'2Gz8$Pۊigyg[C5guSg2W_z q,!/[8u66ݪ43g&1B2W8 Tw"ޅ;]$z|PK=xuskCriWvꊪu{u`sy>W/W/i>vwt'|x6M;ѧN _cu)ʄW? a*TOL%ʘMϝcִ;Ӫ=ʼn0n:d$F41V>/5%ҏu̢$4휞g3{ʟeZÙs75"Z.]KM%;2v7ŧ_m#µ"]x\&6Kv- Y'Vkc rrȭv(86jĬxXda or, )67OsFQѿ ?{wfdnBm 1\D_.&qiptS1MeI* haV9} A?gCx-ye&/ ~ q@VTǟS8փ W}vVg+dL2\~F Hrqp8DʹDiNe$P`D#^--thO-qXhWg`@K*%\-nu2s7U&UUy}N;#lb5>y?^勧a$xD?~~Vy`Z a,%ADՍ?!57g`i'قno|]( tY_έξYǹ/a4kTBc՟0 рW'7KPU$\ 1JIQ5^j#-(%RlSh":LQF-/ip ";'Z1FBn!wZc~?j?7ݷGw'qSg B/7=Sq 󟂙mPD@凇Ypܜaٺ@<4{FU:q>x_~~f*g>sZZBW|/HYƣQv_A~~m?bi摼''@Fj䶌Ƴ3җ0}vI+I5\%)Ϝ ?Jï^< w߿kFU맛_nZ u̷㻿̙P]LHC`5o`# RrްLJgN>e Q{QU͏|5ZZ)Y/bqlMjZ"svneiaR}ĸ0a/ӭ$!@a #-Q>e q*;W՜&b|yqa0Xy%7[aG B4PS=E9ƠޗS+a(ܤI m% UJsE Pmvo-XKPd/yCA7tQ&P4c))OUKw2\i+U '7pK㇙M,gPsin W6Ӫ|[5 uXV߼Ck:ތ@ΎK5\|'WmrG0xE'D;J_/WLfTlLXhoh!D_sp&*e;[~1-iIxɚfz_2,l㤙ow,:6IY);(z"x[!}2 bH?ttX6l6lc$Xa}V 9c/d%nHٳ>_}"JKꔰ齆a']XS865Sï3楴~{q6VoU'eb*Z^:{\0yO<_r-cu3x4n'wu& R*+jү.#"QsS@dA?PJgBa|a"H4+T4A`.HGsFbq"@8ꐩ1Um]ə@Lܬ-)լ- [V˘#Lm:sPA`avfeD@ARHM5 &Dn |*`\/{`J}r~V~ӟ\Iɓ 0]A sX9ހOY26?5AL07?wr\yrW't 1Jz F@&a0~?̮@g|//.0×|_| vgv0s'Je7{ VLJiҊ(ƱcUA՛Bҿ/jG1#kE@5fMF穷<9B .R D`Zo^c>'9;kVxA3yA1ƌc7;!ɟrRZNXX17ΐhMPBv\C܆;At/UocNxG;<ʭ@ӵ;<8Rn㦗j8+C2IJ)5 mn4+^B5t{\U;ߒsU 3s 1ʚ.94Gl&|Zb=K_Z7*򏃳2F!#ձqߙG%aKmFF=7#RTdZKPOeOqVVdwI*\~ҁ:Z|R sz-rZjB::^;Y:B쥤~i'!&5bHiu ^B< ,2OT07{OqLQ 4DGU+1PцN搋Ɉn^xMz(QoAJ^INY~1 }2,@Ğ~T^SJ9ά RTE%\߅2?xa60ft a^_P&K񠒏HVTrSw=nHöErp>x%Ar~n=#Y/^-_%Z Mj6XOX$:t U:\jVvIKLrϩQ⡬.E/F:dJ+[\r>S:%jl|k`%7"O9S*tBVVC!l)6BA:]Y&5(x"DcTx%s-!O ZoJuX Pu>¼&~8|"{z@xOHfk>LA܌:ۿnpUB֨秗(SeRNڿ΃zHKGQ1%jP$Pyu+Eh]3k (1W/ 8/eG e?RH&xx*UjaMG8r% Dd)2Xå!#(n(" E^~Jlmz yC $PlFnsp۽܉_NҩO˻d|$hS mdU7';=|nU?dhu3;#9{`.LpYl,%(@ a Lqe56y3K~ |/" Ԝ}%f.,CO3)u.lbp1]"‡9~eo`#Sq>&pu^RF:1CQR k{ 8U_p^#NNYעW(v 5ZF)%gwN!t0!%ŎEb̕^+ДwA (hWJ,s$UqNt LKC 䎵uFe j5D\@9HҜ $e1u8vPIҹE̸&ڥFgs *V imN3Hj0 gMdLkEb_s#^_P޾eOb[M }jCͶ^jV|DIȧsr&[aP>p]^ #KZCR{ l5_7Cl0__\t6Ñmٱg}2>aW?5Gm,af~;q1lGԲ٣~+ -Vyh΂pJje͝]+絺 Y3𶅒s4;TJ=M.剔9 .OtnM-Χ[Ā+5cIa BIAS? f"RZ(^X`2]U8IwICZ:h$G9Ik ڤ9"NIX.9TJCFmFVQ/h[baDdUMC7cDU,OQ ˅cR3ja XpEZv@9f24uJNI q7)o=y #wP| :P,Zզʬ!mQ8>Ȋ(JP[x} ڗt:Q"Ԏf/BA\Nm rxؚ WKhO]=nxH ܏gi)܈)XԨIj۸Ij=Hj^þlM\%Pen>{vpWc`RH-_4TPlfcmq*{S2n{҆Ws%CJ`%_?WUW Q+Yb_>`VTl5mGƀHSbG?O&g*揣~tOBT 7>[gyqy[)JiÇ:sgnsYDf9qzf:ssv,3Tlj]Tp旚[&&-Z|pR#Z7F^|--]EOFzUNrsѫd7 (!=cڲu(1K'>>a{Nwr%6XyѰ㩳 hhY[`Ѧp =P$Y[=Y'k'r~V\cze]K7F_h}HV#f&B2%'J[yƸYrG<-_D|D=l9g+1FV+rd& 5f+/|s"%v]z@O2]WHCr/>-.v]ުьzC1u]֢PS@çO#0J z$:`fJOhP7u܃ dTL0r:_/W>GDWKY`l7d,a,ܲ{w C%/»\(CS_x~-%[,fE;&עhh$@nFFV7G;_h(ZM\76mD2+ieDnS1X?uߩeTԾ֮Duh]p)!w аIʏM;g& ɡ?XLMk(.Gqs}g7t5z3 k\cr iV|CT7_լȾKG'FqԶ(SFO?WW8Je|e=KtY鳣X-F'e/^N?\'s hoݕ,)gh;9ۏ;[}7V$^7AM3Z*ER=U߲+b{T0{yFr`Ɍ\%x\A )Y[os".z]CX!:q*y_~@5֭VKcr%@ cSfKCFeZhn˰W D͸2LRkUbsݱԦ|kAOy26(?ҐhYbbm?BV*_fl _CH 2ZIѹ,gJAo&n/a \ ƟLyEXR7{. 1cWfgc={~Ursڑ|&ƦD^ Jh;)=k#F^%3ή6`0r[%Abhͽ5Hz$0\-yĚCG Qz85efu 늭5(!My|&M:ф%}ʑFHsNR<=F$2MPL+|*#"y;.]ORHyVk0_Ss(|rZ/ `5cE׋N+Mۭ\Squm C3<ׅ` :ncp]/B90F<Ȥ 5";ʨ,M$Js>eʥL$d.?}Y#O1ZW  Фg׍$]~4[7?Kw25& v EnbU6>QT7cp ^b%7ms%kJm S\ oKP޾)ReAFT9RHDd':&QZ !̼}T歰zڔ#2oM,MLYoF @-kRHXz?7u!I֒l.Z ^@wEMTvѨx}qVR2wZ9Ҡssg$ObRƩDSci=eRMSTkS%=V9qgr X?Ía.nSg3&Ǿ^Dn4-ro5XN3Z{ /Y ,ɅԸIZA&RktfRs4w6 2prJVMo2QhSА7ZҐLjK"'()t"=4NJ4tsTpX=Osc*:,Tbr/]fI\$z0#pA[MiQ=EAyy{wOsIfbJhqڽŭb6[V'fHz`K[.!~J-GoEcLHr}Cu5wMD`6ٻ8W]|#C\lȀ 1 ҈,;IiM6UݤdŪ/@D'uaOkqsk+ms[i'1A`.y$0U{ڧiA 9>^?)2xuκzFC^\1V3+?]{[|S|G?B3i  }1bFjKxAwWO_ϖ3׳e|c4J~!>DeY㬞b"mJ[D֩NI ΋7]$^3ϡݪ eZ2 `>;lW>"Gk&"L%QDP<B;3nZ99EAmtsT-NRRidPpGͦa)Z(%D͠ --snj0蓄*M P'CM8#bZHx9 !iAyք$5%S[N~Cܙz2C>C>szMnӴ]P;u%y I /Iz[DeN{'5tmx^34O~)X|F//fT\H5Z_R\6 nxKΌmUUF!*M7N6|YF9$Fg$ƹ$;LHVٌgt<,H`|mJoOt60hZ!rdCHNUt:zl@ϝ:`;%8%0C͒n5A VJ 9pZW\J:fT1ٹG0D83h[{l8)V<9wGpct][p&w\=}0f;J,bwt y<,&9Ukw,nq;AR̲ub `9`~cI&Lei%l 0;;+;~[sz^g>SKM`1coug H9G|kEf^S4zlFƋxx4? OqT$_ŜXҟKgpbXv],zxXrcb<ZV~T ^4n5 :xJ[J'N.?>}j9tOWreySK7x&\\~Yμ<.CTi3oW9i4pֽe&GlQYL2Wq ,uL~>~uWsdfM)ҨZ}w3>o\@m+}eHK,=%Nn5YA3<'#}Z =ub @F-&I>WlUϾu/ely=yأt^7ew|l-u%%c`KAN~q l֬}XՀ5Ň%xw)"Ovӝ\^rD0” w̍sC3֝q(6ZGy $=8$;Ix/Cu!$`?(``>Igxb Hb}6GuD3 ꑥi-L/x^xG3aeQ='n yG.l𴫓d0{%qX<]`nvz8(1 z})G-`clh6\iWl=Z8 o8|L &s&`ZՁ-8! ؼ9K8둷8C y}%Da^ ѾK:I%DK6 cÆ}^-Ͽjx_D9Hmk/gsI:}7w+r׈{݃#zo_>?nZa L-J,}SKfV i< 1ɇH,w8K89Eo}XW' jC4[(pԜ5qbYVԄ’ mBlgElR7VǼˏ  fMՒ+Jnj8CHBR xں]@Tnv "ש}kMwCD[X<YqIpذ Ԫ?%:Xpc:S-ht99&'FSiUZB)#o3eq& ld$6)Ҭ-=p/˙Q]XoTvۋml~r΀o_q";7f.n0-%Ub"g{G>w$zߝ\7t#a+r > @[kqCVѹC# .mIYP1zDuE2,tJ$<XK?,j&GްA#`֎N"zDy)4pMB Ŋ{7A4+UT (0M91RY'~ kY}9\֜LMR1hL7A<_4E_R k '~ w/׮4X^dtᦍYt)Cِ d*f-U+j .fR`RT_k),Ͻz~ZV\=WVƴʦZ?އ.j\ >gz)_]oq%9P X`=7={v=vX,uUP$iʝ(:Cp {ܳA{1clt3ӇB9Ax(--ph[ƥ2QTHFްGun4"PA }XglBс]g>b+~ߵ"s : hE%ً57!uۭquZc\0"f` ?>ԯwQ>~`N /5w.o`;"0}J3.)'uߛx>w2z=qB58)5I"-J7#3AY68kZLm6%Yh`cK!cK>]CF^Z%f F  '؄[RQX̎.Av7[kd)5P焽ME5/t!4ղ}єs.%9CyH]_I~G25ժ+hC5[}ZZB*JB=~p2'V_BN]I &&,ԋ54F%4ȑѨvZpx`CV2TVT&JF]1ٜ#9:{= Ps־a܅UZ?mU냼T1bQ:b[Zkq+,&fsLŷMS]U= ~(\rWd1m 0jPPW0,P55)< *+1 }M=C7L\Mta_!QW:Y]j/4
ԯwQ#}.x}/*
!r&n$rqI	wM|`ڣߠ'4>wpm	@ot.s0ڧEcʚȑ_Qeb782q9b9;8ӒLR7QQDŪ(eP_Hd/thpUn&Y!\H~za"KxG^j7/̫k+FxLxn#ͨ7*mcy^?E޻v+f5m}jYG#BhZdMO
Yb5t3.Gw=Xng?(6a&#
G#{tT,EuDh8xkªlyMׁ]vxfw Pu^m*vf/۔MКMahzb5a5+),oI7mZrN)폍OUʤissssv}=BX]4`Rq SF7RKșԑ#Io*Fz^]A߲=YxxޞO!բ}V;OVMb8{;5:HxKɟS^$rբp'JT$9s_nS/W3g9S=`K#RVhaRZkZ{DE?PG S	l~=")}^@t>{gYp((X'-rޑMY	RĬs#(n>n+^}/r:krUm3zIQ'fR6V}0V0xbFPXޞvrvON[~
+	Ufz\3t=΁_V@A{>vJ3dw6nRr4딜bBYV>]sizu]:pʕn)
xOó$J$*-27+CZuO8_}8;{huW ($%^ϕۏG4YJ2ZZh|^S*{rTrԌiV^J4%%o-NcQM(oӖz99kCay,7npV#^NEL>CH^=``nXWkM= *A1Gt
Ld.nՌ޶5EdwQ3f/QTn
V>
/}
M^E{):/QH	NrEnc`DQ	XG1z\!|-<G%?LB3
#$|l=5{MP_vE(pl%fmxRLm7Bh#l[ /9i^FQ!ZJXhLJ+#:;	_i]z2(Ț"熫FU""ƪR,47vZ)P{D@t348[zD΂q(Grh4S-1?I$Pџg>)2tF3),׀ʋ)S0<
ɒ%=-u䧜\4&j	Tޣrŭ8|Nu3J{Rjdd-()ig,tA`+U6QJfĽK:lE)IV8
"kixRd*\K=uv_y݅J
`R2X,I0<u{pQ['Ԟ=h;R'=3
]}@cE	F0O&pi4)^~jb_>KS5u~jVT+FL~h4DZ
io3Uorܫ[ocβK˖UNOuf2ݟw}LMw=^n:ӟK;0TK_?}s.*%9^p5QΥUOdѽV=qpݽ)SNiqLvzbej~|
́Z5퐃a}nUïG44`-MLCY+A0	X}ƀ1iesYr%brP	K7H8g.=Ej(-*	F~"{ w.N;@+7/ZQZ?ǙaW߿oAV"ivQ*f$9-gcacZ0Pig@m:tA#S}+
r4\;'FgwH F-og7Ѣ~Y~6O#Ž]9UX3?Om9q8Rz/](&^e{HͬkV&WjLΩ◳Ţr]
>JI_	ry&[>y~yזoO7i	%~
~y)RA:w4Jlk2f֬7ñ(]Rؠ5Р1R OE&-sdoX?~nwʮ|ql)+m)gH	j"FxƵx {(*3Մ}nq
lJQ:#uI|^r&{.fvG_,kZiٹ-Bn8iq]`b,eQ)@8x:,*rUu/"d|	vR
5҂c",5qw
N:PwIFS\i(@[!&x/2+R4K/?rξ!5{
yX?1+@CRqky=y/m3
YiBԺGkNj!Lq%Q5XWkQZ&`h{V
^չ$o);9?Q{Zu2j}ZE21KE5
n_e{i.TKԈ,m_\|NPB)B
X?l}P+˷؋mj]*rd]}Ȣ؂jf]1KDcǦ}!k@5ȵl|Ȧ	͜qvjՌ(TOiQ1)dl$x`:moD>zmۄ7fBm%%4s ҆! 5v=8cYXFc)v$+}Hف$ 9|biC4۲|<(qިm%j[h-}JHgS,^49K2K'ϊ-!}6zZ:)hĦP5.YPU\1(wF6$|0FdBvsڑ9Ռ0Y]FCNt1ajZ)f-CeXG[wQ=[)0;}OF2|T22Mؘ'sYacbYxO>fGydQ4
4E.Ddr$(JSʅ"!3jtԆGK/ɥD+&RpE2e.uìZ_ېvU@H?FJtb(ȸfVeYq}E.kMJ!
,W>ȁ%C%*qO,JVxNASJ>2cQIK(rv+7QW.r}c+ںEEt-Yqe<ԴZ鎧b(:7\\\\Pfa`|?cЦisC\C1m?!LHh|rX0-pnSؠ*eZ]͖wޥߚm=Hfbz~6տf/P;=/)BSxUV>k,4=*5LcZsϾ-('fWb&YdA3լI0'JG11f6T	^z@d%]cӍ2EOPiCdؠC?	LU؎>s'=Mo#SPEMr`p9-R&-gG03ۚUֱأ}jRr;9D^=cXA	}i(Zrcq}
4P/zCJ.\ýJ
Rf"cqvf-:9P.E9BjkËc3T;UgDl-{@.~&4.irjf铅QweHmw}U n*V$ZTn#:"GK:W&t<:JȘ|iPɴ~z.1dzvi
cM>5Ј3L;m'ϞUR1Yw~ܘt{ىSj%vWI50B[gM'SZfaq{kvC gJ6R/n~25@_c9uNoP1sEZt"f2{
LarI:s= .?i;l"hÝXAz|Ԕ0(PiFkb\GY*l({Ƽ^ ]]h h (/(b{vebDo='>Ȗ&s>բ4bǕ#]PzLTtJ(sKE#yQ*'>ދטvH
Uǯ*Zᄣ%2TK\0G²-)n{:A
i8PEL}q
{lĂ8o^fAPet#8͂'i.+빌nsU\{;Iq5ZjYly N6/4陚3]6˓X9ڑƲJD}_-Я' ͑8"sNu:)Ԅ(ͣ@XQZVR暌P5%ǗQn7؆Uv{3Xhd96BȏŴ1iXĮ^C'MFe@6q8>׽Kuݪ

콺^4{{ny{ut[/;%y[Xmi	֕֘cG`|DIB’r)Gh%
r&:Ӿ:ڝ@Vm%*E@
'	[Eci0+u#u.bZ"xNf1lHK^.~G $HEU#a'swJ啑"xCf;{GDrK'Ts4)ӹ4tň Oi<EDؽI3'WtS1OXvV3+o*=ͣ].^9+J&Rѱ)1$`L
G;,l߱}XT
t{>cG*%%RN+$֭JC^R
LNiX@hlҋ袊>y)y!KxNč5!	`MFkHtWt	Hƺ[L;҅Y6ՠ;g SA--J4ҖDVF2:6+8etVdishzFS1fKF؀#4"rZ`QS2]o46ZEߔ^	³!FJ> {9Afj!eI&r>Rd9#!q>6VOɳ_S!DM!x6(u#Z+g|7;Tw'(0[k>L鳖-uAEyN+YBEڼJ=H/oF{30OoY8u!d5f;KU3N8gf> ^_zWi@O>,YNvƀvg{+0`x.l}-KѢE_.l]'QŌs6$`NN
Լ4`{:Yخӎlקa5hJnp	ӨQn!RѸ6,QRB{Ⱃ}å@46O;5,GtnetY/a0=^a{h)f}(BXjz
vRǮ%[rn)=EE[pR,Z
Jd()&xKj{:WP,wL:E-kl9E1ekJh(~*9dBøvs08F݉gPlqGR)\08A(U
X40:BhEd&MBIKOZxdA=-#a4lǖt	$VȤ]ȶS4b2}T{Qb
2ZmrQ*L*9B911T6옴tTx~*ԭ*)#sݺw
BZ)`0XN* XjAZ)Ǻ2qk_ڛIt$жlJYɝq3,8'ƱދgyJ!gT3T-/\ǂsI*;k9C/6M2-j^s~)w?_hv>RE+Y;|_pp
2tg7sOZx}^3oY+HaVH崌^4j/C#K^@X@L(..<9)/$lPn읪qU$x2-Wtp62	XmN2c5.?ji$cfg*$bQ-܁RPh#˻_{n>\c;	}xǯ-jRs]H^
Vm梅RRVP"hv"PPqqJG*V*qєWɏE:şͺfU$=)SK{#ʼnKoozvT	0xS|lwk/,͠"mz
9VWDieoaIua&du$^&
'F
Yw27^="rX)_tP;o8=<_X{_6卤UUo
}|׷B?ߴ䵸X2ݼ.o.ٰoX`*ł߶nm߶nm׭? \~ZxeJ2w3 2C*DE(iRd+b5.Ү揫k/6j7iuEh[|woZPF^׿ݭyq>w^2G0'.6װTS
3S(nzm8
BBԂ]7	JtQJ
JZ
$)%1с|x@P<d_ofx%:>ZTni%cyd8S)γ"8Q()43{h5&0Z,0f-:ljTĻԨ77ѓ^oa)dY*Uσ S䛐
4!!zR
e48uh|jmr4}؈Hbc
HIJcah;!(]r'6C=XÂS! WzpQs!2ev]MɆ*5BwM>k@ku41ʪEA,0"٬ -bW,Wr#tk'w]WQDtD\vR$?OXD_<@ϡK2:;Jh[&t Oa%'#w'#_8CLPZfI/m{Y*{Mz9msqf"|K5Is"I Ziccئ1?i 5r6M5*ݣjԗ|
A@ِ&&iS
9SY` @î"p:c5ltD'+
(3+W+xtf!}!A/5";X`W4L4	BXKն6Fq),Ifu{BA柉Mf]{XqwC
I-v
cnGnTMOu\ѼS.wy&Bڵ
6Saon⍖)jLUZg0v[+[coYl.RP))8Rss&omvز}
'i
הꎠڲa?|
W.V~&ub]AZ5D*ڙ9-埯gu @v	ԃ,+&o&A)Bn
F ^kHd4
S'hrlV\QhjhԳ(C9&v<xFVZ-cjbxQ\&(YgdbYOCh3&vVZTf=<!_Ûx3Գ^Jp_մD{/W6_L;_k.U}AjWhoƴf*/6xszWj߫&	T'nʜe5p?4쬯!ZmtYJ)T.@ݦ@Y}†$+i74n?]Q㲲?V3kgГi,70O
nlWҲJ³oO.d<$ѯ\V@'7r_eɥM޽-ZGp$W/?Ui.yy{2L8i4*s͜BHiO_i9tUaٮYL;VC'3]*yS]u[=tEhDMEkLooqv<պfu
zACNMot7?^ӝV_6UmH4{fje:9,ΫՓoޫ_ʹo~Nm7m{?O]{W~{{Aw)o}xoWq&}{7`ƽͯM7XcV)s%mZEBNVWs|
!/\E{u5v@m+inFїxƾt~m&옞ZP-MSLRzn;P*TIK%D"/lDw֟h0mne,[	9s)ܶjqi7v偍,H/;Gf9Py+kd$a
G7هʛ\;EePJ:T~tr(TQeb`3T>B]c;ռՠ{tOX]RztۡQc*vrGGvޣ3Q/j9"z|nJ0Eј2HmKBB\DdU(1
5tQB.p@B\D˔>
*Cp/`]ј>7BGӅ[-&v8vx*v
GbeQʦM0QA01geFh]gt.PRۦJژU3*%GBu_ R-\JX(^8)NqcAEѢ.3Ud~_ʞ[r}d7B	qbK!.
q	b
((zܜ;O
bQt_oic#8ّ1mѺ{|>؉H{=7Kv%6/U5vҒ\^41Mbة{{F>J)7$aV
iQz%JaAWJ4p	KIE8g{F/jC>RڷbWi*ؕ(QA iwN@"vxA"AfB)CZZ%!!aJBa{IPHc/	9٪J]|k)93(ͤ4 ޳}K[Ӓp{`zVꅪgx1N>k~?v~j.0yʞ~]Y>WnbKitg|y]Tsr~Яia6וi2څJ{~wg.JiA*S/z9"1x}M_ݖ_բd A(^!vOFLMԗ:z0_jq]3EۙuK_m]@g
X&"P*
#c	CYv_rJ'w=T
q^;zum@[A,HTbdZQ8kH`
v.rsSII9Id׋}-\.A-2$) ٗYmee_h. D>|l)?\<&P
fXޮKA[n2oug3)୲>J8͂DD{tƾfoT`VD4Fz<ٲWl۵W RG:a0kdtB܎xv$c1‘
uS]<=jcya~[4ޔ#2Ә.gFz5+rj'Q>/|^4yٖmVIX)
WpK׹y-h
rƽPj.0Y";!fcs).^mˆ_sQJht"n]tp{VHyJ+1@d=vn@54`}^& N`hK](r
j>J/#p05i05Ĝɖlp1҅ic#}?*sX #Dh+UPU̎vnn3ltjV8顰
}\Q
I1Gpf\_/+ν
]|uŬG,Y1O7Y5e8
MƷj8H]qo^bB.01H@X%$1VBoW!,5	R#ݛ#J+V*6P~{s_.K\fW@B ;W.7Z@ P"@)r}@zo-"̍ݛͨO` -uJa2xTr '[k"XʹSʟϿYqZcbʼn[AI "nI[++F!ƚS$1ӥ&n-Hcy[!/%$|U`$d7SƋ'fkM"?uIg/F 漡&1AAS{/IT2u;qJ%uoXck+&~.V(
äkA`("L>$쿶63ԑl>-󤑷^#oFzkFZ&kEewMTs`s 9&9$SMxZbC2pCbf:\9w=Ym3}1[ZaX%RKHB\$(1N %BP0$qٯ7
XZ(y3Wܱ+C*2^ɦpC!@^@_8TeT`[BV%
1@eђP
0"
((@	8-b[V

KeP`җҁW0c@b(DH@7=('0O~!*
$h(q9 Pdڟ*gY-yxy@S~d}YQ*gMdD1
ƽi&{0?ҝUC:{]PBj^tr}#!*LIg5x{bV:ӡh5*#M9s"hBb\QehraאV]asmym~&oq1:D֋_9m~Ǜ8w.;NsjƮL\
&xu.6zF2nܢ/ofkFW\H@NgAU|߫s/B>q<-lW_|,d{t:h't1p`̇}zkOa&
>g%:p5f*tb3bB2x)`N;bKD0J)hȈ-fBBb1Gr+6F,
0ɵ2S1hNԚp !&Vq	&
nU!d^KOa:/)#	I{.iVII$%/|~㞗H2=@BD&hmnl
6Wd67>%@`CruD,>os##7p>
ʻPU%7oNBqc!EDǂ/g/Vo'_05xtǩK6[!8/
{σߝϻvއn8^~Voqh~U{sBfdK^HSӶ!ږO˙$wmC1+ҷi/!:QQ_F4$}5ܕpd;Ov->Aݯ~m?ഺb)		nj+e9}`9qkw|DF0띌FN ?tbc~zkDۥae>'V*@z1M}os{f|2 IDI"KY@LY37G^ժ+UVmo8H*7V>w
g7J1fc3I.>LkB>#P,"0
c$%VTkws"\0V_
 z%71փ.*yԹ^,1~0˛mPԩ}53x[p6h/~5	/nbB/t6Ƶ[5JˁC>N+9ºfkiZeA4@g?^~Tp@*U竺Y:)|uQU@_:,,
ħFT*R/VD7D+;]jIgχ+ad\Jsw҇ʅcO8&ݎhݢ'[yU%*|Mӛ~:~J7&0vz&_>z2k>%jfbgٯh6̴9%֏leV̭Zߛ}	Xzg9*)ؚ4;WJ:emfb	VթFvu4M[՛[hTWW;֍/=uATmu;Axn[{6[:%Z5&iIm|&Ƚ(vH(vhB콋Zx"wHBv>y>(M#|u:benO~1aV` Ylo EHb%Fr~uMwjt }T|oʁ!afwHJCeСG<ӝM"Iu^̖&D^ϯMkвnlsh%_sΓ\Mbs,V6͟j/tT2\]yI8(dN5
 ^gHETrQid0vM`JR}ɘp-髼A-cNbnUI)J0ԘKHEc<1X2v3,jnxV]"3/BGx%6J큺~1"s|),䅦B}th͝Jo6Z_vQVaB6J.}	ޛ>ZAnU?WvIAIIyٯnVqZ|R!|+wb-VZJ^QiVLֹNQ
GӍHs.%8W\l_gXAI]SNI-
d+kRNwhyβUN-9s+_IF`t#q -OB>W񥏇sI7]߿<sOjz:}%Z(k(vXE^$,Q,bOGjl}bfJ;R0,8M5Y*q4#c?NSo[S?w/Tašy3٦:g_V[:"4oMyvbp ,[_0{3]L{tzۛՃ=YI>UգJFj"UۧGw-Mɷ7ϊ|ídM:º5`G1UJZ'WL!™:%=o	b0.kT87[E-{P0FGVH?,h;hEҚyAykeCݓPeF_h;L%!.+0"l@.35fqses! ѣ,8ࣝ?Lxi
3	53vYyT%Hj&h8&8]ܯf"ɺey_\	I`_5L2^%[,@MR;϶XS׽8dlC5zj,OLFX}vy dX9mq؄Ph,UXHabh"e2}H$ƂXGA56|
H[_*:(QpF``J@1X0I 3(@Z9U"uX
p72c4q#j_=3U`|4 !`ϟC>}Y?ĘͲWIm/z@,8ʛ7^2Lg339't_N
	9~of
HȀw{?/c.߱}sNg{b,t}KRB nMAΙB":i}3 G1a ĀI "K&q2h:vPh,ߐ]HsR#^a2wNe+O"=KݒOJ6a"z}ٜ.KJK!inR:0!Zݿo=S[Q%9}߬nw@*BsEBgC:̫%I,c83%Ϸ}|b	A:˿H	9$'6ch<=VLCRFia$	j8r4nK_H4}Á 2\FM9D'Ce-y%a Q-q.V˜Ds,1 e7e

"w2'C)0_<8sfPz%M_GqĹRiňfJLcp4gY	m>u831Ta
_*?{ʍiŀ_v&Y }c6:#
ߗnYRdǝ>muU}E*~61G#)v5D\
w4xˮZorE9z|#:SOoXݦ
KîQeR*J3:X|#%o~+NψXq*@;%
jG9pTAl-
z]%_olȁcp:py1hܭsU\!tT8RD-TO{;v&MmqѦ:g7`͜ɮ?ƌZ"-͂4#ȫ7}\)uկ`ć5t
vzh
P{&Q_}'DEWPyJO_轷͸xA
EtV5i6FCp1޸%8ю w̪=mz@)AB=G-BX!, lt	]P}]@τ2F	
0IK
)bI<k	`P"ڇm1U-Tb$e
#\D"AYh&t3C\2C`~5!&em
bzX{_c1<*RTG#nD	I]Ug,'N-POmL$]4A}g5B%7ɩc&qGlg4bb75cq:Ӷ1N_@G'۾q>H+jB^%D9Hk&j|j-~PJR˧*)1"_o/t=M76YƩ|,Y3QL)&Te%OuFdZKseZHєe`DQYGqoz,N	ɯܞٰ]`l+nͲ`g>v2$dļf`f63;Gw,#ч)D?/-K컝5M؞cܙ퇜$\v{xwReجA\
ĹU<ó+XSc83uuPoC+uu-Fj_D=
C"T8
aZXo
qp;'g\l׿7ӵt}z!'AR2$:j<6Cq	< '[m'IZqWl+̧S[jQyF^Og:5Nݟtj= F3+8U~YS+yЃU^&(znVXTaUxBEGVעo
Ey1WƁBB).c2M%	QiJYLș)`Bre΄g%-tƀ#!7?/L/K-^{۰F]*	.DؗV"-HԷd>˞g˵O׾v\7ڮij(.: ѤCOQ	Sʫ
C`c{Z"Jp9[$k+w4-?5˄x;]wd&7vمC%Y¨,EDiVZ}ޱTx[Q"^z	
S6\zv' T^}Dְ8'	g -A+'@.$|l)\=Qw|HdJ\z1F]IՏV]LcXXc~~p, cdrG׸u@MRP3cMvL=1GQѼ:м}$U&s_9O,5V;?[)v&{]ۗڄt5$+WRKbq{	7`	rHX!dTF&\f]"Eʊ"/8/AEN/D?>S#oпo&w墪-K/ԼYw,&e6&9Y*w	fEwirH]p&fpKͥ jHULP,q/ -0-dan9)9MIa4t|K3L	0~肕YQ+-x3y҂o%(KT(!.8Cox"3^3$ya|x_<
ysSa^[]\=ez>IZф2!Fs@&E1Ҟ=e!m@G%8*v׫/^O춦r&QZ,JŖ_yJN]펖:ʜrD^Hb&&@+
OpҒ$)$GfA$J
!(cL;^6ۺnwoD1F1:#MpaшY0w(J9F_lx4p{7.>"
PҬȑbtʲ(ǂ/H16[}`,'PU0
E7>юtl\%x'kYKjvf}TFTs<نprM9t	J;#hD6eX!]_[
=TdˮZorEtQGzriɇ4Zn{1.!iB3%	Ee9*aiNL`dW(VɬchKjsUmzRcWC߮F6N@Ʈt5$خzn51bskF=!ݎqcR(y9/3$gE"]Bji1
C 8ދVza*b]֞t̥YΎ]mP!f>vZ*o99eԭj)ܺuqC
kΦ6דϳGݦ>Xoy~^fvv>_+4LW_iZۣR,gz;+e[wdG?,_xusom9ڸIs[p?4F mQy݄ yR	۾q5a儭TM岂ZP@F%T*Tg2U0!`rPi3LI0eYlvsWjp.gc3aX@X۝㪌q%bN6mL7 9.o5&W$vI!ۃǧuv^lfw3iny]8#8&8xq3ƉN*aIKDey3rZE]^Z*%p9V\z:AzLF(g⏂C@*($D9GD~h=FAQ7| 푴>j	MK#_6!SQw|<uBظ~B\QP=EN"Fc~B#x־ JjT8`ORBxcӸ;@R][s8+*:g2hRDu̞LMfv_vE`]2LjkmS$WQ	g2EF{~!Tg.SHok"cE!jw@7ǦƘscX8/+wXI	7FUBoЍAV+ׇhQ4Sqt.m1$tH[n<1\wbn@ƐwEJ}y_lOy)Vԛ8Iix>6g
DYړN}ΦZ^jѸUE·k\,=5>==\X4IR (3P*)PB0q!I‰@KHj,FĔ&=U~R7LJYʴdj	H3aBA&I4a&`A&nu
J$b~jRw>_6,WMZz^~%oT>Y碫}$}l\9쾞3OǼEcKO`92
vi~}7W~odj] C]vwJ?6)O.&xhud<6E){"CTbrW0[7&RxlV-e~^L-ػwPPN䷥A|=(]y.HKҕXEcK`jRz.RzRkfrc:sB
CJy``.8 @JsSOB
|l_Ȭ{;2ͅ$"|*Rݜs̢$83xrnS\7!zOۢxӫ\m{hBbIƹ:<ڻ[7>dnrf>/XM";'Am1$qjc ΒH­,7])6NDImژ"MX K
-J[4fg}9)8NljzyJ*H׽k[Ko.mE?ԠC[Koht%D)D{],UzVb"\|vQu%&N[5]ɕ%v9?<$f?͚:\DS<ƶk~_"X,yYX
7JtcoU.l|~rM$1Fc4d
&B:DZ(Z7{..?KRxeŻwċE7i)F}t(srݪ{t.Qމ/=zK3Gn-ELAH)0)ͩ@@)S'B@jJi,3U*UTK\T$LH
pjaTs;h:!lmmVcY-)_TvjS\JR[J 0)ͩFI#RaR
*hRaRS
(g-,pgbRUJmtQt'Mi*UYPCrmwUZ739>hZ=ڔ4yrDQpEv+ݲ䅕uz鿼YEw,R1Rn@jB{.w8Hkr>yc?q%!y
 yx_FzWY6} _%_z}'3@*rIrIM@[n̔nTR9g=9,RrD7vM]8+Lo㏛HxX;
{>}WWyN M#Kq<7f\}pϿ}n5V󏷞}RTdW̸Bovn Y-qtf08i4	Ttǭ(3bm5iw;..w-FM(vA~(]͓}&ŋ z)SڽYm/\F{F?S[#{.o	z}^GwlP	/'wޫ븂	_*XDI$bDY%%ƂE1{K!v$pHKA4֯ŴjIP>aAdW=s-;x5Y (;Ǧʃ
*peңD/Q4}pGWr:SҎVRqU۫hF}f4[;UINV
^5+ք%7FeDt;jtp2Ķ#R].%t58X\prTe8G?g5>*8tzN"R‡ؾ)h©gT3ݠC59s&z|JƉjfIbH,!f*NY	-ݏ
J!I	CKTa&0"2(S1C*@0KǨD1,_HL10R)&JS-_03UVxIz7}Nix]R<=I6E;lvc\ǁ-*
~|w_@>U>ۥ?u~۵ܻge odj]b1;^?y1ڟ=TSNd?ʎ'@
C4j(֍'g<&|1r)Zɗxm>UHL([
=ƚǨRalYx-y΋ƚhRkT3=>k)7$XMNuk<2wY-;8fnD)ѹ[A"+e[n*._*[,.ZfY-=gG?D1	K#$9%J#!d
xc5nB!
rtOY#F0QquĦ6R<͹zț_w	֤rQEAt7{ۄ-޿f>LrA/ӘFz1Joa3'Q>|^{y]bE
g#@q.%K, 1 cNSf$P)ĄJ_|>(!v[et>̤L|1	aA=;6'9Y˙ɲ>;Yn.Ƣ
Aá,7d!0
D^HRj%pKgnTc"N+6])Nç6SAWy@I~5u=R*@RI~_"̽K7Jts'Vy'wO߽Ny( J->EGpHISWb\<xx@Ӂ2YN>qgT#/]^dRJ:Rz.9@)erJ4E
-%NRGD'stW~T/",P8sbAzP`bkoQJW:T+E۫}	0=J(ށNu^@.y 3fQ!p5U>=J;!ZthtQl,B޳Գck#\~
sz
"{s9O%Ҹ
,;}$k΋ѿjM͸L1@Nʸ&jA{JAwkz*B^bOqp(.Ft~9J hN༝tߌ<US'
$N;h)7+RAL@PDLjENb^ad'A{d17݄LR0[ƨ/mQ9J^&3-y"I$ԨTZ&/ب٨`J%TPXHIX|6(c11̊aF4b`;&JiOyv.HKƣsfX}Fn!&"mJpjMq!cNSmܮ;\6'S@vidMW<ܻ@Gډ{ͼ
hgW4_7@Kqr
_Gwo./[
js:L!Fj-ޕ;A=,>^=ބͩmxt7Hj茵': )g]ϮC5bC_j_VգAPm;Լ8s>Ղ/!],ot>XQBDQb-5[&Eؼ8_ޛM5jwz]{X2.<ś.==<&rڛl&ߒ|"HNB*>vkʃi*ցT޴[|@ֆ|"HB1,Y'nMy":MQEҋ`Ioڭy}vkCBsm S^!Ήz?%Cp$8	ng~K;WUWz*]]駶k2c5CF8PdkKsvZBJ?]Yo$7+B,@yD0mg`4ytk-dvdINI`0K/"G|ºjx:b1ؚ?=CH_wɊ>
n\[4[M]IN18S⇰Φw*3%sp?K\7h5u>9|Zmr=}L+JT˜=B_H+Ă*t-R뺪M-xc΋?`|A.ڤsn1HG "?G"@7UB>|8^YхͰ|>73O.d.|gglhaK^	J"=+)RDauGI!\s(	J/U0ƝO5/7v$Џ>~	O	{cw
ȪZJ.eZ"КZs$,kՊ7dUB|cwꋒnN}JtR8cRMS#ڊI.(]n*P	*6hݿՠ
Ux)iSJ\trgvw
Qbp-(q4n#T_cmLdVZd f*Ikn[f[YF
-jv|qhy_c2a҄zƅרKB6}@8mɔe]WB3޶n5jF&Fp97-zFtnĥpAzvr>_CK}0AL0ʨP'!jMѽK/|	%j[^}FxI4]롩?oQaOk{[}cޘc8|zmnI15A@ЋFR*JSVBԵUXlyǷm(vNB^tr/&~iq;hozZFk6|)^NSUxI>a\|(4gCӓym2(p#]coO[GK)Pa!\;GQM(ZIJ6w2aC-jF\8dyP8Ժ:<E{[,KmeTb2RԚ5.42MZ1J65ZlPU6F)K"zǷ2~b2Q$!=cKub]"w_X]JTZZT%Ԅ`Y
kEZ]WPtS\;:pm;jhr@[FԈ~r9#	A2'fR[qY}3tMX	ُt՚yJ̀22/nzV {wp@|뮂xƣc74J9ޕ]#LLp+BG#@HS&P*Xv˯v.>*޲+&NnA.1
]/O%%e%rdwCSZV0+cv!fS.򫰙E=0#]~-7c*9W'˯=7WQ(ITE88{XGÙa%(ǣ1C|xf*?zt/n6'rG2yyq z^f\C8S$ {.g4Sׅ l2imHNAڨo!HI?}B'pmiVKfuV0'>R09H6LUusFrV쿿kVwɝS$1%#&9YNu>%6"M2yJ!B6DQ:,)t׬&`(=(
n0CD[–VEc!hmrѺ
YZpn8ZAВ׬-ݜ65(6U%+ilОߺhKхonۢno_o_y7mqkWKv'%2@mfW;&"Mv
?N5!O:~FÞ3;tYGnom,w{ؙ˂	wF=κ.ni䌾07 FD^Pjn/UuAoyGWޫWʚG/=>(;ŧi?>J(Wѽٌ[bk/C'4Ah~BnGCeKa2\;	WCxs3~+ߝ<뼿NAٹ?aY1X{<\_1ɏϕO#$1m)-nYY-b;Q١gnI;IAzonG,|&׎^Zv|͜oz
(G1_.59/KRLIjAP.
XQ6nmڮqD bFP/
AhrY2u
nryQ+X
uT&LV\Ee6d-Vm
Jop~l-raPsf`	ݰ4~MxgxtXj|K8ٌ^ϢXkߋj$+ȢzZX tϤs?|G`ދ=U
GTbH48K"B]4S5Ш0|<8a#ؗ'
XAZ䌊Ūd`Ŗw<;.	y.Sn!=P_ݾ(+fG@a4r	o9rO'(lIЃ
(-p&{}p&TNrILօp=us
~ӱW+(qjW_rd*w~upo/9KIu
{ȕ=ه;[qӊ#{Ù`6;6GbȓK[qo3uM%GQ^ cgԲ{)lZq`}ξR4\8TҬpm۸cd
K.,TDLhZlxT]DqEECd38l:V2*9#XWh!e;0HEKi6-8|A'O%+<6m)u:._}//$>j3^AS#*\zYA_@6*W\}*o:@R6ZV&7o`YA&ŝmΠ#*~sҐokr'l34ǜHNo
,ㅠf1x:Afa?c?|=2">@	k{뛽ް1/=\a,-v.v-<;]~2Bx	oci=dfVǓ:8QU)[IO	Bu(K("A:
d"q.`X65Pf$r;H-03*o>q~p5p)INIE2<*-kZR݌=	ƃ~zEyWKh60]~8>*?wp^S~`&R7B85W9O1'Cr=ƃ]{!FpȦٮp!'>DiHer}6ez|V,KM]N䄵sXWs(78R9Ý.a,9Atc`2~3WL,T8El2:;Јs+ar;A8?lk3*2SOzNL#P~zK'7F//}L:d$~FٰV!I'^A&z6ͥg8`0Z9:w"f1c$njYF_>bP54nAqc`Bnd:JX'3$gʚ8_ˎg|p±HaI{u	,РuoV$$@4$>/3+qZ~xioM~k/?ya{=\p9F(n 2OA
uKxTU	C;z_kǰUq\U`#%D`i1*U8F
a^`}XĬ\ԂMR	`{
Z&U@ǃNza0 wsRG"clF$QF+F+U3hgN2y1)aS{rpk$u d-40a2k1W8ejkRj<Sre#IT:BցMj!r2%|04hRlHXD)=DkDXL3+ꁆr/$?%!Lfavi=ypwr}/($:?f7jbD#	9A҈!$H^2ZyQ!ꑒ?{еc{g۫yb±"?͍YEecD*JUcD(UFiFU
_q
RQ56P5
I&T4y\Y"oMFt=qy;uM0,pl}
=#HC+J=I!}D4ve=FHng0-d\@$j=Ct?1%|gb@%)FGx2Jw@ry[(X/jZ>f7W/H.-'T7ҾT#H_)1`Rqܚ@xPFd^,P#6"pȫ$cyk$fz/y/›>y_Y/@
D
9-I) qHR>kIsHjYx݀rƊ	{/~X?$\Ϧ}9l&:*=&(Bq
ῳ3{8WI>A$xM$|IxHTx!j,Pև01O[)|8w{x$NONTQr͹gɐR#`J+[y`O!V;7-{W4L Th_Ѵ?͜f4 &3YJh#ġ`uj	a!e:)Cf.-\90+-&~82ίB322*cpeTZd:TSkD#͡
,*XTgh1+x=~?zAP5F(b%Ğ"g!jr,r)r"rz7KC\^ukEr{nc#6_k!GẍL!LHQɐE"M1MjQ*
AR5+(OԑZS6Bm-D	Ե{&
"Pe%05YRQEj'+h
pV4(^CjYSUstǛu[ ?n{>dXݷ~U_AYA%*α_[UVZ\z*1//;pQC-ph_/}M1R~K{JݛR	^׸SRtf8C,U޶L(޲{i'3]LfCm'xm0}2ҞL~Fڇoe7{V>?u~5\YGXy]hd?5F`
И5ÁQL]_RÚ$DۚDRE_9x{TfThOan+[|9avop^9wtNjQUkOZIp[;gL]Sm!|KZ6ª?wOwIB=fVKA	m¶^z&*.QU%0a\%|__r!|6Md.U2`['dl/I|r{%'O-mGKnFyJ;~oӧ{~:Omu1<64>܍MYB9-틌5=%tqy@3Sqs"鯻}T͵gJq'CoHCqS9}]̻iRsCV*Sxnl\gޭ֯ޭ	
UNi{[]Nwlv^X:ymVWwEք|*ZKmϽ*UVڡwAթM~wj7wkBCq^XV[,Rr&Um΋˓Yì~NUDUsNB(fq7/p;{W[F7\pnrcp,>8GVeYހ7"<WP#Ъ)YWVֺ%,FhkLkQVa 6/5*B~E!ZŌ
N"ZhT6$*S CJfZ6pcOd9PBBJ\mx4T"pCa=uFMb(ځEOb|h;<]$hmR0yxda+3E'jq؎]B{7#Rt\Í-uu1؂+%jpSB4D~
rkXuPW1Faԏ3{C./WxZ55Ԫ_5B=Ƕ<7J}cnEQ!-kfܚ0KO[7OE;#-Rœ2up!GΧ)cғOQj|2x'8ӽZ7q֢."q#i/v\-іwiiun'*TY3YXחseDjM,B&W?bNQRb0?3`~:Яޙe4hΗќ/xd~Ɩyƈ\%űIo8I)SYiRƂ,u&y_()z\L&!q9I4C̦	C|6Cx;PK?gU/iTtEg—x :&B]XWtK'_[hFy A)	R5JBe)؎h4"Cy#S+7:<ŐV-t &6_UҨUmY_mlqGմҊye0RS.JZʤtDC	ZHlDOrF9%*RrV[FP=Rl{Wr(H
wWs#	"`s^zDwem$Izi!۳kB^eH5IUH1`Q6:Y_DFFdFF8ec[^vy[pVLcDʍLJi>ͧ&|1,L?.3Obj[c	F3׿C|&%\qfMCͪgF/?wd:/q4xgJ1$|zpg45`df.
A~g}:+	iy#Uy5k,O-Ȋ'JE#AV3*m,6If;\H$TbNOIĕ$$rќ	NO9J]/M'y2Ļm!*P6&mV0sfx(F7O
ưvX]*\-7{%=ҶZ -\_c"szq
njs0~f7Ht&P.;㔢ّEy*p;zGppe1Nm5N˔y+AD2,]6#.qΦO]J9of3C`4?ٺoQuJ7AO4>opЉLf3
ߍ
>/մL?W~}Y'Ѻ_ݛ,>yu}	\̲49{?ƲXv{{'>M6CmH'rc%ݴz,ڡ,oqW9޽g]RXtW[C&z
j6
Y2cL2ũ2!n-߱ae;{kmwl
A4>먖GfD5gę*ArX|d1ϿBЫ'}*\v3+^lxi~MIoWf#F1ݟ=>ף5%$(2-/OCO[k;ͽEcktW(MzQfqKK/`=^%brD4"_>UEתQcՉ|$]jɆ`L\hhUO辺zr`d
{zPSy_6x-E]w.L[J2*
Cy:,pʔRx~]}B&%KEw_1qk\MKPuuK0
p..50FK1e0U)D_Dy:BiP:>S1U1O}JnC
;
FF
cm8B[mM1*TQJ,ViIbrxTEGҌ6rCo#-[&#%->l/Gs?ònOI3x7a)^ۺ"lrs}ݺʂ˷%drș@s6j¡S5!6/JˉDSwLgJ$+=9cVK@1:V	h򮼾	c~2'vW5&WM[Ccy,&Qݺ]qs)kXX;tWrhFb]!ARDTVXX,.98hSmRc[#	FAcM",i+=뀉W=DA]%FjaxSLYpd)JI{r_[c) Ғ`DInbn۴{q	nX.sxLBM)$cߝ̉%f5ioÂS-BtXiڽև}R$74Bis^SF8P% eB\
/Zdz=YW8ۙεĸҒk+0_{C@YA 6^
X
x	iܶkKe=fe֝5AƘ4=4)5ZD9ޟRR)Q
l,LE\&uWqP22d(ƃ4!ZVPŨ\sZ17A	D(%﷕Բ2h};4T̤8u{SZ+V
@hG4Q4HDTj߶IBZH{'!Յ@NvCǶ
dӈ^GpH,%Bn6p_-^UKPYl8y$-Vp|){hc9DcphQP`ZuV-A꠭@B{w˷PfZƸ@<7_@qN[6虅"sd7HDj"iSH$^,9<	*Z8<YL(TM16_]W~l 9Z]"M'y&SJag7
r\X6) >mTcO8yEoYƘ $[J5cgҬw;jC8[:""7bذ
jҬM\Eւ
	;!D DCP{Z,qM<#\QdbV8.
v2xuረ7޲{l7MTR3'Er;cXߓU3R$x_$({6]G#Ͳp1PfeAIWBxJ(S^Wڨ;5|Y޺ZQF}.kYbRTpu(6ej&#R)4z|P
]X)̎_dj*ũx]wթMLY$S\aӮsባ-JŒ‡=+ւ{Tq3x3NTKE7E򀁛+4^
@߇ݼۡoꬕED&0V\#[0s5aIɑ8$-BH%yFDT`: u"vP5/Rϝp	.(g1Cm䕝n/b}4KӋ?.mw؝m-ʿ?|&JZISIS0[923~z%L{ԛbv>aGuSX6nm皷6tԇ@|AՎWXKrE`٬dM|/~X0lXk6.&^=llv3߮zvۢW4-708tcpL\o.Lx{\Z<8iZpKtÜ}W9i~X|	OPB96x_WqOT}Բf/ʛ[#u7kJ,ȩ%ؒSmωmd2cTXJFxw1'j$
=ν.M50Gz	NT}:5``iؗ{-`Eظku*7Ɠ"6|fnqך6UK{2iꮤs}fÓcL@8xw;d'2sFT;J:)*9
sKXłBXceI`RQ(+ׂ@cmk?șR궖zFf4cW{i*U䞜?ۯ+fG+IVSWuj%MJJcn*[yk%KJFs%HDE+/Zy*8ʉ7vc 	d1!1@89-@(̘`q7N+JD;[V^W넄Nq'9 <޽Y,Ǡ.],-qIܾQpl]R[R$A8g$9dkdn)y[cdht?M%>'uAϓhR̙I`&I
->X(~}6^@8VN=oWڿs.X'o!NOl(\]rLr5ܜ)vfg30>0/B&A3D
G2a pL7b#X@z[Ato~z]	FS(Wa¯ZF{$Z3 e aT氠g2EJ;:Y7tcPv*N3
l)6N$Ϥ!MR`Xi%GW4Uk0
htX'(fۨspDh
E	di,KV_*pdo!^ӕ6
UiJrJSER)m
fsێ(Dް3pvr=PY3p	]^f*9buz!WOuq\]7ajuAӓ4
\.$($	vҙN.9
ꈷSEdE#ptU'h@4t1Zu>5`ӁZnpa	ԅ	q#J<Iytx<ۀkRk,Xv\C0t
u*{
>Mڈ"lD-s\3^kK}Ary8^mAaާ8Ư4םcD0E	}{(` r	*{(,Aʪs./=ߓe'Vkzu7\%]tfNݓ:o W-"<ҋǠB?6ߴTl+eKkhCo?\naj"8>W߇˕X.ۤ~\n?Om/M<]?}LET_ȷlqgw%r!GxD?>i,ۖ9BrKJ[Dh.gK	$$J o89UcR'UY!᱕C2e%HVVY]VB{atcU#}RI{%Q-):lI',kHJyUKQ$\0@^rc:آC2cۗTF8sSdNmSM\[RdM:S-I\[-p^['1?֖>NmDyғR`4+})`~NihVRXdz
A҈!8
+]0dzY)oDZ?FmtVzzVrl4V܌:ah=kjan۴
,vE'߆:>;Ӯ'i&bfo/	0PGeذb#L-87l6ZSL\fԦzJv4b
hF걙= X7MFDt2rӳ#k7uK7Tp"|@]m
&yLϋoΓoV`y~ş.B~U^k|y{kWz16ՊwJ)LЭTkŒu7X(
ƣ`^20ǸY]+ÄfnfA\,P[t9Kf)K3mP4`$kUˠI_*K4>j[5B*\Y{р)Jfƪk]ro}T=uր\qn3tV@iah]İ4=/r-/Dru75ISWdŝѰkIutq'	04=g%E:3ŝ]X#N%9Cx]ܦpsz3FŝOpX1ƀkjS@8(}Ubw%
jx7
;'H}
tEK58t(jk;SZ)(ª^+oOJYhRiVRʜNJ%J0nzևh*aʗNj'M]JC<0g*UJLTBUB
^(8M51a-kŎckV9WڒDD\Dg6tV8կFnJOJ%JÁWj@o`)h޸7Əs5qZ_mj,Z8mgjȬjՀSwg@qA/:L`QW3Z"WɁ6G^-S7
w5"Đg
tta)%fUyݕ>iu(YvX
J5aǷKIA{,uFax=x+H!>Pٟ4x4cJa&yc2k@s4bf@,ʀ@N'@%=$t%29-Rj+]Yv	ey
r-%R%(_6+$qC|n SO浀غlMe=2fJ|#pFF75ӎx[ u궑j_S4zlan[!PXɞZacFEHʵxFsE%bUAX1C4a%MP*p޵$;8qQ<jeo@e%qj4xRc8Zz:(shM,6:}c:xWr^MW\_W`]8~ݮBr
ԊM/#1:kw~jۺ4!_ΛWz/߾y	TJ:Y\{PZܶCvM<&/ۗyˋrI_p7J4q2pެEw]]z4!N2Jx]4"8k;碌RLx(!Q}&m6$(BB^ڜ226,8DwA`x0m=(tBT#7jӮIR\D42SR0.IpD%z1eU43o^_-ZnPv1SnB9y>z}:CxBsDhF "WDYG(f:#%EkQHChP9~
!M[(#l,dӣkq7liTY{x#iԽfq{\~o.._']6w_\|#ޅkdqyw|.kU&'_JRDHj8l1Z1\J&4TVMXū}Thz7H"sF;Y)ObPȑKa"{5רѤ+2NhcؓtQK
J"kX|;~e7_RŰ.p',
?BVx]NmtVpPOˊ76Ni(,`U{@0v{5Y,ZgΊJ{iTqc@07	LY"6ӕ%=NmMw,ʒqo|n$WV6ԓI@3-m^80aV0*@ +fmȨFpWEg!
ܲW(AְF2T`mN!~lPm>ޔM'eNQ$D̠Za37 R#XU
_џj<~_jj\P&~#etCԩopb~D3|&cS|[
eʱ2 FX颙S+
A;|&cSݳFG1:WFwopb>ѻ
a!n_42eie)oS/IŎ^^eYh&wuq_r!fE҈8/It0MvWY=_ْ
Ҵ"fI+fk͠J_IR2AߨyҧKuXgJ_Dh%*f5< ;NnyۡiAZfgulFi#Hӆ[Tw")e">R`n=M]~[/xnPBV
*J{IkmhҬO,MTW_D"{{T[^E[iZ?\1 JZ*'mJ&i)ku) 5$[/kE㹭ld.DL=6k	|e˻)U0 hpՉ1yx\)8dTVFzT	RFh>#[@4!Zӛo6fǿYj1k)xG&oV4=6
0qZV5	Dx{,S_|
w^s9r4TW޽}[uZsC{-r"lgA}J/?ND+|E4pd"_2*u	G$-S$=:?_-X;_exϒ^Y?=1QeYF5ʲY
L!ne4O82B*l;پ||v4=dMK]Ǯ2E0%H2GS_}d
ʗ.0}ދF8ϓ6Y/OuLqQKf0%TX\銵O*b:Յ|إJeϡe|Rq#erB)|-=-㧰Dl}zRN;rrS3w+O
LnSXwnE6ow08J[л tJߑݖyNQټ[yFwBsݱDTt;v<1XYSb)#R-;[.s]м­kdGc	PF=<-Ovx7
RlfUtLX^B%YU̒c8s8Ԃ.0OaH|EJ̉p|=[8v~^Z>C/2sI2֎Պ
F_w=kޣ7&,j	(c&(h2VQlr3rRu߂4Y!}'\}s%W7Y6'tw:`³`^&ടum҉1,fJ*"3,~ӻn`)h5F"{[A*u"UȠXRx-qSuCJ^qb#*2[΢il_p`ٰFZZ
!
SƢkZlYsԵc}"?KZ񋴤RiQ"ԕ!5xiPQ51UTAXØj:$:鶎1yOT
Ű̳1OS[l6NnFVF6f
54$º`Q':TL].؇K/kS"Fڳ!=!C`n
ކ==Pet"%FTٚ*^5]ÏH(R7PZD*[?!hq`<!vqۘG/ރj2̸5
A$s`FYēGܜ&&>3’VgCRbFN;-lZk4PjɄD[G5u2H)-}GN=r+Tlktb=hM
JݤXл tJߑݖD23KhM
Kuǹhw+A锾#ǻ-EH5}NʳXh
zםǕUi.>N>!GJ"'=	5ZX:Q)1͚L%K'ŴN71bP:KÒfa	#Sh|JT(0b^0J0Z80&,(F0X n䌭צfg!$މ*`PGX!.?m}wr7ޛ*vׁl3~إ`R41wOJIc9hJrW%T2}(g3K…e+EI^rLÄt ,<[xHxr	V[isFſm }&ZTa%>"nbj7o?on
fA'%@Xbܹ0t,nU{K檀p
C..~oC0@,ۧ_L8ۘNU`jeH1""KiXyCs$ *ũ]4(	NUGtD`q8W݌Ӎ!/}:Ѿ&l.qvtNĕd-S.6l/뫯v~_T~|C&$	`b&u;3eه5ZzhkCCc#@w]mIli4Rb僉r*_+pZ`YJm(AFAЇ=TsH;aі1N	V*ChqăFe0h
Zv.y_֠I@q'=̀`oKLRm=C.
K+tw_އ.'?}y-DcI-捶ŝrD}mSZY?W~\?^FРQzRO̓ԬSy4M]+F"re|mWU.8#oSw{u@y3juA^Rr{7:bOӺ.;^.O:Q1kM7!vщs[-zȚ]xfdKpł)+njSu˫Fo+ҙt=/q8@^Y]P.
+ K.2GR$N5$,V*^2Bׁt"L?eRM;o
)&Z:'kdE5DPGDnE+QsRȰ,zHd^QSGh[Ӆ|=&fa{ҫy
p0cMϼq_/%׮@SZrU\l(k|&XX\B.p⺊b\-yW%͖:•s8駬[gү5q"XXvίZOxixg7䋤nL~xY+^u][s7+*4
R5[\*/R5h[l+$5{*}^fKL*pp·|DB=f/߄O'>1!|	Jd.-Mɤ:ԥL:%ڇsX
+'zU~ҥFʕEk@$s!zݨklkJaX
kgv`+hךԒ7OAvJ?;JV2%T:Op4w*G*o	9!OlJ߱ nX!a*PVTҸ–5q+/S5[{)Ts
BFHg 0T웆oxo²@!JlVmM$8t50NUj%*9TRJAUhͽ]L<MB^NѪqT@HI%7Tdy"4ҲKm;6TJgَ#J4"M80lɅhhvwfF$߈ɧ˧Wiqu*pR^N<\0{?JgG)cE8\2P8tg(Ǹ9T&"'lkSi]ه_GKIG;&J8piRSiŕkk
&MdT]U"-wJ4&㊀Ks:9p绘\25n=|ra'3d,٧;)1ff'PCƄn\~}Vp~,Nӯcf!9?]~?tLEOM/QV}·o~3-gb[63?guEV/b7Or~6X3%/wXILN~)E	8&~Ir-Ƞ,NNu%=]R_&2ܷNoQ"V$	V}jHj^T5[PQE)AĞ<>tBtBj2=Q|h
}błZFS߮KĮyyuЭ %y-;1FoJγ`EH˟Ck,~+	I%.P),YX]9oQP<">DNnBO?5l-xԠG$l\N:][mpρ9ɾ|$`:'݆sM MߌgL6,qy`7wSyr1O1Xdn"(71ҕS<<O*PQKazYUtPW27D#4Cҗ{t UKm[e^m_ܘ^V+/ܡZz.1YJ.^ȶ"Ѷbb`WѺkPm䚠Yd|	aߦV3CR&߸A#}W`6ʟ>u11c/$Ѿ(hXÚkb.`kSQU<*sE㙞#N3[ݶn|.-gD4u)+an|Z^('>7a$$

9<
ٖbAzP4x5Ψ/&#RW,릜>{z)z"_dEYh9C	A@zT	b{+UJ^_GUzu9bIiIJG=
k`Wձ[GR\qUܤE)7R]R_9JO1J!q"[(8\K꫶@hQzQ8(EJ҅An~ғRJ]
s/K.hg>̆JZW(FF3-gʗ$lʵ$'PcJjK88/U9dߢBj-<x҉Q'cm5)H7]Q!qvw^-	-rۻ]AчA6 K+g
]7OWA?k߽-Qjқ=w?f
*C#-u]r)Z
GJO Ї53s'O/YFO*C6N[a(=jXeIfj7P_M|:PNZ\wIظؐA$猔7nt=y
;-.	04y4nlū5V@_ſ"L5Xno{?
~Eq1.c\^Ǹn媁jz0d
- A֤o8GRĤ%k_*b1jׅ^__,wn^_,S?F_-8mr萉rAXZ0i+da?yA^kA#p!5qI5w7q-醅ge񟷿n#
jOf~=XnFaKwʄuj-AGvĪ~͘P!z5X+4xc*$	xfyg@a$L6I{٘w0b	.t;YQD/vF\Sg=1-ͷztEn/|]ο]$e^ab 8 oh2"D-HU#Why
Rsf$[ޱ^=e	{r6)>G(Qyξ7ԧt1 ШcH5𳹯ۿy:u3>ae*/,L-V…ju:fw.y:mt}y+>xb@	צy1)<+Ь4mClXio]i.U
5zҒJ
„
<3h?+&VoBOcX|dGH#LDHJ
Wӝhj:J_MsL!!KSQĂCs$5`:Uե5jg]jehVuOQi&8lva1,ctJU*	.V`ҁPu]R1otSP{˲2L.VLnT`,fwqCz>-r&mP^ՠpܞv
0_uKoDS շ/B0S6by7So7c7񦠂Aф:?7!LgC~_\wWV߾xȸZ}yq[߭Ԏ`Bjv{BLTZ&!{H`'=u\OntmާUȢraH.T
ˑuӱE$rDNqxzMq4YS|꽒ҁQuDI%)W\h\X߼%*fnMʋ Exl!h07ӂ)d4__B7etht1*@Xѳtw?fK{ﷄ%ӄN߄}sd8"Ye9K6XȈi,]L+$Vsc49@,Gh0w2C?<=<*sJʜV]M_4~)[(
L;Hx%vӣbTbQmF+neѽ`<k$#)4;Hg;GpfY}~uSAyg>9 nu+٫t{1X<Ѭ]AXիC${u>ؖl>ο[:?[΍:}?{}Aw<~f6?֚݉^۲rzӿb
?-Һ$o!YOh2ǿz<)9pKECg{٭CSLI~M7n8`vjvw|G|!wLII(ف!JH~3^&rrhӕE{b;#߶cbL^ wa
=	6wQ'r26IgSj)d7*]UBcy¬؀UfT
ðO4$&VIdJ'~#(	1`wSɁCUĘ4P=jK@Bb*9p%GL:(ȁW0*JGĺXbF)I-^1#VCFH I6.@5aɥ"&&!# 4,8 ؼ  CV-d+mOV$
	>aL=u Ԭ	a3*id)*L.Gߗ?MZ^E1Y7N#/ٻ6r$WY.Z[,@xq۹	6tN'QU1 p9y.M5_wIff}w|1ϿWrO^S+E}0wׂ;)ݕܞ;Uq?Y)yN,T'RUĞOzC}(Go`|6|s]C[^:K'咄Lfj'A$ysr_$ _2>i"B%/ZO5^@ftSyC.		WKP!rOBp zp5CF}'2n
\"cE jƦfRkW
Ywޢ~_m}z!]&/I?{d	L>6$[<d2O'
ކLUAZ.}HQ
3Ϩ߿{ c̝RqWM=i7.Nܯ/eTbx3DejL\pmm}]~|ѷ	RjQQl?4/tQ;!;86xI6F(Bձ!elpgC#	c
DwVdLej7)YZyi}l_"Rඐ
?C4\#k||Ԕ
QXQ
02/ʊʊrF+qr(	ZYMY呪jӁdvD2ƄPlaRT@o-pqmr$=-^3˾QQvZIFeZ*;fi,s@Is8G.͘"Mi>T#.kpğf!l-]
󆍥pnQ:rڃ;T_S-4'Qhj,()?볿_:]yX~y5_bo2eh%fV_Ed,c3qF+2(-^e>vvw5_Թ?~{jQ ˙GV8YќՕ\˃܌mjΧ}Gkܣ"xZ!NK:G[KE‘vy0_٧(Xv
l^Ia644:v$XL..B/4>ݧkr@p?Z'	`3rg^5,/Ȕ,RX<r1xMOrB٣Z"ΆGG*EHРvi=g J,[#pbDQ'I4U'hɛK&mrr$t>*y/ZTF TKL
be-ZEx0H_K	eI%e0e5
FYМbŕ-rƘ&*RZ꼣}	QsUxk+&s$
v<|3VΦUI0siZhp,2LFJ1#GZ=hyȘ"?D%sk 9j(Pb;~}_4* h8xPhD]N*E~]TBbIH9Ps	lALXxu!x8:|t,(rWTgz-,tNtQ,մWNsrS9Æy^,j6&nT
tAWCHxAW&:  ~.A׹9(+|l^q.J%31X#!eFDW+Tk|w@1>ֿ(cQw]X>=*ǶuUk6栝yb~vdM?(bQvjg0OoCKrNi(A;?_z$Cpag|6Pm3.ok8&Kfƽ/l$݁K73q.:/?ƿtM/T1#Y/^/Paqv_Є;DXG&-,CFFh^ز*2w1+R2C
w.(<앹pԥՕvUAY*-ؗ.@()`M2h邚xcŵ'#*Ν878WR#yQ`̥7{l{7?8uyw>pw/{"lDlrg#nj[:|rٷcu墥:<͟"[\[wHK3&^%g}_5[:%q\7u΅IŦ,P>noFPH{j5XJq
M,=eiiK[uˁ+q.|9p%mW>Ԑ+5d4؜%zlޥN4bkkk,ן]#)Xcz=jW!Z$3P!h$F9"-hC-d$S戇{֎_ҳmOpo:	ބ^JK.KZs'o.$Ju=M,.|jX6NsuM9
>{{g:F޳P;4mST_<sp?~ⴔ-?.؈|ťOӔi ?q?iq2[cyd$~R")*j҅^Qm5Hc*u춯Lcbx4Y)TGk9e&=R3Q܅5вȡ*#u0@j>
\:2~ʽ?ǯ3z_ZOaTڙq6rt3{9+svZo,(://'vVy`i
}*Z\k1q󻬝iM}֪s/JvrqNHC^&IY7-1&mTћuK援n]hW$#MRy:hb݆00doLeSULiaۮ?چ{Sh
.C’+A9H2cI)WB/Ǽazų(qاX"TA5;-2j9JnG
^*JJ1
HI+e8
y^(hiJ
xX)u$rdsĸ1-Le+ͻwM5_wIffSfA*$S
يoA:(t\Awu9sd(䟿kn԰I/u^=˳rǪs*pom[0ow%*3LSҲojW0\_y3_$5ʟ/޸P
g$4/[I-߸f'\'q
.I(BjN2j%#Ad}9͗'t$ 2sHq\Bi=:xêHW31:5U,PTι X) "b
B-*IE^@ɥ0e2<|}I줂?k|t(8
GBe
*-)J
{;7	Y3c{D2?ncSH8Ui?zL1'\y}sK`(n/mz۳?pe{&F+O/AWqq	z8\ pe3AA7KV3 %q:\11輱KZhtYd4{If-jSW8
NeR##,m0OӫEfmhO 7q-	V1H;EB(ʄ4aeVЂh}{MC(<F.S%KrZ*m%.!WU܀@m̵dB5L:he_uY`,frI-u.o&WGp*>.#ZY$&?a+=NkH8_5%Z:%]Cb1L;o”̸sl޵ȧi]zȗ,0o3Dyv6j=,UݭU2I&(i$EFd_(ȦlHqBq~A8}uX\M̤.wkY=O	ٟ_
1g̲?y!|1%o]l\2ڒ(=ȸS@<w#		fF
isϤ'X`:Ct')dAp9&g.*ɣ
J%y5x¾\b-RJ7ٹGTDp{LvR$>t.Aّ_!`{38|zK,2\bå%$hqG,5D(ϵ@l%8q\᰹8siRfdDٸ7w~Z,'%#XfHsLkHPцF+GeGHx.
;sC9̴2Sh	o>
^?;ޭ<6Շ|~j<ٞ_GrzjzbyS;
Ԍexme5\՝t/^mCo>}dx'^gBd
>#I$[*bcI.$چd>X.ddN'6-ܦL	.tmZ+4_?XQ4`ImcMo-Ǵˤ7T.5
(Oj:qn370@?\}u5oBj{"Vbܡʱ1/W2%,β:CK\[D;T̑zFXQϱr%(4*(g%$KIbN%.Qi3TɏgSP!jV-!iԈo߿n^Q!"`ރJсNDdjDdOEB"0vZEZb
#Z=!)"!a^/O[ɬL<`9$,*+O"JWEJ>t	"'Bd_dOFBKiȾȦlL622G?lx!<6NZdIb$GrYAoB븑dur*f|fk6mT76dbV7CIʼn3RD:^݀	pziRT>5Y7Kr  ddP}BLzu162a@BsvB>5M$r!f')^8ekX潿PJ«b+2~'/"R3ѠB|;iQeaVzE-YˑJlfA5v:JcFqo?`b||i%B@TKGLWk=Lkv.HZ2I6{	ɬڊUlP(+}Q/ɡ\dK6J2I!p|M92I]y[%YUơ3	AQΤcR~1idG@b>66y+Ss3)dwy{)+Ͻx-[h33g!3Yi
r|?$??~kg[_n1ty5z*|^ޜͿ`^PJs0?
ZbMv1?ŏv-z)?@63_\ۻHyQ״w!҆ G}՟I~R^\f(Cg_r>wWi>!KK{sK,AɁBÙu!͸ըf.e%j4L3=.MRi0sK\Ag913?%˧W2Ӹ=\
.Y1
Ję&\2fT1GΞlj3zRNCKnx~tǩ(d3v#Xi1d/v&} l|c7E~ooŻOq[wV҅91s֕1׾i
Ǡ:߃sGX!i%"x7c-rA8&|Q#V6$ېqš&=q
|dau+Kj>w}woP|&eYE	N(>$oZ ]>nF̓L,;jE6صR'|ra_%UVccXRtG
/<'#š=^΄h-9`@^ųgWlnLJtn":Y@ŃaZ1iVGf`<1f)g~;kAQCELP;"/x"I$EVBoDr-dtB^702.=X K3"m3s[
AkfJ
XZIH1AK!KI:鐼V@)ކOk6{DXgrZ4Hk&KmYm%9ݹT؎\a˩P4\	…a١9$cn!H (50+
JX+
.e|݅؃-B3/GGIنm	/qĹ^na&	!$a)^z!Hcύ4c*R;	}8OLi1;כbh	W;l!M3v	-. Gܴmןl@a=[hϭ{-J$t=V{g7׷w7,J4YO|(oLЁ-32tp8u,PgAa]`k'1!өndJe|"̦GJ"]uL	\0]$-Ī_>=G?Vt˝5-\FBK'N~nr)n0HZ'=
+$iLy	LG<CL;Ŝ:6ju<8BDF3bȗU䚎X-[T߀fhPWۋr4MCl.,BExin_=V[=5Txy
x)1Yv/!uPd(IhjgSw,0y[{iVIw*Yw'.c?\çՏӣdZū|:b5\fLC'KOJ̠Vd[	D²^˒6g'}Ꮼ^sOr`us4KнȱnOB?۷ORкe>ҕxnz{ξڭȭF<"ߞT+у\)o;ƙYK_ru,=qQɧ#88bn8Ou4V) "zu8cdö1Ճ`s.3 b`Bd೼=DMYh۲f+Lިptܨ$+f4rh}~DT+^zw~so=Fm:罥
;کEG;DnT-}Ttҳ2,w\l11_oSM)h9j)bbFIh)b.&dg$ʴq(d袥筥RQx/]NsR]8XN/_4uC_zЅlݫ.~m:!7Ȟ|ӜbYBGJvX"ٮE$D{B0z^
qe"dNcx%Z>ΐjb|ъɆm
!poҷ.V-47kl$'F!'';),> 4T/2%#1n{M?h&bNF~	ԁ_0\M̴D{,z՟k~mbw?-\϶dWJM6"4W`F\FB6ôid8pIR|r:%/:&2f:hgZ$O Hߔ	gg<8ZD)-6''.z$pd*=u
bd_B#ā!TwKwN%.B
>f$ĒS2a6mۊSbp1HV%BXdJ$"1rRhly%h?ڰͰqO>0:#yΆP#ι8l>hX0}d!F$ыH<,
lgGH뤋2I
:Yd\:qIę
|hRI2\H!M@i$Q6Jkݙ*P>S`bDnCۉI%W
0os[N.!,{Rc	Obg'C`afP*x0sHe
!H+2=K ģW%޵q#e/#"YB/5p@%Z+nvur==/~4kh9bUQ*F桻=5Ƿ8|?+ukfMiJ(?|+5~[qWk]%/կrޭw?~J02Y;#?7?{-6ZO}E@Kkt߾y&L~~ظϣ -9S4ڤa#}c\@nM3:V6\pЍx>r`y;)yGt\~s'D	C
0vouI@9/~5O[?X#_qwoy	7l)n^Y715GO]-£K5r=(bVsWQ8ExP£(+eQ.ѣXL$oдj1{s4'ٖIr99@m=%siWt&Mm(q{o\M0Ip4jDz3wbZV;1QCĻI+2ڸG
<&D}XI>`~>=DsM0 my٢mFEϞT+t5XmHC:a5g+Lo``mг݇VmD^O7?ľ{xP?@ONϣ5v~t+Eʷ*YohF=/SKJY"M4Ʀ݌b-V>6]eu
|<ӻoDlJ#}['n21wx)ͻcxzhoBۈ i.k>uw]@xn>\ɸ曻j|,՗wyztG,a0q~S
u$,PI!Ov(R)}Wlr>T)[GTJJ`рsi$-;P[d#ɧC=&w!hY&\23SgGޱ<*Vۣ{΂
`^V]QýTtբƜE[qs]Ud/k]M6oZH8U&X7惰-.)VxXXߴRqsOj`LB0sLҥir&('N0I8}>XUF`!WHy#YrZF(SbeFTknc~KΚ}ԪaҎUJrʉr7uXVMVey?H .Ғv;5#4R,8duƦ9*T1RՕeV)},+
L]HVSB'F.'L#=@VT\2nV*5cz5#zz_3D`+X3;;!.BeTP(
MEŭ)JjV5+/.ƼuQ/"(ҭOvX+w$vrl(/t*Aކ{j;XTpWIp6ea4BӻoD{J@%ƹz>s#8wٍYY`9EmjiŞ`o8w@2C_矞q./3A4g,osk,]jo&Kh!}[=*!>JN Q
Jab_-٨eRS*%Ra:8Ʃ]͗Fz\P"PmZ}]nPo7߻GKOٺ5_>%?p0YW[~2_(;R`XKxGҏyV=#seK}
*U)^$1ThewM]^f޿.eʻam]!
+<,1-qY{71FN?øx42b ۦO'Ft|䂠XIYʨҢĪ#؞8'ĘzC:;; ͈=s6>ڠ4*LRg05:~2;-堩MX)%{jn=OmjӂFz:"5݂_m^;9n}2yjdX	&FZn/$C#p
ᓩSh>dqJ”F7Xkb %S2
S CJ;'&/FlA]ל+}_n7fypojR~Va}}oT|[7߶!CDSHBbmFUSkN,ra]!	_4C%uoi7ܬ;\.$\l%Zg>Zw{C[ǟPrcޟ	yCLYw|zTɋ)YZs(B0榀ɢAQT5(:,GJ|Ųz1U\KNkɮ}|XHJɉ+!K
ԀRZ+dEmWVS&/ɈB9(1V{ZzMF&0tk69*m˳7/ts8w΂C?I5,tYx&|ۺ_CJr
i	K3@ƪ/9TV]K+x
BTzTtR{hl@m+BĈ\st"60[3hCc!>wFm>V
;?Ix9n1zD˙7zf}x5@CkK?QoԶVg$5.iEy8d%"9Awpp7g`8PbXnǗmI4w1KٓA:`bGe2T:8ӄL'@Rc`1u&df)ssNcIz4of!+̤K{=	XǨLs晔h2e,(1f:W;l'Ig)K,STvЧfR0+^STvZ%Z)z6LXp{'Yhi"`	i6Uu8WUɕj`$ʔ6,-S-G.%swIWYJiԨ+a9FF7nBEÉ?#0ݣZ~AB
4R SwOc_b7o6neXPy8L
X^X3PƩc0>5<\1hvׇZ{Kr6iCjtt*}[r΁d%Τ
0:Sڟ)J&yRsJdvta"4Yäy_ZeeviϘXJc&bv{A`ӁQ
lO>9VN@׮~,Pf0pɫ:+~X4,nCk]Zd{
K-pV(Io0(H	)Ed^_Omv!
&{6Cm$㌒O's|0iSX49urxT
fu%s}C.ZzB5AeBǑBh5s
	uS R@mvĽxXPm#cv'8ѣ+0Ip
4n`'p{4=)j^*|^za.L-Åx`q`)hrz`PPzȘQ#!CL(8	?Emj!zAY) I, INS}ۥZ
u˶RY:ObV*=R-4uRNKKM-%̋ƭL5TJ3%k(ƍ(M]uyQ(Tݺ
z:,!',6m]Xg?Yʕ\ֿk^@3.J`18A"	WٱO/] &0GBDF2Pw`	Ejޒ V3"%hv!#r5[ِ5[?
p@#70Y=F-pN$8)=*XdMbw-
}@UpAO5뤜Byˉ7q
pi`xFKP۷MteaPdL@rhvAQmyaxiX\24Hh&:Oo<.&Fi
%f©mR/up',0gZ]/.Etw__Zmu6*nȖII=?#Nʱ?_ro@Wg5p8s`nJ7W߯8W旳|쨘\Tb2ޕ|l.eV?6.5˻ݟ-:XsǼ b譹ͿELԸlOK'YiB6.E]u/+5hКGon]d!xfQ4.	Dx%l7ydQ5Qfa4ڠ_]_ж<7TRX/T
Ӊ#R8r]yA;auL;FGN[QRGJfYD1ɀ9q!r8
Hޥ5:"]2vha4J4+AI#HBbNfx`J5+oPΌᯄJ8u֨:$,f{te@r!IP1INDu1(oXBɘ0p&30!pāE!zQl$aex͢j
ROY.%a$d3GBAQ].&}`^Z'l Mv2Z0#x	BcXm˜əN.KwC/d.2JY2M}Nf2
u$}!|5M:wA[	k$sW7FN[L®yX=$/IW!ۣ;2d|"fX5(3IM=i}FtI`*]RHYpPYE[X6٨N:.o
Ok\^AKSjљ(Pxb2Y/mQZ9yTAitUz@LHٌgPdK/aT٘R*XI3&\Jb}H$J>m`b|FZرElp1ǁg>h=huLԦSN3PRr-Y,\kXB4d,Jo<@s@u<Ȝ[Iu3Zۧj?F{oŔ]I`48=>}u%Ӟ1fVxWF%wu5Zm{炏	JMVvýOCnZ>@@GksG#H3#r!;waw@ԖҦuzmk֎@Xiv:PGW(}k[|\~vWJ';v"l`~
`L*~RI焊
ib4,6N SMf;}l95z7
D'#\7'i/Rk]r
e[9)5_v@@nUNNر@p@%nxUMJ_,۽?)AW_]}QtEm]ݞ΄ rRԺR@9UdƐ
adQE&}yV
[0}9◳cYVսؚUo}E䣽|堕=sg3.muGSJt4ol>( xevfE5R{f:T{}&0ె;2$uTډ{A=WrƎۀXk⣇NZi.|5W1g<0000+3:YaQ220ȄW1XܢNEɇX}Q/WES&.!*y֢Nk^Oe\oLYlPg9jPe|u6gdEdy2^d@-,$ؙ9tB>۾,ޔh,EBiұ6}cJ}$&UoHZD**a3-j
sJTI)МKT<Ę7)Ht,qD(F&Yfq8JTpVـQ'eOw8*J`̧Fkl$	"ai	;~^%l̿$'\d
ERFIg)'dM⍟g6fSsOĹ>Օ>f[R@+oHM<Ż\~{~΅[;݇}63^ۑO{&M\7swD&j;=dW9zifi96~x@Mi;ɞbw4i4Ҡ6J>y	2sbə7SfB'
J/@ǻ?#B>uցEֺ5B|zgwoCB1!go#,hˇP82*r@TU1)JXHI
o
S?}{#h{]!)Y#-1üh	M빏]jx4q(>^=jX]jomVW|Oҁ_0Vԉ%ď)S:OGJGFr:fϓ~=re'ԉ'%,3tҡ'Rut'*+Hrf䀅WUIcMyuDsd9C8"kD5HҒ1X\+*1ci6LqK5֊[F-2	kM[RnxƙDڕ!d(h7n=NM-'rx=ªvFZ}3f]SIDּd0ƞis|zb`ܢՠExH~3(AMx#IAjg
NNfxrӆbB1:/m>-j>	Ż?oHӈxdtܮ12aL1\Սxۃ)~ro%;κTig|.K5{FT>B}#Ƀo&~w.~FɃ=8Z{;_?Hgd;'sNN"Z99$MpIӿ(q*|8b$B2'&CG_fROZаuJOLUqh02kɮ+2*+T-wӣgiwLqdEKL^߭)+8S22Z5|
t6SA-9Elj|\)f%V>HvT+4:q'uO81ߪHދAQY;z"CN{@{_-ʦ,e%q=;) Ҵ\gZˠw%krhp	^J+Gí:b…ݶ}ݚpmu:ُa9#Bq]#sx/ϪR-0RN++.En3(C%KaIƀA,0i߹GK[iOCY@(IZ,Yݴ)f"6>w;%_׮,p?
cܾlc/-duZ_ghTXTQ|=*6M{>FFocMj\|9ɋz2w^lN:$ˬnpK̒"糙>Nq\DK99@Y[ܥ:LΞe]wEF3-`h)%HBb*HxS,mɺHQr'!2>Xn$]==P
l{zxl}},)6	DAE H J
NCxՄk.\F:ZL(Ud!ԑLIuvy!ewC+_H6̜\JHF?ђ	%ҖhJXsYxmc<&T:;wӂ73wɠU6o7Of-•bw,!ۑC92x3EV)ڔ~03R)10q9AҩO)8>nֆόvd&>q9YQ\C;e[3/ry.VeȽd.z`ڒ<1~/CwD(|@kWH#e Ո ك5@)H8u֔6A̕ueC	}f406?h;+%Lol>sHPJcq(^@mwi}0BNq/-crtPxBZfLۿ{(.)@ۺnfJ	3\1I-K $
ӮsHl9wxg0ɫo10Q.@.^^N
KB\#Wf]j:0UyG&u@M ")(WF{K2N_
rZ;7fWb5S^yhA!VZ2k9܈b{:2]|Nh*,'9Τµ.0!a\oEp+Set.Atȿ7yic}qM
O'@m:fhh7!ZR8H	v+TXKLg&!٤*p	mR7d=ysi0:RRt==.vyrf5ƺ\df6(Qi)Rҭm;""I;c)c*9rrp\$c/qS1]TesA$1h.fV1v̢8BH05DntcoinhvW*YmU&1Md`%A޾RY]tSu!h!>Vܕ#iDK1Y	r,X m$AƟ=В<{jM5rlz0`x#5/Idd77eHyȐ%`efUD1a*)ioqΌwSZ'MqvӔr[\AŬ)dsk8dY D0:uHV:k%^,z7N3YleBPS!ħ)#:RFLUV֔I׾[x'nq&$Cl#zP;
}VQ*+Ci"!G5MVK_]2v~511Ң.Ұmky(mBm[t]F81E/b QbgdX/r`iec\]/7'j59C$Ԫrt&j\y4`IDlk}:RRB*ׅ!WBB-SRR?U;{WSṜ^46gRk$3X"-j~d`2B)xzji#N91BҗyA8kH,Tdsne.ݜ*\Wy8,NQZkbCifLо)r;#Zb:;*d5QH*pDg}oԄlj/FbM]M)vSATyL)UI&!-Z׭:-Ѡwؚ1
Qx7G!2ˣ!'<AӒ%aAbwy?vS^-Cz0v`7pJ]kR5Kws>_ځT/U!NJW栗{h2W?^oX5w~~eWbU8Xfx1I0ԒWU֍W\!XT^dږUC+|G#Ipv&C/|+JU^^;*˼\Ι+%™U)hȢ̮
yhҢe*}ReFgxN$뷱

8UI:D\;cdw=N{NC,.(NVeKr!M6rUvf5F,[BoqgURP̥PvZ$3rD~ڠRq"	ifaѷӎ8-%%	=ce?Y.C`w-#j[p1hhUZ^uLk
=K[ϴ'A[f`IIٷy-yyq^e@Qi"i쇾, xH3oK>YP5Sx^W&Ձ?~> Xx͈&$I`qw^%ʹeG[&םGqo\*::VV1IA69IQOjFkʏ2^d˻6)A&WNTR|R$v'E*{IBb(%-hʿDZwk]u*0yYbb,
y$D$+S^[_+fq!/N9>$fHP2xAFMeLo 8i1Wcsh!ۛyߚ *Xr2:x)F\[[+=eۏg,oxc`]&!)ā8
BG~}{=ڂn'+~n^y!%2yOk76ghLc3V&7򲯙PI1"zFQTm}n`},Ww'xI9Xʽ=6\!bByr\ZS=irkWGH#F}'㓤cw$xYۚRmo^X~_BWm12dH C|;̓2԰(ےrw|I⿪.*\U<pCQ?(I;{~TRgdWRMLĸ)"Z$mGdsDAˆG3Rx}HIq!yG^BM H=,!TY"j[s{	}T"藄A8FKNQ
YKkkyE@Ac2Mߎ1y$ϚH`M--ʿQYֵXF~4HamIBq84$Ey'i_7GXJ
G)9[}v'Y̧]?TЖ-NΉ_.&$+l5]x5ݜڗU6OtYFJF#!_mqda*\VfȬG>SދtOպ1!PLū7xFqgccXn>J}Aڶ3*&v[jQ>(6$>RZ`AZOfZJF)_2m\l֯܏S ;яr]&qu~͠du1;|ݙGhɆL8Yrf|YqߝYУu!?}z/*/~)ɨ}II>2z qu1ݼ1
':G(2'i/,sr5|Ez`ٿx=9v!Xd4טe|&79.Q٠q{@NQITZʰ1ȷ|fJ-@VR:/y,MC'oWFbb9LfӋw40%3NhQCz\C>Wpe2o{Vg}^NC;\TC@A!o//ݼ]4HNbgh]OxzsXȐ@9_细>orSoRlq VYť@S(LDy
%GAn
ʰvVSa#o#΂#\eID>i^4QzKe^<0ءd۹09\L&D[SI.~P!T<	,<)kCv]C5V1-F~v@V%bl:<}Zu'RH''qk*	}U`3}֧ʷmG>	i' ?v4-籝<.O,+HIc_,NlGn>$ڦ_/yvtr ZܨPw3eQYwŖQmܯ_)0K\xoiqjh^`05G;oZx췣oި5Ssd߷)q\.|1hǬ\='+c
>(obMP~.}'t3QN7E'e	Y
V\MEC`k~Ic[v>;%dV]|E9`ÌK6kF=Mѐ5S}cNl>KA s&k|q{^g<#-$?QgaLֲ}9XخPI1fX1]UߚAaCxaWkkaSXPOaw]L@ݵ#S-0 z\cCx:^w`a&
Lj-H2ϦȎsKXR{2R!Jc.<{	^Bezէ=D秤O!icq
+
✸`X/m\f/cI)Ye!"qPоF'a
^qBzO%w
1hT-"d5kE;C4/:}Z7ȡZ,whu:X
K.NYT5Dmc
3c&!h@gz	&qS8>Gjgų
лîsgReyvR ?X=7!Tb&JI%f[2',]HbF{2VF
>.]؄54V9U
ۜȂKa֊Yc"I֓f
I3]3nE_NalԄ:"=w2=?S杁?p8U(Qc#)0e6]z}
T9>Xt	Jt,<9NQ"ʝ")xKMb.0\ld.xO.z&$gUtNGƁJ %k4;lP0S r^|aD$6!,FEXdD}
E2h_iJ!b36Zr(#숷rֆHЊbbBnX|[Ή9zB/az[oO"k=FNxƨ`z.fD0SbxyIY֬uDt	B'D;4ła7)4|$|0TNXH]U*%딱ON= \hRN^dFbL\>u 72m	.{ SB~itD:Ī޷DK;LU°Qba8&	VjNU*r)%-lH2fk 1q1# ͜BDS*y(c1D $ۤ nW5`0K$ez#/
ˏIE8wXq4~b4:=	d`D0*K&Q1U{E墤WSz7FjixfC`ȏ/Χl!&<͖
B)IXÒJDcdc|#,i^*`&,2%zwbM6gV4 BC+-<x jLW:7t0m/چyAq"0r~
|MKݴCH!1k6	3"EmmܤpmKdrM3MYԌg8÷Jb@q	ĉ`8a0%4Ǹ+đuRίI"4{8mLnOƒJjǴXd0ʠ<\c ez]OS;~7sUx!Ǖ0]XN
@%*U;D5>^?l/>r޷7j\Bg7om
sUﲇv~Xmvޟ]/ϬAl_\(b.0x&kZ	qbɁ%ާ^&vr}jgnQPsSG;;/LyҘ&ā7i9)SSӸ$SZ߼mcׂS?p-jL8[&t#		lfle`D9*2\g["KGXsєݭ(KFXۊR"xh0>0&
}dن"taF)Pl/JŖ?,~,8]4'sIh,3Oi?xȖ"κSȾ߷Any?S3'PZz|SnI{C\Vzn]K,^*#s	Mio\'-,qN)Pe ~ج`R{(+Y[Q9ϏWՙiq5gv7P4Z1f-*G#?Wl&@`NW&8o/:+x>|G?:V1-A:Hirb5K{@Vƽ_Ud൱o`on=ן|_1wYClrR"M+GZ8.$OhY"b
SˆV?./.`Lغ2	ex|ZZK)xw];hn
_.e1u	`zZW}i‡?l7ncI }ꋱ@!`y
ݭR"}9#ù!9cUfW=o;=^0Rl.CuŚL;$>[/CmsZ+ãf:~庢GWQè/P+}<䋱8+e,~dzl޼Si~8Ujvf{g(:FG[{C٠m4(!R`5
{C/s :WD(~'ҠK;肸77Q]|C
VҸDmih}E@zt-1$@(Fum)(R3Jr'mģLjG3hh,)l`qRq
;'lEŋrfQ{e5RM']x@n֘JVzVŸ鷋qR.hP#i$42A\eJ^C
D
KF	J*bjhSA##pN֛Q+F^vAi{/7x:? Pji~M:K)p/a[%U@Qd
q8W#t\rk= _BaZ,5IQ˺.@HRh!uQv9NL76aik`XS/JBRUDmc!Zf;3ò*ş(m˃֘ɽ+˿2$(;
'voʇg=Ȁd)%*SšZeŤfWy^Oz/2zHʧgpR/TD]JjD")|$J㓪rnc'g`F'@ov~Ng ]?Pv:M'zft>I6T^Ծ%nc7\r
(u	kCMEkQ(ђ[P%2+uB/FEtlaBZȱzqW?3a·<%OWR\ι>Ĝl][7Z\\(H8sN2yEL-Uɝ$MਭTi9Ä4Vry%kV:AIfڅlefK
)E"y$	e\u֋'
+᭯zwm}ݏGu姛}SM:0U$L~%~4RU5*jjtH)-	gd
JhE]IP\1
zmG$8nAQ+6TGw_-}.X>
]!($na"ZῊ9\E"F G:cB*#~Gv)o?S{o\V!.߼},L'jwU|y\RIP޽u/τ2A
IQoo`y(U,U	GxaBDbAB*.Ie*b=Vsyܱ]7:FGĨuK7(%@(Fcbx!]3O=y8{?6Sq=XbAJoO&ݬUAE/NÍXTai)܇>7!m74T2JbL Z@-^j4.1Ks񏰞-#X5LMXVhCi|7kr?md߼A+PBAoz/h!W[Eʑ-fC%;^HRxZiJkлHł⥒±.
g:$>MS`řcGH+9FZTDo̵)TV%7eN8+Y8C3ĨCqk:72'p
7R^Iuʛ<è|I:F'LGM[//L$U<~:߳q4WMT"mBK4qƐ˚?TF`6Au'"ŎztJBZ]ʉnT9QRb`AQkԇ%<'M8:p|4|$	]6}W63muӑsV/]](x;ȥKK)2Sa/N3!@,sJĀ\\}YhyӟԹo3`=J2xZb5V()n*.ֆ;Iɀl䳞,GUr
H='|a
KGz,8q/	TklϫN9lkUTzy[RuH)У'Fu'
/hSG]}$lFhn@#=ȥ#bG~꨸:C5YaI
	[&hbJ(e
^|7YچDH
hH!#5pPd;lOG5^eqo~3@޳\,?jS&(D;:^l^ި
ɳ/P˖mL"|&eS\~n҇Esn21{x颯Vm[>kwBrͲ)9sۻ<sec:HnSf5ͻAnCXWnґ6p:Cu|1߰EM3i+-ԂÙYIx$;,.jl"ÓdVze593#v)dڛum*V#)R~.-%Tj!8VbkUqYO2c0!2n`B`L5B -(7ch|i9xè/=oN#w㬔6;8+g
j8+e
bVD6AIzғR2yR
JOJy8J,!J\h}R=fPAy}}XOVTWey,2kZ>>}hJ,ot7ᗥ}MѨI"`EPVCk5sǟˋoM#иȷ^?6qoR0CplqKVhQOMH
S*"բR-iW+sc\=9eӻ.IX[pħa%[`"*MT}_}wFe,Bm\ݧ0zg]5EUh.߼},m]MaXMCyOŦE].xUFQA7rl^ܡ}r`&w2v3#-W=),O"ޠbPQ&ʻ_U֪h(5UBb=^f(v

._èکR:#U{nD8҉$?f9f ݀=K2fC#m#uEMi=[B.K+0tJRR8%BXEmLͬDM
/)G?,"il&\u(*,4%-R^0cP#}VuōQg$@4Mlqv"?hKN;"VI'v%Gd-hNՏl~Q/XNA84@)aIaoH`Lg%M1i2?G^\7%<<$${uYis4~))R1aDwWNm	'r>TW=HbUgPve}n8B5ia׭zr-<,aU!*ƒJ:)W1߰EvD#AFͩGQ⬔U<6C/AS<[I[)Dn$ J2_
2J ع{䴭# zv[oooP&+#51FRq@4S#AQY.jZg^KW	BLF>YPPj]$u-HŞ{XfH0v}]=RYGt(đ[κasCh"P	JLְA8}GbMdQ\g\X0FOq'Ji\cjGme,Rle=%JGe1ޭ,Thʪ2tJk%㒢59rYR--}-"Є)mVgkG)DXpUd2aA_d\2oEεlMd:zK!F^ĹuKű-+$iT0feԇ%%I8ߚװ	ߒŵ%o;ȥ]U[T[^Z$>tF9`-2]
{,*kemcs1߰s"u|g&h8w?ʪ(}<,&2hþPh85TOvar/home/core/zuul-output/logs/kubelet.log0000644000000000000000005637105315157300342017707 0ustar  rootrootMar 20 15:38:59 crc systemd[1]: Starting Kubernetes Kubelet...
Mar 20 15:38:59 crc restorecon[4687]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:38:59 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0
Mar 20 15:39:00 crc restorecon[4687]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0
Mar 20 15:39:00 crc restorecon[4687]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0
Mar 20 15:39:01 crc kubenswrapper[4730]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
Mar 20 15:39:01 crc kubenswrapper[4730]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version.
Mar 20 15:39:01 crc kubenswrapper[4730]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
Mar 20 15:39:01 crc kubenswrapper[4730]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
Mar 20 15:39:01 crc kubenswrapper[4730]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI.
Mar 20 15:39:01 crc kubenswrapper[4730]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information.
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.225436    4730 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime"
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.232730    4730 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.232755    4730 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.232763    4730 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.232770    4730 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release.
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.232777    4730 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.232784    4730 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.232789    4730 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.232796    4730 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.232804    4730 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.232811    4730 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.232815    4730 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.232820    4730 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.232825    4730 feature_gate.go:330] unrecognized feature gate: SignatureStores
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.232832    4730 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.232837    4730 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.232841    4730 feature_gate.go:330] unrecognized feature gate: NewOLM
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.232846    4730 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.232853    4730 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release.
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.232859    4730 feature_gate.go:330] unrecognized feature gate: ManagedBootImages
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.232864    4730 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.232870    4730 feature_gate.go:330] unrecognized feature gate: OnClusterBuild
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.232875    4730 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.232880    4730 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.232884    4730 feature_gate.go:330] unrecognized feature gate: InsightsConfig
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.232890    4730 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.232895    4730 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.232900    4730 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.232904    4730 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.232909    4730 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.232914    4730 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.232919    4730 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.232924    4730 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.232929    4730 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.232934    4730 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.232938    4730 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.232943    4730 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.232948    4730 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.232952    4730 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.232957    4730 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.232965    4730 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release.
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.232972    4730 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.232979    4730 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.232984    4730 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.232989    4730 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.232995    4730 feature_gate.go:330] unrecognized feature gate: UpgradeStatus
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.232999    4730 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.233004    4730 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.233009    4730 feature_gate.go:330] unrecognized feature gate: PinnedImages
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.233013    4730 feature_gate.go:330] unrecognized feature gate: DNSNameResolver
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.233020    4730 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release.
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.233025    4730 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.233030    4730 feature_gate.go:330] unrecognized feature gate: OVNObservability
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.233035    4730 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.233040    4730 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.233044    4730 feature_gate.go:330] unrecognized feature gate: ExternalOIDC
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.233049    4730 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.233054    4730 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.233059    4730 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.233064    4730 feature_gate.go:330] unrecognized feature gate: HardwareSpeed
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.233068    4730 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.233073    4730 feature_gate.go:330] unrecognized feature gate: GatewayAPI
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.233077    4730 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.233082    4730 feature_gate.go:330] unrecognized feature gate: Example
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.233087    4730 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.233091    4730 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.233096    4730 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.233100    4730 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.233107    4730 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.233111    4730 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.233116    4730 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.233120    4730 feature_gate.go:330] unrecognized feature gate: PlatformOperators
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236111    4730 flags.go:64] FLAG: --address="0.0.0.0"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236136    4730 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236167    4730 flags.go:64] FLAG: --anonymous-auth="true"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236184    4730 flags.go:64] FLAG: --application-metrics-count-limit="100"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236191    4730 flags.go:64] FLAG: --authentication-token-webhook="false"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236197    4730 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236206    4730 flags.go:64] FLAG: --authorization-mode="AlwaysAllow"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236213    4730 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236219    4730 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236225    4730 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236232    4730 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236238    4730 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236257    4730 flags.go:64] FLAG: --cgroup-driver="cgroupfs"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236262    4730 flags.go:64] FLAG: --cgroup-root=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236267    4730 flags.go:64] FLAG: --cgroups-per-qos="true"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236274    4730 flags.go:64] FLAG: --client-ca-file=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236280    4730 flags.go:64] FLAG: --cloud-config=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236286    4730 flags.go:64] FLAG: --cloud-provider=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236291    4730 flags.go:64] FLAG: --cluster-dns="[]"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236299    4730 flags.go:64] FLAG: --cluster-domain=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236304    4730 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236310    4730 flags.go:64] FLAG: --config-dir=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236317    4730 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236323    4730 flags.go:64] FLAG: --container-log-max-files="5"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236329    4730 flags.go:64] FLAG: --container-log-max-size="10Mi"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236334    4730 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236338    4730 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236343    4730 flags.go:64] FLAG: --containerd-namespace="k8s.io"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236348    4730 flags.go:64] FLAG: --contention-profiling="false"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236352    4730 flags.go:64] FLAG: --cpu-cfs-quota="true"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236357    4730 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236363    4730 flags.go:64] FLAG: --cpu-manager-policy="none"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236367    4730 flags.go:64] FLAG: --cpu-manager-policy-options=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236373    4730 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236377    4730 flags.go:64] FLAG: --enable-controller-attach-detach="true"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236381    4730 flags.go:64] FLAG: --enable-debugging-handlers="true"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236386    4730 flags.go:64] FLAG: --enable-load-reader="false"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236392    4730 flags.go:64] FLAG: --enable-server="true"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236397    4730 flags.go:64] FLAG: --enforce-node-allocatable="[pods]"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236404    4730 flags.go:64] FLAG: --event-burst="100"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236409    4730 flags.go:64] FLAG: --event-qps="50"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236414    4730 flags.go:64] FLAG: --event-storage-age-limit="default=0"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236419    4730 flags.go:64] FLAG: --event-storage-event-limit="default=0"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236425    4730 flags.go:64] FLAG: --eviction-hard=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236432    4730 flags.go:64] FLAG: --eviction-max-pod-grace-period="0"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236437    4730 flags.go:64] FLAG: --eviction-minimum-reclaim=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236442    4730 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236446    4730 flags.go:64] FLAG: --eviction-soft=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236451    4730 flags.go:64] FLAG: --eviction-soft-grace-period=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236455    4730 flags.go:64] FLAG: --exit-on-lock-contention="false"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236459    4730 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236463    4730 flags.go:64] FLAG: --experimental-mounter-path=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236468    4730 flags.go:64] FLAG: --fail-cgroupv1="false"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236472    4730 flags.go:64] FLAG: --fail-swap-on="true"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236476    4730 flags.go:64] FLAG: --feature-gates=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236481    4730 flags.go:64] FLAG: --file-check-frequency="20s"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236486    4730 flags.go:64] FLAG: --global-housekeeping-interval="1m0s"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236490    4730 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236494    4730 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236499    4730 flags.go:64] FLAG: --healthz-port="10248"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236503    4730 flags.go:64] FLAG: --help="false"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236507    4730 flags.go:64] FLAG: --hostname-override=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236512    4730 flags.go:64] FLAG: --housekeeping-interval="10s"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236517    4730 flags.go:64] FLAG: --http-check-frequency="20s"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236521    4730 flags.go:64] FLAG: --image-credential-provider-bin-dir=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236525    4730 flags.go:64] FLAG: --image-credential-provider-config=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236529    4730 flags.go:64] FLAG: --image-gc-high-threshold="85"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236533    4730 flags.go:64] FLAG: --image-gc-low-threshold="80"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236537    4730 flags.go:64] FLAG: --image-service-endpoint=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236542    4730 flags.go:64] FLAG: --kernel-memcg-notification="false"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236546    4730 flags.go:64] FLAG: --kube-api-burst="100"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236550    4730 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236554    4730 flags.go:64] FLAG: --kube-api-qps="50"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236565    4730 flags.go:64] FLAG: --kube-reserved=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236569    4730 flags.go:64] FLAG: --kube-reserved-cgroup=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236573    4730 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236577    4730 flags.go:64] FLAG: --kubelet-cgroups=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236581    4730 flags.go:64] FLAG: --local-storage-capacity-isolation="true"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236585    4730 flags.go:64] FLAG: --lock-file=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236589    4730 flags.go:64] FLAG: --log-cadvisor-usage="false"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236594    4730 flags.go:64] FLAG: --log-flush-frequency="5s"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236597    4730 flags.go:64] FLAG: --log-json-info-buffer-size="0"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236605    4730 flags.go:64] FLAG: --log-json-split-stream="false"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236609    4730 flags.go:64] FLAG: --log-text-info-buffer-size="0"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236614    4730 flags.go:64] FLAG: --log-text-split-stream="false"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236618    4730 flags.go:64] FLAG: --logging-format="text"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236622    4730 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236627    4730 flags.go:64] FLAG: --make-iptables-util-chains="true"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236631    4730 flags.go:64] FLAG: --manifest-url=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236635    4730 flags.go:64] FLAG: --manifest-url-header=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236641    4730 flags.go:64] FLAG: --max-housekeeping-interval="15s"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236646    4730 flags.go:64] FLAG: --max-open-files="1000000"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236651    4730 flags.go:64] FLAG: --max-pods="110"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236656    4730 flags.go:64] FLAG: --maximum-dead-containers="-1"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236664    4730 flags.go:64] FLAG: --maximum-dead-containers-per-container="1"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236668    4730 flags.go:64] FLAG: --memory-manager-policy="None"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236673    4730 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236677    4730 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236682    4730 flags.go:64] FLAG: --node-ip="192.168.126.11"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236686    4730 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236698    4730 flags.go:64] FLAG: --node-status-max-images="50"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236702    4730 flags.go:64] FLAG: --node-status-update-frequency="10s"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236706    4730 flags.go:64] FLAG: --oom-score-adj="-999"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236710    4730 flags.go:64] FLAG: --pod-cidr=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236714    4730 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236726    4730 flags.go:64] FLAG: --pod-manifest-path=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236730    4730 flags.go:64] FLAG: --pod-max-pids="-1"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236735    4730 flags.go:64] FLAG: --pods-per-core="0"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236739    4730 flags.go:64] FLAG: --port="10250"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236744    4730 flags.go:64] FLAG: --protect-kernel-defaults="false"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236750    4730 flags.go:64] FLAG: --provider-id=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236754    4730 flags.go:64] FLAG: --qos-reserved=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236758    4730 flags.go:64] FLAG: --read-only-port="10255"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236763    4730 flags.go:64] FLAG: --register-node="true"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236768    4730 flags.go:64] FLAG: --register-schedulable="true"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236773    4730 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236780    4730 flags.go:64] FLAG: --registry-burst="10"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236784    4730 flags.go:64] FLAG: --registry-qps="5"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236788    4730 flags.go:64] FLAG: --reserved-cpus=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236792    4730 flags.go:64] FLAG: --reserved-memory=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236798    4730 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236802    4730 flags.go:64] FLAG: --root-dir="/var/lib/kubelet"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236806    4730 flags.go:64] FLAG: --rotate-certificates="false"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236811    4730 flags.go:64] FLAG: --rotate-server-certificates="false"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236815    4730 flags.go:64] FLAG: --runonce="false"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236819    4730 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236824    4730 flags.go:64] FLAG: --runtime-request-timeout="2m0s"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236829    4730 flags.go:64] FLAG: --seccomp-default="false"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236833    4730 flags.go:64] FLAG: --serialize-image-pulls="true"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236837    4730 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236841    4730 flags.go:64] FLAG: --storage-driver-db="cadvisor"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236846    4730 flags.go:64] FLAG: --storage-driver-host="localhost:8086"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236850    4730 flags.go:64] FLAG: --storage-driver-password="root"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236854    4730 flags.go:64] FLAG: --storage-driver-secure="false"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236858    4730 flags.go:64] FLAG: --storage-driver-table="stats"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236862    4730 flags.go:64] FLAG: --storage-driver-user="root"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236866    4730 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236870    4730 flags.go:64] FLAG: --sync-frequency="1m0s"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236874    4730 flags.go:64] FLAG: --system-cgroups=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236878    4730 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236885    4730 flags.go:64] FLAG: --system-reserved-cgroup=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236890    4730 flags.go:64] FLAG: --tls-cert-file=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236894    4730 flags.go:64] FLAG: --tls-cipher-suites="[]"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236899    4730 flags.go:64] FLAG: --tls-min-version=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236902    4730 flags.go:64] FLAG: --tls-private-key-file=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236911    4730 flags.go:64] FLAG: --topology-manager-policy="none"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236915    4730 flags.go:64] FLAG: --topology-manager-policy-options=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236919    4730 flags.go:64] FLAG: --topology-manager-scope="container"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236923    4730 flags.go:64] FLAG: --v="2"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236929    4730 flags.go:64] FLAG: --version="false"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236936    4730 flags.go:64] FLAG: --vmodule=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236942    4730 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.236949    4730 flags.go:64] FLAG: --volume-stats-agg-period="1m0s"
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237091    4730 feature_gate.go:330] unrecognized feature gate: PinnedImages
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237097    4730 feature_gate.go:330] unrecognized feature gate: GatewayAPI
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237101    4730 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237105    4730 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237109    4730 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237112    4730 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237116    4730 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237121    4730 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237125    4730 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237128    4730 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237132    4730 feature_gate.go:330] unrecognized feature gate: InsightsConfig
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237136    4730 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237140    4730 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237144    4730 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237148    4730 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237152    4730 feature_gate.go:330] unrecognized feature gate: DNSNameResolver
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237156    4730 feature_gate.go:330] unrecognized feature gate: SignatureStores
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237160    4730 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237165    4730 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release.
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237170    4730 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237175    4730 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237179    4730 feature_gate.go:330] unrecognized feature gate: OnClusterBuild
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237184    4730 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237187    4730 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237194    4730 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237198    4730 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237202    4730 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237206    4730 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237211    4730 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237215    4730 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237219    4730 feature_gate.go:330] unrecognized feature gate: NewOLM
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237223    4730 feature_gate.go:330] unrecognized feature gate: ManagedBootImages
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237227    4730 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237231    4730 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237236    4730 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release.
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237241    4730 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237268    4730 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237272    4730 feature_gate.go:330] unrecognized feature gate: OVNObservability
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237276    4730 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237280    4730 feature_gate.go:330] unrecognized feature gate: ExternalOIDC
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237283    4730 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237287    4730 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237291    4730 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237294    4730 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237298    4730 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237301    4730 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237305    4730 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237309    4730 feature_gate.go:330] unrecognized feature gate: UpgradeStatus
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237312    4730 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237316    4730 feature_gate.go:330] unrecognized feature gate: Example
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237319    4730 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237323    4730 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237327    4730 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237335    4730 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237339    4730 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237342    4730 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237348    4730 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237352    4730 feature_gate.go:330] unrecognized feature gate: HardwareSpeed
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237356    4730 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237359    4730 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237363    4730 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237367    4730 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237371    4730 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237374    4730 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237380    4730 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release.
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237384    4730 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237388    4730 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237392    4730 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237396    4730 feature_gate.go:330] unrecognized feature gate: PlatformOperators
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237400    4730 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.237404    4730 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release.
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.237410    4730 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]}
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.248996    4730 server.go:491] "Kubelet version" kubeletVersion="v1.31.5"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.249046    4730 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249120    4730 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249129    4730 feature_gate.go:330] unrecognized feature gate: ManagedBootImages
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249134    4730 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249138    4730 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249143    4730 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249147    4730 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249150    4730 feature_gate.go:330] unrecognized feature gate: NewOLM
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249154    4730 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249159    4730 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release.
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249167    4730 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249171    4730 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249175    4730 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249180    4730 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249184    4730 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249187    4730 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249190    4730 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249194    4730 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249198    4730 feature_gate.go:330] unrecognized feature gate: UpgradeStatus
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249201    4730 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249205    4730 feature_gate.go:330] unrecognized feature gate: Example
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249208    4730 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249212    4730 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249215    4730 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249219    4730 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249222    4730 feature_gate.go:330] unrecognized feature gate: PinnedImages
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249226    4730 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249230    4730 feature_gate.go:330] unrecognized feature gate: DNSNameResolver
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249234    4730 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249240    4730 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release.
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249270    4730 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249275    4730 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249281    4730 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249286    4730 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249294    4730 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249300    4730 feature_gate.go:330] unrecognized feature gate: GatewayAPI
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249305    4730 feature_gate.go:330] unrecognized feature gate: InsightsConfig
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249310    4730 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249317    4730 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release.
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249323    4730 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249328    4730 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249333    4730 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249337    4730 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249340    4730 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249344    4730 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249348    4730 feature_gate.go:330] unrecognized feature gate: OVNObservability
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249352    4730 feature_gate.go:330] unrecognized feature gate: PlatformOperators
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249355    4730 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249359    4730 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249364    4730 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release.
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249369    4730 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249373    4730 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249377    4730 feature_gate.go:330] unrecognized feature gate: SignatureStores
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249381    4730 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249385    4730 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249389    4730 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249392    4730 feature_gate.go:330] unrecognized feature gate: OnClusterBuild
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249396    4730 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249399    4730 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249403    4730 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249406    4730 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249410    4730 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249413    4730 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249417    4730 feature_gate.go:330] unrecognized feature gate: HardwareSpeed
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249420    4730 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249424    4730 feature_gate.go:330] unrecognized feature gate: ExternalOIDC
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249428    4730 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249432    4730 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249435    4730 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249438    4730 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249442    4730 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249448    4730 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.249455    4730 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]}
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249605    4730 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249613    4730 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249617    4730 feature_gate.go:330] unrecognized feature gate: ExternalOIDC
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249621    4730 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249625    4730 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249629    4730 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249632    4730 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249637    4730 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249640    4730 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249644    4730 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249648    4730 feature_gate.go:330] unrecognized feature gate: ManagedBootImages
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249651    4730 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249655    4730 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249659    4730 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249662    4730 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249666    4730 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249669    4730 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249673    4730 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249676    4730 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249680    4730 feature_gate.go:330] unrecognized feature gate: SignatureStores
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249683    4730 feature_gate.go:330] unrecognized feature gate: PlatformOperators
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249686    4730 feature_gate.go:330] unrecognized feature gate: GatewayAPI
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249691    4730 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release.
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249696    4730 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249701    4730 feature_gate.go:330] unrecognized feature gate: OnClusterBuild
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249705    4730 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249708    4730 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249712    4730 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249715    4730 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249719    4730 feature_gate.go:330] unrecognized feature gate: HardwareSpeed
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249722    4730 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249726    4730 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249729    4730 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249733    4730 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249736    4730 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249740    4730 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249743    4730 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249747    4730 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249750    4730 feature_gate.go:330] unrecognized feature gate: UpgradeStatus
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249754    4730 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249757    4730 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249761    4730 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249764    4730 feature_gate.go:330] unrecognized feature gate: OVNObservability
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249768    4730 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249772    4730 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249776    4730 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249780    4730 feature_gate.go:330] unrecognized feature gate: InsightsConfig
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249784    4730 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release.
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249789    4730 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249793    4730 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249796    4730 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249800    4730 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249805    4730 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release.
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249809    4730 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249814    4730 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release.
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249818    4730 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249822    4730 feature_gate.go:330] unrecognized feature gate: NewOLM
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249826    4730 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249830    4730 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249834    4730 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249838    4730 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249841    4730 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249845    4730 feature_gate.go:330] unrecognized feature gate: PinnedImages
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249849    4730 feature_gate.go:330] unrecognized feature gate: DNSNameResolver
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249853    4730 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249856    4730 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249860    4730 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249863    4730 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249867    4730 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249871    4730 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.249875    4730 feature_gate.go:330] unrecognized feature gate: Example
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.249880    4730 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]}
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.251142    4730 server.go:940] "Client rotation is on, will bootstrap in background"
Mar 20 15:39:01 crc kubenswrapper[4730]: E0320 15:39:01.255017    4730 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.258048    4730 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.258140    4730 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem".
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.260407    4730 server.go:997] "Starting client certificate rotation"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.260440    4730 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.260629    4730 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.296125    4730 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt"
Mar 20 15:39:01 crc kubenswrapper[4730]: E0320 15:39:01.300865    4730 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.162:6443: connect: connection refused" logger="UnhandledError"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.302170    4730 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.325005    4730 log.go:25] "Validated CRI v1 runtime API"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.367596    4730 log.go:25] "Validated CRI v1 image API"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.370327    4730 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.381992    4730 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-20-15-33-45-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3]
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.382083    4730 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:41 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}]
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.423322    4730 manager.go:217] Machine: {Timestamp:2026-03-20 15:39:01.41875618 +0000 UTC m=+0.632127629 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:dfe7d645-fe91-432e-8360-ef4633bfea29 BootID:666d62d4-aa52-41cc-be79-8c9a068e7752 Filesystems:[{Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:41 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:78:58:e2 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:78:58:e2 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:0e:90:fd Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:c7:54:5a Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:46:f0:23 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:2a:00:b3 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:76:6f:3d:8a:c5:84 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:da:13:98:05:cc:72 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None}
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.423716    4730 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available.
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.423992    4730 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:}
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.426762    4730 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.427051    4730 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[]
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.427100    4730 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2}
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.427452    4730 topology_manager.go:138] "Creating topology manager with none policy"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.427469    4730 container_manager_linux.go:303] "Creating device plugin manager"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.428174    4730 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.428208    4730 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.428529    4730 state_mem.go:36] "Initialized new in-memory state store"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.428652    4730 server.go:1245] "Using root directory" path="/var/lib/kubelet"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.433818    4730 kubelet.go:418] "Attempting to sync node with API server"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.433867    4730 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.433890    4730 file.go:69] "Watching path" path="/etc/kubernetes/manifests"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.433911    4730 kubelet.go:324] "Adding apiserver pod source"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.433929    4730 apiserver.go:42] "Waiting for node sync before watching apiserver pods"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.441724    4730 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1"
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.448832    4730 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.162:6443: connect: connection refused
Mar 20 15:39:01 crc kubenswrapper[4730]: E0320 15:39:01.450755    4730 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.162:6443: connect: connection refused" logger="UnhandledError"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.451912    4730 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem".
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.449174    4730 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.162:6443: connect: connection refused
Mar 20 15:39:01 crc kubenswrapper[4730]: E0320 15:39:01.452179    4730 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.162:6443: connect: connection refused" logger="UnhandledError"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.455610    4730 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.458336    4730 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.458534    4730 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.458621    4730 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.458674    4730 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.458730    4730 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.458788    4730 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.458861    4730 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.458928    4730 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.458984    4730 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.459046    4730 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.459106    4730 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.459155    4730 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.462725    4730 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.463820    4730 server.go:1280] "Started kubelet"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.464445    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.162:6443: connect: connection refused
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.465134    4730 server.go:163] "Starting to listen" address="0.0.0.0" port=10250
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.465280    4730 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10
Mar 20 15:39:01 crc systemd[1]: Started Kubernetes Kubelet.
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.465927    4730 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.468344    4730 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.468418    4730 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer"
Mar 20 15:39:01 crc kubenswrapper[4730]: E0320 15:39:01.469314    4730 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.469469    4730 volume_manager.go:287] "The desired_state_of_world populator starts"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.469514    4730 volume_manager.go:289] "Starting Kubelet Volume Manager"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.469590    4730 desired_state_of_world_populator.go:146] "Desired state populator starts to run"
Mar 20 15:39:01 crc kubenswrapper[4730]: E0320 15:39:01.470344    4730 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused" interval="200ms"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.470814    4730 server.go:460] "Adding debug handlers to kubelet server"
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.473172    4730 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.162:6443: connect: connection refused
Mar 20 15:39:01 crc kubenswrapper[4730]: E0320 15:39:01.473282    4730 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.162:6443: connect: connection refused" logger="UnhandledError"
Mar 20 15:39:01 crc kubenswrapper[4730]: E0320 15:39:01.470241    4730 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.162:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189e96d438fabf17  default    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:01.463744279 +0000 UTC m=+0.677115668,LastTimestamp:2026-03-20 15:39:01.463744279 +0000 UTC m=+0.677115668,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.478395    4730 factory.go:55] Registering systemd factory
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.478809    4730 factory.go:221] Registration of the systemd container factory successfully
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.480030    4730 factory.go:153] Registering CRI-O factory
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.480077    4730 factory.go:221] Registration of the crio container factory successfully
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.480169    4730 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.480218    4730 factory.go:103] Registering Raw factory
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.480244    4730 manager.go:1196] Started watching for new ooms in manager
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.481641    4730 manager.go:319] Starting recovery of all containers
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.492092    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.492163    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.492184    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.492197    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.492211    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.492224    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.492237    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.492271    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.492290    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.492303    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.492317    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.492334    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.492347    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.492362    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.492374    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.492386    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.492403    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.492415    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.492431    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.492454    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.492477    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.492495    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.492514    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.492567    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.492580    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.492612    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.492628    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.492642    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.492679    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.492694    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.492707    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.492723    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.492748    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.492767    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.492783    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.492801    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.492815    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.492831    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.492846    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.492865    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.492891    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.492908    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.492929    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.492946    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.492964    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.492980    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.492999    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.493016    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.493031    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.493045    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.493059    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.493073    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.493098    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.493113    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.493129    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.493141    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.493155    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.493171    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.493184    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.493197    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.493210    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.493235    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.493265    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.493278    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.493291    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.493304    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.493316    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.493354    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.493365    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.493382    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.493397    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.493409    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.493421    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.493437    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.493450    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.493464    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.493484    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.493507    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.493526    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.493543    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.496008    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.496041    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.496053    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.496071    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.496083    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.496094    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.496105    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.496118    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.496139    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.496154    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.496166    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.496182    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.496193    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.496211    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.496224    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.496237    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.496270    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.496284    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.496298    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.496312    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.496325    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.496339    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.496351    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.496443    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.496470    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.496497    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.496521    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.496545    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.496557    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.496573    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.496587    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.496601    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.496612    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.496625    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.496644    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.496657    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.496670    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.498803    4730 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.498861    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.498881    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.498894    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.498908    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.498921    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.498937    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.498950    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.498969    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.498981    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.498994    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499008    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499019    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499033    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499045    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499057    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499067    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499078    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499092    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499103    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499115    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499133    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499146    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499161    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499174    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499186    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499239    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499269    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499280    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499292    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499306    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499323    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499333    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499345    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499357    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499369    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499385    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499396    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499411    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499426    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499439    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499455    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499482    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499494    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499540    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499554    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499589    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499601    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499613    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499627    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499639    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499651    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499662    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499695    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499781    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499797    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499808    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499821    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499832    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499844    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499878    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499890    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499923    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499933    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499974    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.499988    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.500001    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.500012    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.500024    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.500034    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.500177    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.500188    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.500201    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.500212    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.500223    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.500234    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.500271    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.500283    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.500319    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.500330    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.500346    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.500360    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.500374    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.500388    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.500399    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.500415    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.500429    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.500441    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.500454    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.500467    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.500479    4730 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext=""
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.500490    4730 reconstruct.go:97] "Volume reconstruction finished"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.500500    4730 reconciler.go:26] "Reconciler: start to sync state"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.508794    4730 manager.go:324] Recovery completed
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.521830    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.523859    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.524098    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.524111    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.526517    4730 cpu_manager.go:225] "Starting CPU manager" policy="none"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.526549    4730 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.526576    4730 state_mem.go:36] "Initialized new in-memory state store"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.529073    4730 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.531749    4730 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.531793    4730 status_manager.go:217] "Starting to sync pod status with apiserver"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.531831    4730 kubelet.go:2335] "Starting kubelet main sync loop"
Mar 20 15:39:01 crc kubenswrapper[4730]: E0320 15:39:01.531989    4730 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]"
Mar 20 15:39:01 crc kubenswrapper[4730]: W0320 15:39:01.533974    4730 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.162:6443: connect: connection refused
Mar 20 15:39:01 crc kubenswrapper[4730]: E0320 15:39:01.534062    4730 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.162:6443: connect: connection refused" logger="UnhandledError"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.545233    4730 policy_none.go:49] "None policy: Start"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.546616    4730 memory_manager.go:170] "Starting memorymanager" policy="None"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.546654    4730 state_mem.go:35] "Initializing new in-memory state store"
Mar 20 15:39:01 crc kubenswrapper[4730]: E0320 15:39:01.569871    4730 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.599416    4730 manager.go:334] "Starting Device Plugin manager"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.599759    4730 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.600008    4730 server.go:79] "Starting device plugin registration server"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.601327    4730 eviction_manager.go:189] "Eviction manager: starting control loop"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.601376    4730 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.601847    4730 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.601994    4730 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.602011    4730 plugin_manager.go:118] "Starting Kubelet Plugin Manager"
Mar 20 15:39:01 crc kubenswrapper[4730]: E0320 15:39:01.608984    4730 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.632366    4730 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc"]
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.632564    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.635111    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.635162    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.635173    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.635379    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.636060    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.636127    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.636868    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.636923    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.636940    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.637113    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.637152    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.637179    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.637199    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.637296    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.637329    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.640697    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.640734    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.640747    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.641536    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.641660    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.641746    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.642673    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.643030    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.643090    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.643570    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.643599    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.643610    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.643753    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.643887    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.643902    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.643967    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.643935    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.644031    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.644344    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.644374    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.644387    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.644528    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.644557    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.644657    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.644679    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.644689    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.645155    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.645182    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.645192    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:01 crc kubenswrapper[4730]: E0320 15:39:01.671196    4730 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused" interval="400ms"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.701551    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.702857    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.702907    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.702922    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.702958    4730 kubelet_node_status.go:76] "Attempting to register node" node="crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.703358    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.703404    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.703434    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.703464    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.703497    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.703603    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: E0320 15:39:01.703588    4730 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.162:6443: connect: connection refused" node="crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.703646    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.703717    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.703738    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.703757    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.703777    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.703795    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.703812    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.703830    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.703848    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.805474    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.805573    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.805620    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.805656    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.805666    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.805754    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.805782    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.805760    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.805818    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.805844    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.805813    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.805903    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.805920    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.805858    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.805888    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.805868    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.806075    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.806107    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.806138    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.806205    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.806212    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.806231    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.806235    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.806272    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.806349    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.806352    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.806308    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.806327    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.806417    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.806505    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.904470    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.905872    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.905923    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.905935    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.905960    4730 kubelet_node_status.go:76] "Attempting to register node" node="crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: E0320 15:39:01.906463    4730 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.162:6443: connect: connection refused" node="crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.968583    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.989209    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc"
Mar 20 15:39:01 crc kubenswrapper[4730]: I0320 15:39:01.993909    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc"
Mar 20 15:39:02 crc kubenswrapper[4730]: I0320 15:39:02.013886    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc"
Mar 20 15:39:02 crc kubenswrapper[4730]: I0320 15:39:02.018196    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc"
Mar 20 15:39:02 crc kubenswrapper[4730]: W0320 15:39:02.024528    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-172e91192de34734b9c2e8902af3d4b109d0533e261f0b7ed31be05be88e4e78 WatchSource:0}: Error finding container 172e91192de34734b9c2e8902af3d4b109d0533e261f0b7ed31be05be88e4e78: Status 404 returned error can't find the container with id 172e91192de34734b9c2e8902af3d4b109d0533e261f0b7ed31be05be88e4e78
Mar 20 15:39:02 crc kubenswrapper[4730]: W0320 15:39:02.034655    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-bd426064ccccf15a8df4852a4b40519cf64976fc7ebe6bc291d251d0a7197c4d WatchSource:0}: Error finding container bd426064ccccf15a8df4852a4b40519cf64976fc7ebe6bc291d251d0a7197c4d: Status 404 returned error can't find the container with id bd426064ccccf15a8df4852a4b40519cf64976fc7ebe6bc291d251d0a7197c4d
Mar 20 15:39:02 crc kubenswrapper[4730]: W0320 15:39:02.042031    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-3ed0cd305128c289c3df06620362422df1b61abcaa253a3b7f2c1127abf63e06 WatchSource:0}: Error finding container 3ed0cd305128c289c3df06620362422df1b61abcaa253a3b7f2c1127abf63e06: Status 404 returned error can't find the container with id 3ed0cd305128c289c3df06620362422df1b61abcaa253a3b7f2c1127abf63e06
Mar 20 15:39:02 crc kubenswrapper[4730]: W0320 15:39:02.045585    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-77c3d7fd2c641d481cfce2566ad31514c958b204f1bd837dc5ec076235f799f6 WatchSource:0}: Error finding container 77c3d7fd2c641d481cfce2566ad31514c958b204f1bd837dc5ec076235f799f6: Status 404 returned error can't find the container with id 77c3d7fd2c641d481cfce2566ad31514c958b204f1bd837dc5ec076235f799f6
Mar 20 15:39:02 crc kubenswrapper[4730]: E0320 15:39:02.072780    4730 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused" interval="800ms"
Mar 20 15:39:02 crc kubenswrapper[4730]: I0320 15:39:02.306611    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:02 crc kubenswrapper[4730]: I0320 15:39:02.308864    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:02 crc kubenswrapper[4730]: I0320 15:39:02.308927    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:02 crc kubenswrapper[4730]: I0320 15:39:02.308952    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:02 crc kubenswrapper[4730]: I0320 15:39:02.309016    4730 kubelet_node_status.go:76] "Attempting to register node" node="crc"
Mar 20 15:39:02 crc kubenswrapper[4730]: E0320 15:39:02.309576    4730 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.162:6443: connect: connection refused" node="crc"
Mar 20 15:39:02 crc kubenswrapper[4730]: I0320 15:39:02.465786    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.162:6443: connect: connection refused
Mar 20 15:39:02 crc kubenswrapper[4730]: W0320 15:39:02.508498    4730 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.162:6443: connect: connection refused
Mar 20 15:39:02 crc kubenswrapper[4730]: E0320 15:39:02.508572    4730 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.162:6443: connect: connection refused" logger="UnhandledError"
Mar 20 15:39:02 crc kubenswrapper[4730]: W0320 15:39:02.526361    4730 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.162:6443: connect: connection refused
Mar 20 15:39:02 crc kubenswrapper[4730]: E0320 15:39:02.526401    4730 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.162:6443: connect: connection refused" logger="UnhandledError"
Mar 20 15:39:02 crc kubenswrapper[4730]: I0320 15:39:02.537764    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"bd426064ccccf15a8df4852a4b40519cf64976fc7ebe6bc291d251d0a7197c4d"}
Mar 20 15:39:02 crc kubenswrapper[4730]: I0320 15:39:02.538858    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"172e91192de34734b9c2e8902af3d4b109d0533e261f0b7ed31be05be88e4e78"}
Mar 20 15:39:02 crc kubenswrapper[4730]: I0320 15:39:02.539856    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"77c3d7fd2c641d481cfce2566ad31514c958b204f1bd837dc5ec076235f799f6"}
Mar 20 15:39:02 crc kubenswrapper[4730]: I0320 15:39:02.540653    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3ed0cd305128c289c3df06620362422df1b61abcaa253a3b7f2c1127abf63e06"}
Mar 20 15:39:02 crc kubenswrapper[4730]: I0320 15:39:02.541377    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"fd13fe13babdf950576c8680a07e2b41551f544e08704402c8c31a760d4d230c"}
Mar 20 15:39:02 crc kubenswrapper[4730]: W0320 15:39:02.575386    4730 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.162:6443: connect: connection refused
Mar 20 15:39:02 crc kubenswrapper[4730]: E0320 15:39:02.575500    4730 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.162:6443: connect: connection refused" logger="UnhandledError"
Mar 20 15:39:02 crc kubenswrapper[4730]: E0320 15:39:02.873830    4730 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused" interval="1.6s"
Mar 20 15:39:03 crc kubenswrapper[4730]: W0320 15:39:03.070515    4730 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.162:6443: connect: connection refused
Mar 20 15:39:03 crc kubenswrapper[4730]: E0320 15:39:03.070617    4730 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.162:6443: connect: connection refused" logger="UnhandledError"
Mar 20 15:39:03 crc kubenswrapper[4730]: I0320 15:39:03.110001    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:03 crc kubenswrapper[4730]: I0320 15:39:03.111178    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:03 crc kubenswrapper[4730]: I0320 15:39:03.111222    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:03 crc kubenswrapper[4730]: I0320 15:39:03.111236    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:03 crc kubenswrapper[4730]: I0320 15:39:03.111284    4730 kubelet_node_status.go:76] "Attempting to register node" node="crc"
Mar 20 15:39:03 crc kubenswrapper[4730]: E0320 15:39:03.111822    4730 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.162:6443: connect: connection refused" node="crc"
Mar 20 15:39:03 crc kubenswrapper[4730]: I0320 15:39:03.465160    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.162:6443: connect: connection refused
Mar 20 15:39:03 crc kubenswrapper[4730]: I0320 15:39:03.469242    4730 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates
Mar 20 15:39:03 crc kubenswrapper[4730]: E0320 15:39:03.470065    4730 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.162:6443: connect: connection refused" logger="UnhandledError"
Mar 20 15:39:03 crc kubenswrapper[4730]: I0320 15:39:03.548691    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:03 crc kubenswrapper[4730]: I0320 15:39:03.548681    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"899dbd6715433cfe5141851019e164daea952552c26706648245fd6319168685"}
Mar 20 15:39:03 crc kubenswrapper[4730]: I0320 15:39:03.548822    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"99093fe46696a888b221d24d1b42226d0ff16bab6b3fb2a718d055cf97066a69"}
Mar 20 15:39:03 crc kubenswrapper[4730]: I0320 15:39:03.548846    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8b34522460ebd4556ce4291e5c5132788387cf45b0be3b9535af9262948b71ac"}
Mar 20 15:39:03 crc kubenswrapper[4730]: I0320 15:39:03.548857    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1aee2dcf43ecf6df4a1615aa6e468921053ccb529d3c6dbc2c2ad641e264e606"}
Mar 20 15:39:03 crc kubenswrapper[4730]: I0320 15:39:03.549873    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:03 crc kubenswrapper[4730]: I0320 15:39:03.549957    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:03 crc kubenswrapper[4730]: I0320 15:39:03.549985    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:03 crc kubenswrapper[4730]: I0320 15:39:03.550977    4730 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e" exitCode=0
Mar 20 15:39:03 crc kubenswrapper[4730]: I0320 15:39:03.551093    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e"}
Mar 20 15:39:03 crc kubenswrapper[4730]: I0320 15:39:03.551211    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:03 crc kubenswrapper[4730]: I0320 15:39:03.552422    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:03 crc kubenswrapper[4730]: I0320 15:39:03.552454    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:03 crc kubenswrapper[4730]: I0320 15:39:03.552464    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:03 crc kubenswrapper[4730]: I0320 15:39:03.553225    4730 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f" exitCode=0
Mar 20 15:39:03 crc kubenswrapper[4730]: I0320 15:39:03.553316    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:03 crc kubenswrapper[4730]: I0320 15:39:03.553317    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f"}
Mar 20 15:39:03 crc kubenswrapper[4730]: I0320 15:39:03.553972    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:03 crc kubenswrapper[4730]: I0320 15:39:03.554138    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:03 crc kubenswrapper[4730]: I0320 15:39:03.554182    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:03 crc kubenswrapper[4730]: I0320 15:39:03.554200    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:03 crc kubenswrapper[4730]: I0320 15:39:03.554706    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:03 crc kubenswrapper[4730]: I0320 15:39:03.554752    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:03 crc kubenswrapper[4730]: I0320 15:39:03.554766    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:03 crc kubenswrapper[4730]: I0320 15:39:03.556204    4730 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2" exitCode=0
Mar 20 15:39:03 crc kubenswrapper[4730]: I0320 15:39:03.556229    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2"}
Mar 20 15:39:03 crc kubenswrapper[4730]: I0320 15:39:03.556299    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:03 crc kubenswrapper[4730]: I0320 15:39:03.557132    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:03 crc kubenswrapper[4730]: I0320 15:39:03.557195    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:03 crc kubenswrapper[4730]: I0320 15:39:03.557223    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:03 crc kubenswrapper[4730]: I0320 15:39:03.558542    4730 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="2e8cd87f56c4a70c698163de2d3f364420981943d389a3cc9b64401bb5fbf08e" exitCode=0
Mar 20 15:39:03 crc kubenswrapper[4730]: I0320 15:39:03.558577    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"2e8cd87f56c4a70c698163de2d3f364420981943d389a3cc9b64401bb5fbf08e"}
Mar 20 15:39:03 crc kubenswrapper[4730]: I0320 15:39:03.558651    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:03 crc kubenswrapper[4730]: I0320 15:39:03.560224    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:03 crc kubenswrapper[4730]: I0320 15:39:03.560259    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:03 crc kubenswrapper[4730]: I0320 15:39:03.560268    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:04 crc kubenswrapper[4730]: W0320 15:39:04.394077    4730 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.162:6443: connect: connection refused
Mar 20 15:39:04 crc kubenswrapper[4730]: E0320 15:39:04.394191    4730 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.162:6443: connect: connection refused" logger="UnhandledError"
Mar 20 15:39:04 crc kubenswrapper[4730]: I0320 15:39:04.424788    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc"
Mar 20 15:39:04 crc kubenswrapper[4730]: I0320 15:39:04.466446    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.162:6443: connect: connection refused
Mar 20 15:39:04 crc kubenswrapper[4730]: E0320 15:39:04.475437    4730 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused" interval="3.2s"
Mar 20 15:39:04 crc kubenswrapper[4730]: I0320 15:39:04.565325    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1f8e7f1bfc56e54242e12f30487e8120eeb3d85dc86cfc17b587255889bd60bf"}
Mar 20 15:39:04 crc kubenswrapper[4730]: I0320 15:39:04.565371    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5d96c704317c9bfbcb499beae611ca6fcb6337084728df20ad791267de34359b"}
Mar 20 15:39:04 crc kubenswrapper[4730]: I0320 15:39:04.565385    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"180d79b8131e00449715e74d883031078dcceb74ca44ea372b6c3f651d617007"}
Mar 20 15:39:04 crc kubenswrapper[4730]: I0320 15:39:04.565395    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2660f772d76479685f586aeb9b242975695ac5e28668840bac6b8a75c09443a4"}
Mar 20 15:39:04 crc kubenswrapper[4730]: I0320 15:39:04.568161    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:04 crc kubenswrapper[4730]: I0320 15:39:04.568146    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"4298d984334a4955a24ac85a16284d8ffe1e4f3033c328be6e66f59d3b17f1b3"}
Mar 20 15:39:04 crc kubenswrapper[4730]: I0320 15:39:04.568832    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:04 crc kubenswrapper[4730]: I0320 15:39:04.568863    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:04 crc kubenswrapper[4730]: I0320 15:39:04.568873    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:04 crc kubenswrapper[4730]: I0320 15:39:04.570023    4730 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c" exitCode=0
Mar 20 15:39:04 crc kubenswrapper[4730]: I0320 15:39:04.570097    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c"}
Mar 20 15:39:04 crc kubenswrapper[4730]: I0320 15:39:04.570113    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:04 crc kubenswrapper[4730]: I0320 15:39:04.571113    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:04 crc kubenswrapper[4730]: I0320 15:39:04.571152    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:04 crc kubenswrapper[4730]: I0320 15:39:04.571166    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:04 crc kubenswrapper[4730]: I0320 15:39:04.573550    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:04 crc kubenswrapper[4730]: I0320 15:39:04.573642    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:04 crc kubenswrapper[4730]: I0320 15:39:04.573533    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f7f50a49e995c2647a19bd3dedd3ca85f1d7d0279df106c153af39641af9ea83"}
Mar 20 15:39:04 crc kubenswrapper[4730]: I0320 15:39:04.573747    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"badcec1b25a9d088fe7e563366ee7568adcabfe9c29a536db19fe3119b10f229"}
Mar 20 15:39:04 crc kubenswrapper[4730]: I0320 15:39:04.573769    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"9a2006b33d30cf2fd57843f3df0fb087253dd116f48a4d807c31260ce7508b9e"}
Mar 20 15:39:04 crc kubenswrapper[4730]: I0320 15:39:04.574524    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:04 crc kubenswrapper[4730]: I0320 15:39:04.574555    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:04 crc kubenswrapper[4730]: I0320 15:39:04.574568    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:04 crc kubenswrapper[4730]: I0320 15:39:04.574857    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:04 crc kubenswrapper[4730]: I0320 15:39:04.574891    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:04 crc kubenswrapper[4730]: I0320 15:39:04.574900    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:04 crc kubenswrapper[4730]: I0320 15:39:04.712135    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:04 crc kubenswrapper[4730]: I0320 15:39:04.713516    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:04 crc kubenswrapper[4730]: I0320 15:39:04.713576    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:04 crc kubenswrapper[4730]: I0320 15:39:04.713592    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:04 crc kubenswrapper[4730]: I0320 15:39:04.713629    4730 kubelet_node_status.go:76] "Attempting to register node" node="crc"
Mar 20 15:39:04 crc kubenswrapper[4730]: E0320 15:39:04.714421    4730 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.162:6443: connect: connection refused" node="crc"
Mar 20 15:39:04 crc kubenswrapper[4730]: W0320 15:39:04.920397    4730 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.162:6443: connect: connection refused
Mar 20 15:39:04 crc kubenswrapper[4730]: E0320 15:39:04.920483    4730 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.162:6443: connect: connection refused" logger="UnhandledError"
Mar 20 15:39:05 crc kubenswrapper[4730]: E0320 15:39:05.295409    4730 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.162:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189e96d438fabf17  default    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:01.463744279 +0000 UTC m=+0.677115668,LastTimestamp:2026-03-20 15:39:01.463744279 +0000 UTC m=+0.677115668,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:05 crc kubenswrapper[4730]: I0320 15:39:05.591126    4730 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506" exitCode=0
Mar 20 15:39:05 crc kubenswrapper[4730]: I0320 15:39:05.591205    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506"}
Mar 20 15:39:05 crc kubenswrapper[4730]: I0320 15:39:05.591318    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:05 crc kubenswrapper[4730]: I0320 15:39:05.592889    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:05 crc kubenswrapper[4730]: I0320 15:39:05.592929    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:05 crc kubenswrapper[4730]: I0320 15:39:05.592940    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:05 crc kubenswrapper[4730]: I0320 15:39:05.595704    4730 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness"
Mar 20 15:39:05 crc kubenswrapper[4730]: I0320 15:39:05.595746    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:05 crc kubenswrapper[4730]: I0320 15:39:05.596237    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:05 crc kubenswrapper[4730]: I0320 15:39:05.596587    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"de82f104aff72ff62f6c5387f4f4d337127c7abc347bf4bf1df18031dc2cf509"}
Mar 20 15:39:05 crc kubenswrapper[4730]: I0320 15:39:05.596693    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:05 crc kubenswrapper[4730]: I0320 15:39:05.597042    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:05 crc kubenswrapper[4730]: I0320 15:39:05.597841    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:05 crc kubenswrapper[4730]: I0320 15:39:05.597869    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:05 crc kubenswrapper[4730]: I0320 15:39:05.597896    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:05 crc kubenswrapper[4730]: I0320 15:39:05.598358    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:05 crc kubenswrapper[4730]: I0320 15:39:05.598378    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:05 crc kubenswrapper[4730]: I0320 15:39:05.598386    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:05 crc kubenswrapper[4730]: I0320 15:39:05.598770    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:05 crc kubenswrapper[4730]: I0320 15:39:05.598790    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:05 crc kubenswrapper[4730]: I0320 15:39:05.598799    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:05 crc kubenswrapper[4730]: I0320 15:39:05.598998    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:05 crc kubenswrapper[4730]: I0320 15:39:05.599057    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:05 crc kubenswrapper[4730]: I0320 15:39:05.599079    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:06 crc kubenswrapper[4730]: I0320 15:39:06.019443    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc"
Mar 20 15:39:06 crc kubenswrapper[4730]: I0320 15:39:06.602410    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:06 crc kubenswrapper[4730]: I0320 15:39:06.602533    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2d029581dfd248d1298c93e18a5352686b5e89fb730ede58a29400c62deb2bc3"}
Mar 20 15:39:06 crc kubenswrapper[4730]: I0320 15:39:06.602594    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"631f7e03c5476d53c6d6a3a3bf6a1d45ac40443386f072fc7c01129e9eaf7619"}
Mar 20 15:39:06 crc kubenswrapper[4730]: I0320 15:39:06.602623    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"dfacdb41d11bf0b03bede308c267faddfd0241d99a4cbda2603c6539bc3a433a"}
Mar 20 15:39:06 crc kubenswrapper[4730]: I0320 15:39:06.602673    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:06 crc kubenswrapper[4730]: I0320 15:39:06.602880    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc"
Mar 20 15:39:06 crc kubenswrapper[4730]: I0320 15:39:06.603785    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:06 crc kubenswrapper[4730]: I0320 15:39:06.603815    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:06 crc kubenswrapper[4730]: I0320 15:39:06.603825    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:06 crc kubenswrapper[4730]: I0320 15:39:06.603903    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:06 crc kubenswrapper[4730]: I0320 15:39:06.603968    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:06 crc kubenswrapper[4730]: I0320 15:39:06.603986    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:07 crc kubenswrapper[4730]: I0320 15:39:07.290518    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc"
Mar 20 15:39:07 crc kubenswrapper[4730]: I0320 15:39:07.290684    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:07 crc kubenswrapper[4730]: I0320 15:39:07.291999    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:07 crc kubenswrapper[4730]: I0320 15:39:07.292034    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:07 crc kubenswrapper[4730]: I0320 15:39:07.292047    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:07 crc kubenswrapper[4730]: I0320 15:39:07.425358    4730 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body=
Mar 20 15:39:07 crc kubenswrapper[4730]: I0320 15:39:07.425457    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)"
Mar 20 15:39:07 crc kubenswrapper[4730]: I0320 15:39:07.481496    4730 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates
Mar 20 15:39:07 crc kubenswrapper[4730]: I0320 15:39:07.610814    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:07 crc kubenswrapper[4730]: I0320 15:39:07.611568    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:07 crc kubenswrapper[4730]: I0320 15:39:07.612041    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"77a81d62ce9d72579cc4dd4778d22cf442919e08c7d83209fde12edc8a338684"}
Mar 20 15:39:07 crc kubenswrapper[4730]: I0320 15:39:07.612100    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"5b365c72e5e05c29295ba289a5bf795353c5221607dadd2ab8b4f642272d99fa"}
Mar 20 15:39:07 crc kubenswrapper[4730]: I0320 15:39:07.612609    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:07 crc kubenswrapper[4730]: I0320 15:39:07.612640    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:07 crc kubenswrapper[4730]: I0320 15:39:07.612651    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:07 crc kubenswrapper[4730]: I0320 15:39:07.612616    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:07 crc kubenswrapper[4730]: I0320 15:39:07.612698    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:07 crc kubenswrapper[4730]: I0320 15:39:07.612724    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:07 crc kubenswrapper[4730]: I0320 15:39:07.915071    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:07 crc kubenswrapper[4730]: I0320 15:39:07.921211    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:07 crc kubenswrapper[4730]: I0320 15:39:07.921274    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:07 crc kubenswrapper[4730]: I0320 15:39:07.921284    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:07 crc kubenswrapper[4730]: I0320 15:39:07.921309    4730 kubelet_node_status.go:76] "Attempting to register node" node="crc"
Mar 20 15:39:08 crc kubenswrapper[4730]: I0320 15:39:08.613907    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:08 crc kubenswrapper[4730]: I0320 15:39:08.614663    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:08 crc kubenswrapper[4730]: I0320 15:39:08.614697    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:08 crc kubenswrapper[4730]: I0320 15:39:08.614711    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:09 crc kubenswrapper[4730]: I0320 15:39:09.241891    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc"
Mar 20 15:39:09 crc kubenswrapper[4730]: I0320 15:39:09.242116    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:09 crc kubenswrapper[4730]: I0320 15:39:09.243203    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:09 crc kubenswrapper[4730]: I0320 15:39:09.243293    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:09 crc kubenswrapper[4730]: I0320 15:39:09.243324    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:10 crc kubenswrapper[4730]: I0320 15:39:10.343954    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc"
Mar 20 15:39:10 crc kubenswrapper[4730]: I0320 15:39:10.344322    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:10 crc kubenswrapper[4730]: I0320 15:39:10.345732    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:10 crc kubenswrapper[4730]: I0320 15:39:10.345776    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:10 crc kubenswrapper[4730]: I0320 15:39:10.345788    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:10 crc kubenswrapper[4730]: I0320 15:39:10.380777    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc"
Mar 20 15:39:10 crc kubenswrapper[4730]: I0320 15:39:10.380984    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:10 crc kubenswrapper[4730]: I0320 15:39:10.382416    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:10 crc kubenswrapper[4730]: I0320 15:39:10.382456    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:10 crc kubenswrapper[4730]: I0320 15:39:10.382465    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:10 crc kubenswrapper[4730]: I0320 15:39:10.652183    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc"
Mar 20 15:39:10 crc kubenswrapper[4730]: I0320 15:39:10.652409    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:10 crc kubenswrapper[4730]: I0320 15:39:10.653480    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:10 crc kubenswrapper[4730]: I0320 15:39:10.653716    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:10 crc kubenswrapper[4730]: I0320 15:39:10.653750    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:11 crc kubenswrapper[4730]: E0320 15:39:11.609084    4730 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found"
Mar 20 15:39:12 crc kubenswrapper[4730]: I0320 15:39:12.119094    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc"
Mar 20 15:39:12 crc kubenswrapper[4730]: I0320 15:39:12.119313    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:12 crc kubenswrapper[4730]: I0320 15:39:12.120558    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:12 crc kubenswrapper[4730]: I0320 15:39:12.120602    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:12 crc kubenswrapper[4730]: I0320 15:39:12.120612    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:12 crc kubenswrapper[4730]: I0320 15:39:12.123331    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc"
Mar 20 15:39:12 crc kubenswrapper[4730]: I0320 15:39:12.252825    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc"
Mar 20 15:39:12 crc kubenswrapper[4730]: I0320 15:39:12.253011    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:12 crc kubenswrapper[4730]: I0320 15:39:12.254044    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:12 crc kubenswrapper[4730]: I0320 15:39:12.254072    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:12 crc kubenswrapper[4730]: I0320 15:39:12.254081    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:12 crc kubenswrapper[4730]: I0320 15:39:12.623959    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:12 crc kubenswrapper[4730]: I0320 15:39:12.624987    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:12 crc kubenswrapper[4730]: I0320 15:39:12.625020    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:12 crc kubenswrapper[4730]: I0320 15:39:12.625033    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:12 crc kubenswrapper[4730]: I0320 15:39:12.628606    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc"
Mar 20 15:39:13 crc kubenswrapper[4730]: I0320 15:39:13.625699    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:13 crc kubenswrapper[4730]: I0320 15:39:13.626453    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:13 crc kubenswrapper[4730]: I0320 15:39:13.626490    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:13 crc kubenswrapper[4730]: I0320 15:39:13.626502    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:15 crc kubenswrapper[4730]: I0320 15:39:15.466413    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout
Mar 20 15:39:15 crc kubenswrapper[4730]: W0320 15:39:15.626639    4730 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout
Mar 20 15:39:15 crc kubenswrapper[4730]: I0320 15:39:15.626751    4730 trace.go:236] Trace[2045556304]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Mar-2026 15:39:05.624) (total time: 10002ms):
Mar 20 15:39:15 crc kubenswrapper[4730]: Trace[2045556304]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (15:39:15.626)
Mar 20 15:39:15 crc kubenswrapper[4730]: Trace[2045556304]: [10.002049706s] [10.002049706s] END
Mar 20 15:39:15 crc kubenswrapper[4730]: E0320 15:39:15.626777    4730 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError"
Mar 20 15:39:16 crc kubenswrapper[4730]: W0320 15:39:16.187828    4730 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout
Mar 20 15:39:16 crc kubenswrapper[4730]: I0320 15:39:16.187973    4730 trace.go:236] Trace[2123123142]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Mar-2026 15:39:06.186) (total time: 10001ms):
Mar 20 15:39:16 crc kubenswrapper[4730]: Trace[2123123142]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (15:39:16.187)
Mar 20 15:39:16 crc kubenswrapper[4730]: Trace[2123123142]: [10.001919921s] [10.001919921s] END
Mar 20 15:39:16 crc kubenswrapper[4730]: E0320 15:39:16.188032    4730 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError"
Mar 20 15:39:16 crc kubenswrapper[4730]: I0320 15:39:16.633802    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log"
Mar 20 15:39:16 crc kubenswrapper[4730]: I0320 15:39:16.637164    4730 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="de82f104aff72ff62f6c5387f4f4d337127c7abc347bf4bf1df18031dc2cf509" exitCode=255
Mar 20 15:39:16 crc kubenswrapper[4730]: I0320 15:39:16.637241    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"de82f104aff72ff62f6c5387f4f4d337127c7abc347bf4bf1df18031dc2cf509"}
Mar 20 15:39:16 crc kubenswrapper[4730]: I0320 15:39:16.637593    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:16 crc kubenswrapper[4730]: I0320 15:39:16.639463    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:16 crc kubenswrapper[4730]: I0320 15:39:16.639513    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:16 crc kubenswrapper[4730]: I0320 15:39:16.639529    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:16 crc kubenswrapper[4730]: I0320 15:39:16.640458    4730 scope.go:117] "RemoveContainer" containerID="de82f104aff72ff62f6c5387f4f4d337127c7abc347bf4bf1df18031dc2cf509"
Mar 20 15:39:17 crc kubenswrapper[4730]: E0320 15:39:17.008822    4730 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:17Z is after 2026-02-23T05:33:13Z" logger="UnhandledError"
Mar 20 15:39:17 crc kubenswrapper[4730]: W0320 15:39:17.010345    4730 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:17Z is after 2026-02-23T05:33:13Z
Mar 20 15:39:17 crc kubenswrapper[4730]: E0320 15:39:17.010409    4730 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:17Z is after 2026-02-23T05:33:13Z" logger="UnhandledError"
Mar 20 15:39:17 crc kubenswrapper[4730]: E0320 15:39:17.010749    4730 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:17Z is after 2026-02-23T05:33:13Z" node="crc"
Mar 20 15:39:17 crc kubenswrapper[4730]: I0320 15:39:17.010957    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:17Z is after 2026-02-23T05:33:13Z
Mar 20 15:39:17 crc kubenswrapper[4730]: E0320 15:39:17.012969    4730 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:17Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e96d438fabf17  default    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:01.463744279 +0000 UTC m=+0.677115668,LastTimestamp:2026-03-20 15:39:01.463744279 +0000 UTC m=+0.677115668,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:17 crc kubenswrapper[4730]: E0320 15:39:17.013155    4730 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:17Z is after 2026-02-23T05:33:13Z" interval="6.4s"
Mar 20 15:39:17 crc kubenswrapper[4730]: I0320 15:39:17.013944    4730 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403}
Mar 20 15:39:17 crc kubenswrapper[4730]: I0320 15:39:17.013989    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403"
Mar 20 15:39:17 crc kubenswrapper[4730]: W0320 15:39:17.016406    4730 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:17Z is after 2026-02-23T05:33:13Z
Mar 20 15:39:17 crc kubenswrapper[4730]: E0320 15:39:17.016470    4730 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:17Z is after 2026-02-23T05:33:13Z" logger="UnhandledError"
Mar 20 15:39:17 crc kubenswrapper[4730]: I0320 15:39:17.018008    4730 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403}
Mar 20 15:39:17 crc kubenswrapper[4730]: I0320 15:39:17.018073    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403"
Mar 20 15:39:17 crc kubenswrapper[4730]: I0320 15:39:17.426897    4730 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body=
Mar 20 15:39:17 crc kubenswrapper[4730]: I0320 15:39:17.427033    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)"
Mar 20 15:39:17 crc kubenswrapper[4730]: I0320 15:39:17.469898    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:17Z is after 2026-02-23T05:33:13Z
Mar 20 15:39:17 crc kubenswrapper[4730]: I0320 15:39:17.641744    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log"
Mar 20 15:39:17 crc kubenswrapper[4730]: I0320 15:39:17.643842    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"23136ed264fc1e174d9978df43b71e3437a1217258a6e710b82d8c27d1478149"}
Mar 20 15:39:17 crc kubenswrapper[4730]: I0320 15:39:17.643986    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:17 crc kubenswrapper[4730]: I0320 15:39:17.644994    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:17 crc kubenswrapper[4730]: I0320 15:39:17.645025    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:17 crc kubenswrapper[4730]: I0320 15:39:17.645038    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:18 crc kubenswrapper[4730]: I0320 15:39:18.469295    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:18Z is after 2026-02-23T05:33:13Z
Mar 20 15:39:18 crc kubenswrapper[4730]: I0320 15:39:18.648477    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log"
Mar 20 15:39:18 crc kubenswrapper[4730]: I0320 15:39:18.648897    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log"
Mar 20 15:39:18 crc kubenswrapper[4730]: I0320 15:39:18.650472    4730 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="23136ed264fc1e174d9978df43b71e3437a1217258a6e710b82d8c27d1478149" exitCode=255
Mar 20 15:39:18 crc kubenswrapper[4730]: I0320 15:39:18.650539    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"23136ed264fc1e174d9978df43b71e3437a1217258a6e710b82d8c27d1478149"}
Mar 20 15:39:18 crc kubenswrapper[4730]: I0320 15:39:18.650591    4730 scope.go:117] "RemoveContainer" containerID="de82f104aff72ff62f6c5387f4f4d337127c7abc347bf4bf1df18031dc2cf509"
Mar 20 15:39:18 crc kubenswrapper[4730]: I0320 15:39:18.650859    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:18 crc kubenswrapper[4730]: I0320 15:39:18.652211    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:18 crc kubenswrapper[4730]: I0320 15:39:18.652352    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:18 crc kubenswrapper[4730]: I0320 15:39:18.652448    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:18 crc kubenswrapper[4730]: I0320 15:39:18.653054    4730 scope.go:117] "RemoveContainer" containerID="23136ed264fc1e174d9978df43b71e3437a1217258a6e710b82d8c27d1478149"
Mar 20 15:39:18 crc kubenswrapper[4730]: E0320 15:39:18.653363    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792"
Mar 20 15:39:19 crc kubenswrapper[4730]: I0320 15:39:19.468422    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:19Z is after 2026-02-23T05:33:13Z
Mar 20 15:39:19 crc kubenswrapper[4730]: I0320 15:39:19.658174    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log"
Mar 20 15:39:19 crc kubenswrapper[4730]: W0320 15:39:19.667306    4730 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:19Z is after 2026-02-23T05:33:13Z
Mar 20 15:39:19 crc kubenswrapper[4730]: E0320 15:39:19.667384    4730 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:19Z is after 2026-02-23T05:33:13Z" logger="UnhandledError"
Mar 20 15:39:20 crc kubenswrapper[4730]: I0320 15:39:20.389851    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc"
Mar 20 15:39:20 crc kubenswrapper[4730]: I0320 15:39:20.390173    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:20 crc kubenswrapper[4730]: I0320 15:39:20.392127    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:20 crc kubenswrapper[4730]: I0320 15:39:20.392179    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:20 crc kubenswrapper[4730]: I0320 15:39:20.392201    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:20 crc kubenswrapper[4730]: I0320 15:39:20.394426    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc"
Mar 20 15:39:20 crc kubenswrapper[4730]: I0320 15:39:20.394796    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:20 crc kubenswrapper[4730]: I0320 15:39:20.397274    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:20 crc kubenswrapper[4730]: I0320 15:39:20.397460    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:20 crc kubenswrapper[4730]: I0320 15:39:20.397600    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:20 crc kubenswrapper[4730]: I0320 15:39:20.398738    4730 scope.go:117] "RemoveContainer" containerID="23136ed264fc1e174d9978df43b71e3437a1217258a6e710b82d8c27d1478149"
Mar 20 15:39:20 crc kubenswrapper[4730]: E0320 15:39:20.399141    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792"
Mar 20 15:39:20 crc kubenswrapper[4730]: I0320 15:39:20.401961    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc"
Mar 20 15:39:20 crc kubenswrapper[4730]: I0320 15:39:20.414810    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc"
Mar 20 15:39:20 crc kubenswrapper[4730]: I0320 15:39:20.471687    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:20Z is after 2026-02-23T05:33:13Z
Mar 20 15:39:20 crc kubenswrapper[4730]: I0320 15:39:20.666089    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:20 crc kubenswrapper[4730]: I0320 15:39:20.666101    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:20 crc kubenswrapper[4730]: I0320 15:39:20.667932    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:20 crc kubenswrapper[4730]: I0320 15:39:20.668045    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:20 crc kubenswrapper[4730]: I0320 15:39:20.668068    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:20 crc kubenswrapper[4730]: I0320 15:39:20.668725    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:20 crc kubenswrapper[4730]: I0320 15:39:20.668786    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:20 crc kubenswrapper[4730]: I0320 15:39:20.668803    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:20 crc kubenswrapper[4730]: I0320 15:39:20.669192    4730 scope.go:117] "RemoveContainer" containerID="23136ed264fc1e174d9978df43b71e3437a1217258a6e710b82d8c27d1478149"
Mar 20 15:39:20 crc kubenswrapper[4730]: E0320 15:39:20.669588    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792"
Mar 20 15:39:21 crc kubenswrapper[4730]: I0320 15:39:21.468278    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:21Z is after 2026-02-23T05:33:13Z
Mar 20 15:39:21 crc kubenswrapper[4730]: E0320 15:39:21.609222    4730 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found"
Mar 20 15:39:21 crc kubenswrapper[4730]: W0320 15:39:21.947945    4730 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:21Z is after 2026-02-23T05:33:13Z
Mar 20 15:39:21 crc kubenswrapper[4730]: E0320 15:39:21.948047    4730 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:21Z is after 2026-02-23T05:33:13Z" logger="UnhandledError"
Mar 20 15:39:22 crc kubenswrapper[4730]: I0320 15:39:22.469480    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:22Z is after 2026-02-23T05:33:13Z
Mar 20 15:39:22 crc kubenswrapper[4730]: I0320 15:39:22.865796    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc"
Mar 20 15:39:22 crc kubenswrapper[4730]: I0320 15:39:22.866179    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:22 crc kubenswrapper[4730]: I0320 15:39:22.867891    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:22 crc kubenswrapper[4730]: I0320 15:39:22.867927    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:22 crc kubenswrapper[4730]: I0320 15:39:22.867939    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:22 crc kubenswrapper[4730]: I0320 15:39:22.868712    4730 scope.go:117] "RemoveContainer" containerID="23136ed264fc1e174d9978df43b71e3437a1217258a6e710b82d8c27d1478149"
Mar 20 15:39:22 crc kubenswrapper[4730]: E0320 15:39:22.868951    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792"
Mar 20 15:39:23 crc kubenswrapper[4730]: I0320 15:39:23.411144    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:23 crc kubenswrapper[4730]: I0320 15:39:23.412460    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:23 crc kubenswrapper[4730]: I0320 15:39:23.412566    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:23 crc kubenswrapper[4730]: I0320 15:39:23.412590    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:23 crc kubenswrapper[4730]: I0320 15:39:23.412634    4730 kubelet_node_status.go:76] "Attempting to register node" node="crc"
Mar 20 15:39:23 crc kubenswrapper[4730]: E0320 15:39:23.415561    4730 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:23Z is after 2026-02-23T05:33:13Z" node="crc"
Mar 20 15:39:23 crc kubenswrapper[4730]: E0320 15:39:23.417664    4730 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:23Z is after 2026-02-23T05:33:13Z" interval="7s"
Mar 20 15:39:23 crc kubenswrapper[4730]: I0320 15:39:23.468095    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:23Z is after 2026-02-23T05:33:13Z
Mar 20 15:39:23 crc kubenswrapper[4730]: I0320 15:39:23.967280    4730 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc"
Mar 20 15:39:23 crc kubenswrapper[4730]: I0320 15:39:23.967501    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:23 crc kubenswrapper[4730]: I0320 15:39:23.968934    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:23 crc kubenswrapper[4730]: I0320 15:39:23.968998    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:23 crc kubenswrapper[4730]: I0320 15:39:23.969014    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:23 crc kubenswrapper[4730]: I0320 15:39:23.969775    4730 scope.go:117] "RemoveContainer" containerID="23136ed264fc1e174d9978df43b71e3437a1217258a6e710b82d8c27d1478149"
Mar 20 15:39:23 crc kubenswrapper[4730]: E0320 15:39:23.970010    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792"
Mar 20 15:39:24 crc kubenswrapper[4730]: I0320 15:39:24.468193    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:24Z is after 2026-02-23T05:33:13Z
Mar 20 15:39:25 crc kubenswrapper[4730]: I0320 15:39:25.386359    4730 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates
Mar 20 15:39:25 crc kubenswrapper[4730]: E0320 15:39:25.389333    4730 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:25Z is after 2026-02-23T05:33:13Z" logger="UnhandledError"
Mar 20 15:39:25 crc kubenswrapper[4730]: I0320 15:39:25.471152    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:25Z is after 2026-02-23T05:33:13Z
Mar 20 15:39:26 crc kubenswrapper[4730]: W0320 15:39:26.217493    4730 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:26Z is after 2026-02-23T05:33:13Z
Mar 20 15:39:26 crc kubenswrapper[4730]: E0320 15:39:26.217592    4730 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:26Z is after 2026-02-23T05:33:13Z" logger="UnhandledError"
Mar 20 15:39:26 crc kubenswrapper[4730]: W0320 15:39:26.264420    4730 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:26Z is after 2026-02-23T05:33:13Z
Mar 20 15:39:26 crc kubenswrapper[4730]: E0320 15:39:26.264511    4730 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:26Z is after 2026-02-23T05:33:13Z" logger="UnhandledError"
Mar 20 15:39:26 crc kubenswrapper[4730]: I0320 15:39:26.468195    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:26Z is after 2026-02-23T05:33:13Z
Mar 20 15:39:27 crc kubenswrapper[4730]: E0320 15:39:27.017791    4730 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:27Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e96d438fabf17  default    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:01.463744279 +0000 UTC m=+0.677115668,LastTimestamp:2026-03-20 15:39:01.463744279 +0000 UTC m=+0.677115668,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:27 crc kubenswrapper[4730]: I0320 15:39:27.425353    4730 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body=
Mar 20 15:39:27 crc kubenswrapper[4730]: I0320 15:39:27.425544    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)"
Mar 20 15:39:27 crc kubenswrapper[4730]: I0320 15:39:27.425656    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc"
Mar 20 15:39:27 crc kubenswrapper[4730]: I0320 15:39:27.425903    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:27 crc kubenswrapper[4730]: I0320 15:39:27.427938    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:27 crc kubenswrapper[4730]: I0320 15:39:27.428012    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:27 crc kubenswrapper[4730]: I0320 15:39:27.428031    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:27 crc kubenswrapper[4730]: I0320 15:39:27.432695    4730 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"8b34522460ebd4556ce4291e5c5132788387cf45b0be3b9535af9262948b71ac"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted"
Mar 20 15:39:27 crc kubenswrapper[4730]: I0320 15:39:27.433773    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://8b34522460ebd4556ce4291e5c5132788387cf45b0be3b9535af9262948b71ac" gracePeriod=30
Mar 20 15:39:27 crc kubenswrapper[4730]: I0320 15:39:27.467944    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:27Z is after 2026-02-23T05:33:13Z
Mar 20 15:39:27 crc kubenswrapper[4730]: W0320 15:39:27.622772    4730 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:27Z is after 2026-02-23T05:33:13Z
Mar 20 15:39:27 crc kubenswrapper[4730]: E0320 15:39:27.622913    4730 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:27Z is after 2026-02-23T05:33:13Z" logger="UnhandledError"
Mar 20 15:39:27 crc kubenswrapper[4730]: I0320 15:39:27.687810    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log"
Mar 20 15:39:27 crc kubenswrapper[4730]: I0320 15:39:27.688504    4730 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="8b34522460ebd4556ce4291e5c5132788387cf45b0be3b9535af9262948b71ac" exitCode=255
Mar 20 15:39:27 crc kubenswrapper[4730]: I0320 15:39:27.688595    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"8b34522460ebd4556ce4291e5c5132788387cf45b0be3b9535af9262948b71ac"}
Mar 20 15:39:28 crc kubenswrapper[4730]: I0320 15:39:28.470860    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:28Z is after 2026-02-23T05:33:13Z
Mar 20 15:39:28 crc kubenswrapper[4730]: I0320 15:39:28.694674    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log"
Mar 20 15:39:28 crc kubenswrapper[4730]: I0320 15:39:28.695305    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"620070760ce503ee2102ce0880913637feb032124892ce1a1e2060939f38e050"}
Mar 20 15:39:28 crc kubenswrapper[4730]: I0320 15:39:28.695523    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:28 crc kubenswrapper[4730]: I0320 15:39:28.696679    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:28 crc kubenswrapper[4730]: I0320 15:39:28.696714    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:28 crc kubenswrapper[4730]: I0320 15:39:28.696727    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:29 crc kubenswrapper[4730]: I0320 15:39:29.471499    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:29Z is after 2026-02-23T05:33:13Z
Mar 20 15:39:29 crc kubenswrapper[4730]: I0320 15:39:29.697819    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:29 crc kubenswrapper[4730]: I0320 15:39:29.699036    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:29 crc kubenswrapper[4730]: I0320 15:39:29.699104    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:29 crc kubenswrapper[4730]: I0320 15:39:29.699124    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:30 crc kubenswrapper[4730]: I0320 15:39:30.416372    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:30 crc kubenswrapper[4730]: I0320 15:39:30.418234    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:30 crc kubenswrapper[4730]: I0320 15:39:30.418332    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:30 crc kubenswrapper[4730]: I0320 15:39:30.418348    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:30 crc kubenswrapper[4730]: I0320 15:39:30.418393    4730 kubelet_node_status.go:76] "Attempting to register node" node="crc"
Mar 20 15:39:30 crc kubenswrapper[4730]: E0320 15:39:30.421222    4730 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:30Z is after 2026-02-23T05:33:13Z" interval="7s"
Mar 20 15:39:30 crc kubenswrapper[4730]: E0320 15:39:30.421937    4730 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:30Z is after 2026-02-23T05:33:13Z" node="crc"
Mar 20 15:39:30 crc kubenswrapper[4730]: I0320 15:39:30.468544    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:30Z is after 2026-02-23T05:33:13Z
Mar 20 15:39:30 crc kubenswrapper[4730]: I0320 15:39:30.652902    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc"
Mar 20 15:39:30 crc kubenswrapper[4730]: I0320 15:39:30.700597    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:30 crc kubenswrapper[4730]: I0320 15:39:30.701589    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:30 crc kubenswrapper[4730]: I0320 15:39:30.701797    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:30 crc kubenswrapper[4730]: I0320 15:39:30.701900    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:31 crc kubenswrapper[4730]: I0320 15:39:31.468687    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:31Z is after 2026-02-23T05:33:13Z
Mar 20 15:39:31 crc kubenswrapper[4730]: E0320 15:39:31.609404    4730 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found"
Mar 20 15:39:32 crc kubenswrapper[4730]: W0320 15:39:32.187820    4730 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:32Z is after 2026-02-23T05:33:13Z
Mar 20 15:39:32 crc kubenswrapper[4730]: E0320 15:39:32.187970    4730 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:32Z is after 2026-02-23T05:33:13Z" logger="UnhandledError"
Mar 20 15:39:32 crc kubenswrapper[4730]: I0320 15:39:32.467920    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:32Z is after 2026-02-23T05:33:13Z
Mar 20 15:39:33 crc kubenswrapper[4730]: I0320 15:39:33.469432    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:33Z is after 2026-02-23T05:33:13Z
Mar 20 15:39:34 crc kubenswrapper[4730]: I0320 15:39:34.425526    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc"
Mar 20 15:39:34 crc kubenswrapper[4730]: I0320 15:39:34.425679    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:34 crc kubenswrapper[4730]: I0320 15:39:34.426583    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:34 crc kubenswrapper[4730]: I0320 15:39:34.426614    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:34 crc kubenswrapper[4730]: I0320 15:39:34.426624    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:34 crc kubenswrapper[4730]: I0320 15:39:34.468610    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:34Z is after 2026-02-23T05:33:13Z
Mar 20 15:39:35 crc kubenswrapper[4730]: I0320 15:39:35.468224    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:35Z is after 2026-02-23T05:33:13Z
Mar 20 15:39:35 crc kubenswrapper[4730]: I0320 15:39:35.533339    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:35 crc kubenswrapper[4730]: I0320 15:39:35.534669    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:35 crc kubenswrapper[4730]: I0320 15:39:35.534752    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:35 crc kubenswrapper[4730]: I0320 15:39:35.534765    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:35 crc kubenswrapper[4730]: I0320 15:39:35.535289    4730 scope.go:117] "RemoveContainer" containerID="23136ed264fc1e174d9978df43b71e3437a1217258a6e710b82d8c27d1478149"
Mar 20 15:39:36 crc kubenswrapper[4730]: I0320 15:39:36.468713    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:36Z is after 2026-02-23T05:33:13Z
Mar 20 15:39:36 crc kubenswrapper[4730]: I0320 15:39:36.714963    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log"
Mar 20 15:39:36 crc kubenswrapper[4730]: I0320 15:39:36.715499    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log"
Mar 20 15:39:36 crc kubenswrapper[4730]: I0320 15:39:36.717550    4730 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e2a38d11938583eb373fcd731b30daf31d61c71c2ace80683efb60769ab0b694" exitCode=255
Mar 20 15:39:36 crc kubenswrapper[4730]: I0320 15:39:36.717599    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"e2a38d11938583eb373fcd731b30daf31d61c71c2ace80683efb60769ab0b694"}
Mar 20 15:39:36 crc kubenswrapper[4730]: I0320 15:39:36.717639    4730 scope.go:117] "RemoveContainer" containerID="23136ed264fc1e174d9978df43b71e3437a1217258a6e710b82d8c27d1478149"
Mar 20 15:39:36 crc kubenswrapper[4730]: I0320 15:39:36.717776    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:36 crc kubenswrapper[4730]: I0320 15:39:36.718620    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:36 crc kubenswrapper[4730]: I0320 15:39:36.718654    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:36 crc kubenswrapper[4730]: I0320 15:39:36.718668    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:36 crc kubenswrapper[4730]: I0320 15:39:36.719063    4730 scope.go:117] "RemoveContainer" containerID="e2a38d11938583eb373fcd731b30daf31d61c71c2ace80683efb60769ab0b694"
Mar 20 15:39:36 crc kubenswrapper[4730]: E0320 15:39:36.719199    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792"
Mar 20 15:39:37 crc kubenswrapper[4730]: E0320 15:39:37.022868    4730 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:37Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e96d438fabf17  default    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:01.463744279 +0000 UTC m=+0.677115668,LastTimestamp:2026-03-20 15:39:01.463744279 +0000 UTC m=+0.677115668,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:37 crc kubenswrapper[4730]: I0320 15:39:37.422043    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:37 crc kubenswrapper[4730]: I0320 15:39:37.423241    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:37 crc kubenswrapper[4730]: I0320 15:39:37.423304    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:37 crc kubenswrapper[4730]: I0320 15:39:37.423352    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:37 crc kubenswrapper[4730]: I0320 15:39:37.423379    4730 kubelet_node_status.go:76] "Attempting to register node" node="crc"
Mar 20 15:39:37 crc kubenswrapper[4730]: E0320 15:39:37.424530    4730 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:37Z is after 2026-02-23T05:33:13Z" interval="7s"
Mar 20 15:39:37 crc kubenswrapper[4730]: I0320 15:39:37.425594    4730 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body=
Mar 20 15:39:37 crc kubenswrapper[4730]: I0320 15:39:37.425664    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)"
Mar 20 15:39:37 crc kubenswrapper[4730]: E0320 15:39:37.426391    4730 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:37Z is after 2026-02-23T05:33:13Z" node="crc"
Mar 20 15:39:37 crc kubenswrapper[4730]: I0320 15:39:37.468177    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:37Z is after 2026-02-23T05:33:13Z
Mar 20 15:39:37 crc kubenswrapper[4730]: I0320 15:39:37.723692    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log"
Mar 20 15:39:38 crc kubenswrapper[4730]: I0320 15:39:38.479889    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope
Mar 20 15:39:39 crc kubenswrapper[4730]: I0320 15:39:39.473438    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope
Mar 20 15:39:40 crc kubenswrapper[4730]: I0320 15:39:40.473407    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope
Mar 20 15:39:41 crc kubenswrapper[4730]: I0320 15:39:41.471794    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope
Mar 20 15:39:41 crc kubenswrapper[4730]: E0320 15:39:41.609938    4730 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found"
Mar 20 15:39:42 crc kubenswrapper[4730]: I0320 15:39:42.471668    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope
Mar 20 15:39:42 crc kubenswrapper[4730]: I0320 15:39:42.801370    4730 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates
Mar 20 15:39:42 crc kubenswrapper[4730]: I0320 15:39:42.819667    4730 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146
Mar 20 15:39:42 crc kubenswrapper[4730]: I0320 15:39:42.865149    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc"
Mar 20 15:39:42 crc kubenswrapper[4730]: I0320 15:39:42.865380    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:42 crc kubenswrapper[4730]: I0320 15:39:42.866457    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:42 crc kubenswrapper[4730]: I0320 15:39:42.866633    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:42 crc kubenswrapper[4730]: I0320 15:39:42.866713    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:42 crc kubenswrapper[4730]: I0320 15:39:42.867341    4730 scope.go:117] "RemoveContainer" containerID="e2a38d11938583eb373fcd731b30daf31d61c71c2ace80683efb60769ab0b694"
Mar 20 15:39:42 crc kubenswrapper[4730]: E0320 15:39:42.867572    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792"
Mar 20 15:39:43 crc kubenswrapper[4730]: I0320 15:39:43.469500    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope
Mar 20 15:39:43 crc kubenswrapper[4730]: I0320 15:39:43.966706    4730 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc"
Mar 20 15:39:43 crc kubenswrapper[4730]: I0320 15:39:43.966969    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:43 crc kubenswrapper[4730]: I0320 15:39:43.968535    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:43 crc kubenswrapper[4730]: I0320 15:39:43.968581    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:43 crc kubenswrapper[4730]: I0320 15:39:43.968597    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:43 crc kubenswrapper[4730]: I0320 15:39:43.969272    4730 scope.go:117] "RemoveContainer" containerID="e2a38d11938583eb373fcd731b30daf31d61c71c2ace80683efb60769ab0b694"
Mar 20 15:39:43 crc kubenswrapper[4730]: E0320 15:39:43.969500    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792"
Mar 20 15:39:44 crc kubenswrapper[4730]: I0320 15:39:44.426685    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:44 crc kubenswrapper[4730]: I0320 15:39:44.428417    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:44 crc kubenswrapper[4730]: I0320 15:39:44.428496    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:44 crc kubenswrapper[4730]: I0320 15:39:44.428521    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:44 crc kubenswrapper[4730]: I0320 15:39:44.428575    4730 kubelet_node_status.go:76] "Attempting to register node" node="crc"
Mar 20 15:39:44 crc kubenswrapper[4730]: E0320 15:39:44.432690    4730 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s"
Mar 20 15:39:44 crc kubenswrapper[4730]: E0320 15:39:44.433132    4730 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc"
Mar 20 15:39:44 crc kubenswrapper[4730]: I0320 15:39:44.466460    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope
Mar 20 15:39:45 crc kubenswrapper[4730]: W0320 15:39:45.282879    4730 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope
Mar 20 15:39:45 crc kubenswrapper[4730]: E0320 15:39:45.283735    4730 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError"
Mar 20 15:39:45 crc kubenswrapper[4730]: I0320 15:39:45.466402    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope
Mar 20 15:39:46 crc kubenswrapper[4730]: I0320 15:39:46.469323    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope
Mar 20 15:39:47 crc kubenswrapper[4730]: W0320 15:39:47.014691    4730 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.014737    4730 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.028550    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e96d438fabf17  default    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:01.463744279 +0000 UTC m=+0.677115668,LastTimestamp:2026-03-20 15:39:01.463744279 +0000 UTC m=+0.677115668,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.032378    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e96d43c937c06  default    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:01.524085766 +0000 UTC m=+0.737457135,LastTimestamp:2026-03-20 15:39:01.524085766 +0000 UTC m=+0.737457135,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.036689    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e96d43c93c92e  default    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:01.524105518 +0000 UTC m=+0.737476887,LastTimestamp:2026-03-20 15:39:01.524105518 +0000 UTC m=+0.737476887,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.041063    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e96d43c93f625  default    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:01.524117029 +0000 UTC m=+0.737488398,LastTimestamp:2026-03-20 15:39:01.524117029 +0000 UTC m=+0.737488398,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.044728    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e96d4413de3b1  default    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:01.602362289 +0000 UTC m=+0.815733668,LastTimestamp:2026-03-20 15:39:01.602362289 +0000 UTC m=+0.815733668,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.049429    4730 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e96d43c937c06\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e96d43c937c06  default    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:01.524085766 +0000 UTC m=+0.737457135,LastTimestamp:2026-03-20 15:39:01.635147837 +0000 UTC m=+0.848519206,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.055218    4730 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e96d43c93c92e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e96d43c93c92e  default    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:01.524105518 +0000 UTC m=+0.737476887,LastTimestamp:2026-03-20 15:39:01.635169658 +0000 UTC m=+0.848541027,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.058070    4730 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e96d43c93f625\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e96d43c93f625  default    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:01.524117029 +0000 UTC m=+0.737488398,LastTimestamp:2026-03-20 15:39:01.635178349 +0000 UTC m=+0.848549718,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.059391    4730 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e96d43c937c06\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e96d43c937c06  default    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:01.524085766 +0000 UTC m=+0.737457135,LastTimestamp:2026-03-20 15:39:01.636910564 +0000 UTC m=+0.850281933,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.064091    4730 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e96d43c93c92e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e96d43c93c92e  default    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:01.524105518 +0000 UTC m=+0.737476887,LastTimestamp:2026-03-20 15:39:01.636933695 +0000 UTC m=+0.850305064,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.068591    4730 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e96d43c93f625\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e96d43c93f625  default    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:01.524117029 +0000 UTC m=+0.737488398,LastTimestamp:2026-03-20 15:39:01.636947536 +0000 UTC m=+0.850318905,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.072180    4730 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e96d43c937c06\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e96d43c937c06  default    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:01.524085766 +0000 UTC m=+0.737457135,LastTimestamp:2026-03-20 15:39:01.637163852 +0000 UTC m=+0.850535251,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.075666    4730 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e96d43c93c92e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e96d43c93c92e  default    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:01.524105518 +0000 UTC m=+0.737476887,LastTimestamp:2026-03-20 15:39:01.637192864 +0000 UTC m=+0.850564253,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.078955    4730 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e96d43c93f625\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e96d43c93f625  default    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:01.524117029 +0000 UTC m=+0.737488398,LastTimestamp:2026-03-20 15:39:01.637208855 +0000 UTC m=+0.850580244,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.081947    4730 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e96d43c937c06\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e96d43c937c06  default    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:01.524085766 +0000 UTC m=+0.737457135,LastTimestamp:2026-03-20 15:39:01.640724649 +0000 UTC m=+0.854096018,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.085762    4730 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e96d43c93c92e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e96d43c93c92e  default    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:01.524105518 +0000 UTC m=+0.737476887,LastTimestamp:2026-03-20 15:39:01.64074046 +0000 UTC m=+0.854111829,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.088797    4730 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e96d43c93f625\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e96d43c93f625  default    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:01.524117029 +0000 UTC m=+0.737488398,LastTimestamp:2026-03-20 15:39:01.640752911 +0000 UTC m=+0.854124270,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.092099    4730 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e96d43c937c06\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e96d43c937c06  default    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:01.524085766 +0000 UTC m=+0.737457135,LastTimestamp:2026-03-20 15:39:01.641634175 +0000 UTC m=+0.855005544,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.095440    4730 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e96d43c93c92e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e96d43c93c92e  default    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:01.524105518 +0000 UTC m=+0.737476887,LastTimestamp:2026-03-20 15:39:01.641737922 +0000 UTC m=+0.855109291,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.098534    4730 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e96d43c93f625\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e96d43c93f625  default    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:01.524117029 +0000 UTC m=+0.737488398,LastTimestamp:2026-03-20 15:39:01.642359627 +0000 UTC m=+0.855730996,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.102644    4730 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e96d43c937c06\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e96d43c937c06  default    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:01.524085766 +0000 UTC m=+0.737457135,LastTimestamp:2026-03-20 15:39:01.643592886 +0000 UTC m=+0.856964255,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.106957    4730 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e96d43c93c92e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e96d43c93c92e  default    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:01.524105518 +0000 UTC m=+0.737476887,LastTimestamp:2026-03-20 15:39:01.643606567 +0000 UTC m=+0.856977926,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.111804    4730 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e96d43c93f625\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e96d43c93f625  default    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:01.524117029 +0000 UTC m=+0.737488398,LastTimestamp:2026-03-20 15:39:01.643615728 +0000 UTC m=+0.856987097,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.115794    4730 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e96d43c937c06\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e96d43c937c06  default    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:01.524085766 +0000 UTC m=+0.737457135,LastTimestamp:2026-03-20 15:39:01.64392597 +0000 UTC m=+0.857297339,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.119756    4730 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e96d43c93c92e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e96d43c93c92e  default    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:01.524105518 +0000 UTC m=+0.737476887,LastTimestamp:2026-03-20 15:39:01.644022117 +0000 UTC m=+0.857393486,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.125459    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e96d45af0db80  openshift-kube-scheduler    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:02.033521536 +0000 UTC m=+1.246892905,LastTimestamp:2026-03-20 15:39:02.033521536 +0000 UTC m=+1.246892905,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.136958    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e96d45b419b97  openshift-etcd    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:02.038813591 +0000 UTC m=+1.252184950,LastTimestamp:2026-03-20 15:39:02.038813591 +0000 UTC m=+1.252184950,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.141723    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e96d45b747955  openshift-machine-config-operator    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:02.042147157 +0000 UTC m=+1.255518526,LastTimestamp:2026-03-20 15:39:02.042147157 +0000 UTC m=+1.255518526,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.145116    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e96d45b9b224f  openshift-kube-apiserver    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:02.044680783 +0000 UTC m=+1.258052152,LastTimestamp:2026-03-20 15:39:02.044680783 +0000 UTC m=+1.258052152,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.149784    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e96d45bf4c1f5  openshift-kube-controller-manager    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:02.050554357 +0000 UTC m=+1.263925726,LastTimestamp:2026-03-20 15:39:02.050554357 +0000 UTC m=+1.263925726,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.153942    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e96d4806b763b  openshift-kube-controller-manager    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:02.662313531 +0000 UTC m=+1.875684900,LastTimestamp:2026-03-20 15:39:02.662313531 +0000 UTC m=+1.875684900,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.157667    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e96d4806b76bd  openshift-kube-scheduler    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:02.662313661 +0000 UTC m=+1.875685050,LastTimestamp:2026-03-20 15:39:02.662313661 +0000 UTC m=+1.875685050,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.159994    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e96d4815c756b  openshift-etcd    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:02.678107499 +0000 UTC m=+1.891478868,LastTimestamp:2026-03-20 15:39:02.678107499 +0000 UTC m=+1.891478868,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.161636    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e96d48161ba15  openshift-kube-controller-manager    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:02.678452757 +0000 UTC m=+1.891824136,LastTimestamp:2026-03-20 15:39:02.678452757 +0000 UTC m=+1.891824136,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.164872    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e96d48165b406  openshift-kube-scheduler    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:02.67871335 +0000 UTC m=+1.892084719,LastTimestamp:2026-03-20 15:39:02.67871335 +0000 UTC m=+1.892084719,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.168524    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e96d48165d116  openshift-machine-config-operator    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:02.67872079 +0000 UTC m=+1.892092159,LastTimestamp:2026-03-20 15:39:02.67872079 +0000 UTC m=+1.892092159,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.171761    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e96d481664cb0  openshift-kube-apiserver    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:02.678752432 +0000 UTC m=+1.892123801,LastTimestamp:2026-03-20 15:39:02.678752432 +0000 UTC m=+1.892123801,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.175821    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e96d48179d95a  openshift-kube-controller-manager    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:02.680033626 +0000 UTC m=+1.893404995,LastTimestamp:2026-03-20 15:39:02.680033626 +0000 UTC m=+1.893404995,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.180224    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e96d4824a35c9  openshift-machine-config-operator    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:02.693688777 +0000 UTC m=+1.907060156,LastTimestamp:2026-03-20 15:39:02.693688777 +0000 UTC m=+1.907060156,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.183674    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e96d4830348df  openshift-kube-apiserver    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:02.705817823 +0000 UTC m=+1.919189192,LastTimestamp:2026-03-20 15:39:02.705817823 +0000 UTC m=+1.919189192,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.187178    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e96d48309835d  openshift-etcd    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:02.706226013 +0000 UTC m=+1.919597382,LastTimestamp:2026-03-20 15:39:02.706226013 +0000 UTC m=+1.919597382,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.190703    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e96d492a67a90  openshift-kube-controller-manager    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:02.968171152 +0000 UTC m=+2.181542521,LastTimestamp:2026-03-20 15:39:02.968171152 +0000 UTC m=+2.181542521,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.193871    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e96d49342e742  openshift-kube-controller-manager    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:02.978422594 +0000 UTC m=+2.191793963,LastTimestamp:2026-03-20 15:39:02.978422594 +0000 UTC m=+2.191793963,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.197269    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e96d49354c19a  openshift-kube-controller-manager    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:02.979592602 +0000 UTC m=+2.192963971,LastTimestamp:2026-03-20 15:39:02.979592602 +0000 UTC m=+2.192963971,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.200630    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e96d49cae779d  openshift-kube-controller-manager    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:03.136466845 +0000 UTC m=+2.349838224,LastTimestamp:2026-03-20 15:39:03.136466845 +0000 UTC m=+2.349838224,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.203735    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e96d49d6b57a3  openshift-kube-controller-manager    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:03.148844963 +0000 UTC m=+2.362216342,LastTimestamp:2026-03-20 15:39:03.148844963 +0000 UTC m=+2.362216342,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.207317    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e96d49d7a40db  openshift-kube-controller-manager    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:03.149822171 +0000 UTC m=+2.363193540,LastTimestamp:2026-03-20 15:39:03.149822171 +0000 UTC m=+2.363193540,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.210753    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e96d4a6fa0dbe  openshift-kube-controller-manager    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:03.309192638 +0000 UTC m=+2.522564007,LastTimestamp:2026-03-20 15:39:03.309192638 +0000 UTC m=+2.522564007,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.214063    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e96d4a7dbd60d  openshift-kube-controller-manager    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:03.323989517 +0000 UTC m=+2.537360886,LastTimestamp:2026-03-20 15:39:03.323989517 +0000 UTC m=+2.537360886,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.217816    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e96d4b58f2306  openshift-kube-apiserver    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:03.553843974 +0000 UTC m=+2.767215343,LastTimestamp:2026-03-20 15:39:03.553843974 +0000 UTC m=+2.767215343,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.221231    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e96d4b5b248db  openshift-machine-config-operator    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:03.556147419 +0000 UTC m=+2.769518798,LastTimestamp:2026-03-20 15:39:03.556147419 +0000 UTC m=+2.769518798,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.224978    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e96d4b5eb0630  openshift-etcd    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:03.559865904 +0000 UTC m=+2.773237283,LastTimestamp:2026-03-20 15:39:03.559865904 +0000 UTC m=+2.773237283,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.229648    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e96d4b6710801  openshift-kube-scheduler    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:03.568648193 +0000 UTC m=+2.782019562,LastTimestamp:2026-03-20 15:39:03.568648193 +0000 UTC m=+2.782019562,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.232901    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e96d4c2b3986f  openshift-machine-config-operator    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:03.774337135 +0000 UTC m=+2.987708504,LastTimestamp:2026-03-20 15:39:03.774337135 +0000 UTC m=+2.987708504,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.236293    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e96d4c35b9035  openshift-machine-config-operator    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:03.785345077 +0000 UTC m=+2.998716446,LastTimestamp:2026-03-20 15:39:03.785345077 +0000 UTC m=+2.998716446,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.240016    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e96d4c39c53f5  openshift-etcd    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:03.789589493 +0000 UTC m=+3.002960862,LastTimestamp:2026-03-20 15:39:03.789589493 +0000 UTC m=+3.002960862,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.243366    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e96d4c400d504  openshift-kube-apiserver    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:03.796176132 +0000 UTC m=+3.009547501,LastTimestamp:2026-03-20 15:39:03.796176132 +0000 UTC m=+3.009547501,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.249595    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e96d4c400d518  openshift-kube-scheduler    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:03.796176152 +0000 UTC m=+3.009547521,LastTimestamp:2026-03-20 15:39:03.796176152 +0000 UTC m=+3.009547521,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.258360    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e96d4c4398633  openshift-etcd    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:03.799891507 +0000 UTC m=+3.013262876,LastTimestamp:2026-03-20 15:39:03.799891507 +0000 UTC m=+3.013262876,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.263868    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e96d4c4d99db9  openshift-kube-scheduler    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:03.810383289 +0000 UTC m=+3.023754658,LastTimestamp:2026-03-20 15:39:03.810383289 +0000 UTC m=+3.023754658,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.267271    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e96d4c4eda1de  openshift-kube-scheduler    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:03.81169507 +0000 UTC m=+3.025066449,LastTimestamp:2026-03-20 15:39:03.81169507 +0000 UTC m=+3.025066449,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.271528    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e96d4c56429c1  openshift-kube-apiserver    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:03.819463105 +0000 UTC m=+3.032834474,LastTimestamp:2026-03-20 15:39:03.819463105 +0000 UTC m=+3.032834474,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.275470    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e96d4c5851e4c  openshift-kube-apiserver    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:03.82162286 +0000 UTC m=+3.034994229,LastTimestamp:2026-03-20 15:39:03.82162286 +0000 UTC m=+3.034994229,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.278834    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e96d4d2549def  openshift-kube-apiserver    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:04.036548079 +0000 UTC m=+3.249919448,LastTimestamp:2026-03-20 15:39:04.036548079 +0000 UTC m=+3.249919448,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.282082    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e96d4d2559e95  openshift-kube-scheduler    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:04.036613781 +0000 UTC m=+3.249985140,LastTimestamp:2026-03-20 15:39:04.036613781 +0000 UTC m=+3.249985140,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.285536    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e96d4d3415863  openshift-kube-scheduler    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:04.052062307 +0000 UTC m=+3.265433676,LastTimestamp:2026-03-20 15:39:04.052062307 +0000 UTC m=+3.265433676,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.289089    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e96d4d3573c3d  openshift-kube-scheduler    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:04.053496893 +0000 UTC m=+3.266868252,LastTimestamp:2026-03-20 15:39:04.053496893 +0000 UTC m=+3.266868252,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.292652    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e96d4d36b99ce  openshift-kube-apiserver    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:04.054831566 +0000 UTC m=+3.268202935,LastTimestamp:2026-03-20 15:39:04.054831566 +0000 UTC m=+3.268202935,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.296030    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e96d4d37a9e7c  openshift-kube-apiserver    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:04.055815804 +0000 UTC m=+3.269187193,LastTimestamp:2026-03-20 15:39:04.055815804 +0000 UTC m=+3.269187193,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.299361    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e96d4deacfffd  openshift-kube-apiserver    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:04.243666941 +0000 UTC m=+3.457038330,LastTimestamp:2026-03-20 15:39:04.243666941 +0000 UTC m=+3.457038330,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.302358    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e96d4debd2ac8  openshift-kube-scheduler    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:04.244726472 +0000 UTC m=+3.458097841,LastTimestamp:2026-03-20 15:39:04.244726472 +0000 UTC m=+3.458097841,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.305793    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e96d4e00207a5  openshift-kube-scheduler    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:04.266016677 +0000 UTC m=+3.479388046,LastTimestamp:2026-03-20 15:39:04.266016677 +0000 UTC m=+3.479388046,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.309407    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e96d4e0079a39  openshift-kube-apiserver    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:04.266381881 +0000 UTC m=+3.479753250,LastTimestamp:2026-03-20 15:39:04.266381881 +0000 UTC m=+3.479753250,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.313668    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e96d4e01ddd51  openshift-kube-apiserver    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:04.267840849 +0000 UTC m=+3.481212228,LastTimestamp:2026-03-20 15:39:04.267840849 +0000 UTC m=+3.481212228,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.317197    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e96d4ec911e79  openshift-kube-apiserver    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:04.476720761 +0000 UTC m=+3.690092130,LastTimestamp:2026-03-20 15:39:04.476720761 +0000 UTC m=+3.690092130,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.320335    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e96d4ed859673  openshift-kube-apiserver    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:04.492742259 +0000 UTC m=+3.706113628,LastTimestamp:2026-03-20 15:39:04.492742259 +0000 UTC m=+3.706113628,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.323312    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e96d4ed978b3b  openshift-kube-apiserver    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:04.493919035 +0000 UTC m=+3.707290414,LastTimestamp:2026-03-20 15:39:04.493919035 +0000 UTC m=+3.707290414,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.326613    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e96d4f249f6b0  openshift-etcd    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:04.572720816 +0000 UTC m=+3.786092185,LastTimestamp:2026-03-20 15:39:04.572720816 +0000 UTC m=+3.786092185,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.330226    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e96d4fa00d046  openshift-kube-apiserver    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:04.702144582 +0000 UTC m=+3.915515951,LastTimestamp:2026-03-20 15:39:04.702144582 +0000 UTC m=+3.915515951,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.334156    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e96d4fb010252  openshift-kube-apiserver    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:04.71893461 +0000 UTC m=+3.932305979,LastTimestamp:2026-03-20 15:39:04.71893461 +0000 UTC m=+3.932305979,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.338107    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e96d502251847  openshift-etcd    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:04.838740039 +0000 UTC m=+4.052111408,LastTimestamp:2026-03-20 15:39:04.838740039 +0000 UTC m=+4.052111408,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.342172    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e96d505667dbd  openshift-etcd    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:04.893357501 +0000 UTC m=+4.106728870,LastTimestamp:2026-03-20 15:39:04.893357501 +0000 UTC m=+4.106728870,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.345562    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e96d52f3a4fee  openshift-etcd    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:05.595105262 +0000 UTC m=+4.808476631,LastTimestamp:2026-03-20 15:39:05.595105262 +0000 UTC m=+4.808476631,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.350069    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e96d53d930a68  openshift-etcd    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:05.835801192 +0000 UTC m=+5.049172601,LastTimestamp:2026-03-20 15:39:05.835801192 +0000 UTC m=+5.049172601,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.354336    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e96d53ee5b461  openshift-etcd    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:05.857995873 +0000 UTC m=+5.071367242,LastTimestamp:2026-03-20 15:39:05.857995873 +0000 UTC m=+5.071367242,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.357610    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e96d53f00aebe  openshift-etcd    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:05.859763902 +0000 UTC m=+5.073135311,LastTimestamp:2026-03-20 15:39:05.859763902 +0000 UTC m=+5.073135311,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.362197    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e96d54b95aed5  openshift-etcd    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:06.070855381 +0000 UTC m=+5.284226750,LastTimestamp:2026-03-20 15:39:06.070855381 +0000 UTC m=+5.284226750,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.366542    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e96d54c238795  openshift-etcd    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:06.080151445 +0000 UTC m=+5.293522814,LastTimestamp:2026-03-20 15:39:06.080151445 +0000 UTC m=+5.293522814,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.370682    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e96d54c386411  openshift-etcd    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:06.081518609 +0000 UTC m=+5.294889998,LastTimestamp:2026-03-20 15:39:06.081518609 +0000 UTC m=+5.294889998,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.374330    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e96d55bdedbff  openshift-etcd    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:06.344086527 +0000 UTC m=+5.557457896,LastTimestamp:2026-03-20 15:39:06.344086527 +0000 UTC m=+5.557457896,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.378794    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e96d55d9e9c05  openshift-etcd    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:06.373430277 +0000 UTC m=+5.586801646,LastTimestamp:2026-03-20 15:39:06.373430277 +0000 UTC m=+5.586801646,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.382234    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e96d55db41b71  openshift-etcd    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:06.374839153 +0000 UTC m=+5.588210522,LastTimestamp:2026-03-20 15:39:06.374839153 +0000 UTC m=+5.588210522,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.386591    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e96d56aa0085b  openshift-etcd    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:06.591627355 +0000 UTC m=+5.804998754,LastTimestamp:2026-03-20 15:39:06.591627355 +0000 UTC m=+5.804998754,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.390373    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e96d56ba550b8  openshift-etcd    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:06.608750776 +0000 UTC m=+5.822122145,LastTimestamp:2026-03-20 15:39:06.608750776 +0000 UTC m=+5.822122145,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.394989    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e96d56bb6c7b7  openshift-etcd    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:06.609895351 +0000 UTC m=+5.823266710,LastTimestamp:2026-03-20 15:39:06.609895351 +0000 UTC m=+5.823266710,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.398013    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e96d57624d8f4  openshift-etcd    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:06.784880884 +0000 UTC m=+5.998252263,LastTimestamp:2026-03-20 15:39:06.784880884 +0000 UTC m=+5.998252263,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.402095    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e96d576d5ea95  openshift-etcd    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:06.796485269 +0000 UTC m=+6.009856638,LastTimestamp:2026-03-20 15:39:06.796485269 +0000 UTC m=+6.009856638,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.405976    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=<
Mar 20 15:39:47 crc kubenswrapper[4730]:         &Event{ObjectMeta:{kube-controller-manager-crc.189e96d59c52d686  openshift-kube-controller-manager    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
Mar 20 15:39:47 crc kubenswrapper[4730]:         body:
Mar 20 15:39:47 crc kubenswrapper[4730]:         ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:07.425429126 +0000 UTC m=+6.638800495,LastTimestamp:2026-03-20 15:39:07.425429126 +0000 UTC m=+6.638800495,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}
Mar 20 15:39:47 crc kubenswrapper[4730]:  >
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.409918    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e96d59c53d966  openshift-kube-controller-manager    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:07.425495398 +0000 UTC m=+6.638866767,LastTimestamp:2026-03-20 15:39:07.425495398 +0000 UTC m=+6.638866767,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.415281    4730 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e96d4ed978b3b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e96d4ed978b3b  openshift-kube-apiserver    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:04.493919035 +0000 UTC m=+3.707290414,LastTimestamp:2026-03-20 15:39:16.642042504 +0000 UTC m=+15.855413883,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.420149    4730 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e96d4fa00d046\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e96d4fa00d046  openshift-kube-apiserver    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:04.702144582 +0000 UTC m=+3.915515951,LastTimestamp:2026-03-20 15:39:16.941657268 +0000 UTC m=+16.155028637,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.423402    4730 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e96d4fb010252\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e96d4fb010252  openshift-kube-apiserver    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:04.71893461 +0000 UTC m=+3.932305979,LastTimestamp:2026-03-20 15:39:16.971223083 +0000 UTC m=+16.184594452,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: I0320 15:39:47.425091    4730 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body=
Mar 20 15:39:47 crc kubenswrapper[4730]: I0320 15:39:47.425131    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.426989    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=<
Mar 20 15:39:47 crc kubenswrapper[4730]:         &Event{ObjectMeta:{kube-apiserver-crc.189e96d7d7d86dbe  openshift-kube-apiserver    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403
Mar 20 15:39:47 crc kubenswrapper[4730]:         body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403}
Mar 20 15:39:47 crc kubenswrapper[4730]:         
Mar 20 15:39:47 crc kubenswrapper[4730]:         ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:17.013974462 +0000 UTC m=+16.227345821,LastTimestamp:2026-03-20 15:39:17.013974462 +0000 UTC m=+16.227345821,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}
Mar 20 15:39:47 crc kubenswrapper[4730]:  >
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.430780    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e96d7d7d8f5a5  openshift-kube-apiserver    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:17.014009253 +0000 UTC m=+16.227380622,LastTimestamp:2026-03-20 15:39:17.014009253 +0000 UTC m=+16.227380622,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.433927    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=<
Mar 20 15:39:47 crc kubenswrapper[4730]:         &Event{ObjectMeta:{kube-apiserver-crc.189e96d7d816b0c7  openshift-kube-apiserver    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403
Mar 20 15:39:47 crc kubenswrapper[4730]:         body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403}
Mar 20 15:39:47 crc kubenswrapper[4730]:         
Mar 20 15:39:47 crc kubenswrapper[4730]:         ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:17.018054855 +0000 UTC m=+16.231426214,LastTimestamp:2026-03-20 15:39:17.018054855 +0000 UTC m=+16.231426214,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}
Mar 20 15:39:47 crc kubenswrapper[4730]:  >
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.437181    4730 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e96d7d7d8f5a5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e96d7d7d8f5a5  openshift-kube-apiserver    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:17.014009253 +0000 UTC m=+16.227380622,LastTimestamp:2026-03-20 15:39:17.018097676 +0000 UTC m=+16.231469045,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.440275    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=<
Mar 20 15:39:47 crc kubenswrapper[4730]:         &Event{ObjectMeta:{kube-controller-manager-crc.189e96d7f076901f  openshift-kube-controller-manager    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)
Mar 20 15:39:47 crc kubenswrapper[4730]:         body:
Mar 20 15:39:47 crc kubenswrapper[4730]:         ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:17.426991135 +0000 UTC m=+16.640362504,LastTimestamp:2026-03-20 15:39:17.426991135 +0000 UTC m=+16.640362504,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}
Mar 20 15:39:47 crc kubenswrapper[4730]:  >
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.443382    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e96d7f0780e29  openshift-kube-controller-manager    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:17.427088937 +0000 UTC m=+16.640460296,LastTimestamp:2026-03-20 15:39:17.427088937 +0000 UTC m=+16.640460296,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.448031    4730 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e96d7f076901f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=<
Mar 20 15:39:47 crc kubenswrapper[4730]:         &Event{ObjectMeta:{kube-controller-manager-crc.189e96d7f076901f  openshift-kube-controller-manager    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)
Mar 20 15:39:47 crc kubenswrapper[4730]:         body:
Mar 20 15:39:47 crc kubenswrapper[4730]:         ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:17.426991135 +0000 UTC m=+16.640362504,LastTimestamp:2026-03-20 15:39:27.425503088 +0000 UTC m=+26.638874487,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}
Mar 20 15:39:47 crc kubenswrapper[4730]:  >
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.451490    4730 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e96d7f0780e29\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e96d7f0780e29  openshift-kube-controller-manager    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:17.427088937 +0000 UTC m=+16.640460296,LastTimestamp:2026-03-20 15:39:27.4256045 +0000 UTC m=+26.638975909,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.454973    4730 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e96da44e951c1  openshift-kube-controller-manager    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:27.433732545 +0000 UTC m=+26.647103954,LastTimestamp:2026-03-20 15:39:27.433732545 +0000 UTC m=+26.647103954,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.458548    4730 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e96d48179d95a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e96d48179d95a  openshift-kube-controller-manager    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:02.680033626 +0000 UTC m=+1.893404995,LastTimestamp:2026-03-20 15:39:27.587768103 +0000 UTC m=+26.801139502,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.461917    4730 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e96d492a67a90\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e96d492a67a90  openshift-kube-controller-manager    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:02.968171152 +0000 UTC m=+2.181542521,LastTimestamp:2026-03-20 15:39:27.826369865 +0000 UTC m=+27.039741244,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: I0320 15:39:47.465386    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.465431    4730 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e96d49342e742\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e96d49342e742  openshift-kube-controller-manager    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:02.978422594 +0000 UTC m=+2.191793963,LastTimestamp:2026-03-20 15:39:27.840012341 +0000 UTC m=+27.053383720,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.469975    4730 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e96d7f076901f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=<
Mar 20 15:39:47 crc kubenswrapper[4730]:         &Event{ObjectMeta:{kube-controller-manager-crc.189e96d7f076901f  openshift-kube-controller-manager    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)
Mar 20 15:39:47 crc kubenswrapper[4730]:         body:
Mar 20 15:39:47 crc kubenswrapper[4730]:         ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:17.426991135 +0000 UTC m=+16.640362504,LastTimestamp:2026-03-20 15:39:37.425645017 +0000 UTC m=+36.639016406,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}
Mar 20 15:39:47 crc kubenswrapper[4730]:  >
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.473106    4730 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e96d7f0780e29\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e96d7f0780e29  openshift-kube-controller-manager    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:17.427088937 +0000 UTC m=+16.640460296,LastTimestamp:2026-03-20 15:39:37.425691458 +0000 UTC m=+36.639062837,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}"
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.476913    4730 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e96d7f076901f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=<
Mar 20 15:39:47 crc kubenswrapper[4730]:         &Event{ObjectMeta:{kube-controller-manager-crc.189e96d7f076901f  openshift-kube-controller-manager    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)
Mar 20 15:39:47 crc kubenswrapper[4730]:         body:
Mar 20 15:39:47 crc kubenswrapper[4730]:         ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:39:17.426991135 +0000 UTC m=+16.640362504,LastTimestamp:2026-03-20 15:39:47.42511929 +0000 UTC m=+46.638490659,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}
Mar 20 15:39:47 crc kubenswrapper[4730]:  >
Mar 20 15:39:47 crc kubenswrapper[4730]: W0320 15:39:47.574193    4730 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope
Mar 20 15:39:47 crc kubenswrapper[4730]: E0320 15:39:47.574242    4730 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError"
Mar 20 15:39:48 crc kubenswrapper[4730]: I0320 15:39:48.470603    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope
Mar 20 15:39:49 crc kubenswrapper[4730]: I0320 15:39:49.470801    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope
Mar 20 15:39:50 crc kubenswrapper[4730]: W0320 15:39:50.248653    4730 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope
Mar 20 15:39:50 crc kubenswrapper[4730]: E0320 15:39:50.249005    4730 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError"
Mar 20 15:39:50 crc kubenswrapper[4730]: I0320 15:39:50.469791    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope
Mar 20 15:39:51 crc kubenswrapper[4730]: I0320 15:39:51.433441    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:51 crc kubenswrapper[4730]: I0320 15:39:51.435060    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:51 crc kubenswrapper[4730]: I0320 15:39:51.435138    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:51 crc kubenswrapper[4730]: I0320 15:39:51.435160    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:51 crc kubenswrapper[4730]: I0320 15:39:51.435209    4730 kubelet_node_status.go:76] "Attempting to register node" node="crc"
Mar 20 15:39:51 crc kubenswrapper[4730]: E0320 15:39:51.440686    4730 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s"
Mar 20 15:39:51 crc kubenswrapper[4730]: E0320 15:39:51.440747    4730 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc"
Mar 20 15:39:51 crc kubenswrapper[4730]: I0320 15:39:51.468783    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope
Mar 20 15:39:51 crc kubenswrapper[4730]: E0320 15:39:51.610838    4730 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found"
Mar 20 15:39:52 crc kubenswrapper[4730]: I0320 15:39:52.469613    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope
Mar 20 15:39:53 crc kubenswrapper[4730]: I0320 15:39:53.470420    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope
Mar 20 15:39:54 crc kubenswrapper[4730]: I0320 15:39:54.430774    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc"
Mar 20 15:39:54 crc kubenswrapper[4730]: I0320 15:39:54.430931    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:54 crc kubenswrapper[4730]: I0320 15:39:54.431943    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:54 crc kubenswrapper[4730]: I0320 15:39:54.431967    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:54 crc kubenswrapper[4730]: I0320 15:39:54.431977    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:54 crc kubenswrapper[4730]: I0320 15:39:54.436705    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc"
Mar 20 15:39:54 crc kubenswrapper[4730]: I0320 15:39:54.472449    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope
Mar 20 15:39:54 crc kubenswrapper[4730]: I0320 15:39:54.777969    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:54 crc kubenswrapper[4730]: I0320 15:39:54.779220    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:54 crc kubenswrapper[4730]: I0320 15:39:54.779277    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:54 crc kubenswrapper[4730]: I0320 15:39:54.779293    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:55 crc kubenswrapper[4730]: I0320 15:39:55.471494    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope
Mar 20 15:39:56 crc kubenswrapper[4730]: I0320 15:39:56.024781    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc"
Mar 20 15:39:56 crc kubenswrapper[4730]: I0320 15:39:56.024938    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:56 crc kubenswrapper[4730]: I0320 15:39:56.025907    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:56 crc kubenswrapper[4730]: I0320 15:39:56.025932    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:56 crc kubenswrapper[4730]: I0320 15:39:56.025941    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:56 crc kubenswrapper[4730]: I0320 15:39:56.469428    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope
Mar 20 15:39:57 crc kubenswrapper[4730]: I0320 15:39:57.468608    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope
Mar 20 15:39:58 crc kubenswrapper[4730]: I0320 15:39:58.441744    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:58 crc kubenswrapper[4730]: I0320 15:39:58.442902    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:58 crc kubenswrapper[4730]: I0320 15:39:58.442943    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:58 crc kubenswrapper[4730]: I0320 15:39:58.442956    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:58 crc kubenswrapper[4730]: I0320 15:39:58.442982    4730 kubelet_node_status.go:76] "Attempting to register node" node="crc"
Mar 20 15:39:58 crc kubenswrapper[4730]: E0320 15:39:58.446169    4730 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc"
Mar 20 15:39:58 crc kubenswrapper[4730]: E0320 15:39:58.446422    4730 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s"
Mar 20 15:39:58 crc kubenswrapper[4730]: I0320 15:39:58.467010    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope
Mar 20 15:39:58 crc kubenswrapper[4730]: I0320 15:39:58.532858    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:58 crc kubenswrapper[4730]: I0320 15:39:58.534019    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:58 crc kubenswrapper[4730]: I0320 15:39:58.534061    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:58 crc kubenswrapper[4730]: I0320 15:39:58.534070    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:58 crc kubenswrapper[4730]: I0320 15:39:58.534648    4730 scope.go:117] "RemoveContainer" containerID="e2a38d11938583eb373fcd731b30daf31d61c71c2ace80683efb60769ab0b694"
Mar 20 15:39:58 crc kubenswrapper[4730]: I0320 15:39:58.789122    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log"
Mar 20 15:39:58 crc kubenswrapper[4730]: I0320 15:39:58.790764    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5"}
Mar 20 15:39:58 crc kubenswrapper[4730]: I0320 15:39:58.790908    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:58 crc kubenswrapper[4730]: I0320 15:39:58.793412    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:58 crc kubenswrapper[4730]: I0320 15:39:58.793437    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:58 crc kubenswrapper[4730]: I0320 15:39:58.793445    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:59 crc kubenswrapper[4730]: I0320 15:39:59.470962    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope
Mar 20 15:39:59 crc kubenswrapper[4730]: I0320 15:39:59.794374    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log"
Mar 20 15:39:59 crc kubenswrapper[4730]: I0320 15:39:59.794862    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log"
Mar 20 15:39:59 crc kubenswrapper[4730]: I0320 15:39:59.797160    4730 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5" exitCode=255
Mar 20 15:39:59 crc kubenswrapper[4730]: I0320 15:39:59.797201    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5"}
Mar 20 15:39:59 crc kubenswrapper[4730]: I0320 15:39:59.797239    4730 scope.go:117] "RemoveContainer" containerID="e2a38d11938583eb373fcd731b30daf31d61c71c2ace80683efb60769ab0b694"
Mar 20 15:39:59 crc kubenswrapper[4730]: I0320 15:39:59.797439    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:39:59 crc kubenswrapper[4730]: I0320 15:39:59.798589    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:39:59 crc kubenswrapper[4730]: I0320 15:39:59.798702    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:39:59 crc kubenswrapper[4730]: I0320 15:39:59.798731    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:39:59 crc kubenswrapper[4730]: I0320 15:39:59.799630    4730 scope.go:117] "RemoveContainer" containerID="688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5"
Mar 20 15:39:59 crc kubenswrapper[4730]: E0320 15:39:59.799921    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792"
Mar 20 15:40:00 crc kubenswrapper[4730]: I0320 15:40:00.469830    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope
Mar 20 15:40:00 crc kubenswrapper[4730]: I0320 15:40:00.800841    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log"
Mar 20 15:40:01 crc kubenswrapper[4730]: I0320 15:40:01.469712    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope
Mar 20 15:40:01 crc kubenswrapper[4730]: E0320 15:40:01.611775    4730 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found"
Mar 20 15:40:02 crc kubenswrapper[4730]: I0320 15:40:02.471077    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope
Mar 20 15:40:02 crc kubenswrapper[4730]: I0320 15:40:02.864939    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc"
Mar 20 15:40:02 crc kubenswrapper[4730]: I0320 15:40:02.865412    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:40:02 crc kubenswrapper[4730]: I0320 15:40:02.866889    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:02 crc kubenswrapper[4730]: I0320 15:40:02.866994    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:02 crc kubenswrapper[4730]: I0320 15:40:02.867057    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:02 crc kubenswrapper[4730]: I0320 15:40:02.867677    4730 scope.go:117] "RemoveContainer" containerID="688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5"
Mar 20 15:40:02 crc kubenswrapper[4730]: E0320 15:40:02.867902    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792"
Mar 20 15:40:03 crc kubenswrapper[4730]: I0320 15:40:03.470047    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope
Mar 20 15:40:03 crc kubenswrapper[4730]: I0320 15:40:03.967369    4730 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc"
Mar 20 15:40:03 crc kubenswrapper[4730]: I0320 15:40:03.967567    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:40:03 crc kubenswrapper[4730]: I0320 15:40:03.968689    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:03 crc kubenswrapper[4730]: I0320 15:40:03.968738    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:03 crc kubenswrapper[4730]: I0320 15:40:03.968756    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:03 crc kubenswrapper[4730]: I0320 15:40:03.969569    4730 scope.go:117] "RemoveContainer" containerID="688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5"
Mar 20 15:40:03 crc kubenswrapper[4730]: E0320 15:40:03.969863    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792"
Mar 20 15:40:04 crc kubenswrapper[4730]: I0320 15:40:04.469975    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope
Mar 20 15:40:05 crc kubenswrapper[4730]: I0320 15:40:05.446230    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:40:05 crc kubenswrapper[4730]: I0320 15:40:05.447120    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:05 crc kubenswrapper[4730]: I0320 15:40:05.447170    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:05 crc kubenswrapper[4730]: I0320 15:40:05.447184    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:05 crc kubenswrapper[4730]: I0320 15:40:05.447212    4730 kubelet_node_status.go:76] "Attempting to register node" node="crc"
Mar 20 15:40:05 crc kubenswrapper[4730]: E0320 15:40:05.450876    4730 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s"
Mar 20 15:40:05 crc kubenswrapper[4730]: E0320 15:40:05.450920    4730 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc"
Mar 20 15:40:05 crc kubenswrapper[4730]: I0320 15:40:05.468341    4730 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope
Mar 20 15:40:06 crc kubenswrapper[4730]: I0320 15:40:06.086341    4730 csr.go:261] certificate signing request csr-72rl6 is approved, waiting to be issued
Mar 20 15:40:06 crc kubenswrapper[4730]: I0320 15:40:06.093759    4730 csr.go:257] certificate signing request csr-72rl6 is issued
Mar 20 15:40:06 crc kubenswrapper[4730]: I0320 15:40:06.151617    4730 reconstruct.go:205] "DevicePaths of reconstructed volumes updated"
Mar 20 15:40:06 crc kubenswrapper[4730]: I0320 15:40:06.261059    4730 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials"
Mar 20 15:40:07 crc kubenswrapper[4730]: I0320 15:40:07.095156    4730 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-09 20:16:13.685260601 +0000 UTC
Mar 20 15:40:07 crc kubenswrapper[4730]: I0320 15:40:07.095202    4730 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6340h36m6.5900616s for next certificate rotation
Mar 20 15:40:11 crc kubenswrapper[4730]: E0320 15:40:11.612862    4730 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.451722    4730 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.453022    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.453059    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.453071    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.453183    4730 kubelet_node_status.go:76] "Attempting to register node" node="crc"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.456930    4730 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.460727    4730 kubelet_node_status.go:115] "Node was previously registered" node="crc"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.461093    4730 kubelet_node_status.go:79] "Successfully registered node" node="crc"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.462760    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.462795    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.462807    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.462826    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.462838    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:12Z","lastTransitionTime":"2026-03-20T15:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:12 crc kubenswrapper[4730]: E0320 15:40:12.480794    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"666d62d4-aa52-41cc-be79-8c9a068e7752\\\",\\\"systemUUID\\\":\\\"dfe7d645-fe91-432e-8360-ef4633bfea29\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.485527    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.485652    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.485678    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.485708    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.485731    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:12Z","lastTransitionTime":"2026-03-20T15:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.493571    4730 apiserver.go:52] "Watching apiserver"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.501861    4730 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.502698    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-6r2kn","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-operator/iptables-alerter-4ln5h","openshift-machine-config-operator/machine-config-daemon-p5qvf","openshift-image-registry/node-ca-n4w74","openshift-multus/multus-additional-cni-plugins-49hht","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh","openshift-dns/node-resolver-69fnw","openshift-ovn-kubernetes/ovnkube-node-qj97f"]
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.504463    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.504998    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.505520    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:40:12 crc kubenswrapper[4730]: E0320 15:40:12.505706    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.506084    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:40:12 crc kubenswrapper[4730]: E0320 15:40:12.506271    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.506506    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.507018    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.507674    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:40:12 crc kubenswrapper[4730]: E0320 15:40:12.507782    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.507915    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.508679    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-n4w74"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.509212    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.509488    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.509653    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.509788    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-69fnw"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.508689    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-49hht"
Mar 20 15:40:12 crc kubenswrapper[4730]: E0320 15:40:12.510745    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"666d62d4-aa52-41cc-be79-8c9a068e7752\\\",\\\"systemUUID\\\":\\\"dfe7d645-fe91-432e-8360-ef4633bfea29\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.512802    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.512844    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.512910    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.513394    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.513795    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.514160    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.515297    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.515491    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.515520    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.515652    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.515775    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.518102    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.524662    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.524821    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.525269    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.525521    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.525808    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.526180    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.525406    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.526198    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.526459    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.526495    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.526549    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.526578    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.526626    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.526575    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.526882    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p47zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.526933    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.527798    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.528547    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.528722    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.528812    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.528855    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.528911    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.529011    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.529309    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.530184    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.532967    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.533643    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.533748    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.533819    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.533881    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:12Z","lastTransitionTime":"2026-03-20T15:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:12 crc kubenswrapper[4730]: E0320 15:40:12.552244    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"666d62d4-aa52-41cc-be79-8c9a068e7752\\\",\\\"systemUUID\\\":\\\"dfe7d645-fe91-432e-8360-ef4633bfea29\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.556212    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.563770    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.563828    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.563856    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.563891    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.563916    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:12Z","lastTransitionTime":"2026-03-20T15:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.571327    4730 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.572998    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:12 crc kubenswrapper[4730]: E0320 15:40:12.582060    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"666d62d4-aa52-41cc-be79-8c9a068e7752\\\",\\\"systemUUID\\\":\\\"dfe7d645-fe91-432e-8360-ef4633bfea29\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.585613    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.585643    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.585652    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.585665    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.585675    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:12Z","lastTransitionTime":"2026-03-20T15:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.588187    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:12 crc kubenswrapper[4730]: E0320 15:40:12.607484    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"666d62d4-aa52-41cc-be79-8c9a068e7752\\\",\\\"systemUUID\\\":\\\"dfe7d645-fe91-432e-8360-ef4633bfea29\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:12 crc kubenswrapper[4730]: E0320 15:40:12.607639    4730 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.607887    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.611377    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.611447    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.611474    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.611497    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.611518    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.611537    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.611557    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.611581    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.611623    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.611645    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.611670    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.611691    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.611710    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.611730    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.611751    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.611770    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.611785    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.611804    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.611826    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.611846    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.611867    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.611889    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.611912    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.611934    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.611955    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.611978    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.612000    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.612024    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.612044    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.612064    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.612085    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.612106    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.612128    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.612149    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.612171    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.612193    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.612213    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.612234    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.612295    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.612320    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.612341    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.612361    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.612380    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.612400    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.612422    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.612444    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.612466    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.612486    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.612506    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.612526    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.612546    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.612566    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.612586    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.612617    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.612640    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.612658    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.612679    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.612700    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.617066    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.617241    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.617654    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.617754    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.617986    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.618112    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.618174    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.618371    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.618749    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.618872    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.619073    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.619075    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.619271    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.619454    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.619728    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.619785    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.619923    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.620103    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.620111    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.620500    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.620468    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.620832    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.621746    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.622078    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.622506    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.622514    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.622562    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.622574    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.622588    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.622597    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:12Z","lastTransitionTime":"2026-03-20T15:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.622869    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.622914    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n4w74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ee8d55f-90bd-4484-8455-933de455efea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fvg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n4w74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.623400    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.623415    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.623986    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.624072    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.624427    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.624707    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.624826    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.625146    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.625562    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.626188    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.626688    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.627633    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.627777    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.628032    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.628345    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.628594    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.628732    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.628835    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.628901    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.629194    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.629412    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.629564    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.629895    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.629981    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.630149    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.630207    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.630237    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.630271    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.630289    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.630306    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.630322    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.630338    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.630354    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.630369    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.630384    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.630394    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.630404    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.630519    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.630545    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.630753    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.630779    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.630799    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.630844    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.630866    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.630888    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.630905    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.630930    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.630951    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.630991    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.631014    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.631034    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.631076    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.631099    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.631144    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.631163    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.631225    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.631267    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.631290    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.631313    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.631350    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.631370    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.631391    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.631432    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.631451    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.631472    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.631512    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.631534    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.631552    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.631605    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.631626    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.631379    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.631968    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.632354    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.632403    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.632703    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.632742    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.632971    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.633009    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.633191    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.633212    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.633500    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.633514    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.633914    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.634025    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.634193    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.634486    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.634829    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.634818    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.635344    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.636406    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.636490    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.636717    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.636867    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.636980    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.637185    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.637228    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.637407    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.637736    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.638005    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.638022    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.638011    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.638082    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.638199    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.638444    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.638580    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.638851    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.638720    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.638993    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.639372    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.639465    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.640611    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.640658    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.640686    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.646012    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.645946    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.646173    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.646351    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.646389    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.646411    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.646430    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.646690    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.646728    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.646747    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.647217    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.647385    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.647549    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.647621    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.647702    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.647783    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.647856    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.647934    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.648009    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.648085    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.648151    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.648222    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.648322    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.648392    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.648491    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.648584    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.648671    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.648761    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.648851    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.648926    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.648993    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.649060    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.649200    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.649284    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.649360    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.649431    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.649571    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.649661    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.649737    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.649808    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.649875    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.649939    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.650099    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.650166    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.650349    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.650447    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.650545    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.650625    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.650700    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.650771    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.650838    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.650909    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.651033    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.651100    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.651166    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.651232    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.651338    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.651405    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.651470    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.651540    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.651608    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.651680    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.651783    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.651881    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.651973    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.652072    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.652171    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.652299    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.652416    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.652527    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.652637    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.652743    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.652850    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.652955    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.653081    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.653225    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.653456    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.653586    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.653692    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.653793    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.653890    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.653997    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.654113    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.654213    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.654332    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.654458    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.654646    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.654750    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.654851    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.654950    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.655045    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.655145    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.655274    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.655434    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.655801    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.647233    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.648330    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.649080    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.649978    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.652089    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.652421    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.652635    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.652913    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.654042    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.654236    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.654446    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.654774    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.655065    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.655339    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.655978    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.657063    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.657560    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.657789    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.657833    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.657852    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.658127    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.658305    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") "
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.658429    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.658477    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.658596    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.658679    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.658666    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.658942    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.658974    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.659063    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.659569    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6f97b1f1-1fad-44ec-8253-17dd6a5eee54-host-run-netns\") pod \"multus-6r2kn\" (UID: \"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\") " pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.659622    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c4b4e0e8-af33-491e-b1d1-31079d90c656-ovn-node-metrics-cert\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.659671    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.658889    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.660179    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.660282    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dbb015c0-3a11-48bf-a59f-22bc03ca2fb9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-49hht\" (UID: \"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\") " pod="openshift-multus/multus-additional-cni-plugins-49hht"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.660374    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.660403    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.660485    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.660528    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.660642    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.660858    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.663090    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.663401    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.663434    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.663530    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.663571    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.663589    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.663737    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.663767    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.663790    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.663918    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.664196    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: E0320 15:40:12.664277    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:40:13.164238304 +0000 UTC m=+72.377609753 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.664326    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.664203    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.664453    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6f97b1f1-1fad-44ec-8253-17dd6a5eee54-system-cni-dir\") pod \"multus-6r2kn\" (UID: \"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\") " pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.664751    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.664818    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7fcd3db3-55f1-4c23-8fa9-78844495cea3-rootfs\") pod \"machine-config-daemon-p5qvf\" (UID: \"7fcd3db3-55f1-4c23-8fa9-78844495cea3\") " pod="openshift-machine-config-operator/machine-config-daemon-p5qvf"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.665076    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.665399    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.665424    4730 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.665517    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.665425    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.665632    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.665658    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.665688    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-host-cni-bin\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.665716    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-p47zh\" (UID: \"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.665743    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6f97b1f1-1fad-44ec-8253-17dd6a5eee54-multus-daemon-config\") pod \"multus-6r2kn\" (UID: \"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\") " pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.665778    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.665787    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.665801    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-host-run-netns\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.665828    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-var-lib-openvswitch\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.665855    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6f97b1f1-1fad-44ec-8253-17dd6a5eee54-cni-binary-copy\") pod \"multus-6r2kn\" (UID: \"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\") " pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.665867    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.665880    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6f97b1f1-1fad-44ec-8253-17dd6a5eee54-cnibin\") pod \"multus-6r2kn\" (UID: \"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\") " pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.665902    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0-env-overrides\") pod \"ovnkube-control-plane-749d76644c-p47zh\" (UID: \"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.665926    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-etc-openvswitch\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.665956    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz64b\" (UniqueName: \"kubernetes.io/projected/c4b4e0e8-af33-491e-b1d1-31079d90c656-kube-api-access-mz64b\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.665986    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dbb015c0-3a11-48bf-a59f-22bc03ca2fb9-cni-binary-copy\") pod \"multus-additional-cni-plugins-49hht\" (UID: \"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\") " pod="openshift-multus/multus-additional-cni-plugins-49hht"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.666009    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.666043    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-run-ovn\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.666102    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvthz\" (UniqueName: \"kubernetes.io/projected/6f97b1f1-1fad-44ec-8253-17dd6a5eee54-kube-api-access-vvthz\") pod \"multus-6r2kn\" (UID: \"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\") " pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.666130    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4xpw\" (UniqueName: \"kubernetes.io/projected/a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0-kube-api-access-d4xpw\") pod \"ovnkube-control-plane-749d76644c-p47zh\" (UID: \"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.666158    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.666192    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.666210    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6f97b1f1-1fad-44ec-8253-17dd6a5eee54-multus-socket-dir-parent\") pod \"multus-6r2kn\" (UID: \"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\") " pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.666232    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.666237    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6f97b1f1-1fad-44ec-8253-17dd6a5eee54-host-var-lib-cni-bin\") pod \"multus-6r2kn\" (UID: \"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\") " pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.666350    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6f97b1f1-1fad-44ec-8253-17dd6a5eee54-host-run-multus-certs\") pod \"multus-6r2kn\" (UID: \"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\") " pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.666375    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-host-kubelet\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.666398    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/dbb015c0-3a11-48bf-a59f-22bc03ca2fb9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-49hht\" (UID: \"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\") " pod="openshift-multus/multus-additional-cni-plugins-49hht"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.666523    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-p47zh\" (UID: \"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.666566    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/102cb977-7291-453e-9282-20572071afee-hosts-file\") pod \"node-resolver-69fnw\" (UID: \"102cb977-7291-453e-9282-20572071afee\") " pod="openshift-dns/node-resolver-69fnw"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.666617    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.666642    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-host-run-ovn-kubernetes\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.666721    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.666822    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-host-cni-netd\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.666831    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.666846    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c4b4e0e8-af33-491e-b1d1-31079d90c656-env-overrides\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.666870    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qtg2\" (UniqueName: \"kubernetes.io/projected/dbb015c0-3a11-48bf-a59f-22bc03ca2fb9-kube-api-access-4qtg2\") pod \"multus-additional-cni-plugins-49hht\" (UID: \"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\") " pod="openshift-multus/multus-additional-cni-plugins-49hht"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.666902    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6f97b1f1-1fad-44ec-8253-17dd6a5eee54-multus-cni-dir\") pod \"multus-6r2kn\" (UID: \"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\") " pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.666921    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6f97b1f1-1fad-44ec-8253-17dd6a5eee54-os-release\") pod \"multus-6r2kn\" (UID: \"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\") " pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.666934    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.666942    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6f97b1f1-1fad-44ec-8253-17dd6a5eee54-multus-conf-dir\") pod \"multus-6r2kn\" (UID: \"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\") " pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.667137    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-node-log\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.667172    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dbb015c0-3a11-48bf-a59f-22bc03ca2fb9-os-release\") pod \"multus-additional-cni-plugins-49hht\" (UID: \"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\") " pod="openshift-multus/multus-additional-cni-plugins-49hht"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.667196    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7fcd3db3-55f1-4c23-8fa9-78844495cea3-mcd-auth-proxy-config\") pod \"machine-config-daemon-p5qvf\" (UID: \"7fcd3db3-55f1-4c23-8fa9-78844495cea3\") " pod="openshift-machine-config-operator/machine-config-daemon-p5qvf"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.667217    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2ee8d55f-90bd-4484-8455-933de455efea-serviceca\") pod \"node-ca-n4w74\" (UID: \"2ee8d55f-90bd-4484-8455-933de455efea\") " pod="openshift-image-registry/node-ca-n4w74"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.667258    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.667283    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.667291    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.667305    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-host-slash\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.667331    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7fcd3db3-55f1-4c23-8fa9-78844495cea3-proxy-tls\") pod \"machine-config-daemon-p5qvf\" (UID: \"7fcd3db3-55f1-4c23-8fa9-78844495cea3\") " pod="openshift-machine-config-operator/machine-config-daemon-p5qvf"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.667351    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6f97b1f1-1fad-44ec-8253-17dd6a5eee54-host-var-lib-cni-multus\") pod \"multus-6r2kn\" (UID: \"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\") " pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.667373    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6f97b1f1-1fad-44ec-8253-17dd6a5eee54-hostroot\") pod \"multus-6r2kn\" (UID: \"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\") " pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.667393    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-run-systemd\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.667412    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-run-openvswitch\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.667432    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dbb015c0-3a11-48bf-a59f-22bc03ca2fb9-cnibin\") pod \"multus-additional-cni-plugins-49hht\" (UID: \"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\") " pod="openshift-multus/multus-additional-cni-plugins-49hht"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.667454    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6f97b1f1-1fad-44ec-8253-17dd6a5eee54-host-var-lib-kubelet\") pod \"multus-6r2kn\" (UID: \"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\") " pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.667478    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-systemd-units\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.667497    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-log-socket\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.667520    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.667548    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.667572    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dbb015c0-3a11-48bf-a59f-22bc03ca2fb9-system-cni-dir\") pod \"multus-additional-cni-plugins-49hht\" (UID: \"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\") " pod="openshift-multus/multus-additional-cni-plugins-49hht"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.667373    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.667432    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.667588    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: E0320 15:40:12.667567    4730 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered
Mar 20 15:40:12 crc kubenswrapper[4730]: E0320 15:40:12.667658    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 15:40:13.167642267 +0000 UTC m=+72.381013636 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.667684    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.667762    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6f97b1f1-1fad-44ec-8253-17dd6a5eee54-etc-kubernetes\") pod \"multus-6r2kn\" (UID: \"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\") " pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.667803    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c4b4e0e8-af33-491e-b1d1-31079d90c656-ovnkube-config\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.667821    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plthx\" (UniqueName: \"kubernetes.io/projected/102cb977-7291-453e-9282-20572071afee-kube-api-access-plthx\") pod \"node-resolver-69fnw\" (UID: \"102cb977-7291-453e-9282-20572071afee\") " pod="openshift-dns/node-resolver-69fnw"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.667808    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.667862    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:40:12 crc kubenswrapper[4730]: E0320 15:40:12.667923    4730 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.667933    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: E0320 15:40:12.667974    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 15:40:13.167960866 +0000 UTC m=+72.381332365 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.668004    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2ee8d55f-90bd-4484-8455-933de455efea-host\") pod \"node-ca-n4w74\" (UID: \"2ee8d55f-90bd-4484-8455-933de455efea\") " pod="openshift-image-registry/node-ca-n4w74"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.668036    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fvg6\" (UniqueName: \"kubernetes.io/projected/2ee8d55f-90bd-4484-8455-933de455efea-kube-api-access-2fvg6\") pod \"node-ca-n4w74\" (UID: \"2ee8d55f-90bd-4484-8455-933de455efea\") " pod="openshift-image-registry/node-ca-n4w74"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.668062    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzk8j\" (UniqueName: \"kubernetes.io/projected/7fcd3db3-55f1-4c23-8fa9-78844495cea3-kube-api-access-lzk8j\") pod \"machine-config-daemon-p5qvf\" (UID: \"7fcd3db3-55f1-4c23-8fa9-78844495cea3\") " pod="openshift-machine-config-operator/machine-config-daemon-p5qvf"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.668090    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.668114    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c4b4e0e8-af33-491e-b1d1-31079d90c656-ovnkube-script-lib\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.668141    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.668155    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.668172    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6f97b1f1-1fad-44ec-8253-17dd6a5eee54-host-run-k8s-cni-cncf-io\") pod \"multus-6r2kn\" (UID: \"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\") " pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.668318    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.668415    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.668608    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.669171    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.669198    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.669235    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.669731    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.670042    4730 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.670060    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.670071    4730 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.670082    4730 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.670083    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.670084    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.670095    4730 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.670134    4730 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.670148    4730 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.670160    4730 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.670171    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.670181    4730 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.670190    4730 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.670199    4730 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.670209    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.670219    4730 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.670228    4730 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.670236    4730 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.660819    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.670271    4730 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.670299    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.670310    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.670323    4730 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.670343    4730 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.670353    4730 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.670363    4730 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.670374    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.670384    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.670394    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.670433    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.670389    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.670489    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.670694    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.660921    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.660952    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.671178    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.661310    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.674947    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.675023    4730 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.675041    4730 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.675051    4730 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.675071    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.675090    4730 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.675101    4730 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.675111    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.675120    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.675131    4730 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.675140    4730 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.675149    4730 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.675162    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.675171    4730 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.675180    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.675239    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.675292    4730 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.675306    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.675314    4730 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.675323    4730 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.675332    4730 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.675341    4730 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.675351    4730 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.675360    4730 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.675368    4730 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.675379    4730 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.675388    4730 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.675397    4730 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.675406    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.675415    4730 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.675423    4730 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.675433    4730 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.675440    4730 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.675448    4730 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.675457    4730 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.675465    4730 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.675473    4730 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.675489    4730 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.675498    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.675506    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.675515    4730 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676091    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fcd3db3-55f1-4c23-8fa9-78844495cea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p5qvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676211    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676222    4730 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676231    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676239    4730 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676261    4730 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676269    4730 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676282    4730 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676295    4730 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676308    4730 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676320    4730 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676333    4730 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676344    4730 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676355    4730 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676365    4730 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676373    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676382    4730 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676390    4730 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676398    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676407    4730 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676415    4730 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676423    4730 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676431    4730 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676439    4730 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676446    4730 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676455    4730 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676463    4730 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676471    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676481    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676489    4730 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676498    4730 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676507    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676516    4730 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676524    4730 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676533    4730 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676541    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676550    4730 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676559    4730 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676566    4730 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676574    4730 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676582    4730 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676589    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676600    4730 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676608    4730 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676616    4730 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676626    4730 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676634    4730 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676643    4730 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676651    4730 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676659    4730 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676669    4730 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676678    4730 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676686    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676695    4730 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676703    4730 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676711    4730 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676719    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676727    4730 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676735    4730 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676743    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676752    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676760    4730 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676768    4730 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676776    4730 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676785    4730 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676794    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676803    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676811    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676820    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676828    4730 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676837    4730 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676844    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676853    4730 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676862    4730 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676873    4730 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676883    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676892    4730 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676901    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676909    4730 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676918    4730 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676926    4730 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676934    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676944    4730 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676952    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676960    4730 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676969    4730 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676977    4730 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676985    4730 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.676993    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.677001    4730 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.677010    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.677018    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.677027    4730 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.677035    4730 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.677044    4730 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.677052    4730 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.677061    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.677069    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.677079    4730 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.677086    4730 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.677094    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.677102    4730 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.678131    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.681349    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: E0320 15:40:12.681493    4730 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered
Mar 20 15:40:12 crc kubenswrapper[4730]: E0320 15:40:12.681509    4730 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered
Mar 20 15:40:12 crc kubenswrapper[4730]: E0320 15:40:12.681520    4730 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered]
Mar 20 15:40:12 crc kubenswrapper[4730]: E0320 15:40:12.681571    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 15:40:13.181554725 +0000 UTC m=+72.394926094 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered]
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.681993    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.682901    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.683340    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.683366    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.683580    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.684002    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: E0320 15:40:12.684105    4730 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered
Mar 20 15:40:12 crc kubenswrapper[4730]: E0320 15:40:12.684125    4730 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered
Mar 20 15:40:12 crc kubenswrapper[4730]: E0320 15:40:12.684142    4730 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered]
Mar 20 15:40:12 crc kubenswrapper[4730]: E0320 15:40:12.684194    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 15:40:13.184175446 +0000 UTC m=+72.397546815 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered]
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.684233    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.684875    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.685604    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-69fnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102cb977-7291-453e-9282-20572071afee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plthx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-69fnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.687384    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.691504    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.693682    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.696818    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-49hht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-49hht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.700183    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.703504    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p47zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.708629    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.710981    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.719808    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6r2kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvthz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6r2kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.724934    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.724967    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.724977    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.724992    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.725002    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:12Z","lastTransitionTime":"2026-03-20T15:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.727585    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.741428    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b4e0e8-af33-491e-b1d1-31079d90c656\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj97f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.751149    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.755854    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-2prfn"]
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.756376    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:40:12 crc kubenswrapper[4730]: E0320 15:40:12.756431    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.759731    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.768303    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.777025    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.777441    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6f97b1f1-1fad-44ec-8253-17dd6a5eee54-multus-daemon-config\") pod \"multus-6r2kn\" (UID: \"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\") " pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.777483    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-host-run-netns\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.777507    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-var-lib-openvswitch\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.777528    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6f97b1f1-1fad-44ec-8253-17dd6a5eee54-cnibin\") pod \"multus-6r2kn\" (UID: \"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\") " pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.777543    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6f97b1f1-1fad-44ec-8253-17dd6a5eee54-cni-binary-copy\") pod \"multus-6r2kn\" (UID: \"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\") " pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.777558    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dbb015c0-3a11-48bf-a59f-22bc03ca2fb9-cni-binary-copy\") pod \"multus-additional-cni-plugins-49hht\" (UID: \"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\") " pod="openshift-multus/multus-additional-cni-plugins-49hht"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.777575    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0-env-overrides\") pod \"ovnkube-control-plane-749d76644c-p47zh\" (UID: \"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.777589    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-etc-openvswitch\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.777605    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mz64b\" (UniqueName: \"kubernetes.io/projected/c4b4e0e8-af33-491e-b1d1-31079d90c656-kube-api-access-mz64b\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.777619    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.777636    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvthz\" (UniqueName: \"kubernetes.io/projected/6f97b1f1-1fad-44ec-8253-17dd6a5eee54-kube-api-access-vvthz\") pod \"multus-6r2kn\" (UID: \"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\") " pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.777655    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-run-ovn\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.777672    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-host-kubelet\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.777691    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/dbb015c0-3a11-48bf-a59f-22bc03ca2fb9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-49hht\" (UID: \"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\") " pod="openshift-multus/multus-additional-cni-plugins-49hht"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.777709    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-p47zh\" (UID: \"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.777724    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4xpw\" (UniqueName: \"kubernetes.io/projected/a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0-kube-api-access-d4xpw\") pod \"ovnkube-control-plane-749d76644c-p47zh\" (UID: \"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.777740    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6f97b1f1-1fad-44ec-8253-17dd6a5eee54-multus-socket-dir-parent\") pod \"multus-6r2kn\" (UID: \"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\") " pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.777739    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-host-run-netns\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.777792    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6f97b1f1-1fad-44ec-8253-17dd6a5eee54-host-var-lib-cni-bin\") pod \"multus-6r2kn\" (UID: \"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\") " pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.777755    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6f97b1f1-1fad-44ec-8253-17dd6a5eee54-host-var-lib-cni-bin\") pod \"multus-6r2kn\" (UID: \"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\") " pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.777831    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-etc-openvswitch\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.777852    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6f97b1f1-1fad-44ec-8253-17dd6a5eee54-host-run-multus-certs\") pod \"multus-6r2kn\" (UID: \"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\") " pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.777926    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a-metrics-certs\") pod \"network-metrics-daemon-2prfn\" (UID: \"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\") " pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.777977    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qtg2\" (UniqueName: \"kubernetes.io/projected/dbb015c0-3a11-48bf-a59f-22bc03ca2fb9-kube-api-access-4qtg2\") pod \"multus-additional-cni-plugins-49hht\" (UID: \"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\") " pod="openshift-multus/multus-additional-cni-plugins-49hht"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.778012    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/102cb977-7291-453e-9282-20572071afee-hosts-file\") pod \"node-resolver-69fnw\" (UID: \"102cb977-7291-453e-9282-20572071afee\") " pod="openshift-dns/node-resolver-69fnw"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.778051    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-host-run-ovn-kubernetes\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.778055    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6f97b1f1-1fad-44ec-8253-17dd6a5eee54-host-run-multus-certs\") pod \"multus-6r2kn\" (UID: \"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\") " pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.778081    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-host-cni-netd\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.778117    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-host-cni-netd\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.778140    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c4b4e0e8-af33-491e-b1d1-31079d90c656-env-overrides\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.778201    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dbb015c0-3a11-48bf-a59f-22bc03ca2fb9-os-release\") pod \"multus-additional-cni-plugins-49hht\" (UID: \"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\") " pod="openshift-multus/multus-additional-cni-plugins-49hht"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.778213    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0-env-overrides\") pod \"ovnkube-control-plane-749d76644c-p47zh\" (UID: \"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.778226    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.778221    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7fcd3db3-55f1-4c23-8fa9-78844495cea3-mcd-auth-proxy-config\") pod \"machine-config-daemon-p5qvf\" (UID: \"7fcd3db3-55f1-4c23-8fa9-78844495cea3\") " pod="openshift-machine-config-operator/machine-config-daemon-p5qvf"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.778295    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6f97b1f1-1fad-44ec-8253-17dd6a5eee54-multus-cni-dir\") pod \"multus-6r2kn\" (UID: \"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\") " pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.778351    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6f97b1f1-1fad-44ec-8253-17dd6a5eee54-os-release\") pod \"multus-6r2kn\" (UID: \"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\") " pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.778370    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6f97b1f1-1fad-44ec-8253-17dd6a5eee54-multus-conf-dir\") pod \"multus-6r2kn\" (UID: \"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\") " pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.778387    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-node-log\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.778433    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7xml\" (UniqueName: \"kubernetes.io/projected/db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a-kube-api-access-m7xml\") pod \"network-metrics-daemon-2prfn\" (UID: \"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\") " pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.778437    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-run-ovn\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.778457    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2ee8d55f-90bd-4484-8455-933de455efea-serviceca\") pod \"node-ca-n4w74\" (UID: \"2ee8d55f-90bd-4484-8455-933de455efea\") " pod="openshift-image-registry/node-ca-n4w74"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.778453    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-host-run-ovn-kubernetes\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.778480    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-host-kubelet\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.778465    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/102cb977-7291-453e-9282-20572071afee-hosts-file\") pod \"node-resolver-69fnw\" (UID: \"102cb977-7291-453e-9282-20572071afee\") " pod="openshift-dns/node-resolver-69fnw"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.778508    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-host-slash\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.778527    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dbb015c0-3a11-48bf-a59f-22bc03ca2fb9-cnibin\") pod \"multus-additional-cni-plugins-49hht\" (UID: \"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\") " pod="openshift-multus/multus-additional-cni-plugins-49hht"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.778549    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7fcd3db3-55f1-4c23-8fa9-78844495cea3-proxy-tls\") pod \"machine-config-daemon-p5qvf\" (UID: \"7fcd3db3-55f1-4c23-8fa9-78844495cea3\") " pod="openshift-machine-config-operator/machine-config-daemon-p5qvf"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.778664    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6f97b1f1-1fad-44ec-8253-17dd6a5eee54-multus-socket-dir-parent\") pod \"multus-6r2kn\" (UID: \"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\") " pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.778682    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6f97b1f1-1fad-44ec-8253-17dd6a5eee54-host-var-lib-cni-multus\") pod \"multus-6r2kn\" (UID: \"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\") " pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.778707    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6f97b1f1-1fad-44ec-8253-17dd6a5eee54-hostroot\") pod \"multus-6r2kn\" (UID: \"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\") " pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.778752    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-run-systemd\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.778776    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-run-openvswitch\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.778830    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dbb015c0-3a11-48bf-a59f-22bc03ca2fb9-system-cni-dir\") pod \"multus-additional-cni-plugins-49hht\" (UID: \"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\") " pod="openshift-multus/multus-additional-cni-plugins-49hht"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.778853    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6f97b1f1-1fad-44ec-8253-17dd6a5eee54-host-var-lib-kubelet\") pod \"multus-6r2kn\" (UID: \"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\") " pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.778898    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6f97b1f1-1fad-44ec-8253-17dd6a5eee54-cni-binary-copy\") pod \"multus-6r2kn\" (UID: \"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\") " pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.778915    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-p47zh\" (UID: \"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.778956    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-systemd-units\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.778987    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6f97b1f1-1fad-44ec-8253-17dd6a5eee54-host-var-lib-cni-multus\") pod \"multus-6r2kn\" (UID: \"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\") " pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.779017    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6f97b1f1-1fad-44ec-8253-17dd6a5eee54-multus-conf-dir\") pod \"multus-6r2kn\" (UID: \"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\") " pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.779018    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6f97b1f1-1fad-44ec-8253-17dd6a5eee54-multus-cni-dir\") pod \"multus-6r2kn\" (UID: \"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\") " pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.779043    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-node-log\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.779060    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-log-socket\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.779079    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dbb015c0-3a11-48bf-a59f-22bc03ca2fb9-os-release\") pod \"multus-additional-cni-plugins-49hht\" (UID: \"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\") " pod="openshift-multus/multus-additional-cni-plugins-49hht"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.779081    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.779121    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plthx\" (UniqueName: \"kubernetes.io/projected/102cb977-7291-453e-9282-20572071afee-kube-api-access-plthx\") pod \"node-resolver-69fnw\" (UID: \"102cb977-7291-453e-9282-20572071afee\") " pod="openshift-dns/node-resolver-69fnw"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.779157    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6f97b1f1-1fad-44ec-8253-17dd6a5eee54-etc-kubernetes\") pod \"multus-6r2kn\" (UID: \"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\") " pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.779177    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c4b4e0e8-af33-491e-b1d1-31079d90c656-ovnkube-config\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.779198    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c4b4e0e8-af33-491e-b1d1-31079d90c656-ovnkube-script-lib\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.779217    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c4b4e0e8-af33-491e-b1d1-31079d90c656-env-overrides\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.779225    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2ee8d55f-90bd-4484-8455-933de455efea-host\") pod \"node-ca-n4w74\" (UID: \"2ee8d55f-90bd-4484-8455-933de455efea\") " pod="openshift-image-registry/node-ca-n4w74"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.778300    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6f97b1f1-1fad-44ec-8253-17dd6a5eee54-cnibin\") pod \"multus-6r2kn\" (UID: \"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\") " pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.779265    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fvg6\" (UniqueName: \"kubernetes.io/projected/2ee8d55f-90bd-4484-8455-933de455efea-kube-api-access-2fvg6\") pod \"node-ca-n4w74\" (UID: \"2ee8d55f-90bd-4484-8455-933de455efea\") " pod="openshift-image-registry/node-ca-n4w74"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.779291    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzk8j\" (UniqueName: \"kubernetes.io/projected/7fcd3db3-55f1-4c23-8fa9-78844495cea3-kube-api-access-lzk8j\") pod \"machine-config-daemon-p5qvf\" (UID: \"7fcd3db3-55f1-4c23-8fa9-78844495cea3\") " pod="openshift-machine-config-operator/machine-config-daemon-p5qvf"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.779310    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6f97b1f1-1fad-44ec-8253-17dd6a5eee54-os-release\") pod \"multus-6r2kn\" (UID: \"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\") " pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.779329    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6f97b1f1-1fad-44ec-8253-17dd6a5eee54-host-run-k8s-cni-cncf-io\") pod \"multus-6r2kn\" (UID: \"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\") " pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.779354    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dbb015c0-3a11-48bf-a59f-22bc03ca2fb9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-49hht\" (UID: \"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\") " pod="openshift-multus/multus-additional-cni-plugins-49hht"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.779376    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6f97b1f1-1fad-44ec-8253-17dd6a5eee54-system-cni-dir\") pod \"multus-6r2kn\" (UID: \"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\") " pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.779398    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6f97b1f1-1fad-44ec-8253-17dd6a5eee54-host-run-netns\") pod \"multus-6r2kn\" (UID: \"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\") " pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.779421    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c4b4e0e8-af33-491e-b1d1-31079d90c656-ovn-node-metrics-cert\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.779446    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-p47zh\" (UID: \"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.779471    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7fcd3db3-55f1-4c23-8fa9-78844495cea3-rootfs\") pod \"machine-config-daemon-p5qvf\" (UID: \"7fcd3db3-55f1-4c23-8fa9-78844495cea3\") " pod="openshift-machine-config-operator/machine-config-daemon-p5qvf"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.779476    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.779505    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.779529    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.779542    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-host-cni-bin\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.779731    4730 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.779741    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dbb015c0-3a11-48bf-a59f-22bc03ca2fb9-cni-binary-copy\") pod \"multus-additional-cni-plugins-49hht\" (UID: \"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\") " pod="openshift-multus/multus-additional-cni-plugins-49hht"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.779771    4730 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.778351    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6f97b1f1-1fad-44ec-8253-17dd6a5eee54-multus-daemon-config\") pod \"multus-6r2kn\" (UID: \"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\") " pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.777676    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-var-lib-openvswitch\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.779933    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/dbb015c0-3a11-48bf-a59f-22bc03ca2fb9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-49hht\" (UID: \"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\") " pod="openshift-multus/multus-additional-cni-plugins-49hht"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.779937    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2ee8d55f-90bd-4484-8455-933de455efea-serviceca\") pod \"node-ca-n4w74\" (UID: \"2ee8d55f-90bd-4484-8455-933de455efea\") " pod="openshift-image-registry/node-ca-n4w74"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.780100    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.780123    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6f97b1f1-1fad-44ec-8253-17dd6a5eee54-etc-kubernetes\") pod \"multus-6r2kn\" (UID: \"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\") " pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.780150    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2ee8d55f-90bd-4484-8455-933de455efea-host\") pod \"node-ca-n4w74\" (UID: \"2ee8d55f-90bd-4484-8455-933de455efea\") " pod="openshift-image-registry/node-ca-n4w74"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.780147    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7fcd3db3-55f1-4c23-8fa9-78844495cea3-mcd-auth-proxy-config\") pod \"machine-config-daemon-p5qvf\" (UID: \"7fcd3db3-55f1-4c23-8fa9-78844495cea3\") " pod="openshift-machine-config-operator/machine-config-daemon-p5qvf"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.780176    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7fcd3db3-55f1-4c23-8fa9-78844495cea3-rootfs\") pod \"machine-config-daemon-p5qvf\" (UID: \"7fcd3db3-55f1-4c23-8fa9-78844495cea3\") " pod="openshift-machine-config-operator/machine-config-daemon-p5qvf"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.780175    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6f97b1f1-1fad-44ec-8253-17dd6a5eee54-host-run-k8s-cni-cncf-io\") pod \"multus-6r2kn\" (UID: \"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\") " pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.780325    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6f97b1f1-1fad-44ec-8253-17dd6a5eee54-system-cni-dir\") pod \"multus-6r2kn\" (UID: \"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\") " pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.780489    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-host-cni-bin\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.780486    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-log-socket\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.780528    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6f97b1f1-1fad-44ec-8253-17dd6a5eee54-host-run-netns\") pod \"multus-6r2kn\" (UID: \"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\") " pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.780538    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6f97b1f1-1fad-44ec-8253-17dd6a5eee54-host-var-lib-kubelet\") pod \"multus-6r2kn\" (UID: \"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\") " pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.780550    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-systemd-units\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.780561    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-host-slash\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.780565    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6f97b1f1-1fad-44ec-8253-17dd6a5eee54-hostroot\") pod \"multus-6r2kn\" (UID: \"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\") " pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.780586    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dbb015c0-3a11-48bf-a59f-22bc03ca2fb9-cnibin\") pod \"multus-additional-cni-plugins-49hht\" (UID: \"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\") " pod="openshift-multus/multus-additional-cni-plugins-49hht"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.780592    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-run-systemd\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.780609    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-run-openvswitch\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.781065    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c4b4e0e8-af33-491e-b1d1-31079d90c656-ovnkube-config\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.780712    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dbb015c0-3a11-48bf-a59f-22bc03ca2fb9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-49hht\" (UID: \"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\") " pod="openshift-multus/multus-additional-cni-plugins-49hht"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.782055    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c4b4e0e8-af33-491e-b1d1-31079d90c656-ovnkube-script-lib\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.782547    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.782571    4730 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.782663    4730 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.789652    4730 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.789693    4730 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.789705    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.789733    4730 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.789757    4730 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.789767    4730 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.789801    4730 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.789814    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.790166    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.790185    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.790195    4730 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.790203    4730 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.790213    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.790335    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.790358    4730 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.790368    4730 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.790378    4730 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.790392    4730 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.790405    4730 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.790418    4730 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.790433    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.790446    4730 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.790458    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\""
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.786030    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7fcd3db3-55f1-4c23-8fa9-78844495cea3-proxy-tls\") pod \"machine-config-daemon-p5qvf\" (UID: \"7fcd3db3-55f1-4c23-8fa9-78844495cea3\") " pod="openshift-machine-config-operator/machine-config-daemon-p5qvf"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.789908    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-p47zh\" (UID: \"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.783439    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c4b4e0e8-af33-491e-b1d1-31079d90c656-ovn-node-metrics-cert\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.782026    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dbb015c0-3a11-48bf-a59f-22bc03ca2fb9-system-cni-dir\") pod \"multus-additional-cni-plugins-49hht\" (UID: \"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\") " pod="openshift-multus/multus-additional-cni-plugins-49hht"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.792637    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzk8j\" (UniqueName: \"kubernetes.io/projected/7fcd3db3-55f1-4c23-8fa9-78844495cea3-kube-api-access-lzk8j\") pod \"machine-config-daemon-p5qvf\" (UID: \"7fcd3db3-55f1-4c23-8fa9-78844495cea3\") " pod="openshift-machine-config-operator/machine-config-daemon-p5qvf"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.794855    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qtg2\" (UniqueName: \"kubernetes.io/projected/dbb015c0-3a11-48bf-a59f-22bc03ca2fb9-kube-api-access-4qtg2\") pod \"multus-additional-cni-plugins-49hht\" (UID: \"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\") " pod="openshift-multus/multus-additional-cni-plugins-49hht"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.795738    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz64b\" (UniqueName: \"kubernetes.io/projected/c4b4e0e8-af33-491e-b1d1-31079d90c656-kube-api-access-mz64b\") pod \"ovnkube-node-qj97f\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") " pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.796021    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6r2kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvthz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6r2kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.796550    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4xpw\" (UniqueName: \"kubernetes.io/projected/a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0-kube-api-access-d4xpw\") pod \"ovnkube-control-plane-749d76644c-p47zh\" (UID: \"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.799565    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fvg6\" (UniqueName: \"kubernetes.io/projected/2ee8d55f-90bd-4484-8455-933de455efea-kube-api-access-2fvg6\") pod \"node-ca-n4w74\" (UID: \"2ee8d55f-90bd-4484-8455-933de455efea\") " pod="openshift-image-registry/node-ca-n4w74"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.799648    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvthz\" (UniqueName: \"kubernetes.io/projected/6f97b1f1-1fad-44ec-8253-17dd6a5eee54-kube-api-access-vvthz\") pod \"multus-6r2kn\" (UID: \"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\") " pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.807665    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plthx\" (UniqueName: \"kubernetes.io/projected/102cb977-7291-453e-9282-20572071afee-kube-api-access-plthx\") pod \"node-resolver-69fnw\" (UID: \"102cb977-7291-453e-9282-20572071afee\") " pod="openshift-dns/node-resolver-69fnw"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.807397    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2prfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2prfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.817919    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.826093    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.827786    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.827816    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.827826    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.827839    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.827849    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:12Z","lastTransitionTime":"2026-03-20T15:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.831471    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.837655    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.839271    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b4e0e8-af33-491e-b1d1-31079d90c656\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj97f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.845960    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n4w74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ee8d55f-90bd-4484-8455-933de455efea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fvg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n4w74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:12 crc kubenswrapper[4730]: E0320 15:40:12.847362    4730 kuberuntime_manager.go:1274] "Unhandled Error" err=<
Mar 20 15:40:12 crc kubenswrapper[4730]:         container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash
Mar 20 15:40:12 crc kubenswrapper[4730]:         set -euo pipefail
Mar 20 15:40:12 crc kubenswrapper[4730]:         TLS_PK=/etc/pki/tls/metrics-cert/tls.key
Mar 20 15:40:12 crc kubenswrapper[4730]:         TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt
Mar 20 15:40:12 crc kubenswrapper[4730]:         # As the secret mount is optional we must wait for the files to be present.
Mar 20 15:40:12 crc kubenswrapper[4730]:         # The service is created in monitor.yaml and this is created in sdn.yaml.
Mar 20 15:40:12 crc kubenswrapper[4730]:         TS=$(date +%s)
Mar 20 15:40:12 crc kubenswrapper[4730]:         WARN_TS=$(( ${TS} + $(( 20 * 60)) ))
Mar 20 15:40:12 crc kubenswrapper[4730]:         HAS_LOGGED_INFO=0
Mar 20 15:40:12 crc kubenswrapper[4730]:         
Mar 20 15:40:12 crc kubenswrapper[4730]:         log_missing_certs(){
Mar 20 15:40:12 crc kubenswrapper[4730]:             CUR_TS=$(date +%s)
Mar 20 15:40:12 crc kubenswrapper[4730]:             if [[ "${CUR_TS}" -gt "WARN_TS"  ]]; then
Mar 20 15:40:12 crc kubenswrapper[4730]:               echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes.
Mar 20 15:40:12 crc kubenswrapper[4730]:             elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then
Mar 20 15:40:12 crc kubenswrapper[4730]:               echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes.
Mar 20 15:40:12 crc kubenswrapper[4730]:               HAS_LOGGED_INFO=1
Mar 20 15:40:12 crc kubenswrapper[4730]:             fi
Mar 20 15:40:12 crc kubenswrapper[4730]:         }
Mar 20 15:40:12 crc kubenswrapper[4730]:         while [[ ! -f "${TLS_PK}" ||  ! -f "${TLS_CERT}" ]] ; do
Mar 20 15:40:12 crc kubenswrapper[4730]:           log_missing_certs
Mar 20 15:40:12 crc kubenswrapper[4730]:           sleep 5
Mar 20 15:40:12 crc kubenswrapper[4730]:         done
Mar 20 15:40:12 crc kubenswrapper[4730]:         
Mar 20 15:40:12 crc kubenswrapper[4730]:         echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy
Mar 20 15:40:12 crc kubenswrapper[4730]:         exec /usr/bin/kube-rbac-proxy \
Mar 20 15:40:12 crc kubenswrapper[4730]:           --logtostderr \
Mar 20 15:40:12 crc kubenswrapper[4730]:           --secure-listen-address=:9108 \
Mar 20 15:40:12 crc kubenswrapper[4730]:           --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \
Mar 20 15:40:12 crc kubenswrapper[4730]:           --upstream=http://127.0.0.1:29108/ \
Mar 20 15:40:12 crc kubenswrapper[4730]:           --tls-private-key-file=${TLS_PK} \
Mar 20 15:40:12 crc kubenswrapper[4730]:           --tls-cert-file=${TLS_CERT}
Mar 20 15:40:12 crc kubenswrapper[4730]:         ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d4xpw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-p47zh_openshift-ovn-kubernetes(a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars
Mar 20 15:40:12 crc kubenswrapper[4730]:  > logger="UnhandledError"
Mar 20 15:40:12 crc kubenswrapper[4730]: E0320 15:40:12.849826    4730 kuberuntime_manager.go:1274] "Unhandled Error" err=<
Mar 20 15:40:12 crc kubenswrapper[4730]:         container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe
Mar 20 15:40:12 crc kubenswrapper[4730]:         if [[ -f "/env/_master" ]]; then
Mar 20 15:40:12 crc kubenswrapper[4730]:           set -o allexport
Mar 20 15:40:12 crc kubenswrapper[4730]:           source "/env/_master"
Mar 20 15:40:12 crc kubenswrapper[4730]:           set +o allexport
Mar 20 15:40:12 crc kubenswrapper[4730]:         fi
Mar 20 15:40:12 crc kubenswrapper[4730]:         
Mar 20 15:40:12 crc kubenswrapper[4730]:         ovn_v4_join_subnet_opt=
Mar 20 15:40:12 crc kubenswrapper[4730]:         if [[ "" != "" ]]; then
Mar 20 15:40:12 crc kubenswrapper[4730]:           ovn_v4_join_subnet_opt="--gateway-v4-join-subnet "
Mar 20 15:40:12 crc kubenswrapper[4730]:         fi
Mar 20 15:40:12 crc kubenswrapper[4730]:         ovn_v6_join_subnet_opt=
Mar 20 15:40:12 crc kubenswrapper[4730]:         if [[ "" != "" ]]; then
Mar 20 15:40:12 crc kubenswrapper[4730]:           ovn_v6_join_subnet_opt="--gateway-v6-join-subnet "
Mar 20 15:40:12 crc kubenswrapper[4730]:         fi
Mar 20 15:40:12 crc kubenswrapper[4730]:         
Mar 20 15:40:12 crc kubenswrapper[4730]:         ovn_v4_transit_switch_subnet_opt=
Mar 20 15:40:12 crc kubenswrapper[4730]:         if [[ "" != "" ]]; then
Mar 20 15:40:12 crc kubenswrapper[4730]:           ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet "
Mar 20 15:40:12 crc kubenswrapper[4730]:         fi
Mar 20 15:40:12 crc kubenswrapper[4730]:         ovn_v6_transit_switch_subnet_opt=
Mar 20 15:40:12 crc kubenswrapper[4730]:         if [[ "" != "" ]]; then
Mar 20 15:40:12 crc kubenswrapper[4730]:           ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet "
Mar 20 15:40:12 crc kubenswrapper[4730]:         fi
Mar 20 15:40:12 crc kubenswrapper[4730]:         
Mar 20 15:40:12 crc kubenswrapper[4730]:         dns_name_resolver_enabled_flag=
Mar 20 15:40:12 crc kubenswrapper[4730]:         if [[ "false" == "true" ]]; then
Mar 20 15:40:12 crc kubenswrapper[4730]:           dns_name_resolver_enabled_flag="--enable-dns-name-resolver"
Mar 20 15:40:12 crc kubenswrapper[4730]:         fi
Mar 20 15:40:12 crc kubenswrapper[4730]:         
Mar 20 15:40:12 crc kubenswrapper[4730]:         persistent_ips_enabled_flag=
Mar 20 15:40:12 crc kubenswrapper[4730]:         if [[ "true" == "true" ]]; then
Mar 20 15:40:12 crc kubenswrapper[4730]:           persistent_ips_enabled_flag="--enable-persistent-ips"
Mar 20 15:40:12 crc kubenswrapper[4730]:         fi
Mar 20 15:40:12 crc kubenswrapper[4730]:         
Mar 20 15:40:12 crc kubenswrapper[4730]:         # This is needed so that converting clusters from GA to TP
Mar 20 15:40:12 crc kubenswrapper[4730]:         # will rollout control plane pods as well
Mar 20 15:40:12 crc kubenswrapper[4730]:         network_segmentation_enabled_flag=
Mar 20 15:40:12 crc kubenswrapper[4730]:         multi_network_enabled_flag=
Mar 20 15:40:12 crc kubenswrapper[4730]:         if [[ "true" == "true" ]]; then
Mar 20 15:40:12 crc kubenswrapper[4730]:           multi_network_enabled_flag="--enable-multi-network"
Mar 20 15:40:12 crc kubenswrapper[4730]:           network_segmentation_enabled_flag="--enable-network-segmentation"
Mar 20 15:40:12 crc kubenswrapper[4730]:         fi
Mar 20 15:40:12 crc kubenswrapper[4730]:         
Mar 20 15:40:12 crc kubenswrapper[4730]:         echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}"
Mar 20 15:40:12 crc kubenswrapper[4730]:         exec /usr/bin/ovnkube \
Mar 20 15:40:12 crc kubenswrapper[4730]:           --enable-interconnect \
Mar 20 15:40:12 crc kubenswrapper[4730]:           --init-cluster-manager "${K8S_NODE}" \
Mar 20 15:40:12 crc kubenswrapper[4730]:           --config-file=/run/ovnkube-config/ovnkube.conf \
Mar 20 15:40:12 crc kubenswrapper[4730]:           --loglevel "${OVN_KUBE_LOG_LEVEL}" \
Mar 20 15:40:12 crc kubenswrapper[4730]:           --metrics-bind-address "127.0.0.1:29108" \
Mar 20 15:40:12 crc kubenswrapper[4730]:           --metrics-enable-pprof \
Mar 20 15:40:12 crc kubenswrapper[4730]:           --metrics-enable-config-duration \
Mar 20 15:40:12 crc kubenswrapper[4730]:           ${ovn_v4_join_subnet_opt} \
Mar 20 15:40:12 crc kubenswrapper[4730]:           ${ovn_v6_join_subnet_opt} \
Mar 20 15:40:12 crc kubenswrapper[4730]:           ${ovn_v4_transit_switch_subnet_opt} \
Mar 20 15:40:12 crc kubenswrapper[4730]:           ${ovn_v6_transit_switch_subnet_opt} \
Mar 20 15:40:12 crc kubenswrapper[4730]:           ${dns_name_resolver_enabled_flag} \
Mar 20 15:40:12 crc kubenswrapper[4730]:           ${persistent_ips_enabled_flag} \
Mar 20 15:40:12 crc kubenswrapper[4730]:           ${multi_network_enabled_flag} \
Mar 20 15:40:12 crc kubenswrapper[4730]:           ${network_segmentation_enabled_flag}
Mar 20 15:40:12 crc kubenswrapper[4730]:         ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d4xpw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-p47zh_openshift-ovn-kubernetes(a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars
Mar 20 15:40:12 crc kubenswrapper[4730]:  > logger="UnhandledError"
Mar 20 15:40:12 crc kubenswrapper[4730]: W0320 15:40:12.850350    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-0d4eaf48ca38e2c780071b4c2cd083f62f9df885d079ed59049f71027240efb4 WatchSource:0}: Error finding container 0d4eaf48ca38e2c780071b4c2cd083f62f9df885d079ed59049f71027240efb4: Status 404 returned error can't find the container with id 0d4eaf48ca38e2c780071b4c2cd083f62f9df885d079ed59049f71027240efb4
Mar 20 15:40:12 crc kubenswrapper[4730]: E0320 15:40:12.850901    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh" podUID="a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0"
Mar 20 15:40:12 crc kubenswrapper[4730]: E0320 15:40:12.854737    4730 kuberuntime_manager.go:1274] "Unhandled Error" err=<
Mar 20 15:40:12 crc kubenswrapper[4730]:         container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash
Mar 20 15:40:12 crc kubenswrapper[4730]:         set -o allexport
Mar 20 15:40:12 crc kubenswrapper[4730]:         if [[ -f /etc/kubernetes/apiserver-url.env ]]; then
Mar 20 15:40:12 crc kubenswrapper[4730]:           source /etc/kubernetes/apiserver-url.env
Mar 20 15:40:12 crc kubenswrapper[4730]:         else
Mar 20 15:40:12 crc kubenswrapper[4730]:           echo "Error: /etc/kubernetes/apiserver-url.env is missing"
Mar 20 15:40:12 crc kubenswrapper[4730]:           exit 1
Mar 20 15:40:12 crc kubenswrapper[4730]:         fi
Mar 20 15:40:12 crc kubenswrapper[4730]:         exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104
Mar 20 15:40:12 crc kubenswrapper[4730]:         ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars
Mar 20 15:40:12 crc kubenswrapper[4730]:  > logger="UnhandledError"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.856008    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:12 crc kubenswrapper[4730]: E0320 15:40:12.856126    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.863209    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.865832    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p47zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.875480    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:12 crc kubenswrapper[4730]: W0320 15:40:12.876269    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-aaa236df0208971cf48de3f9bf06671b82fce6170fb3ad4236c9cb521096380b WatchSource:0}: Error finding container aaa236df0208971cf48de3f9bf06671b82fce6170fb3ad4236c9cb521096380b: Status 404 returned error can't find the container with id aaa236df0208971cf48de3f9bf06671b82fce6170fb3ad4236c9cb521096380b
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.876308    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.879094    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf"
Mar 20 15:40:12 crc kubenswrapper[4730]: E0320 15:40:12.881630    4730 kuberuntime_manager.go:1274] "Unhandled Error" err=<
Mar 20 15:40:12 crc kubenswrapper[4730]:         container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe
Mar 20 15:40:12 crc kubenswrapper[4730]:         if [[ -f "/env/_master" ]]; then
Mar 20 15:40:12 crc kubenswrapper[4730]:           set -o allexport
Mar 20 15:40:12 crc kubenswrapper[4730]:           source "/env/_master"
Mar 20 15:40:12 crc kubenswrapper[4730]:           set +o allexport
Mar 20 15:40:12 crc kubenswrapper[4730]:         fi
Mar 20 15:40:12 crc kubenswrapper[4730]:         # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled.
Mar 20 15:40:12 crc kubenswrapper[4730]:         # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791
Mar 20 15:40:12 crc kubenswrapper[4730]:         ho_enable="--enable-hybrid-overlay"
Mar 20 15:40:12 crc kubenswrapper[4730]:         echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook"
Mar 20 15:40:12 crc kubenswrapper[4730]:         # extra-allowed-user: service account `ovn-kubernetes-control-plane`
Mar 20 15:40:12 crc kubenswrapper[4730]:         # sets pod annotations in multi-homing layer3 network controller (cluster-manager)
Mar 20 15:40:12 crc kubenswrapper[4730]:         exec /usr/bin/ovnkube-identity  --k8s-apiserver=https://api-int.crc.testing:6443 \
Mar 20 15:40:12 crc kubenswrapper[4730]:             --webhook-cert-dir="/etc/webhook-cert" \
Mar 20 15:40:12 crc kubenswrapper[4730]:             --webhook-host=127.0.0.1 \
Mar 20 15:40:12 crc kubenswrapper[4730]:             --webhook-port=9743 \
Mar 20 15:40:12 crc kubenswrapper[4730]:             ${ho_enable} \
Mar 20 15:40:12 crc kubenswrapper[4730]:             --enable-interconnect \
Mar 20 15:40:12 crc kubenswrapper[4730]:             --disable-approver \
Mar 20 15:40:12 crc kubenswrapper[4730]:             --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \
Mar 20 15:40:12 crc kubenswrapper[4730]:             --wait-for-kubernetes-api=200s \
Mar 20 15:40:12 crc kubenswrapper[4730]:             --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \
Mar 20 15:40:12 crc kubenswrapper[4730]:             --loglevel="${LOGLEVEL}"
Mar 20 15:40:12 crc kubenswrapper[4730]:         ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars
Mar 20 15:40:12 crc kubenswrapper[4730]:  > logger="UnhandledError"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.883488    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fcd3db3-55f1-4c23-8fa9-78844495cea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p5qvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:12 crc kubenswrapper[4730]: E0320 15:40:12.884216    4730 kuberuntime_manager.go:1274] "Unhandled Error" err=<
Mar 20 15:40:12 crc kubenswrapper[4730]:         container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe
Mar 20 15:40:12 crc kubenswrapper[4730]:         if [[ -f "/env/_master" ]]; then
Mar 20 15:40:12 crc kubenswrapper[4730]:           set -o allexport
Mar 20 15:40:12 crc kubenswrapper[4730]:           source "/env/_master"
Mar 20 15:40:12 crc kubenswrapper[4730]:           set +o allexport
Mar 20 15:40:12 crc kubenswrapper[4730]:         fi
Mar 20 15:40:12 crc kubenswrapper[4730]:         
Mar 20 15:40:12 crc kubenswrapper[4730]:         echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver"
Mar 20 15:40:12 crc kubenswrapper[4730]:         exec /usr/bin/ovnkube-identity  --k8s-apiserver=https://api-int.crc.testing:6443 \
Mar 20 15:40:12 crc kubenswrapper[4730]:             --disable-webhook \
Mar 20 15:40:12 crc kubenswrapper[4730]:             --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \
Mar 20 15:40:12 crc kubenswrapper[4730]:             --loglevel="${LOGLEVEL}"
Mar 20 15:40:12 crc kubenswrapper[4730]:         ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars
Mar 20 15:40:12 crc kubenswrapper[4730]:  > logger="UnhandledError"
Mar 20 15:40:12 crc kubenswrapper[4730]: E0320 15:40:12.885444    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.886393    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-n4w74"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.891082    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-69fnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102cb977-7291-453e-9282-20572071afee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plthx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-69fnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.891402    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a-metrics-certs\") pod \"network-metrics-daemon-2prfn\" (UID: \"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\") " pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.891437    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7xml\" (UniqueName: \"kubernetes.io/projected/db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a-kube-api-access-m7xml\") pod \"network-metrics-daemon-2prfn\" (UID: \"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\") " pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:40:12 crc kubenswrapper[4730]: E0320 15:40:12.891569    4730 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered
Mar 20 15:40:12 crc kubenswrapper[4730]: E0320 15:40:12.891633    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a-metrics-certs podName:db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a nodeName:}" failed. No retries permitted until 2026-03-20 15:40:13.391615862 +0000 UTC m=+72.604987241 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a-metrics-certs") pod "network-metrics-daemon-2prfn" (UID: "db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a") : object "openshift-multus"/"metrics-daemon-secret" not registered
Mar 20 15:40:12 crc kubenswrapper[4730]: W0320 15:40:12.896099    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-52ab0a486f857a8f6da3f2dfd99d1bad7f101147f963e683628056e2b95a1a8b WatchSource:0}: Error finding container 52ab0a486f857a8f6da3f2dfd99d1bad7f101147f963e683628056e2b95a1a8b: Status 404 returned error can't find the container with id 52ab0a486f857a8f6da3f2dfd99d1bad7f101147f963e683628056e2b95a1a8b
Mar 20 15:40:12 crc kubenswrapper[4730]: E0320 15:40:12.898067    4730 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError"
Mar 20 15:40:12 crc kubenswrapper[4730]: E0320 15:40:12.899283    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.901559    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-49hht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-49hht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:12 crc kubenswrapper[4730]: W0320 15:40:12.904285    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ee8d55f_90bd_4484_8455_933de455efea.slice/crio-c6cedbaeead04f0a723ee0c341b7f6751c3ad80c472b32b9905de4d6b0d54e0b WatchSource:0}: Error finding container c6cedbaeead04f0a723ee0c341b7f6751c3ad80c472b32b9905de4d6b0d54e0b: Status 404 returned error can't find the container with id c6cedbaeead04f0a723ee0c341b7f6751c3ad80c472b32b9905de4d6b0d54e0b
Mar 20 15:40:12 crc kubenswrapper[4730]: E0320 15:40:12.904459    4730 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lzk8j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError"
Mar 20 15:40:12 crc kubenswrapper[4730]: E0320 15:40:12.906469    4730 kuberuntime_manager.go:1274] "Unhandled Error" err=<
Mar 20 15:40:12 crc kubenswrapper[4730]:         container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM
Mar 20 15:40:12 crc kubenswrapper[4730]:         while [ true ];
Mar 20 15:40:12 crc kubenswrapper[4730]:         do
Mar 20 15:40:12 crc kubenswrapper[4730]:           for f in $(ls /tmp/serviceca); do
Mar 20 15:40:12 crc kubenswrapper[4730]:               echo $f
Mar 20 15:40:12 crc kubenswrapper[4730]:               ca_file_path="/tmp/serviceca/${f}"
Mar 20 15:40:12 crc kubenswrapper[4730]:               f=$(echo $f | sed  -r 's/(.*)\.\./\1:/')
Mar 20 15:40:12 crc kubenswrapper[4730]:               reg_dir_path="/etc/docker/certs.d/${f}"
Mar 20 15:40:12 crc kubenswrapper[4730]:               if [ -e "${reg_dir_path}" ]; then
Mar 20 15:40:12 crc kubenswrapper[4730]:                   cp -u $ca_file_path $reg_dir_path/ca.crt
Mar 20 15:40:12 crc kubenswrapper[4730]:               else
Mar 20 15:40:12 crc kubenswrapper[4730]:                   mkdir $reg_dir_path
Mar 20 15:40:12 crc kubenswrapper[4730]:                   cp $ca_file_path $reg_dir_path/ca.crt
Mar 20 15:40:12 crc kubenswrapper[4730]:               fi
Mar 20 15:40:12 crc kubenswrapper[4730]:           done
Mar 20 15:40:12 crc kubenswrapper[4730]:           for d in $(ls /etc/docker/certs.d); do
Mar 20 15:40:12 crc kubenswrapper[4730]:               echo $d
Mar 20 15:40:12 crc kubenswrapper[4730]:               dp=$(echo $d | sed  -r 's/(.*):/\1\.\./')
Mar 20 15:40:12 crc kubenswrapper[4730]:               reg_conf_path="/tmp/serviceca/${dp}"
Mar 20 15:40:12 crc kubenswrapper[4730]:               if [ ! -e "${reg_conf_path}" ]; then
Mar 20 15:40:12 crc kubenswrapper[4730]:                   rm -rf /etc/docker/certs.d/$d
Mar 20 15:40:12 crc kubenswrapper[4730]:               fi
Mar 20 15:40:12 crc kubenswrapper[4730]:           done
Mar 20 15:40:12 crc kubenswrapper[4730]:           sleep 60 & wait ${!}
Mar 20 15:40:12 crc kubenswrapper[4730]:         done
Mar 20 15:40:12 crc kubenswrapper[4730]:         ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2fvg6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-n4w74_openshift-image-registry(2ee8d55f-90bd-4484-8455-933de455efea): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars
Mar 20 15:40:12 crc kubenswrapper[4730]:  > logger="UnhandledError"
Mar 20 15:40:12 crc kubenswrapper[4730]: E0320 15:40:12.906590    4730 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lzk8j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError"
Mar 20 15:40:12 crc kubenswrapper[4730]: E0320 15:40:12.908132    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-n4w74" podUID="2ee8d55f-90bd-4484-8455-933de455efea"
Mar 20 15:40:12 crc kubenswrapper[4730]: E0320 15:40:12.908217    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.908715    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7xml\" (UniqueName: \"kubernetes.io/projected/db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a-kube-api-access-m7xml\") pod \"network-metrics-daemon-2prfn\" (UID: \"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\") " pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.923201    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-6r2kn"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.929489    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.929527    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.929536    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.929565    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.929575    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:12Z","lastTransitionTime":"2026-03-20T15:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:12 crc kubenswrapper[4730]: W0320 15:40:12.931928    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f97b1f1_1fad_44ec_8253_17dd6a5eee54.slice/crio-fc4a02f9622b344573aecaba86f050f3013a2ce63ed59201d29d31b7fdfc4c52 WatchSource:0}: Error finding container fc4a02f9622b344573aecaba86f050f3013a2ce63ed59201d29d31b7fdfc4c52: Status 404 returned error can't find the container with id fc4a02f9622b344573aecaba86f050f3013a2ce63ed59201d29d31b7fdfc4c52
Mar 20 15:40:12 crc kubenswrapper[4730]: E0320 15:40:12.933638    4730 kuberuntime_manager.go:1274] "Unhandled Error" err=<
Mar 20 15:40:12 crc kubenswrapper[4730]:         container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT=""
Mar 20 15:40:12 crc kubenswrapper[4730]:         /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT
Mar 20 15:40:12 crc kubenswrapper[4730]:         ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vvthz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-6r2kn_openshift-multus(6f97b1f1-1fad-44ec-8253-17dd6a5eee54): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars
Mar 20 15:40:12 crc kubenswrapper[4730]:  > logger="UnhandledError"
Mar 20 15:40:12 crc kubenswrapper[4730]: E0320 15:40:12.934841    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-6r2kn" podUID="6f97b1f1-1fad-44ec-8253-17dd6a5eee54"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.942393    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:12 crc kubenswrapper[4730]: E0320 15:40:12.952924    4730 kuberuntime_manager.go:1274] "Unhandled Error" err=<
Mar 20 15:40:12 crc kubenswrapper[4730]:         init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig
Mar 20 15:40:12 crc kubenswrapper[4730]:         apiVersion: v1
Mar 20 15:40:12 crc kubenswrapper[4730]:         clusters:
Mar 20 15:40:12 crc kubenswrapper[4730]:           - cluster:
Mar 20 15:40:12 crc kubenswrapper[4730]:               certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt
Mar 20 15:40:12 crc kubenswrapper[4730]:               server: https://api-int.crc.testing:6443
Mar 20 15:40:12 crc kubenswrapper[4730]:             name: default-cluster
Mar 20 15:40:12 crc kubenswrapper[4730]:         contexts:
Mar 20 15:40:12 crc kubenswrapper[4730]:           - context:
Mar 20 15:40:12 crc kubenswrapper[4730]:               cluster: default-cluster
Mar 20 15:40:12 crc kubenswrapper[4730]:               namespace: default
Mar 20 15:40:12 crc kubenswrapper[4730]:               user: default-auth
Mar 20 15:40:12 crc kubenswrapper[4730]:             name: default-context
Mar 20 15:40:12 crc kubenswrapper[4730]:         current-context: default-context
Mar 20 15:40:12 crc kubenswrapper[4730]:         kind: Config
Mar 20 15:40:12 crc kubenswrapper[4730]:         preferences: {}
Mar 20 15:40:12 crc kubenswrapper[4730]:         users:
Mar 20 15:40:12 crc kubenswrapper[4730]:           - name: default-auth
Mar 20 15:40:12 crc kubenswrapper[4730]:             user:
Mar 20 15:40:12 crc kubenswrapper[4730]:               client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem
Mar 20 15:40:12 crc kubenswrapper[4730]:               client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem
Mar 20 15:40:12 crc kubenswrapper[4730]:         EOF
Mar 20 15:40:12 crc kubenswrapper[4730]:         ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mz64b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-qj97f_openshift-ovn-kubernetes(c4b4e0e8-af33-491e-b1d1-31079d90c656): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars
Mar 20 15:40:12 crc kubenswrapper[4730]:  > logger="UnhandledError"
Mar 20 15:40:12 crc kubenswrapper[4730]: E0320 15:40:12.954092    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.954101    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-69fnw"
Mar 20 15:40:12 crc kubenswrapper[4730]: W0320 15:40:12.963545    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod102cb977_7291_453e_9282_20572071afee.slice/crio-e6f0997dd31e8e344f7ef497d6d54e32ae4978ad19519923f1968f9227693b16 WatchSource:0}: Error finding container e6f0997dd31e8e344f7ef497d6d54e32ae4978ad19519923f1968f9227693b16: Status 404 returned error can't find the container with id e6f0997dd31e8e344f7ef497d6d54e32ae4978ad19519923f1968f9227693b16
Mar 20 15:40:12 crc kubenswrapper[4730]: E0320 15:40:12.965232    4730 kuberuntime_manager.go:1274] "Unhandled Error" err=<
Mar 20 15:40:12 crc kubenswrapper[4730]:         container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash
Mar 20 15:40:12 crc kubenswrapper[4730]:         set -uo pipefail
Mar 20 15:40:12 crc kubenswrapper[4730]:         
Mar 20 15:40:12 crc kubenswrapper[4730]:         trap 'jobs -p | xargs kill || true; wait; exit 0' TERM
Mar 20 15:40:12 crc kubenswrapper[4730]:         
Mar 20 15:40:12 crc kubenswrapper[4730]:         OPENSHIFT_MARKER="openshift-generated-node-resolver"
Mar 20 15:40:12 crc kubenswrapper[4730]:         HOSTS_FILE="/etc/hosts"
Mar 20 15:40:12 crc kubenswrapper[4730]:         TEMP_FILE="/etc/hosts.tmp"
Mar 20 15:40:12 crc kubenswrapper[4730]:         
Mar 20 15:40:12 crc kubenswrapper[4730]:         IFS=', ' read -r -a services <<< "${SERVICES}"
Mar 20 15:40:12 crc kubenswrapper[4730]:         
Mar 20 15:40:12 crc kubenswrapper[4730]:         # Make a temporary file with the old hosts file's attributes.
Mar 20 15:40:12 crc kubenswrapper[4730]:         if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then
Mar 20 15:40:12 crc kubenswrapper[4730]:           echo "Failed to preserve hosts file. Exiting."
Mar 20 15:40:12 crc kubenswrapper[4730]:           exit 1
Mar 20 15:40:12 crc kubenswrapper[4730]:         fi
Mar 20 15:40:12 crc kubenswrapper[4730]:         
Mar 20 15:40:12 crc kubenswrapper[4730]:         while true; do
Mar 20 15:40:12 crc kubenswrapper[4730]:           declare -A svc_ips
Mar 20 15:40:12 crc kubenswrapper[4730]:           for svc in "${services[@]}"; do
Mar 20 15:40:12 crc kubenswrapper[4730]:             # Fetch service IP from cluster dns if present. We make several tries
Mar 20 15:40:12 crc kubenswrapper[4730]:             # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones
Mar 20 15:40:12 crc kubenswrapper[4730]:             # are for deployments with Kuryr on older OpenStack (OSP13) - those do not
Mar 20 15:40:12 crc kubenswrapper[4730]:             # support UDP loadbalancers and require reaching DNS through TCP.
Mar 20 15:40:12 crc kubenswrapper[4730]:             cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"'
Mar 20 15:40:12 crc kubenswrapper[4730]:                   'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"'
Mar 20 15:40:12 crc kubenswrapper[4730]:                   'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"'
Mar 20 15:40:12 crc kubenswrapper[4730]:                   'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"')
Mar 20 15:40:12 crc kubenswrapper[4730]:             for i in ${!cmds[*]}
Mar 20 15:40:12 crc kubenswrapper[4730]:             do
Mar 20 15:40:12 crc kubenswrapper[4730]:               ips=($(eval "${cmds[i]}"))
Mar 20 15:40:12 crc kubenswrapper[4730]:               if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then
Mar 20 15:40:12 crc kubenswrapper[4730]:                 svc_ips["${svc}"]="${ips[@]}"
Mar 20 15:40:12 crc kubenswrapper[4730]:                 break
Mar 20 15:40:12 crc kubenswrapper[4730]:               fi
Mar 20 15:40:12 crc kubenswrapper[4730]:             done
Mar 20 15:40:12 crc kubenswrapper[4730]:           done
Mar 20 15:40:12 crc kubenswrapper[4730]:         
Mar 20 15:40:12 crc kubenswrapper[4730]:           # Update /etc/hosts only if we get valid service IPs
Mar 20 15:40:12 crc kubenswrapper[4730]:           # We will not update /etc/hosts when there is coredns service outage or api unavailability
Mar 20 15:40:12 crc kubenswrapper[4730]:           # Stale entries could exist in /etc/hosts if the service is deleted
Mar 20 15:40:12 crc kubenswrapper[4730]:           if [[ -n "${svc_ips[*]-}" ]]; then
Mar 20 15:40:12 crc kubenswrapper[4730]:             # Build a new hosts file from /etc/hosts with our custom entries filtered out
Mar 20 15:40:12 crc kubenswrapper[4730]:             if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then
Mar 20 15:40:12 crc kubenswrapper[4730]:               # Only continue rebuilding the hosts entries if its original content is preserved
Mar 20 15:40:12 crc kubenswrapper[4730]:               sleep 60 & wait
Mar 20 15:40:12 crc kubenswrapper[4730]:               continue
Mar 20 15:40:12 crc kubenswrapper[4730]:             fi
Mar 20 15:40:12 crc kubenswrapper[4730]:         
Mar 20 15:40:12 crc kubenswrapper[4730]:             # Append resolver entries for services
Mar 20 15:40:12 crc kubenswrapper[4730]:             rc=0
Mar 20 15:40:12 crc kubenswrapper[4730]:             for svc in "${!svc_ips[@]}"; do
Mar 20 15:40:12 crc kubenswrapper[4730]:               for ip in ${svc_ips[${svc}]}; do
Mar 20 15:40:12 crc kubenswrapper[4730]:                 echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$?
Mar 20 15:40:12 crc kubenswrapper[4730]:               done
Mar 20 15:40:12 crc kubenswrapper[4730]:             done
Mar 20 15:40:12 crc kubenswrapper[4730]:             if [[ $rc -ne 0 ]]; then
Mar 20 15:40:12 crc kubenswrapper[4730]:               sleep 60 & wait
Mar 20 15:40:12 crc kubenswrapper[4730]:               continue
Mar 20 15:40:12 crc kubenswrapper[4730]:             fi
Mar 20 15:40:12 crc kubenswrapper[4730]:         
Mar 20 15:40:12 crc kubenswrapper[4730]:         
Mar 20 15:40:12 crc kubenswrapper[4730]:             # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior
Mar 20 15:40:12 crc kubenswrapper[4730]:             # Replace /etc/hosts with our modified version if needed
Mar 20 15:40:12 crc kubenswrapper[4730]:             cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}"
Mar 20 15:40:12 crc kubenswrapper[4730]:             # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn
Mar 20 15:40:12 crc kubenswrapper[4730]:           fi
Mar 20 15:40:12 crc kubenswrapper[4730]:           sleep 60 & wait
Mar 20 15:40:12 crc kubenswrapper[4730]:           unset svc_ips
Mar 20 15:40:12 crc kubenswrapper[4730]:         done
Mar 20 15:40:12 crc kubenswrapper[4730]:         ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-plthx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-69fnw_openshift-dns(102cb977-7291-453e-9282-20572071afee): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars
Mar 20 15:40:12 crc kubenswrapper[4730]:  > logger="UnhandledError"
Mar 20 15:40:12 crc kubenswrapper[4730]: E0320 15:40:12.966518    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-69fnw" podUID="102cb977-7291-453e-9282-20572071afee"
Mar 20 15:40:12 crc kubenswrapper[4730]: I0320 15:40:12.979964    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-49hht"
Mar 20 15:40:12 crc kubenswrapper[4730]: W0320 15:40:12.989978    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbb015c0_3a11_48bf_a59f_22bc03ca2fb9.slice/crio-f1d093488113ecc8c7f47886f76617cbd92ef51fd6c2516efe3977f71ad7a69a WatchSource:0}: Error finding container f1d093488113ecc8c7f47886f76617cbd92ef51fd6c2516efe3977f71ad7a69a: Status 404 returned error can't find the container with id f1d093488113ecc8c7f47886f76617cbd92ef51fd6c2516efe3977f71ad7a69a
Mar 20 15:40:12 crc kubenswrapper[4730]: E0320 15:40:12.992085    4730 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4qtg2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-49hht_openshift-multus(dbb015c0-3a11-48bf-a59f-22bc03ca2fb9): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError"
Mar 20 15:40:12 crc kubenswrapper[4730]: E0320 15:40:12.993225    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-49hht" podUID="dbb015c0-3a11-48bf-a59f-22bc03ca2fb9"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.031988    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.032023    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.032035    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.032049    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.032058    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:13Z","lastTransitionTime":"2026-03-20T15:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.134615    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.134662    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.134674    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.134692    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.134707    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:13Z","lastTransitionTime":"2026-03-20T15:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.194356    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:40:13 crc kubenswrapper[4730]: E0320 15:40:13.194538    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:40:14.194501502 +0000 UTC m=+73.407872911 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.194613    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.194696    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.194763    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.194849    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:40:13 crc kubenswrapper[4730]: E0320 15:40:13.194861    4730 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered
Mar 20 15:40:13 crc kubenswrapper[4730]: E0320 15:40:13.195049    4730 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered
Mar 20 15:40:13 crc kubenswrapper[4730]: E0320 15:40:13.195093    4730 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered
Mar 20 15:40:13 crc kubenswrapper[4730]: E0320 15:40:13.195107    4730 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered]
Mar 20 15:40:13 crc kubenswrapper[4730]: E0320 15:40:13.195140    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 15:40:14.195112178 +0000 UTC m=+73.408483587 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered
Mar 20 15:40:13 crc kubenswrapper[4730]: E0320 15:40:13.195002    4730 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered
Mar 20 15:40:13 crc kubenswrapper[4730]: E0320 15:40:13.195228    4730 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered
Mar 20 15:40:13 crc kubenswrapper[4730]: E0320 15:40:13.195304    4730 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered]
Mar 20 15:40:13 crc kubenswrapper[4730]: E0320 15:40:13.194930    4730 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered
Mar 20 15:40:13 crc kubenswrapper[4730]: E0320 15:40:13.195182    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 15:40:14.19516327 +0000 UTC m=+73.408534679 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered]
Mar 20 15:40:13 crc kubenswrapper[4730]: E0320 15:40:13.195431    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 15:40:14.195403516 +0000 UTC m=+73.408774935 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered]
Mar 20 15:40:13 crc kubenswrapper[4730]: E0320 15:40:13.195477    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 15:40:14.195460288 +0000 UTC m=+73.408831707 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.237190    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.237282    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.237295    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.237313    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.237325    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:13Z","lastTransitionTime":"2026-03-20T15:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.339882    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.339941    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.339953    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.339971    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.339985    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:13Z","lastTransitionTime":"2026-03-20T15:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.397048    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a-metrics-certs\") pod \"network-metrics-daemon-2prfn\" (UID: \"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\") " pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:40:13 crc kubenswrapper[4730]: E0320 15:40:13.397177    4730 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered
Mar 20 15:40:13 crc kubenswrapper[4730]: E0320 15:40:13.397268    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a-metrics-certs podName:db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a nodeName:}" failed. No retries permitted until 2026-03-20 15:40:14.39723355 +0000 UTC m=+73.610604929 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a-metrics-certs") pod "network-metrics-daemon-2prfn" (UID: "db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a") : object "openshift-multus"/"metrics-daemon-secret" not registered
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.441881    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.441919    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.441931    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.441951    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.441963    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:13Z","lastTransitionTime":"2026-03-20T15:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.536573    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.537084    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.538399    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.539021    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.539981    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.540500    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.541030    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.541893    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.542523    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.543530    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.543850    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.543917    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.543937    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.543961    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.543980    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:13Z","lastTransitionTime":"2026-03-20T15:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.544040    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.545157    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.545742    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.546314    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.547620    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.548127    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.549177    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.549636    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.550263    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.551464    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.551897    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.552910    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.553340    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.554330    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.554730    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.555350    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.556572    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.557121    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.558202    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.558821    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.559885    4730 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.560010    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.561884    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.562934    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.563501    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.564927    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.565833    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.566711    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.567369    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.568730    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.569278    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.570285    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.570928    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.571889    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.572396    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.573299    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.573794    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.574952    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.575434    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.576511    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.576947    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.577860    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.578460    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.578906    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.646839    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.646899    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.646913    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.646930    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.646946    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:13Z","lastTransitionTime":"2026-03-20T15:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.748731    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.748763    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.748774    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.748793    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.748805    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:13Z","lastTransitionTime":"2026-03-20T15:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.832847    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" event={"ID":"c4b4e0e8-af33-491e-b1d1-31079d90c656","Type":"ContainerStarted","Data":"f0bb8a04718d250ff389e424bacc9dc0320526af93827c03eb732b797d1a25fb"}
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.833909    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6r2kn" event={"ID":"6f97b1f1-1fad-44ec-8253-17dd6a5eee54","Type":"ContainerStarted","Data":"fc4a02f9622b344573aecaba86f050f3013a2ce63ed59201d29d31b7fdfc4c52"}
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.834771    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"aaa236df0208971cf48de3f9bf06671b82fce6170fb3ad4236c9cb521096380b"}
Mar 20 15:40:13 crc kubenswrapper[4730]: E0320 15:40:13.835021    4730 kuberuntime_manager.go:1274] "Unhandled Error" err=<
Mar 20 15:40:13 crc kubenswrapper[4730]:         init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig
Mar 20 15:40:13 crc kubenswrapper[4730]:         apiVersion: v1
Mar 20 15:40:13 crc kubenswrapper[4730]:         clusters:
Mar 20 15:40:13 crc kubenswrapper[4730]:           - cluster:
Mar 20 15:40:13 crc kubenswrapper[4730]:               certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt
Mar 20 15:40:13 crc kubenswrapper[4730]:               server: https://api-int.crc.testing:6443
Mar 20 15:40:13 crc kubenswrapper[4730]:             name: default-cluster
Mar 20 15:40:13 crc kubenswrapper[4730]:         contexts:
Mar 20 15:40:13 crc kubenswrapper[4730]:           - context:
Mar 20 15:40:13 crc kubenswrapper[4730]:               cluster: default-cluster
Mar 20 15:40:13 crc kubenswrapper[4730]:               namespace: default
Mar 20 15:40:13 crc kubenswrapper[4730]:               user: default-auth
Mar 20 15:40:13 crc kubenswrapper[4730]:             name: default-context
Mar 20 15:40:13 crc kubenswrapper[4730]:         current-context: default-context
Mar 20 15:40:13 crc kubenswrapper[4730]:         kind: Config
Mar 20 15:40:13 crc kubenswrapper[4730]:         preferences: {}
Mar 20 15:40:13 crc kubenswrapper[4730]:         users:
Mar 20 15:40:13 crc kubenswrapper[4730]:           - name: default-auth
Mar 20 15:40:13 crc kubenswrapper[4730]:             user:
Mar 20 15:40:13 crc kubenswrapper[4730]:               client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem
Mar 20 15:40:13 crc kubenswrapper[4730]:               client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem
Mar 20 15:40:13 crc kubenswrapper[4730]:         EOF
Mar 20 15:40:13 crc kubenswrapper[4730]:         ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mz64b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-qj97f_openshift-ovn-kubernetes(c4b4e0e8-af33-491e-b1d1-31079d90c656): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars
Mar 20 15:40:13 crc kubenswrapper[4730]:  > logger="UnhandledError"
Mar 20 15:40:13 crc kubenswrapper[4730]: E0320 15:40:13.835903    4730 kuberuntime_manager.go:1274] "Unhandled Error" err=<
Mar 20 15:40:13 crc kubenswrapper[4730]:         container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT=""
Mar 20 15:40:13 crc kubenswrapper[4730]:         /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT
Mar 20 15:40:13 crc kubenswrapper[4730]:         ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vvthz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-6r2kn_openshift-multus(6f97b1f1-1fad-44ec-8253-17dd6a5eee54): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars
Mar 20 15:40:13 crc kubenswrapper[4730]:  > logger="UnhandledError"
Mar 20 15:40:13 crc kubenswrapper[4730]: E0320 15:40:13.835999    4730 kuberuntime_manager.go:1274] "Unhandled Error" err=<
Mar 20 15:40:13 crc kubenswrapper[4730]:         container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe
Mar 20 15:40:13 crc kubenswrapper[4730]:         if [[ -f "/env/_master" ]]; then
Mar 20 15:40:13 crc kubenswrapper[4730]:           set -o allexport
Mar 20 15:40:13 crc kubenswrapper[4730]:           source "/env/_master"
Mar 20 15:40:13 crc kubenswrapper[4730]:           set +o allexport
Mar 20 15:40:13 crc kubenswrapper[4730]:         fi
Mar 20 15:40:13 crc kubenswrapper[4730]:         # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled.
Mar 20 15:40:13 crc kubenswrapper[4730]:         # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791
Mar 20 15:40:13 crc kubenswrapper[4730]:         ho_enable="--enable-hybrid-overlay"
Mar 20 15:40:13 crc kubenswrapper[4730]:         echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook"
Mar 20 15:40:13 crc kubenswrapper[4730]:         # extra-allowed-user: service account `ovn-kubernetes-control-plane`
Mar 20 15:40:13 crc kubenswrapper[4730]:         # sets pod annotations in multi-homing layer3 network controller (cluster-manager)
Mar 20 15:40:13 crc kubenswrapper[4730]:         exec /usr/bin/ovnkube-identity  --k8s-apiserver=https://api-int.crc.testing:6443 \
Mar 20 15:40:13 crc kubenswrapper[4730]:             --webhook-cert-dir="/etc/webhook-cert" \
Mar 20 15:40:13 crc kubenswrapper[4730]:             --webhook-host=127.0.0.1 \
Mar 20 15:40:13 crc kubenswrapper[4730]:             --webhook-port=9743 \
Mar 20 15:40:13 crc kubenswrapper[4730]:             ${ho_enable} \
Mar 20 15:40:13 crc kubenswrapper[4730]:             --enable-interconnect \
Mar 20 15:40:13 crc kubenswrapper[4730]:             --disable-approver \
Mar 20 15:40:13 crc kubenswrapper[4730]:             --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \
Mar 20 15:40:13 crc kubenswrapper[4730]:             --wait-for-kubernetes-api=200s \
Mar 20 15:40:13 crc kubenswrapper[4730]:             --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \
Mar 20 15:40:13 crc kubenswrapper[4730]:             --loglevel="${LOGLEVEL}"
Mar 20 15:40:13 crc kubenswrapper[4730]:         ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars
Mar 20 15:40:13 crc kubenswrapper[4730]:  > logger="UnhandledError"
Mar 20 15:40:13 crc kubenswrapper[4730]: E0320 15:40:13.836098    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.836356    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-69fnw" event={"ID":"102cb977-7291-453e-9282-20572071afee","Type":"ContainerStarted","Data":"e6f0997dd31e8e344f7ef497d6d54e32ae4978ad19519923f1968f9227693b16"}
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.837133    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"0d4eaf48ca38e2c780071b4c2cd083f62f9df885d079ed59049f71027240efb4"}
Mar 20 15:40:13 crc kubenswrapper[4730]: E0320 15:40:13.837183    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-6r2kn" podUID="6f97b1f1-1fad-44ec-8253-17dd6a5eee54"
Mar 20 15:40:13 crc kubenswrapper[4730]: E0320 15:40:13.838157    4730 kuberuntime_manager.go:1274] "Unhandled Error" err=<
Mar 20 15:40:13 crc kubenswrapper[4730]:         container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash
Mar 20 15:40:13 crc kubenswrapper[4730]:         set -uo pipefail
Mar 20 15:40:13 crc kubenswrapper[4730]:         
Mar 20 15:40:13 crc kubenswrapper[4730]:         trap 'jobs -p | xargs kill || true; wait; exit 0' TERM
Mar 20 15:40:13 crc kubenswrapper[4730]:         
Mar 20 15:40:13 crc kubenswrapper[4730]:         OPENSHIFT_MARKER="openshift-generated-node-resolver"
Mar 20 15:40:13 crc kubenswrapper[4730]:         HOSTS_FILE="/etc/hosts"
Mar 20 15:40:13 crc kubenswrapper[4730]:         TEMP_FILE="/etc/hosts.tmp"
Mar 20 15:40:13 crc kubenswrapper[4730]:         
Mar 20 15:40:13 crc kubenswrapper[4730]:         IFS=', ' read -r -a services <<< "${SERVICES}"
Mar 20 15:40:13 crc kubenswrapper[4730]:         
Mar 20 15:40:13 crc kubenswrapper[4730]:         # Make a temporary file with the old hosts file's attributes.
Mar 20 15:40:13 crc kubenswrapper[4730]:         if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then
Mar 20 15:40:13 crc kubenswrapper[4730]:           echo "Failed to preserve hosts file. Exiting."
Mar 20 15:40:13 crc kubenswrapper[4730]:           exit 1
Mar 20 15:40:13 crc kubenswrapper[4730]:         fi
Mar 20 15:40:13 crc kubenswrapper[4730]:         
Mar 20 15:40:13 crc kubenswrapper[4730]:         while true; do
Mar 20 15:40:13 crc kubenswrapper[4730]:           declare -A svc_ips
Mar 20 15:40:13 crc kubenswrapper[4730]:           for svc in "${services[@]}"; do
Mar 20 15:40:13 crc kubenswrapper[4730]:             # Fetch service IP from cluster dns if present. We make several tries
Mar 20 15:40:13 crc kubenswrapper[4730]:             # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones
Mar 20 15:40:13 crc kubenswrapper[4730]:             # are for deployments with Kuryr on older OpenStack (OSP13) - those do not
Mar 20 15:40:13 crc kubenswrapper[4730]:             # support UDP loadbalancers and require reaching DNS through TCP.
Mar 20 15:40:13 crc kubenswrapper[4730]:             cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"'
Mar 20 15:40:13 crc kubenswrapper[4730]:                   'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"'
Mar 20 15:40:13 crc kubenswrapper[4730]:                   'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"'
Mar 20 15:40:13 crc kubenswrapper[4730]:                   'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"')
Mar 20 15:40:13 crc kubenswrapper[4730]:             for i in ${!cmds[*]}
Mar 20 15:40:13 crc kubenswrapper[4730]:             do
Mar 20 15:40:13 crc kubenswrapper[4730]:               ips=($(eval "${cmds[i]}"))
Mar 20 15:40:13 crc kubenswrapper[4730]:               if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then
Mar 20 15:40:13 crc kubenswrapper[4730]:                 svc_ips["${svc}"]="${ips[@]}"
Mar 20 15:40:13 crc kubenswrapper[4730]:                 break
Mar 20 15:40:13 crc kubenswrapper[4730]:               fi
Mar 20 15:40:13 crc kubenswrapper[4730]:             done
Mar 20 15:40:13 crc kubenswrapper[4730]:           done
Mar 20 15:40:13 crc kubenswrapper[4730]:         
Mar 20 15:40:13 crc kubenswrapper[4730]:           # Update /etc/hosts only if we get valid service IPs
Mar 20 15:40:13 crc kubenswrapper[4730]:           # We will not update /etc/hosts when there is coredns service outage or api unavailability
Mar 20 15:40:13 crc kubenswrapper[4730]:           # Stale entries could exist in /etc/hosts if the service is deleted
Mar 20 15:40:13 crc kubenswrapper[4730]:           if [[ -n "${svc_ips[*]-}" ]]; then
Mar 20 15:40:13 crc kubenswrapper[4730]:             # Build a new hosts file from /etc/hosts with our custom entries filtered out
Mar 20 15:40:13 crc kubenswrapper[4730]:             if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then
Mar 20 15:40:13 crc kubenswrapper[4730]:               # Only continue rebuilding the hosts entries if its original content is preserved
Mar 20 15:40:13 crc kubenswrapper[4730]:               sleep 60 & wait
Mar 20 15:40:13 crc kubenswrapper[4730]:               continue
Mar 20 15:40:13 crc kubenswrapper[4730]:             fi
Mar 20 15:40:13 crc kubenswrapper[4730]:         
Mar 20 15:40:13 crc kubenswrapper[4730]:             # Append resolver entries for services
Mar 20 15:40:13 crc kubenswrapper[4730]:             rc=0
Mar 20 15:40:13 crc kubenswrapper[4730]:             for svc in "${!svc_ips[@]}"; do
Mar 20 15:40:13 crc kubenswrapper[4730]:               for ip in ${svc_ips[${svc}]}; do
Mar 20 15:40:13 crc kubenswrapper[4730]:                 echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$?
Mar 20 15:40:13 crc kubenswrapper[4730]:               done
Mar 20 15:40:13 crc kubenswrapper[4730]:             done
Mar 20 15:40:13 crc kubenswrapper[4730]:             if [[ $rc -ne 0 ]]; then
Mar 20 15:40:13 crc kubenswrapper[4730]:               sleep 60 & wait
Mar 20 15:40:13 crc kubenswrapper[4730]:               continue
Mar 20 15:40:13 crc kubenswrapper[4730]:             fi
Mar 20 15:40:13 crc kubenswrapper[4730]:         
Mar 20 15:40:13 crc kubenswrapper[4730]:         
Mar 20 15:40:13 crc kubenswrapper[4730]:             # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior
Mar 20 15:40:13 crc kubenswrapper[4730]:             # Replace /etc/hosts with our modified version if needed
Mar 20 15:40:13 crc kubenswrapper[4730]:             cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}"
Mar 20 15:40:13 crc kubenswrapper[4730]:             # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn
Mar 20 15:40:13 crc kubenswrapper[4730]:           fi
Mar 20 15:40:13 crc kubenswrapper[4730]:           sleep 60 & wait
Mar 20 15:40:13 crc kubenswrapper[4730]:           unset svc_ips
Mar 20 15:40:13 crc kubenswrapper[4730]:         done
Mar 20 15:40:13 crc kubenswrapper[4730]:         ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-plthx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-69fnw_openshift-dns(102cb977-7291-453e-9282-20572071afee): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars
Mar 20 15:40:13 crc kubenswrapper[4730]:  > logger="UnhandledError"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.838163    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-49hht" event={"ID":"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9","Type":"ContainerStarted","Data":"f1d093488113ecc8c7f47886f76617cbd92ef51fd6c2516efe3977f71ad7a69a"}
Mar 20 15:40:13 crc kubenswrapper[4730]: E0320 15:40:13.838381    4730 kuberuntime_manager.go:1274] "Unhandled Error" err=<
Mar 20 15:40:13 crc kubenswrapper[4730]:         container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash
Mar 20 15:40:13 crc kubenswrapper[4730]:         set -o allexport
Mar 20 15:40:13 crc kubenswrapper[4730]:         if [[ -f /etc/kubernetes/apiserver-url.env ]]; then
Mar 20 15:40:13 crc kubenswrapper[4730]:           source /etc/kubernetes/apiserver-url.env
Mar 20 15:40:13 crc kubenswrapper[4730]:         else
Mar 20 15:40:13 crc kubenswrapper[4730]:           echo "Error: /etc/kubernetes/apiserver-url.env is missing"
Mar 20 15:40:13 crc kubenswrapper[4730]:           exit 1
Mar 20 15:40:13 crc kubenswrapper[4730]:         fi
Mar 20 15:40:13 crc kubenswrapper[4730]:         exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104
Mar 20 15:40:13 crc kubenswrapper[4730]:         ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars
Mar 20 15:40:13 crc kubenswrapper[4730]:  > logger="UnhandledError"
Mar 20 15:40:13 crc kubenswrapper[4730]: E0320 15:40:13.838655    4730 kuberuntime_manager.go:1274] "Unhandled Error" err=<
Mar 20 15:40:13 crc kubenswrapper[4730]:         container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe
Mar 20 15:40:13 crc kubenswrapper[4730]:         if [[ -f "/env/_master" ]]; then
Mar 20 15:40:13 crc kubenswrapper[4730]:           set -o allexport
Mar 20 15:40:13 crc kubenswrapper[4730]:           source "/env/_master"
Mar 20 15:40:13 crc kubenswrapper[4730]:           set +o allexport
Mar 20 15:40:13 crc kubenswrapper[4730]:         fi
Mar 20 15:40:13 crc kubenswrapper[4730]:         
Mar 20 15:40:13 crc kubenswrapper[4730]:         echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver"
Mar 20 15:40:13 crc kubenswrapper[4730]:         exec /usr/bin/ovnkube-identity  --k8s-apiserver=https://api-int.crc.testing:6443 \
Mar 20 15:40:13 crc kubenswrapper[4730]:             --disable-webhook \
Mar 20 15:40:13 crc kubenswrapper[4730]:             --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \
Mar 20 15:40:13 crc kubenswrapper[4730]:             --loglevel="${LOGLEVEL}"
Mar 20 15:40:13 crc kubenswrapper[4730]:         ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars
Mar 20 15:40:13 crc kubenswrapper[4730]:  > logger="UnhandledError"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.841261    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh" event={"ID":"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0","Type":"ContainerStarted","Data":"c6e84e70aeec61f3c5e26afb92d8d59eb7be5dbcce5dc7207bc638470927d8d6"}
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.842441    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-n4w74" event={"ID":"2ee8d55f-90bd-4484-8455-933de455efea","Type":"ContainerStarted","Data":"c6cedbaeead04f0a723ee0c341b7f6751c3ad80c472b32b9905de4d6b0d54e0b"}
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.842556    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" event={"ID":"7fcd3db3-55f1-4c23-8fa9-78844495cea3","Type":"ContainerStarted","Data":"a0784fbf7fdba6b3f1633c4eeb3bee20b81376e6456ebe1c5ae165fcca0c2e9e"}
Mar 20 15:40:13 crc kubenswrapper[4730]: E0320 15:40:13.842471    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d"
Mar 20 15:40:13 crc kubenswrapper[4730]: E0320 15:40:13.842489    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-69fnw" podUID="102cb977-7291-453e-9282-20572071afee"
Mar 20 15:40:13 crc kubenswrapper[4730]: E0320 15:40:13.842448    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312"
Mar 20 15:40:13 crc kubenswrapper[4730]: E0320 15:40:13.842583    4730 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4qtg2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-49hht_openshift-multus(dbb015c0-3a11-48bf-a59f-22bc03ca2fb9): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.842646    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"52ab0a486f857a8f6da3f2dfd99d1bad7f101147f963e683628056e2b95a1a8b"}
Mar 20 15:40:13 crc kubenswrapper[4730]: E0320 15:40:13.843697    4730 kuberuntime_manager.go:1274] "Unhandled Error" err=<
Mar 20 15:40:13 crc kubenswrapper[4730]:         container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM
Mar 20 15:40:13 crc kubenswrapper[4730]:         while [ true ];
Mar 20 15:40:13 crc kubenswrapper[4730]:         do
Mar 20 15:40:13 crc kubenswrapper[4730]:           for f in $(ls /tmp/serviceca); do
Mar 20 15:40:13 crc kubenswrapper[4730]:               echo $f
Mar 20 15:40:13 crc kubenswrapper[4730]:               ca_file_path="/tmp/serviceca/${f}"
Mar 20 15:40:13 crc kubenswrapper[4730]:               f=$(echo $f | sed  -r 's/(.*)\.\./\1:/')
Mar 20 15:40:13 crc kubenswrapper[4730]:               reg_dir_path="/etc/docker/certs.d/${f}"
Mar 20 15:40:13 crc kubenswrapper[4730]:               if [ -e "${reg_dir_path}" ]; then
Mar 20 15:40:13 crc kubenswrapper[4730]:                   cp -u $ca_file_path $reg_dir_path/ca.crt
Mar 20 15:40:13 crc kubenswrapper[4730]:               else
Mar 20 15:40:13 crc kubenswrapper[4730]:                   mkdir $reg_dir_path
Mar 20 15:40:13 crc kubenswrapper[4730]:                   cp $ca_file_path $reg_dir_path/ca.crt
Mar 20 15:40:13 crc kubenswrapper[4730]:               fi
Mar 20 15:40:13 crc kubenswrapper[4730]:           done
Mar 20 15:40:13 crc kubenswrapper[4730]:           for d in $(ls /etc/docker/certs.d); do
Mar 20 15:40:13 crc kubenswrapper[4730]:               echo $d
Mar 20 15:40:13 crc kubenswrapper[4730]:               dp=$(echo $d | sed  -r 's/(.*):/\1\.\./')
Mar 20 15:40:13 crc kubenswrapper[4730]:               reg_conf_path="/tmp/serviceca/${dp}"
Mar 20 15:40:13 crc kubenswrapper[4730]:               if [ ! -e "${reg_conf_path}" ]; then
Mar 20 15:40:13 crc kubenswrapper[4730]:                   rm -rf /etc/docker/certs.d/$d
Mar 20 15:40:13 crc kubenswrapper[4730]:               fi
Mar 20 15:40:13 crc kubenswrapper[4730]:           done
Mar 20 15:40:13 crc kubenswrapper[4730]:           sleep 60 & wait ${!}
Mar 20 15:40:13 crc kubenswrapper[4730]:         done
Mar 20 15:40:13 crc kubenswrapper[4730]:         ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2fvg6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-n4w74_openshift-image-registry(2ee8d55f-90bd-4484-8455-933de455efea): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars
Mar 20 15:40:13 crc kubenswrapper[4730]:  > logger="UnhandledError"
Mar 20 15:40:13 crc kubenswrapper[4730]: E0320 15:40:13.843735    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-49hht" podUID="dbb015c0-3a11-48bf-a59f-22bc03ca2fb9"
Mar 20 15:40:13 crc kubenswrapper[4730]: E0320 15:40:13.843975    4730 kuberuntime_manager.go:1274] "Unhandled Error" err=<
Mar 20 15:40:13 crc kubenswrapper[4730]:         container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash
Mar 20 15:40:13 crc kubenswrapper[4730]:         set -euo pipefail
Mar 20 15:40:13 crc kubenswrapper[4730]:         TLS_PK=/etc/pki/tls/metrics-cert/tls.key
Mar 20 15:40:13 crc kubenswrapper[4730]:         TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt
Mar 20 15:40:13 crc kubenswrapper[4730]:         # As the secret mount is optional we must wait for the files to be present.
Mar 20 15:40:13 crc kubenswrapper[4730]:         # The service is created in monitor.yaml and this is created in sdn.yaml.
Mar 20 15:40:13 crc kubenswrapper[4730]:         TS=$(date +%s)
Mar 20 15:40:13 crc kubenswrapper[4730]:         WARN_TS=$(( ${TS} + $(( 20 * 60)) ))
Mar 20 15:40:13 crc kubenswrapper[4730]:         HAS_LOGGED_INFO=0
Mar 20 15:40:13 crc kubenswrapper[4730]:         
Mar 20 15:40:13 crc kubenswrapper[4730]:         log_missing_certs(){
Mar 20 15:40:13 crc kubenswrapper[4730]:             CUR_TS=$(date +%s)
Mar 20 15:40:13 crc kubenswrapper[4730]:             if [[ "${CUR_TS}" -gt "WARN_TS"  ]]; then
Mar 20 15:40:13 crc kubenswrapper[4730]:               echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes.
Mar 20 15:40:13 crc kubenswrapper[4730]:             elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then
Mar 20 15:40:13 crc kubenswrapper[4730]:               echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes.
Mar 20 15:40:13 crc kubenswrapper[4730]:               HAS_LOGGED_INFO=1
Mar 20 15:40:13 crc kubenswrapper[4730]:             fi
Mar 20 15:40:13 crc kubenswrapper[4730]:         }
Mar 20 15:40:13 crc kubenswrapper[4730]:         while [[ ! -f "${TLS_PK}" ||  ! -f "${TLS_CERT}" ]] ; do
Mar 20 15:40:13 crc kubenswrapper[4730]:           log_missing_certs
Mar 20 15:40:13 crc kubenswrapper[4730]:           sleep 5
Mar 20 15:40:13 crc kubenswrapper[4730]:         done
Mar 20 15:40:13 crc kubenswrapper[4730]:         
Mar 20 15:40:13 crc kubenswrapper[4730]:         echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy
Mar 20 15:40:13 crc kubenswrapper[4730]:         exec /usr/bin/kube-rbac-proxy \
Mar 20 15:40:13 crc kubenswrapper[4730]:           --logtostderr \
Mar 20 15:40:13 crc kubenswrapper[4730]:           --secure-listen-address=:9108 \
Mar 20 15:40:13 crc kubenswrapper[4730]:           --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \
Mar 20 15:40:13 crc kubenswrapper[4730]:           --upstream=http://127.0.0.1:29108/ \
Mar 20 15:40:13 crc kubenswrapper[4730]:           --tls-private-key-file=${TLS_PK} \
Mar 20 15:40:13 crc kubenswrapper[4730]:           --tls-cert-file=${TLS_CERT}
Mar 20 15:40:13 crc kubenswrapper[4730]:         ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d4xpw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-p47zh_openshift-ovn-kubernetes(a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars
Mar 20 15:40:13 crc kubenswrapper[4730]:  > logger="UnhandledError"
Mar 20 15:40:13 crc kubenswrapper[4730]: E0320 15:40:13.844439    4730 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError"
Mar 20 15:40:13 crc kubenswrapper[4730]: E0320 15:40:13.845150    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-n4w74" podUID="2ee8d55f-90bd-4484-8455-933de455efea"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.845350    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:13 crc kubenswrapper[4730]: E0320 15:40:13.845971    4730 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lzk8j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError"
Mar 20 15:40:13 crc kubenswrapper[4730]: E0320 15:40:13.846019    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49"
Mar 20 15:40:13 crc kubenswrapper[4730]: E0320 15:40:13.846324    4730 kuberuntime_manager.go:1274] "Unhandled Error" err=<
Mar 20 15:40:13 crc kubenswrapper[4730]:         container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe
Mar 20 15:40:13 crc kubenswrapper[4730]:         if [[ -f "/env/_master" ]]; then
Mar 20 15:40:13 crc kubenswrapper[4730]:           set -o allexport
Mar 20 15:40:13 crc kubenswrapper[4730]:           source "/env/_master"
Mar 20 15:40:13 crc kubenswrapper[4730]:           set +o allexport
Mar 20 15:40:13 crc kubenswrapper[4730]:         fi
Mar 20 15:40:13 crc kubenswrapper[4730]:         
Mar 20 15:40:13 crc kubenswrapper[4730]:         ovn_v4_join_subnet_opt=
Mar 20 15:40:13 crc kubenswrapper[4730]:         if [[ "" != "" ]]; then
Mar 20 15:40:13 crc kubenswrapper[4730]:           ovn_v4_join_subnet_opt="--gateway-v4-join-subnet "
Mar 20 15:40:13 crc kubenswrapper[4730]:         fi
Mar 20 15:40:13 crc kubenswrapper[4730]:         ovn_v6_join_subnet_opt=
Mar 20 15:40:13 crc kubenswrapper[4730]:         if [[ "" != "" ]]; then
Mar 20 15:40:13 crc kubenswrapper[4730]:           ovn_v6_join_subnet_opt="--gateway-v6-join-subnet "
Mar 20 15:40:13 crc kubenswrapper[4730]:         fi
Mar 20 15:40:13 crc kubenswrapper[4730]:         
Mar 20 15:40:13 crc kubenswrapper[4730]:         ovn_v4_transit_switch_subnet_opt=
Mar 20 15:40:13 crc kubenswrapper[4730]:         if [[ "" != "" ]]; then
Mar 20 15:40:13 crc kubenswrapper[4730]:           ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet "
Mar 20 15:40:13 crc kubenswrapper[4730]:         fi
Mar 20 15:40:13 crc kubenswrapper[4730]:         ovn_v6_transit_switch_subnet_opt=
Mar 20 15:40:13 crc kubenswrapper[4730]:         if [[ "" != "" ]]; then
Mar 20 15:40:13 crc kubenswrapper[4730]:           ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet "
Mar 20 15:40:13 crc kubenswrapper[4730]:         fi
Mar 20 15:40:13 crc kubenswrapper[4730]:         
Mar 20 15:40:13 crc kubenswrapper[4730]:         dns_name_resolver_enabled_flag=
Mar 20 15:40:13 crc kubenswrapper[4730]:         if [[ "false" == "true" ]]; then
Mar 20 15:40:13 crc kubenswrapper[4730]:           dns_name_resolver_enabled_flag="--enable-dns-name-resolver"
Mar 20 15:40:13 crc kubenswrapper[4730]:         fi
Mar 20 15:40:13 crc kubenswrapper[4730]:         
Mar 20 15:40:13 crc kubenswrapper[4730]:         persistent_ips_enabled_flag=
Mar 20 15:40:13 crc kubenswrapper[4730]:         if [[ "true" == "true" ]]; then
Mar 20 15:40:13 crc kubenswrapper[4730]:           persistent_ips_enabled_flag="--enable-persistent-ips"
Mar 20 15:40:13 crc kubenswrapper[4730]:         fi
Mar 20 15:40:13 crc kubenswrapper[4730]:         
Mar 20 15:40:13 crc kubenswrapper[4730]:         # This is needed so that converting clusters from GA to TP
Mar 20 15:40:13 crc kubenswrapper[4730]:         # will rollout control plane pods as well
Mar 20 15:40:13 crc kubenswrapper[4730]:         network_segmentation_enabled_flag=
Mar 20 15:40:13 crc kubenswrapper[4730]:         multi_network_enabled_flag=
Mar 20 15:40:13 crc kubenswrapper[4730]:         if [[ "true" == "true" ]]; then
Mar 20 15:40:13 crc kubenswrapper[4730]:           multi_network_enabled_flag="--enable-multi-network"
Mar 20 15:40:13 crc kubenswrapper[4730]:           network_segmentation_enabled_flag="--enable-network-segmentation"
Mar 20 15:40:13 crc kubenswrapper[4730]:         fi
Mar 20 15:40:13 crc kubenswrapper[4730]:         
Mar 20 15:40:13 crc kubenswrapper[4730]:         echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}"
Mar 20 15:40:13 crc kubenswrapper[4730]:         exec /usr/bin/ovnkube \
Mar 20 15:40:13 crc kubenswrapper[4730]:           --enable-interconnect \
Mar 20 15:40:13 crc kubenswrapper[4730]:           --init-cluster-manager "${K8S_NODE}" \
Mar 20 15:40:13 crc kubenswrapper[4730]:           --config-file=/run/ovnkube-config/ovnkube.conf \
Mar 20 15:40:13 crc kubenswrapper[4730]:           --loglevel "${OVN_KUBE_LOG_LEVEL}" \
Mar 20 15:40:13 crc kubenswrapper[4730]:           --metrics-bind-address "127.0.0.1:29108" \
Mar 20 15:40:13 crc kubenswrapper[4730]:           --metrics-enable-pprof \
Mar 20 15:40:13 crc kubenswrapper[4730]:           --metrics-enable-config-duration \
Mar 20 15:40:13 crc kubenswrapper[4730]:           ${ovn_v4_join_subnet_opt} \
Mar 20 15:40:13 crc kubenswrapper[4730]:           ${ovn_v6_join_subnet_opt} \
Mar 20 15:40:13 crc kubenswrapper[4730]:           ${ovn_v4_transit_switch_subnet_opt} \
Mar 20 15:40:13 crc kubenswrapper[4730]:           ${ovn_v6_transit_switch_subnet_opt} \
Mar 20 15:40:13 crc kubenswrapper[4730]:           ${dns_name_resolver_enabled_flag} \
Mar 20 15:40:13 crc kubenswrapper[4730]:           ${persistent_ips_enabled_flag} \
Mar 20 15:40:13 crc kubenswrapper[4730]:           ${multi_network_enabled_flag} \
Mar 20 15:40:13 crc kubenswrapper[4730]:           ${network_segmentation_enabled_flag}
Mar 20 15:40:13 crc kubenswrapper[4730]:         ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d4xpw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-p47zh_openshift-ovn-kubernetes(a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars
Mar 20 15:40:13 crc kubenswrapper[4730]:  > logger="UnhandledError"
Mar 20 15:40:13 crc kubenswrapper[4730]: E0320 15:40:13.848357    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh" podUID="a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0"
Mar 20 15:40:13 crc kubenswrapper[4730]: E0320 15:40:13.849013    4730 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lzk8j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.849965    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.849997    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.850009    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.850025    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.850038    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:13Z","lastTransitionTime":"2026-03-20T15:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:13 crc kubenswrapper[4730]: E0320 15:40:13.850339    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.856931    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.867313    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6r2kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvthz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6r2kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.874868    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2prfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2prfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.884313    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.899523    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b4e0e8-af33-491e-b1d1-31079d90c656\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj97f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.913912    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.923469    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.929922    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n4w74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ee8d55f-90bd-4484-8455-933de455efea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fvg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n4w74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.938948    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p47zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.947803    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.952592    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.952637    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.952649    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.952671    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.952684    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:13Z","lastTransitionTime":"2026-03-20T15:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.955732    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fcd3db3-55f1-4c23-8fa9-78844495cea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p5qvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.964348    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-69fnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102cb977-7291-453e-9282-20572071afee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plthx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-69fnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.976576    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-49hht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-49hht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:13 crc kubenswrapper[4730]: I0320 15:40:13.997083    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b4e0e8-af33-491e-b1d1-31079d90c656\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj97f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.009709    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.020057    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.029359    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.035787    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n4w74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ee8d55f-90bd-4484-8455-933de455efea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fvg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n4w74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.043409    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.051834    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fcd3db3-55f1-4c23-8fa9-78844495cea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p5qvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.055656    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.055705    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.055722    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.055745    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.055799    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:14Z","lastTransitionTime":"2026-03-20T15:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.062346    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-69fnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102cb977-7291-453e-9282-20572071afee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plthx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-69fnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.075806    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-49hht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-49hht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.084357    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p47zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.093136    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.105153    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6r2kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvthz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6r2kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.113016    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2prfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2prfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.121855    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.158725    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.158770    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.158784    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.158803    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.158818    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:14Z","lastTransitionTime":"2026-03-20T15:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.207720    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.207827    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:40:14 crc kubenswrapper[4730]: E0320 15:40:14.207847    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:40:16.207829264 +0000 UTC m=+75.421200633 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.207871    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.207896    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:40:14 crc kubenswrapper[4730]: E0320 15:40:14.207903    4730 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.207920    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:40:14 crc kubenswrapper[4730]: E0320 15:40:14.207941    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 15:40:16.207930947 +0000 UTC m=+75.421302316 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered
Mar 20 15:40:14 crc kubenswrapper[4730]: E0320 15:40:14.208031    4730 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered
Mar 20 15:40:14 crc kubenswrapper[4730]: E0320 15:40:14.208048    4730 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered
Mar 20 15:40:14 crc kubenswrapper[4730]: E0320 15:40:14.208060    4730 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered]
Mar 20 15:40:14 crc kubenswrapper[4730]: E0320 15:40:14.208076    4730 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered
Mar 20 15:40:14 crc kubenswrapper[4730]: E0320 15:40:14.208094    4730 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered
Mar 20 15:40:14 crc kubenswrapper[4730]: E0320 15:40:14.208129    4730 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered
Mar 20 15:40:14 crc kubenswrapper[4730]: E0320 15:40:14.208141    4730 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered]
Mar 20 15:40:14 crc kubenswrapper[4730]: E0320 15:40:14.208098    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 15:40:16.208088321 +0000 UTC m=+75.421459690 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered]
Mar 20 15:40:14 crc kubenswrapper[4730]: E0320 15:40:14.208176    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 15:40:16.208157413 +0000 UTC m=+75.421528872 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered
Mar 20 15:40:14 crc kubenswrapper[4730]: E0320 15:40:14.208191    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 15:40:16.208183284 +0000 UTC m=+75.421554773 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered]
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.260659    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.260874    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.260958    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.261057    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.261140    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:14Z","lastTransitionTime":"2026-03-20T15:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.364510    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.364547    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.364557    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.364572    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.364582    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:14Z","lastTransitionTime":"2026-03-20T15:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.409768    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a-metrics-certs\") pod \"network-metrics-daemon-2prfn\" (UID: \"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\") " pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:40:14 crc kubenswrapper[4730]: E0320 15:40:14.409982    4730 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered
Mar 20 15:40:14 crc kubenswrapper[4730]: E0320 15:40:14.410065    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a-metrics-certs podName:db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a nodeName:}" failed. No retries permitted until 2026-03-20 15:40:16.410042628 +0000 UTC m=+75.623414087 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a-metrics-certs") pod "network-metrics-daemon-2prfn" (UID: "db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a") : object "openshift-multus"/"metrics-daemon-secret" not registered
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.466561    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.466610    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.466623    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.466642    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.466651    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:14Z","lastTransitionTime":"2026-03-20T15:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.533445    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.533479    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:40:14 crc kubenswrapper[4730]: E0320 15:40:14.533795    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.533538    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.533508    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:40:14 crc kubenswrapper[4730]: E0320 15:40:14.534138    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:40:14 crc kubenswrapper[4730]: E0320 15:40:14.534044    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:40:14 crc kubenswrapper[4730]: E0320 15:40:14.533950    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.546901    4730 scope.go:117] "RemoveContainer" containerID="688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5"
Mar 20 15:40:14 crc kubenswrapper[4730]: E0320 15:40:14.547108    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.547392    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"]
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.569657    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.569709    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.569723    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.569741    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.569754    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:14Z","lastTransitionTime":"2026-03-20T15:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.671621    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.672178    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.672278    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.672375    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.672474    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:14Z","lastTransitionTime":"2026-03-20T15:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.774227    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.774277    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.774290    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.774303    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.774312    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:14Z","lastTransitionTime":"2026-03-20T15:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.844617    4730 scope.go:117] "RemoveContainer" containerID="688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5"
Mar 20 15:40:14 crc kubenswrapper[4730]: E0320 15:40:14.844771    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.876812    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.876849    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.876859    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.876871    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.876881    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:14Z","lastTransitionTime":"2026-03-20T15:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.979709    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.980041    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.980172    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.980313    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:14 crc kubenswrapper[4730]: I0320 15:40:14.980418    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:14Z","lastTransitionTime":"2026-03-20T15:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:15 crc kubenswrapper[4730]: I0320 15:40:15.083152    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:15 crc kubenswrapper[4730]: I0320 15:40:15.083198    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:15 crc kubenswrapper[4730]: I0320 15:40:15.083209    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:15 crc kubenswrapper[4730]: I0320 15:40:15.083225    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:15 crc kubenswrapper[4730]: I0320 15:40:15.083236    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:15Z","lastTransitionTime":"2026-03-20T15:40:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:15 crc kubenswrapper[4730]: I0320 15:40:15.185529    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:15 crc kubenswrapper[4730]: I0320 15:40:15.185621    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:15 crc kubenswrapper[4730]: I0320 15:40:15.185655    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:15 crc kubenswrapper[4730]: I0320 15:40:15.185677    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:15 crc kubenswrapper[4730]: I0320 15:40:15.185688    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:15Z","lastTransitionTime":"2026-03-20T15:40:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:15 crc kubenswrapper[4730]: I0320 15:40:15.287954    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:15 crc kubenswrapper[4730]: I0320 15:40:15.288011    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:15 crc kubenswrapper[4730]: I0320 15:40:15.288024    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:15 crc kubenswrapper[4730]: I0320 15:40:15.288050    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:15 crc kubenswrapper[4730]: I0320 15:40:15.288062    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:15Z","lastTransitionTime":"2026-03-20T15:40:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:15 crc kubenswrapper[4730]: I0320 15:40:15.390262    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:15 crc kubenswrapper[4730]: I0320 15:40:15.390483    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:15 crc kubenswrapper[4730]: I0320 15:40:15.390586    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:15 crc kubenswrapper[4730]: I0320 15:40:15.390675    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:15 crc kubenswrapper[4730]: I0320 15:40:15.390764    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:15Z","lastTransitionTime":"2026-03-20T15:40:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:15 crc kubenswrapper[4730]: I0320 15:40:15.493853    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:15 crc kubenswrapper[4730]: I0320 15:40:15.493906    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:15 crc kubenswrapper[4730]: I0320 15:40:15.493919    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:15 crc kubenswrapper[4730]: I0320 15:40:15.493939    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:15 crc kubenswrapper[4730]: I0320 15:40:15.493959    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:15Z","lastTransitionTime":"2026-03-20T15:40:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:15 crc kubenswrapper[4730]: I0320 15:40:15.596182    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:15 crc kubenswrapper[4730]: I0320 15:40:15.596231    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:15 crc kubenswrapper[4730]: I0320 15:40:15.596263    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:15 crc kubenswrapper[4730]: I0320 15:40:15.596281    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:15 crc kubenswrapper[4730]: I0320 15:40:15.596292    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:15Z","lastTransitionTime":"2026-03-20T15:40:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:15 crc kubenswrapper[4730]: I0320 15:40:15.698805    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:15 crc kubenswrapper[4730]: I0320 15:40:15.699423    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:15 crc kubenswrapper[4730]: I0320 15:40:15.699545    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:15 crc kubenswrapper[4730]: I0320 15:40:15.699645    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:15 crc kubenswrapper[4730]: I0320 15:40:15.699823    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:15Z","lastTransitionTime":"2026-03-20T15:40:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:15 crc kubenswrapper[4730]: I0320 15:40:15.803537    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:15 crc kubenswrapper[4730]: I0320 15:40:15.803868    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:15 crc kubenswrapper[4730]: I0320 15:40:15.803996    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:15 crc kubenswrapper[4730]: I0320 15:40:15.804126    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:15 crc kubenswrapper[4730]: I0320 15:40:15.804242    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:15Z","lastTransitionTime":"2026-03-20T15:40:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:15 crc kubenswrapper[4730]: I0320 15:40:15.907497    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:15 crc kubenswrapper[4730]: I0320 15:40:15.908105    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:15 crc kubenswrapper[4730]: I0320 15:40:15.908395    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:15 crc kubenswrapper[4730]: I0320 15:40:15.908619    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:15 crc kubenswrapper[4730]: I0320 15:40:15.908814    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:15Z","lastTransitionTime":"2026-03-20T15:40:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.011911    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.011950    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.011959    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.011974    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.011983    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:16Z","lastTransitionTime":"2026-03-20T15:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.115547    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.115603    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.115620    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.115642    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.115658    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:16Z","lastTransitionTime":"2026-03-20T15:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.219312    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.219439    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.219460    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.219545    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.219575    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:16Z","lastTransitionTime":"2026-03-20T15:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.231677    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:40:16 crc kubenswrapper[4730]: E0320 15:40:16.231886    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:40:20.231856215 +0000 UTC m=+79.445227614 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.231935    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.232008    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.232059    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.232114    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:40:16 crc kubenswrapper[4730]: E0320 15:40:16.232325    4730 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered
Mar 20 15:40:16 crc kubenswrapper[4730]: E0320 15:40:16.232349    4730 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered
Mar 20 15:40:16 crc kubenswrapper[4730]: E0320 15:40:16.232367    4730 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered]
Mar 20 15:40:16 crc kubenswrapper[4730]: E0320 15:40:16.232418    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 15:40:20.23240374 +0000 UTC m=+79.445775139 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered]
Mar 20 15:40:16 crc kubenswrapper[4730]: E0320 15:40:16.232537    4730 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered
Mar 20 15:40:16 crc kubenswrapper[4730]: E0320 15:40:16.232612    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 15:40:20.232590825 +0000 UTC m=+79.445962224 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered
Mar 20 15:40:16 crc kubenswrapper[4730]: E0320 15:40:16.232620    4730 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered
Mar 20 15:40:16 crc kubenswrapper[4730]: E0320 15:40:16.232641    4730 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered
Mar 20 15:40:16 crc kubenswrapper[4730]: E0320 15:40:16.232682    4730 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered
Mar 20 15:40:16 crc kubenswrapper[4730]: E0320 15:40:16.232701    4730 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered]
Mar 20 15:40:16 crc kubenswrapper[4730]: E0320 15:40:16.232737    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 15:40:20.232700838 +0000 UTC m=+79.446072247 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered
Mar 20 15:40:16 crc kubenswrapper[4730]: E0320 15:40:16.232777    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 15:40:20.23275198 +0000 UTC m=+79.446123379 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered]
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.322440    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.322526    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.322544    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.322599    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.322615    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:16Z","lastTransitionTime":"2026-03-20T15:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.425418    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.425733    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.425875    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.426034    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.426137    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:16Z","lastTransitionTime":"2026-03-20T15:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.434151    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a-metrics-certs\") pod \"network-metrics-daemon-2prfn\" (UID: \"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\") " pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:40:16 crc kubenswrapper[4730]: E0320 15:40:16.434373    4730 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered
Mar 20 15:40:16 crc kubenswrapper[4730]: E0320 15:40:16.434674    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a-metrics-certs podName:db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a nodeName:}" failed. No retries permitted until 2026-03-20 15:40:20.434641685 +0000 UTC m=+79.648013094 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a-metrics-certs") pod "network-metrics-daemon-2prfn" (UID: "db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a") : object "openshift-multus"/"metrics-daemon-secret" not registered
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.528983    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.529219    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.529296    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.529390    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.529464    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:16Z","lastTransitionTime":"2026-03-20T15:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.532529    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.532581    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:40:16 crc kubenswrapper[4730]: E0320 15:40:16.532667    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.532730    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.532953    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:40:16 crc kubenswrapper[4730]: E0320 15:40:16.532944    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:40:16 crc kubenswrapper[4730]: E0320 15:40:16.533064    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:40:16 crc kubenswrapper[4730]: E0320 15:40:16.534008    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.633164    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.633227    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.633279    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.633309    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.633330    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:16Z","lastTransitionTime":"2026-03-20T15:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.736665    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.736724    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.736744    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.736767    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.736786    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:16Z","lastTransitionTime":"2026-03-20T15:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.839490    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.839569    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.839584    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.839606    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.839620    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:16Z","lastTransitionTime":"2026-03-20T15:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.942653    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.942712    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.942746    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.942776    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:16 crc kubenswrapper[4730]: I0320 15:40:16.942798    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:16Z","lastTransitionTime":"2026-03-20T15:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:17 crc kubenswrapper[4730]: I0320 15:40:17.046294    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:17 crc kubenswrapper[4730]: I0320 15:40:17.046365    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:17 crc kubenswrapper[4730]: I0320 15:40:17.046381    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:17 crc kubenswrapper[4730]: I0320 15:40:17.046406    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:17 crc kubenswrapper[4730]: I0320 15:40:17.046424    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:17Z","lastTransitionTime":"2026-03-20T15:40:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:17 crc kubenswrapper[4730]: I0320 15:40:17.149701    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:17 crc kubenswrapper[4730]: I0320 15:40:17.149777    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:17 crc kubenswrapper[4730]: I0320 15:40:17.149802    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:17 crc kubenswrapper[4730]: I0320 15:40:17.149835    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:17 crc kubenswrapper[4730]: I0320 15:40:17.149858    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:17Z","lastTransitionTime":"2026-03-20T15:40:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:17 crc kubenswrapper[4730]: I0320 15:40:17.252961    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:17 crc kubenswrapper[4730]: I0320 15:40:17.253008    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:17 crc kubenswrapper[4730]: I0320 15:40:17.253044    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:17 crc kubenswrapper[4730]: I0320 15:40:17.253068    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:17 crc kubenswrapper[4730]: I0320 15:40:17.253080    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:17Z","lastTransitionTime":"2026-03-20T15:40:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:17 crc kubenswrapper[4730]: I0320 15:40:17.356013    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:17 crc kubenswrapper[4730]: I0320 15:40:17.356093    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:17 crc kubenswrapper[4730]: I0320 15:40:17.356116    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:17 crc kubenswrapper[4730]: I0320 15:40:17.356140    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:17 crc kubenswrapper[4730]: I0320 15:40:17.356156    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:17Z","lastTransitionTime":"2026-03-20T15:40:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:17 crc kubenswrapper[4730]: I0320 15:40:17.459718    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:17 crc kubenswrapper[4730]: I0320 15:40:17.459768    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:17 crc kubenswrapper[4730]: I0320 15:40:17.459785    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:17 crc kubenswrapper[4730]: I0320 15:40:17.459809    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:17 crc kubenswrapper[4730]: I0320 15:40:17.459830    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:17Z","lastTransitionTime":"2026-03-20T15:40:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:17 crc kubenswrapper[4730]: I0320 15:40:17.562707    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:17 crc kubenswrapper[4730]: I0320 15:40:17.562765    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:17 crc kubenswrapper[4730]: I0320 15:40:17.562783    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:17 crc kubenswrapper[4730]: I0320 15:40:17.562808    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:17 crc kubenswrapper[4730]: I0320 15:40:17.562830    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:17Z","lastTransitionTime":"2026-03-20T15:40:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:17 crc kubenswrapper[4730]: I0320 15:40:17.666316    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:17 crc kubenswrapper[4730]: I0320 15:40:17.666381    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:17 crc kubenswrapper[4730]: I0320 15:40:17.666405    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:17 crc kubenswrapper[4730]: I0320 15:40:17.666437    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:17 crc kubenswrapper[4730]: I0320 15:40:17.666461    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:17Z","lastTransitionTime":"2026-03-20T15:40:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:17 crc kubenswrapper[4730]: I0320 15:40:17.768658    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:17 crc kubenswrapper[4730]: I0320 15:40:17.768685    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:17 crc kubenswrapper[4730]: I0320 15:40:17.768694    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:17 crc kubenswrapper[4730]: I0320 15:40:17.768705    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:17 crc kubenswrapper[4730]: I0320 15:40:17.768715    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:17Z","lastTransitionTime":"2026-03-20T15:40:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:17 crc kubenswrapper[4730]: I0320 15:40:17.872346    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:17 crc kubenswrapper[4730]: I0320 15:40:17.872420    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:17 crc kubenswrapper[4730]: I0320 15:40:17.872440    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:17 crc kubenswrapper[4730]: I0320 15:40:17.872467    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:17 crc kubenswrapper[4730]: I0320 15:40:17.872486    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:17Z","lastTransitionTime":"2026-03-20T15:40:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:17 crc kubenswrapper[4730]: I0320 15:40:17.975565    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:17 crc kubenswrapper[4730]: I0320 15:40:17.975618    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:17 crc kubenswrapper[4730]: I0320 15:40:17.975649    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:17 crc kubenswrapper[4730]: I0320 15:40:17.975676    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:17 crc kubenswrapper[4730]: I0320 15:40:17.975694    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:17Z","lastTransitionTime":"2026-03-20T15:40:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:18 crc kubenswrapper[4730]: I0320 15:40:18.079533    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:18 crc kubenswrapper[4730]: I0320 15:40:18.079598    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:18 crc kubenswrapper[4730]: I0320 15:40:18.079621    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:18 crc kubenswrapper[4730]: I0320 15:40:18.079652    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:18 crc kubenswrapper[4730]: I0320 15:40:18.079722    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:18Z","lastTransitionTime":"2026-03-20T15:40:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:18 crc kubenswrapper[4730]: I0320 15:40:18.182480    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:18 crc kubenswrapper[4730]: I0320 15:40:18.182511    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:18 crc kubenswrapper[4730]: I0320 15:40:18.182519    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:18 crc kubenswrapper[4730]: I0320 15:40:18.182533    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:18 crc kubenswrapper[4730]: I0320 15:40:18.182543    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:18Z","lastTransitionTime":"2026-03-20T15:40:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:18 crc kubenswrapper[4730]: I0320 15:40:18.286078    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:18 crc kubenswrapper[4730]: I0320 15:40:18.286129    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:18 crc kubenswrapper[4730]: I0320 15:40:18.286141    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:18 crc kubenswrapper[4730]: I0320 15:40:18.286159    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:18 crc kubenswrapper[4730]: I0320 15:40:18.286171    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:18Z","lastTransitionTime":"2026-03-20T15:40:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:18 crc kubenswrapper[4730]: I0320 15:40:18.388887    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:18 crc kubenswrapper[4730]: I0320 15:40:18.388945    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:18 crc kubenswrapper[4730]: I0320 15:40:18.388967    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:18 crc kubenswrapper[4730]: I0320 15:40:18.388998    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:18 crc kubenswrapper[4730]: I0320 15:40:18.389023    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:18Z","lastTransitionTime":"2026-03-20T15:40:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:18 crc kubenswrapper[4730]: I0320 15:40:18.491969    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:18 crc kubenswrapper[4730]: I0320 15:40:18.492032    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:18 crc kubenswrapper[4730]: I0320 15:40:18.492050    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:18 crc kubenswrapper[4730]: I0320 15:40:18.492075    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:18 crc kubenswrapper[4730]: I0320 15:40:18.492092    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:18Z","lastTransitionTime":"2026-03-20T15:40:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:18 crc kubenswrapper[4730]: I0320 15:40:18.532963    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:40:18 crc kubenswrapper[4730]: E0320 15:40:18.533201    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:40:18 crc kubenswrapper[4730]: I0320 15:40:18.533483    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:40:18 crc kubenswrapper[4730]: I0320 15:40:18.533553    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:40:18 crc kubenswrapper[4730]: E0320 15:40:18.533755    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:40:18 crc kubenswrapper[4730]: I0320 15:40:18.533801    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:40:18 crc kubenswrapper[4730]: E0320 15:40:18.533879    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:40:18 crc kubenswrapper[4730]: E0320 15:40:18.533963    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:40:18 crc kubenswrapper[4730]: I0320 15:40:18.595213    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:18 crc kubenswrapper[4730]: I0320 15:40:18.595263    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:18 crc kubenswrapper[4730]: I0320 15:40:18.595272    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:18 crc kubenswrapper[4730]: I0320 15:40:18.595288    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:18 crc kubenswrapper[4730]: I0320 15:40:18.595297    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:18Z","lastTransitionTime":"2026-03-20T15:40:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:18 crc kubenswrapper[4730]: I0320 15:40:18.698179    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:18 crc kubenswrapper[4730]: I0320 15:40:18.698294    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:18 crc kubenswrapper[4730]: I0320 15:40:18.698316    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:18 crc kubenswrapper[4730]: I0320 15:40:18.698345    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:18 crc kubenswrapper[4730]: I0320 15:40:18.698363    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:18Z","lastTransitionTime":"2026-03-20T15:40:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:18 crc kubenswrapper[4730]: I0320 15:40:18.801910    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:18 crc kubenswrapper[4730]: I0320 15:40:18.801980    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:18 crc kubenswrapper[4730]: I0320 15:40:18.801999    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:18 crc kubenswrapper[4730]: I0320 15:40:18.802024    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:18 crc kubenswrapper[4730]: I0320 15:40:18.802041    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:18Z","lastTransitionTime":"2026-03-20T15:40:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:18 crc kubenswrapper[4730]: I0320 15:40:18.904404    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:18 crc kubenswrapper[4730]: I0320 15:40:18.904457    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:18 crc kubenswrapper[4730]: I0320 15:40:18.904475    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:18 crc kubenswrapper[4730]: I0320 15:40:18.904497    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:18 crc kubenswrapper[4730]: I0320 15:40:18.904514    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:18Z","lastTransitionTime":"2026-03-20T15:40:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:19 crc kubenswrapper[4730]: I0320 15:40:19.007427    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:19 crc kubenswrapper[4730]: I0320 15:40:19.007520    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:19 crc kubenswrapper[4730]: I0320 15:40:19.007549    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:19 crc kubenswrapper[4730]: I0320 15:40:19.007577    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:19 crc kubenswrapper[4730]: I0320 15:40:19.007600    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:19Z","lastTransitionTime":"2026-03-20T15:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:19 crc kubenswrapper[4730]: I0320 15:40:19.110155    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:19 crc kubenswrapper[4730]: I0320 15:40:19.110196    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:19 crc kubenswrapper[4730]: I0320 15:40:19.110208    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:19 crc kubenswrapper[4730]: I0320 15:40:19.110224    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:19 crc kubenswrapper[4730]: I0320 15:40:19.110236    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:19Z","lastTransitionTime":"2026-03-20T15:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:19 crc kubenswrapper[4730]: I0320 15:40:19.213670    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:19 crc kubenswrapper[4730]: I0320 15:40:19.213750    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:19 crc kubenswrapper[4730]: I0320 15:40:19.213774    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:19 crc kubenswrapper[4730]: I0320 15:40:19.213804    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:19 crc kubenswrapper[4730]: I0320 15:40:19.213830    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:19Z","lastTransitionTime":"2026-03-20T15:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:19 crc kubenswrapper[4730]: I0320 15:40:19.317743    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:19 crc kubenswrapper[4730]: I0320 15:40:19.317798    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:19 crc kubenswrapper[4730]: I0320 15:40:19.317813    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:19 crc kubenswrapper[4730]: I0320 15:40:19.317832    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:19 crc kubenswrapper[4730]: I0320 15:40:19.318032    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:19Z","lastTransitionTime":"2026-03-20T15:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:19 crc kubenswrapper[4730]: I0320 15:40:19.420817    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:19 crc kubenswrapper[4730]: I0320 15:40:19.421130    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:19 crc kubenswrapper[4730]: I0320 15:40:19.421282    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:19 crc kubenswrapper[4730]: I0320 15:40:19.421444    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:19 crc kubenswrapper[4730]: I0320 15:40:19.421566    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:19Z","lastTransitionTime":"2026-03-20T15:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:19 crc kubenswrapper[4730]: I0320 15:40:19.523970    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:19 crc kubenswrapper[4730]: I0320 15:40:19.524031    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:19 crc kubenswrapper[4730]: I0320 15:40:19.524056    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:19 crc kubenswrapper[4730]: I0320 15:40:19.524083    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:19 crc kubenswrapper[4730]: I0320 15:40:19.524108    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:19Z","lastTransitionTime":"2026-03-20T15:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:19 crc kubenswrapper[4730]: I0320 15:40:19.627117    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:19 crc kubenswrapper[4730]: I0320 15:40:19.627197    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:19 crc kubenswrapper[4730]: I0320 15:40:19.627236    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:19 crc kubenswrapper[4730]: I0320 15:40:19.627306    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:19 crc kubenswrapper[4730]: I0320 15:40:19.627332    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:19Z","lastTransitionTime":"2026-03-20T15:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:19 crc kubenswrapper[4730]: I0320 15:40:19.730211    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:19 crc kubenswrapper[4730]: I0320 15:40:19.730289    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:19 crc kubenswrapper[4730]: I0320 15:40:19.730299    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:19 crc kubenswrapper[4730]: I0320 15:40:19.730315    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:19 crc kubenswrapper[4730]: I0320 15:40:19.730326    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:19Z","lastTransitionTime":"2026-03-20T15:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:19 crc kubenswrapper[4730]: I0320 15:40:19.833176    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:19 crc kubenswrapper[4730]: I0320 15:40:19.833216    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:19 crc kubenswrapper[4730]: I0320 15:40:19.833230    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:19 crc kubenswrapper[4730]: I0320 15:40:19.833270    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:19 crc kubenswrapper[4730]: I0320 15:40:19.833286    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:19Z","lastTransitionTime":"2026-03-20T15:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:19 crc kubenswrapper[4730]: I0320 15:40:19.935718    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:19 crc kubenswrapper[4730]: I0320 15:40:19.935784    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:19 crc kubenswrapper[4730]: I0320 15:40:19.935804    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:19 crc kubenswrapper[4730]: I0320 15:40:19.935833    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:19 crc kubenswrapper[4730]: I0320 15:40:19.935898    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:19Z","lastTransitionTime":"2026-03-20T15:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.037875    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.037937    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.037955    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.037978    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.037995    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:20Z","lastTransitionTime":"2026-03-20T15:40:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.141834    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.142190    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.142424    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.142614    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.142787    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:20Z","lastTransitionTime":"2026-03-20T15:40:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.244854    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.244903    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.244922    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.244944    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.244962    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:20Z","lastTransitionTime":"2026-03-20T15:40:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.275781    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:40:20 crc kubenswrapper[4730]: E0320 15:40:20.276012    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:40:28.275973832 +0000 UTC m=+87.489345241 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.276282    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.276398    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.276459    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.276504    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:40:20 crc kubenswrapper[4730]: E0320 15:40:20.276510    4730 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered
Mar 20 15:40:20 crc kubenswrapper[4730]: E0320 15:40:20.276559    4730 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered
Mar 20 15:40:20 crc kubenswrapper[4730]: E0320 15:40:20.276581    4730 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered]
Mar 20 15:40:20 crc kubenswrapper[4730]: E0320 15:40:20.276655    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 15:40:28.27663012 +0000 UTC m=+87.490001529 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered]
Mar 20 15:40:20 crc kubenswrapper[4730]: E0320 15:40:20.276673    4730 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered
Mar 20 15:40:20 crc kubenswrapper[4730]: E0320 15:40:20.276698    4730 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered
Mar 20 15:40:20 crc kubenswrapper[4730]: E0320 15:40:20.276720    4730 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered]
Mar 20 15:40:20 crc kubenswrapper[4730]: E0320 15:40:20.276781    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 15:40:28.276763184 +0000 UTC m=+87.490134593 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered]
Mar 20 15:40:20 crc kubenswrapper[4730]: E0320 15:40:20.276841    4730 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered
Mar 20 15:40:20 crc kubenswrapper[4730]: E0320 15:40:20.276882    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 15:40:28.276870197 +0000 UTC m=+87.490241606 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered
Mar 20 15:40:20 crc kubenswrapper[4730]: E0320 15:40:20.276954    4730 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered
Mar 20 15:40:20 crc kubenswrapper[4730]: E0320 15:40:20.276993    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 15:40:28.27698121 +0000 UTC m=+87.490352619 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.348279    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.348329    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.348344    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.348366    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.348382    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:20Z","lastTransitionTime":"2026-03-20T15:40:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.451623    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.451698    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.451720    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.451743    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.451762    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:20Z","lastTransitionTime":"2026-03-20T15:40:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.478675    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a-metrics-certs\") pod \"network-metrics-daemon-2prfn\" (UID: \"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\") " pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:40:20 crc kubenswrapper[4730]: E0320 15:40:20.478936    4730 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered
Mar 20 15:40:20 crc kubenswrapper[4730]: E0320 15:40:20.479049    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a-metrics-certs podName:db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a nodeName:}" failed. No retries permitted until 2026-03-20 15:40:28.479022129 +0000 UTC m=+87.692393538 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a-metrics-certs") pod "network-metrics-daemon-2prfn" (UID: "db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a") : object "openshift-multus"/"metrics-daemon-secret" not registered
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.532150    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.532275    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:40:20 crc kubenswrapper[4730]: E0320 15:40:20.532430    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.532476    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:40:20 crc kubenswrapper[4730]: E0320 15:40:20.532578    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:40:20 crc kubenswrapper[4730]: E0320 15:40:20.532684    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.532933    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:40:20 crc kubenswrapper[4730]: E0320 15:40:20.533400    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.555505    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.555944    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.556410    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.556702    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.556929    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:20Z","lastTransitionTime":"2026-03-20T15:40:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.660659    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.660960    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.661039    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.661140    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.661217    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:20Z","lastTransitionTime":"2026-03-20T15:40:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.764299    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.764362    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.764382    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.764412    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.764437    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:20Z","lastTransitionTime":"2026-03-20T15:40:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.866768    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.866806    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.866816    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.866833    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.866843    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:20Z","lastTransitionTime":"2026-03-20T15:40:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.969819    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.969858    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.969870    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.969886    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:20 crc kubenswrapper[4730]: I0320 15:40:20.969898    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:20Z","lastTransitionTime":"2026-03-20T15:40:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.072857    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.072926    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.072945    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.072970    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.072988    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:21Z","lastTransitionTime":"2026-03-20T15:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.176112    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.176199    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.176224    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.176292    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.176311    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:21Z","lastTransitionTime":"2026-03-20T15:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.279014    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.279076    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.279093    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.279117    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.279135    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:21Z","lastTransitionTime":"2026-03-20T15:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.381211    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.381269    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.381283    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.381298    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.381307    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:21Z","lastTransitionTime":"2026-03-20T15:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.483525    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.483560    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.483576    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.483594    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.483606    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:21Z","lastTransitionTime":"2026-03-20T15:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.541820    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.553408    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n4w74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ee8d55f-90bd-4484-8455-933de455efea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fvg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n4w74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.565331    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c3fe2f-3c67-4dee-becb-3ecfe2758384\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2660f772d76479685f586aeb9b242975695ac5e28668840bac6b8a75c09443a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d96c704317c9bfbcb499beae611ca6fcb6337084728df20ad791267de34359b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://180d79b8131e00449715e74d883031078dcceb74ca44ea372b6c3f651d617007\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:39:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:39:59.021893       1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:39:59.022050       1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:39:59.023119       1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311534806/tls.crt::/tmp/serving-cert-3311534806/tls.key\\\\\\\"\\\\nI0320 15:39:59.307640       1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:39:59.310126       1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:39:59.310145       1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:39:59.310173       1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:39:59.310181       1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:39:59.315103       1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 15:39:59.315287       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 15:39:59.315111       1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:39:59.315373       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:39:59.315499       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:39:59.315558       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:39:59.315615       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:39:59.315659       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:39:59.316192       1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8e7f1bfc56e54242e12f30487e8120eeb3d85dc86cfc17b587255889bd60bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.575675    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p47zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.585739    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.585892    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.586021    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.586142    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.586354    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:21Z","lastTransitionTime":"2026-03-20T15:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.588604    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.598230    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fcd3db3-55f1-4c23-8fa9-78844495cea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p5qvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.605010    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-69fnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102cb977-7291-453e-9282-20572071afee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plthx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-69fnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.616997    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-49hht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-49hht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.628829    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.638968    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.653182    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6r2kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvthz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6r2kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.663991    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2prfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2prfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.675067    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.689435    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.689499    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.689518    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.689545    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.689562    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:21Z","lastTransitionTime":"2026-03-20T15:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.689808    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.713382    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b4e0e8-af33-491e-b1d1-31079d90c656\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj97f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.792601    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.792628    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.792638    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.792650    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.792659    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:21Z","lastTransitionTime":"2026-03-20T15:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.894427    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.894463    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.894473    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.894489    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.894501    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:21Z","lastTransitionTime":"2026-03-20T15:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.996566    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.996607    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.996616    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.996634    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:21 crc kubenswrapper[4730]: I0320 15:40:21.996643    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:21Z","lastTransitionTime":"2026-03-20T15:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.099157    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.099478    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.099575    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.099664    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.099751    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:22Z","lastTransitionTime":"2026-03-20T15:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.202843    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.202872    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.202880    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.202894    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.202903    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:22Z","lastTransitionTime":"2026-03-20T15:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.305842    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.305873    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.305884    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.305906    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.305922    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:22Z","lastTransitionTime":"2026-03-20T15:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.408963    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.408999    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.409008    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.409022    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.409033    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:22Z","lastTransitionTime":"2026-03-20T15:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.511622    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.511682    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.511705    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.511731    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.511750    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:22Z","lastTransitionTime":"2026-03-20T15:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.533045    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.533091    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.533199    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:40:22 crc kubenswrapper[4730]: E0320 15:40:22.533224    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:40:22 crc kubenswrapper[4730]: E0320 15:40:22.533433    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:40:22 crc kubenswrapper[4730]: E0320 15:40:22.533545    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.533719    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:40:22 crc kubenswrapper[4730]: E0320 15:40:22.535971    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.539879    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"]
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.615242    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.615609    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.615787    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.615945    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.616116    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:22Z","lastTransitionTime":"2026-03-20T15:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.720229    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.720703    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.721032    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.721228    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.721451    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:22Z","lastTransitionTime":"2026-03-20T15:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.824396    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.824470    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.824489    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.824514    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.824530    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:22Z","lastTransitionTime":"2026-03-20T15:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.914129    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.914986    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.915147    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.915293    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.915395    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:22Z","lastTransitionTime":"2026-03-20T15:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:22 crc kubenswrapper[4730]: E0320 15:40:22.925609    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"666d62d4-aa52-41cc-be79-8c9a068e7752\\\",\\\"systemUUID\\\":\\\"dfe7d645-fe91-432e-8360-ef4633bfea29\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.930310    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.930370    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.930390    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.930416    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.930438    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:22Z","lastTransitionTime":"2026-03-20T15:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:22 crc kubenswrapper[4730]: E0320 15:40:22.945046    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"666d62d4-aa52-41cc-be79-8c9a068e7752\\\",\\\"systemUUID\\\":\\\"dfe7d645-fe91-432e-8360-ef4633bfea29\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.949221    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.949281    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.949294    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.949311    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.949323    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:22Z","lastTransitionTime":"2026-03-20T15:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:22 crc kubenswrapper[4730]: E0320 15:40:22.958714    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"666d62d4-aa52-41cc-be79-8c9a068e7752\\\",\\\"systemUUID\\\":\\\"dfe7d645-fe91-432e-8360-ef4633bfea29\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.962684    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.962712    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.962725    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.962747    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.962761    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:22Z","lastTransitionTime":"2026-03-20T15:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:22 crc kubenswrapper[4730]: E0320 15:40:22.974524    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"666d62d4-aa52-41cc-be79-8c9a068e7752\\\",\\\"systemUUID\\\":\\\"dfe7d645-fe91-432e-8360-ef4633bfea29\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.978555    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.978598    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.978609    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.978626    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.978641    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:22Z","lastTransitionTime":"2026-03-20T15:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:22 crc kubenswrapper[4730]: E0320 15:40:22.989500    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"666d62d4-aa52-41cc-be79-8c9a068e7752\\\",\\\"systemUUID\\\":\\\"dfe7d645-fe91-432e-8360-ef4633bfea29\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:22 crc kubenswrapper[4730]: E0320 15:40:22.989642    4730 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.991400    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.991461    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.991472    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.991489    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:22 crc kubenswrapper[4730]: I0320 15:40:22.991504    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:22Z","lastTransitionTime":"2026-03-20T15:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:23 crc kubenswrapper[4730]: I0320 15:40:23.094412    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:23 crc kubenswrapper[4730]: I0320 15:40:23.094496    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:23 crc kubenswrapper[4730]: I0320 15:40:23.094523    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:23 crc kubenswrapper[4730]: I0320 15:40:23.094551    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:23 crc kubenswrapper[4730]: I0320 15:40:23.094571    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:23Z","lastTransitionTime":"2026-03-20T15:40:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:23 crc kubenswrapper[4730]: I0320 15:40:23.197823    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:23 crc kubenswrapper[4730]: I0320 15:40:23.197884    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:23 crc kubenswrapper[4730]: I0320 15:40:23.197910    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:23 crc kubenswrapper[4730]: I0320 15:40:23.197938    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:23 crc kubenswrapper[4730]: I0320 15:40:23.197961    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:23Z","lastTransitionTime":"2026-03-20T15:40:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:23 crc kubenswrapper[4730]: I0320 15:40:23.300576    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:23 crc kubenswrapper[4730]: I0320 15:40:23.300609    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:23 crc kubenswrapper[4730]: I0320 15:40:23.300622    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:23 crc kubenswrapper[4730]: I0320 15:40:23.300639    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:23 crc kubenswrapper[4730]: I0320 15:40:23.300650    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:23Z","lastTransitionTime":"2026-03-20T15:40:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:23 crc kubenswrapper[4730]: I0320 15:40:23.403342    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:23 crc kubenswrapper[4730]: I0320 15:40:23.403419    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:23 crc kubenswrapper[4730]: I0320 15:40:23.403478    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:23 crc kubenswrapper[4730]: I0320 15:40:23.403505    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:23 crc kubenswrapper[4730]: I0320 15:40:23.403521    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:23Z","lastTransitionTime":"2026-03-20T15:40:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:23 crc kubenswrapper[4730]: I0320 15:40:23.505640    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:23 crc kubenswrapper[4730]: I0320 15:40:23.505677    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:23 crc kubenswrapper[4730]: I0320 15:40:23.505689    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:23 crc kubenswrapper[4730]: I0320 15:40:23.505705    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:23 crc kubenswrapper[4730]: I0320 15:40:23.505716    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:23Z","lastTransitionTime":"2026-03-20T15:40:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:23 crc kubenswrapper[4730]: I0320 15:40:23.607699    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:23 crc kubenswrapper[4730]: I0320 15:40:23.608009    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:23 crc kubenswrapper[4730]: I0320 15:40:23.608145    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:23 crc kubenswrapper[4730]: I0320 15:40:23.608330    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:23 crc kubenswrapper[4730]: I0320 15:40:23.608480    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:23Z","lastTransitionTime":"2026-03-20T15:40:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:23 crc kubenswrapper[4730]: I0320 15:40:23.711294    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:23 crc kubenswrapper[4730]: I0320 15:40:23.711330    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:23 crc kubenswrapper[4730]: I0320 15:40:23.711342    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:23 crc kubenswrapper[4730]: I0320 15:40:23.711357    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:23 crc kubenswrapper[4730]: I0320 15:40:23.711369    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:23Z","lastTransitionTime":"2026-03-20T15:40:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:23 crc kubenswrapper[4730]: I0320 15:40:23.813653    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:23 crc kubenswrapper[4730]: I0320 15:40:23.813693    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:23 crc kubenswrapper[4730]: I0320 15:40:23.813705    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:23 crc kubenswrapper[4730]: I0320 15:40:23.813722    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:23 crc kubenswrapper[4730]: I0320 15:40:23.813732    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:23Z","lastTransitionTime":"2026-03-20T15:40:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:23 crc kubenswrapper[4730]: I0320 15:40:23.916549    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:23 crc kubenswrapper[4730]: I0320 15:40:23.917043    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:23 crc kubenswrapper[4730]: I0320 15:40:23.917119    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:23 crc kubenswrapper[4730]: I0320 15:40:23.917309    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:23 crc kubenswrapper[4730]: I0320 15:40:23.917401    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:23Z","lastTransitionTime":"2026-03-20T15:40:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.020389    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.020447    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.020472    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.020505    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.020518    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:24Z","lastTransitionTime":"2026-03-20T15:40:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.123295    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.123624    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.123720    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.123795    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.123867    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:24Z","lastTransitionTime":"2026-03-20T15:40:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.226653    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.226719    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.226738    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.226762    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.226779    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:24Z","lastTransitionTime":"2026-03-20T15:40:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.329150    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.329211    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.329233    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.329291    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.329313    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:24Z","lastTransitionTime":"2026-03-20T15:40:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.432751    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.432820    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.432859    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.432893    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.432913    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:24Z","lastTransitionTime":"2026-03-20T15:40:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.532138    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:40:24 crc kubenswrapper[4730]: E0320 15:40:24.532566    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.532208    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:40:24 crc kubenswrapper[4730]: E0320 15:40:24.532879    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.532150    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:40:24 crc kubenswrapper[4730]: E0320 15:40:24.533238    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.532291    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:40:24 crc kubenswrapper[4730]: E0320 15:40:24.533603    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.535894    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.536056    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.536174    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.536310    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.536408    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:24Z","lastTransitionTime":"2026-03-20T15:40:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.639840    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.640177    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.640449    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.640692    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.640918    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:24Z","lastTransitionTime":"2026-03-20T15:40:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.743860    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.744168    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.744421    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.744669    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.744885    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:24Z","lastTransitionTime":"2026-03-20T15:40:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.847666    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.847707    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.847718    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.847733    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.847744    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:24Z","lastTransitionTime":"2026-03-20T15:40:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.950485    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.950532    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.950544    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.950564    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:24 crc kubenswrapper[4730]: I0320 15:40:24.950575    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:24Z","lastTransitionTime":"2026-03-20T15:40:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:25 crc kubenswrapper[4730]: I0320 15:40:25.053586    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:25 crc kubenswrapper[4730]: I0320 15:40:25.053628    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:25 crc kubenswrapper[4730]: I0320 15:40:25.053639    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:25 crc kubenswrapper[4730]: I0320 15:40:25.053654    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:25 crc kubenswrapper[4730]: I0320 15:40:25.053665    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:25Z","lastTransitionTime":"2026-03-20T15:40:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:25 crc kubenswrapper[4730]: I0320 15:40:25.156313    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:25 crc kubenswrapper[4730]: I0320 15:40:25.156355    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:25 crc kubenswrapper[4730]: I0320 15:40:25.156364    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:25 crc kubenswrapper[4730]: I0320 15:40:25.156380    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:25 crc kubenswrapper[4730]: I0320 15:40:25.156391    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:25Z","lastTransitionTime":"2026-03-20T15:40:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:25 crc kubenswrapper[4730]: I0320 15:40:25.258460    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:25 crc kubenswrapper[4730]: I0320 15:40:25.258498    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:25 crc kubenswrapper[4730]: I0320 15:40:25.258511    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:25 crc kubenswrapper[4730]: I0320 15:40:25.258526    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:25 crc kubenswrapper[4730]: I0320 15:40:25.258539    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:25Z","lastTransitionTime":"2026-03-20T15:40:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:25 crc kubenswrapper[4730]: I0320 15:40:25.361026    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:25 crc kubenswrapper[4730]: I0320 15:40:25.361087    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:25 crc kubenswrapper[4730]: I0320 15:40:25.361108    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:25 crc kubenswrapper[4730]: I0320 15:40:25.361132    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:25 crc kubenswrapper[4730]: I0320 15:40:25.361145    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:25Z","lastTransitionTime":"2026-03-20T15:40:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:25 crc kubenswrapper[4730]: I0320 15:40:25.463395    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:25 crc kubenswrapper[4730]: I0320 15:40:25.463467    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:25 crc kubenswrapper[4730]: I0320 15:40:25.463480    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:25 crc kubenswrapper[4730]: I0320 15:40:25.463495    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:25 crc kubenswrapper[4730]: I0320 15:40:25.463507    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:25Z","lastTransitionTime":"2026-03-20T15:40:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:25 crc kubenswrapper[4730]: E0320 15:40:25.534604    4730 kuberuntime_manager.go:1274] "Unhandled Error" err=<
Mar 20 15:40:25 crc kubenswrapper[4730]:         container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe
Mar 20 15:40:25 crc kubenswrapper[4730]:         if [[ -f "/env/_master" ]]; then
Mar 20 15:40:25 crc kubenswrapper[4730]:           set -o allexport
Mar 20 15:40:25 crc kubenswrapper[4730]:           source "/env/_master"
Mar 20 15:40:25 crc kubenswrapper[4730]:           set +o allexport
Mar 20 15:40:25 crc kubenswrapper[4730]:         fi
Mar 20 15:40:25 crc kubenswrapper[4730]:         # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled.
Mar 20 15:40:25 crc kubenswrapper[4730]:         # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791
Mar 20 15:40:25 crc kubenswrapper[4730]:         ho_enable="--enable-hybrid-overlay"
Mar 20 15:40:25 crc kubenswrapper[4730]:         echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook"
Mar 20 15:40:25 crc kubenswrapper[4730]:         # extra-allowed-user: service account `ovn-kubernetes-control-plane`
Mar 20 15:40:25 crc kubenswrapper[4730]:         # sets pod annotations in multi-homing layer3 network controller (cluster-manager)
Mar 20 15:40:25 crc kubenswrapper[4730]:         exec /usr/bin/ovnkube-identity  --k8s-apiserver=https://api-int.crc.testing:6443 \
Mar 20 15:40:25 crc kubenswrapper[4730]:             --webhook-cert-dir="/etc/webhook-cert" \
Mar 20 15:40:25 crc kubenswrapper[4730]:             --webhook-host=127.0.0.1 \
Mar 20 15:40:25 crc kubenswrapper[4730]:             --webhook-port=9743 \
Mar 20 15:40:25 crc kubenswrapper[4730]:             ${ho_enable} \
Mar 20 15:40:25 crc kubenswrapper[4730]:             --enable-interconnect \
Mar 20 15:40:25 crc kubenswrapper[4730]:             --disable-approver \
Mar 20 15:40:25 crc kubenswrapper[4730]:             --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \
Mar 20 15:40:25 crc kubenswrapper[4730]:             --wait-for-kubernetes-api=200s \
Mar 20 15:40:25 crc kubenswrapper[4730]:             --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \
Mar 20 15:40:25 crc kubenswrapper[4730]:             --loglevel="${LOGLEVEL}"
Mar 20 15:40:25 crc kubenswrapper[4730]:         ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars
Mar 20 15:40:25 crc kubenswrapper[4730]:  > logger="UnhandledError"
Mar 20 15:40:25 crc kubenswrapper[4730]: E0320 15:40:25.534943    4730 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4qtg2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-49hht_openshift-multus(dbb015c0-3a11-48bf-a59f-22bc03ca2fb9): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError"
Mar 20 15:40:25 crc kubenswrapper[4730]: E0320 15:40:25.535656    4730 kuberuntime_manager.go:1274] "Unhandled Error" err=<
Mar 20 15:40:25 crc kubenswrapper[4730]:         container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash
Mar 20 15:40:25 crc kubenswrapper[4730]:         set -euo pipefail
Mar 20 15:40:25 crc kubenswrapper[4730]:         TLS_PK=/etc/pki/tls/metrics-cert/tls.key
Mar 20 15:40:25 crc kubenswrapper[4730]:         TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt
Mar 20 15:40:25 crc kubenswrapper[4730]:         # As the secret mount is optional we must wait for the files to be present.
Mar 20 15:40:25 crc kubenswrapper[4730]:         # The service is created in monitor.yaml and this is created in sdn.yaml.
Mar 20 15:40:25 crc kubenswrapper[4730]:         TS=$(date +%s)
Mar 20 15:40:25 crc kubenswrapper[4730]:         WARN_TS=$(( ${TS} + $(( 20 * 60)) ))
Mar 20 15:40:25 crc kubenswrapper[4730]:         HAS_LOGGED_INFO=0
Mar 20 15:40:25 crc kubenswrapper[4730]:         
Mar 20 15:40:25 crc kubenswrapper[4730]:         log_missing_certs(){
Mar 20 15:40:25 crc kubenswrapper[4730]:             CUR_TS=$(date +%s)
Mar 20 15:40:25 crc kubenswrapper[4730]:             if [[ "${CUR_TS}" -gt "WARN_TS"  ]]; then
Mar 20 15:40:25 crc kubenswrapper[4730]:               echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes.
Mar 20 15:40:25 crc kubenswrapper[4730]:             elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then
Mar 20 15:40:25 crc kubenswrapper[4730]:               echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes.
Mar 20 15:40:25 crc kubenswrapper[4730]:               HAS_LOGGED_INFO=1
Mar 20 15:40:25 crc kubenswrapper[4730]:             fi
Mar 20 15:40:25 crc kubenswrapper[4730]:         }
Mar 20 15:40:25 crc kubenswrapper[4730]:         while [[ ! -f "${TLS_PK}" ||  ! -f "${TLS_CERT}" ]] ; do
Mar 20 15:40:25 crc kubenswrapper[4730]:           log_missing_certs
Mar 20 15:40:25 crc kubenswrapper[4730]:           sleep 5
Mar 20 15:40:25 crc kubenswrapper[4730]:         done
Mar 20 15:40:25 crc kubenswrapper[4730]:         
Mar 20 15:40:25 crc kubenswrapper[4730]:         echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy
Mar 20 15:40:25 crc kubenswrapper[4730]:         exec /usr/bin/kube-rbac-proxy \
Mar 20 15:40:25 crc kubenswrapper[4730]:           --logtostderr \
Mar 20 15:40:25 crc kubenswrapper[4730]:           --secure-listen-address=:9108 \
Mar 20 15:40:25 crc kubenswrapper[4730]:           --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \
Mar 20 15:40:25 crc kubenswrapper[4730]:           --upstream=http://127.0.0.1:29108/ \
Mar 20 15:40:25 crc kubenswrapper[4730]:           --tls-private-key-file=${TLS_PK} \
Mar 20 15:40:25 crc kubenswrapper[4730]:           --tls-cert-file=${TLS_CERT}
Mar 20 15:40:25 crc kubenswrapper[4730]:         ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d4xpw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-p47zh_openshift-ovn-kubernetes(a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars
Mar 20 15:40:25 crc kubenswrapper[4730]:  > logger="UnhandledError"
Mar 20 15:40:25 crc kubenswrapper[4730]: E0320 15:40:25.536056    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-49hht" podUID="dbb015c0-3a11-48bf-a59f-22bc03ca2fb9"
Mar 20 15:40:25 crc kubenswrapper[4730]: E0320 15:40:25.536571    4730 kuberuntime_manager.go:1274] "Unhandled Error" err=<
Mar 20 15:40:25 crc kubenswrapper[4730]:         container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe
Mar 20 15:40:25 crc kubenswrapper[4730]:         if [[ -f "/env/_master" ]]; then
Mar 20 15:40:25 crc kubenswrapper[4730]:           set -o allexport
Mar 20 15:40:25 crc kubenswrapper[4730]:           source "/env/_master"
Mar 20 15:40:25 crc kubenswrapper[4730]:           set +o allexport
Mar 20 15:40:25 crc kubenswrapper[4730]:         fi
Mar 20 15:40:25 crc kubenswrapper[4730]:         
Mar 20 15:40:25 crc kubenswrapper[4730]:         echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver"
Mar 20 15:40:25 crc kubenswrapper[4730]:         exec /usr/bin/ovnkube-identity  --k8s-apiserver=https://api-int.crc.testing:6443 \
Mar 20 15:40:25 crc kubenswrapper[4730]:             --disable-webhook \
Mar 20 15:40:25 crc kubenswrapper[4730]:             --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \
Mar 20 15:40:25 crc kubenswrapper[4730]:             --loglevel="${LOGLEVEL}"
Mar 20 15:40:25 crc kubenswrapper[4730]:         ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars
Mar 20 15:40:25 crc kubenswrapper[4730]:  > logger="UnhandledError"
Mar 20 15:40:25 crc kubenswrapper[4730]: E0320 15:40:25.537750    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d"
Mar 20 15:40:25 crc kubenswrapper[4730]: E0320 15:40:25.539788    4730 kuberuntime_manager.go:1274] "Unhandled Error" err=<
Mar 20 15:40:25 crc kubenswrapper[4730]:         container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe
Mar 20 15:40:25 crc kubenswrapper[4730]:         if [[ -f "/env/_master" ]]; then
Mar 20 15:40:25 crc kubenswrapper[4730]:           set -o allexport
Mar 20 15:40:25 crc kubenswrapper[4730]:           source "/env/_master"
Mar 20 15:40:25 crc kubenswrapper[4730]:           set +o allexport
Mar 20 15:40:25 crc kubenswrapper[4730]:         fi
Mar 20 15:40:25 crc kubenswrapper[4730]:         
Mar 20 15:40:25 crc kubenswrapper[4730]:         ovn_v4_join_subnet_opt=
Mar 20 15:40:25 crc kubenswrapper[4730]:         if [[ "" != "" ]]; then
Mar 20 15:40:25 crc kubenswrapper[4730]:           ovn_v4_join_subnet_opt="--gateway-v4-join-subnet "
Mar 20 15:40:25 crc kubenswrapper[4730]:         fi
Mar 20 15:40:25 crc kubenswrapper[4730]:         ovn_v6_join_subnet_opt=
Mar 20 15:40:25 crc kubenswrapper[4730]:         if [[ "" != "" ]]; then
Mar 20 15:40:25 crc kubenswrapper[4730]:           ovn_v6_join_subnet_opt="--gateway-v6-join-subnet "
Mar 20 15:40:25 crc kubenswrapper[4730]:         fi
Mar 20 15:40:25 crc kubenswrapper[4730]:         
Mar 20 15:40:25 crc kubenswrapper[4730]:         ovn_v4_transit_switch_subnet_opt=
Mar 20 15:40:25 crc kubenswrapper[4730]:         if [[ "" != "" ]]; then
Mar 20 15:40:25 crc kubenswrapper[4730]:           ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet "
Mar 20 15:40:25 crc kubenswrapper[4730]:         fi
Mar 20 15:40:25 crc kubenswrapper[4730]:         ovn_v6_transit_switch_subnet_opt=
Mar 20 15:40:25 crc kubenswrapper[4730]:         if [[ "" != "" ]]; then
Mar 20 15:40:25 crc kubenswrapper[4730]:           ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet "
Mar 20 15:40:25 crc kubenswrapper[4730]:         fi
Mar 20 15:40:25 crc kubenswrapper[4730]:         
Mar 20 15:40:25 crc kubenswrapper[4730]:         dns_name_resolver_enabled_flag=
Mar 20 15:40:25 crc kubenswrapper[4730]:         if [[ "false" == "true" ]]; then
Mar 20 15:40:25 crc kubenswrapper[4730]:           dns_name_resolver_enabled_flag="--enable-dns-name-resolver"
Mar 20 15:40:25 crc kubenswrapper[4730]:         fi
Mar 20 15:40:25 crc kubenswrapper[4730]:         
Mar 20 15:40:25 crc kubenswrapper[4730]:         persistent_ips_enabled_flag=
Mar 20 15:40:25 crc kubenswrapper[4730]:         if [[ "true" == "true" ]]; then
Mar 20 15:40:25 crc kubenswrapper[4730]:           persistent_ips_enabled_flag="--enable-persistent-ips"
Mar 20 15:40:25 crc kubenswrapper[4730]:         fi
Mar 20 15:40:25 crc kubenswrapper[4730]:         
Mar 20 15:40:25 crc kubenswrapper[4730]:         # This is needed so that converting clusters from GA to TP
Mar 20 15:40:25 crc kubenswrapper[4730]:         # will rollout control plane pods as well
Mar 20 15:40:25 crc kubenswrapper[4730]:         network_segmentation_enabled_flag=
Mar 20 15:40:25 crc kubenswrapper[4730]:         multi_network_enabled_flag=
Mar 20 15:40:25 crc kubenswrapper[4730]:         if [[ "true" == "true" ]]; then
Mar 20 15:40:25 crc kubenswrapper[4730]:           multi_network_enabled_flag="--enable-multi-network"
Mar 20 15:40:25 crc kubenswrapper[4730]:           network_segmentation_enabled_flag="--enable-network-segmentation"
Mar 20 15:40:25 crc kubenswrapper[4730]:         fi
Mar 20 15:40:25 crc kubenswrapper[4730]:         
Mar 20 15:40:25 crc kubenswrapper[4730]:         echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}"
Mar 20 15:40:25 crc kubenswrapper[4730]:         exec /usr/bin/ovnkube \
Mar 20 15:40:25 crc kubenswrapper[4730]:           --enable-interconnect \
Mar 20 15:40:25 crc kubenswrapper[4730]:           --init-cluster-manager "${K8S_NODE}" \
Mar 20 15:40:25 crc kubenswrapper[4730]:           --config-file=/run/ovnkube-config/ovnkube.conf \
Mar 20 15:40:25 crc kubenswrapper[4730]:           --loglevel "${OVN_KUBE_LOG_LEVEL}" \
Mar 20 15:40:25 crc kubenswrapper[4730]:           --metrics-bind-address "127.0.0.1:29108" \
Mar 20 15:40:25 crc kubenswrapper[4730]:           --metrics-enable-pprof \
Mar 20 15:40:25 crc kubenswrapper[4730]:           --metrics-enable-config-duration \
Mar 20 15:40:25 crc kubenswrapper[4730]:           ${ovn_v4_join_subnet_opt} \
Mar 20 15:40:25 crc kubenswrapper[4730]:           ${ovn_v6_join_subnet_opt} \
Mar 20 15:40:25 crc kubenswrapper[4730]:           ${ovn_v4_transit_switch_subnet_opt} \
Mar 20 15:40:25 crc kubenswrapper[4730]:           ${ovn_v6_transit_switch_subnet_opt} \
Mar 20 15:40:25 crc kubenswrapper[4730]:           ${dns_name_resolver_enabled_flag} \
Mar 20 15:40:25 crc kubenswrapper[4730]:           ${persistent_ips_enabled_flag} \
Mar 20 15:40:25 crc kubenswrapper[4730]:           ${multi_network_enabled_flag} \
Mar 20 15:40:25 crc kubenswrapper[4730]:           ${network_segmentation_enabled_flag}
Mar 20 15:40:25 crc kubenswrapper[4730]:         ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d4xpw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-p47zh_openshift-ovn-kubernetes(a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars
Mar 20 15:40:25 crc kubenswrapper[4730]:  > logger="UnhandledError"
Mar 20 15:40:25 crc kubenswrapper[4730]: E0320 15:40:25.541196    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh" podUID="a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0"
Mar 20 15:40:25 crc kubenswrapper[4730]: I0320 15:40:25.566444    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:25 crc kubenswrapper[4730]: I0320 15:40:25.566488    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:25 crc kubenswrapper[4730]: I0320 15:40:25.566510    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:25 crc kubenswrapper[4730]: I0320 15:40:25.566526    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:25 crc kubenswrapper[4730]: I0320 15:40:25.566536    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:25Z","lastTransitionTime":"2026-03-20T15:40:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:25 crc kubenswrapper[4730]: I0320 15:40:25.668613    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:25 crc kubenswrapper[4730]: I0320 15:40:25.668851    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:25 crc kubenswrapper[4730]: I0320 15:40:25.669012    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:25 crc kubenswrapper[4730]: I0320 15:40:25.669240    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:25 crc kubenswrapper[4730]: I0320 15:40:25.669435    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:25Z","lastTransitionTime":"2026-03-20T15:40:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:25 crc kubenswrapper[4730]: I0320 15:40:25.780056    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:25 crc kubenswrapper[4730]: I0320 15:40:25.780088    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:25 crc kubenswrapper[4730]: I0320 15:40:25.780099    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:25 crc kubenswrapper[4730]: I0320 15:40:25.780113    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:25 crc kubenswrapper[4730]: I0320 15:40:25.780482    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:25Z","lastTransitionTime":"2026-03-20T15:40:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:25 crc kubenswrapper[4730]: I0320 15:40:25.882422    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:25 crc kubenswrapper[4730]: I0320 15:40:25.882472    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:25 crc kubenswrapper[4730]: I0320 15:40:25.882487    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:25 crc kubenswrapper[4730]: I0320 15:40:25.882503    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:25 crc kubenswrapper[4730]: I0320 15:40:25.882513    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:25Z","lastTransitionTime":"2026-03-20T15:40:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:25 crc kubenswrapper[4730]: I0320 15:40:25.984431    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:25 crc kubenswrapper[4730]: I0320 15:40:25.984918    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:25 crc kubenswrapper[4730]: I0320 15:40:25.985006    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:25 crc kubenswrapper[4730]: I0320 15:40:25.985088    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:25 crc kubenswrapper[4730]: I0320 15:40:25.985166    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:25Z","lastTransitionTime":"2026-03-20T15:40:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:26 crc kubenswrapper[4730]: I0320 15:40:26.087778    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:26 crc kubenswrapper[4730]: I0320 15:40:26.087804    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:26 crc kubenswrapper[4730]: I0320 15:40:26.087816    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:26 crc kubenswrapper[4730]: I0320 15:40:26.087829    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:26 crc kubenswrapper[4730]: I0320 15:40:26.087837    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:26Z","lastTransitionTime":"2026-03-20T15:40:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:26 crc kubenswrapper[4730]: I0320 15:40:26.190822    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:26 crc kubenswrapper[4730]: I0320 15:40:26.191099    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:26 crc kubenswrapper[4730]: I0320 15:40:26.191217    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:26 crc kubenswrapper[4730]: I0320 15:40:26.191323    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:26 crc kubenswrapper[4730]: I0320 15:40:26.191436    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:26Z","lastTransitionTime":"2026-03-20T15:40:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:26 crc kubenswrapper[4730]: I0320 15:40:26.294504    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:26 crc kubenswrapper[4730]: I0320 15:40:26.294573    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:26 crc kubenswrapper[4730]: I0320 15:40:26.294594    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:26 crc kubenswrapper[4730]: I0320 15:40:26.294619    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:26 crc kubenswrapper[4730]: I0320 15:40:26.294646    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:26Z","lastTransitionTime":"2026-03-20T15:40:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:26 crc kubenswrapper[4730]: I0320 15:40:26.397241    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:26 crc kubenswrapper[4730]: I0320 15:40:26.397314    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:26 crc kubenswrapper[4730]: I0320 15:40:26.397324    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:26 crc kubenswrapper[4730]: I0320 15:40:26.397338    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:26 crc kubenswrapper[4730]: I0320 15:40:26.397347    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:26Z","lastTransitionTime":"2026-03-20T15:40:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:26 crc kubenswrapper[4730]: I0320 15:40:26.507677    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:26 crc kubenswrapper[4730]: I0320 15:40:26.508136    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:26 crc kubenswrapper[4730]: I0320 15:40:26.508293    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:26 crc kubenswrapper[4730]: I0320 15:40:26.508434    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:26 crc kubenswrapper[4730]: I0320 15:40:26.508568    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:26Z","lastTransitionTime":"2026-03-20T15:40:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:26 crc kubenswrapper[4730]: I0320 15:40:26.533441    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:40:26 crc kubenswrapper[4730]: I0320 15:40:26.533465    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:40:26 crc kubenswrapper[4730]: I0320 15:40:26.533628    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:40:26 crc kubenswrapper[4730]: I0320 15:40:26.533793    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:40:26 crc kubenswrapper[4730]: E0320 15:40:26.534007    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:40:26 crc kubenswrapper[4730]: E0320 15:40:26.534183    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:40:26 crc kubenswrapper[4730]: E0320 15:40:26.534286    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:40:26 crc kubenswrapper[4730]: E0320 15:40:26.534382    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:40:26 crc kubenswrapper[4730]: E0320 15:40:26.536584    4730 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError"
Mar 20 15:40:26 crc kubenswrapper[4730]: E0320 15:40:26.537967    4730 kuberuntime_manager.go:1274] "Unhandled Error" err=<
Mar 20 15:40:26 crc kubenswrapper[4730]:         container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT=""
Mar 20 15:40:26 crc kubenswrapper[4730]:         /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT
Mar 20 15:40:26 crc kubenswrapper[4730]:         ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vvthz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-6r2kn_openshift-multus(6f97b1f1-1fad-44ec-8253-17dd6a5eee54): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars
Mar 20 15:40:26 crc kubenswrapper[4730]:  > logger="UnhandledError"
Mar 20 15:40:26 crc kubenswrapper[4730]: E0320 15:40:26.538272    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49"
Mar 20 15:40:26 crc kubenswrapper[4730]: E0320 15:40:26.539452    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-6r2kn" podUID="6f97b1f1-1fad-44ec-8253-17dd6a5eee54"
Mar 20 15:40:26 crc kubenswrapper[4730]: I0320 15:40:26.612496    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:26 crc kubenswrapper[4730]: I0320 15:40:26.612554    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:26 crc kubenswrapper[4730]: I0320 15:40:26.612573    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:26 crc kubenswrapper[4730]: I0320 15:40:26.612599    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:26 crc kubenswrapper[4730]: I0320 15:40:26.612617    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:26Z","lastTransitionTime":"2026-03-20T15:40:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:26 crc kubenswrapper[4730]: I0320 15:40:26.716390    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:26 crc kubenswrapper[4730]: I0320 15:40:26.716438    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:26 crc kubenswrapper[4730]: I0320 15:40:26.716453    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:26 crc kubenswrapper[4730]: I0320 15:40:26.716476    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:26 crc kubenswrapper[4730]: I0320 15:40:26.716486    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:26Z","lastTransitionTime":"2026-03-20T15:40:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:26 crc kubenswrapper[4730]: I0320 15:40:26.820338    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:26 crc kubenswrapper[4730]: I0320 15:40:26.820440    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:26 crc kubenswrapper[4730]: I0320 15:40:26.820459    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:26 crc kubenswrapper[4730]: I0320 15:40:26.820810    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:26 crc kubenswrapper[4730]: I0320 15:40:26.821140    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:26Z","lastTransitionTime":"2026-03-20T15:40:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:26 crc kubenswrapper[4730]: I0320 15:40:26.923978    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:26 crc kubenswrapper[4730]: I0320 15:40:26.924042    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:26 crc kubenswrapper[4730]: I0320 15:40:26.924054    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:26 crc kubenswrapper[4730]: I0320 15:40:26.924074    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:26 crc kubenswrapper[4730]: I0320 15:40:26.924087    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:26Z","lastTransitionTime":"2026-03-20T15:40:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:27 crc kubenswrapper[4730]: I0320 15:40:27.026559    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:27 crc kubenswrapper[4730]: I0320 15:40:27.026626    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:27 crc kubenswrapper[4730]: I0320 15:40:27.026650    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:27 crc kubenswrapper[4730]: I0320 15:40:27.026680    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:27 crc kubenswrapper[4730]: I0320 15:40:27.026703    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:27Z","lastTransitionTime":"2026-03-20T15:40:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:27 crc kubenswrapper[4730]: I0320 15:40:27.129695    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:27 crc kubenswrapper[4730]: I0320 15:40:27.129729    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:27 crc kubenswrapper[4730]: I0320 15:40:27.129738    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:27 crc kubenswrapper[4730]: I0320 15:40:27.129752    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:27 crc kubenswrapper[4730]: I0320 15:40:27.129762    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:27Z","lastTransitionTime":"2026-03-20T15:40:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:27 crc kubenswrapper[4730]: I0320 15:40:27.233594    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:27 crc kubenswrapper[4730]: I0320 15:40:27.233833    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:27 crc kubenswrapper[4730]: I0320 15:40:27.234080    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:27 crc kubenswrapper[4730]: I0320 15:40:27.234299    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:27 crc kubenswrapper[4730]: I0320 15:40:27.234418    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:27Z","lastTransitionTime":"2026-03-20T15:40:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:27 crc kubenswrapper[4730]: I0320 15:40:27.337086    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:27 crc kubenswrapper[4730]: I0320 15:40:27.337159    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:27 crc kubenswrapper[4730]: I0320 15:40:27.337184    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:27 crc kubenswrapper[4730]: I0320 15:40:27.337217    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:27 crc kubenswrapper[4730]: I0320 15:40:27.337243    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:27Z","lastTransitionTime":"2026-03-20T15:40:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:27 crc kubenswrapper[4730]: I0320 15:40:27.440233    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:27 crc kubenswrapper[4730]: I0320 15:40:27.440319    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:27 crc kubenswrapper[4730]: I0320 15:40:27.440331    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:27 crc kubenswrapper[4730]: I0320 15:40:27.440372    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:27 crc kubenswrapper[4730]: I0320 15:40:27.440384    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:27Z","lastTransitionTime":"2026-03-20T15:40:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:27 crc kubenswrapper[4730]: E0320 15:40:27.534935    4730 kuberuntime_manager.go:1274] "Unhandled Error" err=<
Mar 20 15:40:27 crc kubenswrapper[4730]:         container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash
Mar 20 15:40:27 crc kubenswrapper[4730]:         set -o allexport
Mar 20 15:40:27 crc kubenswrapper[4730]:         if [[ -f /etc/kubernetes/apiserver-url.env ]]; then
Mar 20 15:40:27 crc kubenswrapper[4730]:           source /etc/kubernetes/apiserver-url.env
Mar 20 15:40:27 crc kubenswrapper[4730]:         else
Mar 20 15:40:27 crc kubenswrapper[4730]:           echo "Error: /etc/kubernetes/apiserver-url.env is missing"
Mar 20 15:40:27 crc kubenswrapper[4730]:           exit 1
Mar 20 15:40:27 crc kubenswrapper[4730]:         fi
Mar 20 15:40:27 crc kubenswrapper[4730]:         exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104
Mar 20 15:40:27 crc kubenswrapper[4730]:         ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars
Mar 20 15:40:27 crc kubenswrapper[4730]:  > logger="UnhandledError"
Mar 20 15:40:27 crc kubenswrapper[4730]: E0320 15:40:27.535074    4730 kuberuntime_manager.go:1274] "Unhandled Error" err=<
Mar 20 15:40:27 crc kubenswrapper[4730]:         container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash
Mar 20 15:40:27 crc kubenswrapper[4730]:         set -uo pipefail
Mar 20 15:40:27 crc kubenswrapper[4730]:         
Mar 20 15:40:27 crc kubenswrapper[4730]:         trap 'jobs -p | xargs kill || true; wait; exit 0' TERM
Mar 20 15:40:27 crc kubenswrapper[4730]:         
Mar 20 15:40:27 crc kubenswrapper[4730]:         OPENSHIFT_MARKER="openshift-generated-node-resolver"
Mar 20 15:40:27 crc kubenswrapper[4730]:         HOSTS_FILE="/etc/hosts"
Mar 20 15:40:27 crc kubenswrapper[4730]:         TEMP_FILE="/etc/hosts.tmp"
Mar 20 15:40:27 crc kubenswrapper[4730]:         
Mar 20 15:40:27 crc kubenswrapper[4730]:         IFS=', ' read -r -a services <<< "${SERVICES}"
Mar 20 15:40:27 crc kubenswrapper[4730]:         
Mar 20 15:40:27 crc kubenswrapper[4730]:         # Make a temporary file with the old hosts file's attributes.
Mar 20 15:40:27 crc kubenswrapper[4730]:         if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then
Mar 20 15:40:27 crc kubenswrapper[4730]:           echo "Failed to preserve hosts file. Exiting."
Mar 20 15:40:27 crc kubenswrapper[4730]:           exit 1
Mar 20 15:40:27 crc kubenswrapper[4730]:         fi
Mar 20 15:40:27 crc kubenswrapper[4730]:         
Mar 20 15:40:27 crc kubenswrapper[4730]:         while true; do
Mar 20 15:40:27 crc kubenswrapper[4730]:           declare -A svc_ips
Mar 20 15:40:27 crc kubenswrapper[4730]:           for svc in "${services[@]}"; do
Mar 20 15:40:27 crc kubenswrapper[4730]:             # Fetch service IP from cluster dns if present. We make several tries
Mar 20 15:40:27 crc kubenswrapper[4730]:             # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones
Mar 20 15:40:27 crc kubenswrapper[4730]:             # are for deployments with Kuryr on older OpenStack (OSP13) - those do not
Mar 20 15:40:27 crc kubenswrapper[4730]:             # support UDP loadbalancers and require reaching DNS through TCP.
Mar 20 15:40:27 crc kubenswrapper[4730]:             cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"'
Mar 20 15:40:27 crc kubenswrapper[4730]:                   'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"'
Mar 20 15:40:27 crc kubenswrapper[4730]:                   'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"'
Mar 20 15:40:27 crc kubenswrapper[4730]:                   'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"')
Mar 20 15:40:27 crc kubenswrapper[4730]:             for i in ${!cmds[*]}
Mar 20 15:40:27 crc kubenswrapper[4730]:             do
Mar 20 15:40:27 crc kubenswrapper[4730]:               ips=($(eval "${cmds[i]}"))
Mar 20 15:40:27 crc kubenswrapper[4730]:               if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then
Mar 20 15:40:27 crc kubenswrapper[4730]:                 svc_ips["${svc}"]="${ips[@]}"
Mar 20 15:40:27 crc kubenswrapper[4730]:                 break
Mar 20 15:40:27 crc kubenswrapper[4730]:               fi
Mar 20 15:40:27 crc kubenswrapper[4730]:             done
Mar 20 15:40:27 crc kubenswrapper[4730]:           done
Mar 20 15:40:27 crc kubenswrapper[4730]:         
Mar 20 15:40:27 crc kubenswrapper[4730]:           # Update /etc/hosts only if we get valid service IPs
Mar 20 15:40:27 crc kubenswrapper[4730]:           # We will not update /etc/hosts when there is coredns service outage or api unavailability
Mar 20 15:40:27 crc kubenswrapper[4730]:           # Stale entries could exist in /etc/hosts if the service is deleted
Mar 20 15:40:27 crc kubenswrapper[4730]:           if [[ -n "${svc_ips[*]-}" ]]; then
Mar 20 15:40:27 crc kubenswrapper[4730]:             # Build a new hosts file from /etc/hosts with our custom entries filtered out
Mar 20 15:40:27 crc kubenswrapper[4730]:             if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then
Mar 20 15:40:27 crc kubenswrapper[4730]:               # Only continue rebuilding the hosts entries if its original content is preserved
Mar 20 15:40:27 crc kubenswrapper[4730]:               sleep 60 & wait
Mar 20 15:40:27 crc kubenswrapper[4730]:               continue
Mar 20 15:40:27 crc kubenswrapper[4730]:             fi
Mar 20 15:40:27 crc kubenswrapper[4730]:         
Mar 20 15:40:27 crc kubenswrapper[4730]:             # Append resolver entries for services
Mar 20 15:40:27 crc kubenswrapper[4730]:             rc=0
Mar 20 15:40:27 crc kubenswrapper[4730]:             for svc in "${!svc_ips[@]}"; do
Mar 20 15:40:27 crc kubenswrapper[4730]:               for ip in ${svc_ips[${svc}]}; do
Mar 20 15:40:27 crc kubenswrapper[4730]:                 echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$?
Mar 20 15:40:27 crc kubenswrapper[4730]:               done
Mar 20 15:40:27 crc kubenswrapper[4730]:             done
Mar 20 15:40:27 crc kubenswrapper[4730]:             if [[ $rc -ne 0 ]]; then
Mar 20 15:40:27 crc kubenswrapper[4730]:               sleep 60 & wait
Mar 20 15:40:27 crc kubenswrapper[4730]:               continue
Mar 20 15:40:27 crc kubenswrapper[4730]:             fi
Mar 20 15:40:27 crc kubenswrapper[4730]:         
Mar 20 15:40:27 crc kubenswrapper[4730]:         
Mar 20 15:40:27 crc kubenswrapper[4730]:             # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior
Mar 20 15:40:27 crc kubenswrapper[4730]:             # Replace /etc/hosts with our modified version if needed
Mar 20 15:40:27 crc kubenswrapper[4730]:             cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}"
Mar 20 15:40:27 crc kubenswrapper[4730]:             # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn
Mar 20 15:40:27 crc kubenswrapper[4730]:           fi
Mar 20 15:40:27 crc kubenswrapper[4730]:           sleep 60 & wait
Mar 20 15:40:27 crc kubenswrapper[4730]:           unset svc_ips
Mar 20 15:40:27 crc kubenswrapper[4730]:         done
Mar 20 15:40:27 crc kubenswrapper[4730]:         ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-plthx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-69fnw_openshift-dns(102cb977-7291-453e-9282-20572071afee): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars
Mar 20 15:40:27 crc kubenswrapper[4730]:  > logger="UnhandledError"
Mar 20 15:40:27 crc kubenswrapper[4730]: E0320 15:40:27.535459    4730 kuberuntime_manager.go:1274] "Unhandled Error" err=<
Mar 20 15:40:27 crc kubenswrapper[4730]:         container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM
Mar 20 15:40:27 crc kubenswrapper[4730]:         while [ true ];
Mar 20 15:40:27 crc kubenswrapper[4730]:         do
Mar 20 15:40:27 crc kubenswrapper[4730]:           for f in $(ls /tmp/serviceca); do
Mar 20 15:40:27 crc kubenswrapper[4730]:               echo $f
Mar 20 15:40:27 crc kubenswrapper[4730]:               ca_file_path="/tmp/serviceca/${f}"
Mar 20 15:40:27 crc kubenswrapper[4730]:               f=$(echo $f | sed  -r 's/(.*)\.\./\1:/')
Mar 20 15:40:27 crc kubenswrapper[4730]:               reg_dir_path="/etc/docker/certs.d/${f}"
Mar 20 15:40:27 crc kubenswrapper[4730]:               if [ -e "${reg_dir_path}" ]; then
Mar 20 15:40:27 crc kubenswrapper[4730]:                   cp -u $ca_file_path $reg_dir_path/ca.crt
Mar 20 15:40:27 crc kubenswrapper[4730]:               else
Mar 20 15:40:27 crc kubenswrapper[4730]:                   mkdir $reg_dir_path
Mar 20 15:40:27 crc kubenswrapper[4730]:                   cp $ca_file_path $reg_dir_path/ca.crt
Mar 20 15:40:27 crc kubenswrapper[4730]:               fi
Mar 20 15:40:27 crc kubenswrapper[4730]:           done
Mar 20 15:40:27 crc kubenswrapper[4730]:           for d in $(ls /etc/docker/certs.d); do
Mar 20 15:40:27 crc kubenswrapper[4730]:               echo $d
Mar 20 15:40:27 crc kubenswrapper[4730]:               dp=$(echo $d | sed  -r 's/(.*):/\1\.\./')
Mar 20 15:40:27 crc kubenswrapper[4730]:               reg_conf_path="/tmp/serviceca/${dp}"
Mar 20 15:40:27 crc kubenswrapper[4730]:               if [ ! -e "${reg_conf_path}" ]; then
Mar 20 15:40:27 crc kubenswrapper[4730]:                   rm -rf /etc/docker/certs.d/$d
Mar 20 15:40:27 crc kubenswrapper[4730]:               fi
Mar 20 15:40:27 crc kubenswrapper[4730]:           done
Mar 20 15:40:27 crc kubenswrapper[4730]:           sleep 60 & wait ${!}
Mar 20 15:40:27 crc kubenswrapper[4730]:         done
Mar 20 15:40:27 crc kubenswrapper[4730]:         ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2fvg6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-n4w74_openshift-image-registry(2ee8d55f-90bd-4484-8455-933de455efea): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars
Mar 20 15:40:27 crc kubenswrapper[4730]:  > logger="UnhandledError"
Mar 20 15:40:27 crc kubenswrapper[4730]: E0320 15:40:27.536467    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-69fnw" podUID="102cb977-7291-453e-9282-20572071afee"
Mar 20 15:40:27 crc kubenswrapper[4730]: E0320 15:40:27.536523    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-n4w74" podUID="2ee8d55f-90bd-4484-8455-933de455efea"
Mar 20 15:40:27 crc kubenswrapper[4730]: E0320 15:40:27.536502    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312"
Mar 20 15:40:27 crc kubenswrapper[4730]: I0320 15:40:27.541844    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:27 crc kubenswrapper[4730]: I0320 15:40:27.541886    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:27 crc kubenswrapper[4730]: I0320 15:40:27.541898    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:27 crc kubenswrapper[4730]: I0320 15:40:27.541914    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:27 crc kubenswrapper[4730]: I0320 15:40:27.541927    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:27Z","lastTransitionTime":"2026-03-20T15:40:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:27 crc kubenswrapper[4730]: I0320 15:40:27.644431    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:27 crc kubenswrapper[4730]: I0320 15:40:27.644468    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:27 crc kubenswrapper[4730]: I0320 15:40:27.644477    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:27 crc kubenswrapper[4730]: I0320 15:40:27.644492    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:27 crc kubenswrapper[4730]: I0320 15:40:27.644502    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:27Z","lastTransitionTime":"2026-03-20T15:40:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:27 crc kubenswrapper[4730]: I0320 15:40:27.747418    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:27 crc kubenswrapper[4730]: I0320 15:40:27.747467    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:27 crc kubenswrapper[4730]: I0320 15:40:27.747483    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:27 crc kubenswrapper[4730]: I0320 15:40:27.747505    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:27 crc kubenswrapper[4730]: I0320 15:40:27.747521    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:27Z","lastTransitionTime":"2026-03-20T15:40:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:27 crc kubenswrapper[4730]: I0320 15:40:27.850830    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:27 crc kubenswrapper[4730]: I0320 15:40:27.850874    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:27 crc kubenswrapper[4730]: I0320 15:40:27.850890    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:27 crc kubenswrapper[4730]: I0320 15:40:27.850910    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:27 crc kubenswrapper[4730]: I0320 15:40:27.850925    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:27Z","lastTransitionTime":"2026-03-20T15:40:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:27 crc kubenswrapper[4730]: I0320 15:40:27.953548    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:27 crc kubenswrapper[4730]: I0320 15:40:27.953622    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:27 crc kubenswrapper[4730]: I0320 15:40:27.953642    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:27 crc kubenswrapper[4730]: I0320 15:40:27.953665    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:27 crc kubenswrapper[4730]: I0320 15:40:27.953682    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:27Z","lastTransitionTime":"2026-03-20T15:40:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.056632    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.057222    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.057310    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.057374    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.057445    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:28Z","lastTransitionTime":"2026-03-20T15:40:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.154106    4730 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.159898    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.159952    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.159966    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.159984    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.159996    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:28Z","lastTransitionTime":"2026-03-20T15:40:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.261964    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.262309    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.262322    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.262335    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.262343    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:28Z","lastTransitionTime":"2026-03-20T15:40:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.364950    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.365000    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.365018    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.365044    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.365062    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:28Z","lastTransitionTime":"2026-03-20T15:40:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.368432    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.368555    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.368613    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.368659    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.368725    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:40:28 crc kubenswrapper[4730]: E0320 15:40:28.368842    4730 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered
Mar 20 15:40:28 crc kubenswrapper[4730]: E0320 15:40:28.368916    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 15:40:44.368894895 +0000 UTC m=+103.582266304 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered
Mar 20 15:40:28 crc kubenswrapper[4730]: E0320 15:40:28.369494    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:40:44.36946794 +0000 UTC m=+103.582839359 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:40:28 crc kubenswrapper[4730]: E0320 15:40:28.369639    4730 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered
Mar 20 15:40:28 crc kubenswrapper[4730]: E0320 15:40:28.369724    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 15:40:44.369702207 +0000 UTC m=+103.583073616 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered
Mar 20 15:40:28 crc kubenswrapper[4730]: E0320 15:40:28.369862    4730 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered
Mar 20 15:40:28 crc kubenswrapper[4730]: E0320 15:40:28.369909    4730 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered
Mar 20 15:40:28 crc kubenswrapper[4730]: E0320 15:40:28.369936    4730 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered]
Mar 20 15:40:28 crc kubenswrapper[4730]: E0320 15:40:28.369997    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 15:40:44.369980634 +0000 UTC m=+103.583352043 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered]
Mar 20 15:40:28 crc kubenswrapper[4730]: E0320 15:40:28.370120    4730 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered
Mar 20 15:40:28 crc kubenswrapper[4730]: E0320 15:40:28.370156    4730 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered
Mar 20 15:40:28 crc kubenswrapper[4730]: E0320 15:40:28.370177    4730 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered]
Mar 20 15:40:28 crc kubenswrapper[4730]: E0320 15:40:28.370235    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 15:40:44.370215851 +0000 UTC m=+103.583587270 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered]
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.467861    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.468336    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.468548    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.468723    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.468896    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:28Z","lastTransitionTime":"2026-03-20T15:40:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.532639    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.532876    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.532945    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:40:28 crc kubenswrapper[4730]: E0320 15:40:28.533123    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.533263    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:40:28 crc kubenswrapper[4730]: E0320 15:40:28.534597    4730 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lzk8j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError"
Mar 20 15:40:28 crc kubenswrapper[4730]: E0320 15:40:28.534705    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:40:28 crc kubenswrapper[4730]: E0320 15:40:28.534734    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:40:28 crc kubenswrapper[4730]: E0320 15:40:28.535054    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.535086    4730 scope.go:117] "RemoveContainer" containerID="688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5"
Mar 20 15:40:28 crc kubenswrapper[4730]: E0320 15:40:28.535445    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792"
Mar 20 15:40:28 crc kubenswrapper[4730]: E0320 15:40:28.536743    4730 kuberuntime_manager.go:1274] "Unhandled Error" err=<
Mar 20 15:40:28 crc kubenswrapper[4730]:         init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig
Mar 20 15:40:28 crc kubenswrapper[4730]:         apiVersion: v1
Mar 20 15:40:28 crc kubenswrapper[4730]:         clusters:
Mar 20 15:40:28 crc kubenswrapper[4730]:           - cluster:
Mar 20 15:40:28 crc kubenswrapper[4730]:               certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt
Mar 20 15:40:28 crc kubenswrapper[4730]:               server: https://api-int.crc.testing:6443
Mar 20 15:40:28 crc kubenswrapper[4730]:             name: default-cluster
Mar 20 15:40:28 crc kubenswrapper[4730]:         contexts:
Mar 20 15:40:28 crc kubenswrapper[4730]:           - context:
Mar 20 15:40:28 crc kubenswrapper[4730]:               cluster: default-cluster
Mar 20 15:40:28 crc kubenswrapper[4730]:               namespace: default
Mar 20 15:40:28 crc kubenswrapper[4730]:               user: default-auth
Mar 20 15:40:28 crc kubenswrapper[4730]:             name: default-context
Mar 20 15:40:28 crc kubenswrapper[4730]:         current-context: default-context
Mar 20 15:40:28 crc kubenswrapper[4730]:         kind: Config
Mar 20 15:40:28 crc kubenswrapper[4730]:         preferences: {}
Mar 20 15:40:28 crc kubenswrapper[4730]:         users:
Mar 20 15:40:28 crc kubenswrapper[4730]:           - name: default-auth
Mar 20 15:40:28 crc kubenswrapper[4730]:             user:
Mar 20 15:40:28 crc kubenswrapper[4730]:               client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem
Mar 20 15:40:28 crc kubenswrapper[4730]:               client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem
Mar 20 15:40:28 crc kubenswrapper[4730]:         EOF
Mar 20 15:40:28 crc kubenswrapper[4730]:         ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mz64b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-qj97f_openshift-ovn-kubernetes(c4b4e0e8-af33-491e-b1d1-31079d90c656): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars
Mar 20 15:40:28 crc kubenswrapper[4730]:  > logger="UnhandledError"
Mar 20 15:40:28 crc kubenswrapper[4730]: E0320 15:40:28.537647    4730 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lzk8j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError"
Mar 20 15:40:28 crc kubenswrapper[4730]: E0320 15:40:28.538781    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 15:40:28 crc kubenswrapper[4730]: E0320 15:40:28.538792    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656"
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.570845    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a-metrics-certs\") pod \"network-metrics-daemon-2prfn\" (UID: \"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\") " pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:40:28 crc kubenswrapper[4730]: E0320 15:40:28.571530    4730 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.571946    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.571977    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.571986    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:28 crc kubenswrapper[4730]: E0320 15:40:28.571952    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a-metrics-certs podName:db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a nodeName:}" failed. No retries permitted until 2026-03-20 15:40:44.57187206 +0000 UTC m=+103.785243489 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a-metrics-certs") pod "network-metrics-daemon-2prfn" (UID: "db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a") : object "openshift-multus"/"metrics-daemon-secret" not registered
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.572001    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.572012    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:28Z","lastTransitionTime":"2026-03-20T15:40:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.674114    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.674159    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.674176    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.674201    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.674220    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:28Z","lastTransitionTime":"2026-03-20T15:40:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.776911    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.776960    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.776977    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.776998    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.777015    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:28Z","lastTransitionTime":"2026-03-20T15:40:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.878414    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.878443    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.878452    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.878464    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.878472    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:28Z","lastTransitionTime":"2026-03-20T15:40:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.980235    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.980280    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.980288    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.980301    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:28 crc kubenswrapper[4730]: I0320 15:40:28.980309    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:28Z","lastTransitionTime":"2026-03-20T15:40:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:29 crc kubenswrapper[4730]: I0320 15:40:29.082831    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:29 crc kubenswrapper[4730]: I0320 15:40:29.082857    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:29 crc kubenswrapper[4730]: I0320 15:40:29.082866    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:29 crc kubenswrapper[4730]: I0320 15:40:29.082879    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:29 crc kubenswrapper[4730]: I0320 15:40:29.082887    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:29Z","lastTransitionTime":"2026-03-20T15:40:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:29 crc kubenswrapper[4730]: I0320 15:40:29.185670    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:29 crc kubenswrapper[4730]: I0320 15:40:29.185711    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:29 crc kubenswrapper[4730]: I0320 15:40:29.185722    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:29 crc kubenswrapper[4730]: I0320 15:40:29.185738    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:29 crc kubenswrapper[4730]: I0320 15:40:29.185750    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:29Z","lastTransitionTime":"2026-03-20T15:40:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:29 crc kubenswrapper[4730]: I0320 15:40:29.288207    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:29 crc kubenswrapper[4730]: I0320 15:40:29.288291    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:29 crc kubenswrapper[4730]: I0320 15:40:29.288311    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:29 crc kubenswrapper[4730]: I0320 15:40:29.288337    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:29 crc kubenswrapper[4730]: I0320 15:40:29.288355    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:29Z","lastTransitionTime":"2026-03-20T15:40:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:29 crc kubenswrapper[4730]: I0320 15:40:29.391533    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:29 crc kubenswrapper[4730]: I0320 15:40:29.391588    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:29 crc kubenswrapper[4730]: I0320 15:40:29.391598    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:29 crc kubenswrapper[4730]: I0320 15:40:29.391615    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:29 crc kubenswrapper[4730]: I0320 15:40:29.391627    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:29Z","lastTransitionTime":"2026-03-20T15:40:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:29 crc kubenswrapper[4730]: I0320 15:40:29.495529    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:29 crc kubenswrapper[4730]: I0320 15:40:29.495633    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:29 crc kubenswrapper[4730]: I0320 15:40:29.495663    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:29 crc kubenswrapper[4730]: I0320 15:40:29.495701    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:29 crc kubenswrapper[4730]: I0320 15:40:29.495726    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:29Z","lastTransitionTime":"2026-03-20T15:40:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:29 crc kubenswrapper[4730]: I0320 15:40:29.598583    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:29 crc kubenswrapper[4730]: I0320 15:40:29.598635    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:29 crc kubenswrapper[4730]: I0320 15:40:29.598648    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:29 crc kubenswrapper[4730]: I0320 15:40:29.598671    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:29 crc kubenswrapper[4730]: I0320 15:40:29.598686    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:29Z","lastTransitionTime":"2026-03-20T15:40:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:29 crc kubenswrapper[4730]: I0320 15:40:29.701432    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:29 crc kubenswrapper[4730]: I0320 15:40:29.701489    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:29 crc kubenswrapper[4730]: I0320 15:40:29.701501    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:29 crc kubenswrapper[4730]: I0320 15:40:29.701520    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:29 crc kubenswrapper[4730]: I0320 15:40:29.701532    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:29Z","lastTransitionTime":"2026-03-20T15:40:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:29 crc kubenswrapper[4730]: I0320 15:40:29.804591    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:29 crc kubenswrapper[4730]: I0320 15:40:29.804731    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:29 crc kubenswrapper[4730]: I0320 15:40:29.804749    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:29 crc kubenswrapper[4730]: I0320 15:40:29.804773    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:29 crc kubenswrapper[4730]: I0320 15:40:29.804791    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:29Z","lastTransitionTime":"2026-03-20T15:40:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:29 crc kubenswrapper[4730]: I0320 15:40:29.908576    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:29 crc kubenswrapper[4730]: I0320 15:40:29.908865    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:29 crc kubenswrapper[4730]: I0320 15:40:29.908932    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:29 crc kubenswrapper[4730]: I0320 15:40:29.909010    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:29 crc kubenswrapper[4730]: I0320 15:40:29.909076    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:29Z","lastTransitionTime":"2026-03-20T15:40:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.012466    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.012496    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.012526    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.012541    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.012553    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:30Z","lastTransitionTime":"2026-03-20T15:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.115658    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.115694    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.115705    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.116414    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.116626    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:30Z","lastTransitionTime":"2026-03-20T15:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.218818    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.219079    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.219295    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.219522    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.219732    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:30Z","lastTransitionTime":"2026-03-20T15:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.322197    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.322518    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.322669    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.322774    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.322846    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:30Z","lastTransitionTime":"2026-03-20T15:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.426224    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.426291    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.426305    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.426326    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.426337    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:30Z","lastTransitionTime":"2026-03-20T15:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.528978    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.529022    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.529031    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.529047    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.529057    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:30Z","lastTransitionTime":"2026-03-20T15:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.532676    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:40:30 crc kubenswrapper[4730]: E0320 15:40:30.532773    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.532863    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.532912    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:40:30 crc kubenswrapper[4730]: E0320 15:40:30.532971    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:40:30 crc kubenswrapper[4730]: E0320 15:40:30.533763    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.534012    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:40:30 crc kubenswrapper[4730]: E0320 15:40:30.534447    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.632765    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.632838    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.632856    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.632875    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.632903    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:30Z","lastTransitionTime":"2026-03-20T15:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.736289    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.736346    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.736363    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.736384    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.736401    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:30Z","lastTransitionTime":"2026-03-20T15:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.839814    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.839866    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.839881    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.839905    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.839924    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:30Z","lastTransitionTime":"2026-03-20T15:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.942626    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.942690    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.942710    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.942761    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:30 crc kubenswrapper[4730]: I0320 15:40:30.942780    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:30Z","lastTransitionTime":"2026-03-20T15:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.046280    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.046626    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.046763    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.046910    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.047045    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:31Z","lastTransitionTime":"2026-03-20T15:40:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.149647    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.149706    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.149716    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.149728    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.149736    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:31Z","lastTransitionTime":"2026-03-20T15:40:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.252690    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.252738    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.252747    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.252761    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.252771    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:31Z","lastTransitionTime":"2026-03-20T15:40:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.354970    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.355003    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.355013    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.355027    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.355053    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:31Z","lastTransitionTime":"2026-03-20T15:40:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.457310    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.457354    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.457366    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.457383    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.457393    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:31Z","lastTransitionTime":"2026-03-20T15:40:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.546833    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.557417    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n4w74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ee8d55f-90bd-4484-8455-933de455efea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fvg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n4w74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.560455    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.560705    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.560859    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.561004    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.561130    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:31Z","lastTransitionTime":"2026-03-20T15:40:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.568436    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-49hht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-49hht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.583095    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c3fe2f-3c67-4dee-becb-3ecfe2758384\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2660f772d76479685f586aeb9b242975695ac5e28668840bac6b8a75c09443a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d96c704317c9bfbcb499beae611ca6fcb6337084728df20ad791267de34359b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://180d79b8131e00449715e74d883031078dcceb74ca44ea372b6c3f651d617007\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:39:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:39:59.021893       1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:39:59.022050       1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:39:59.023119       1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311534806/tls.crt::/tmp/serving-cert-3311534806/tls.key\\\\\\\"\\\\nI0320 15:39:59.307640       1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:39:59.310126       1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:39:59.310145       1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:39:59.310173       1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:39:59.310181       1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:39:59.315103       1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 15:39:59.315287       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 15:39:59.315111       1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:39:59.315373       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:39:59.315499       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:39:59.315558       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:39:59.315615       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:39:59.315659       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:39:59.316192       1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8e7f1bfc56e54242e12f30487e8120eeb3d85dc86cfc17b587255889bd60bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.592350    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p47zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.606195    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.617180    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fcd3db3-55f1-4c23-8fa9-78844495cea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p5qvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.632903    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-69fnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102cb977-7291-453e-9282-20572071afee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plthx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-69fnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.645500    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07c87d31-bb33-4cbb-8786-b42cee6340a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4298d984334a4955a24ac85a16284d8ffe1e4f3033c328be6e66f59d3b17f1b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.655148    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.662694    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.662743    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.662755    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.662774    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.662787    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:31Z","lastTransitionTime":"2026-03-20T15:40:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.666206    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.676773    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6r2kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvthz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6r2kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.684165    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2prfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2prfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.694091    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.704376    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.722130    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b4e0e8-af33-491e-b1d1-31079d90c656\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj97f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.765418    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.765462    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.765474    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.765490    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.765502    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:31Z","lastTransitionTime":"2026-03-20T15:40:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.868330    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.868390    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.868410    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.868436    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.868453    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:31Z","lastTransitionTime":"2026-03-20T15:40:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.970674    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.970712    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.970723    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.970741    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:31 crc kubenswrapper[4730]: I0320 15:40:31.970753    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:31Z","lastTransitionTime":"2026-03-20T15:40:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.072699    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.072758    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.072772    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.072789    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.072800    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:32Z","lastTransitionTime":"2026-03-20T15:40:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.175183    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.175237    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.175281    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.175301    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.175314    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:32Z","lastTransitionTime":"2026-03-20T15:40:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.277634    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.277684    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.277697    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.277717    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.277729    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:32Z","lastTransitionTime":"2026-03-20T15:40:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.380297    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.380344    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.380356    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.380373    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.380385    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:32Z","lastTransitionTime":"2026-03-20T15:40:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.483146    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.483186    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.483195    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.483209    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.483218    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:32Z","lastTransitionTime":"2026-03-20T15:40:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.532969    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.533079    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.533110    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.533158    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:40:32 crc kubenswrapper[4730]: E0320 15:40:32.533232    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:40:32 crc kubenswrapper[4730]: E0320 15:40:32.533451    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:40:32 crc kubenswrapper[4730]: E0320 15:40:32.533811    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:40:32 crc kubenswrapper[4730]: E0320 15:40:32.533719    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.586103    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.586187    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.586215    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.586286    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.586306    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:32Z","lastTransitionTime":"2026-03-20T15:40:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.688417    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.688487    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.688509    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.688573    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.688599    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:32Z","lastTransitionTime":"2026-03-20T15:40:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.790869    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.790971    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.791012    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.791040    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.791059    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:32Z","lastTransitionTime":"2026-03-20T15:40:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.894874    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.894934    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.894951    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.894976    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.894994    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:32Z","lastTransitionTime":"2026-03-20T15:40:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.997898    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.997949    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.997967    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.997992    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:32 crc kubenswrapper[4730]: I0320 15:40:32.998022    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:32Z","lastTransitionTime":"2026-03-20T15:40:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.101119    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.101179    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.101198    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.101222    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.101239    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:33Z","lastTransitionTime":"2026-03-20T15:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.134702    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.134767    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.134790    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.134818    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.134840    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:33Z","lastTransitionTime":"2026-03-20T15:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:33 crc kubenswrapper[4730]: E0320 15:40:33.151187    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"666d62d4-aa52-41cc-be79-8c9a068e7752\\\",\\\"systemUUID\\\":\\\"dfe7d645-fe91-432e-8360-ef4633bfea29\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.155967    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.156050    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.156077    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.156108    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.156131    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:33Z","lastTransitionTime":"2026-03-20T15:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:33 crc kubenswrapper[4730]: E0320 15:40:33.172043    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"666d62d4-aa52-41cc-be79-8c9a068e7752\\\",\\\"systemUUID\\\":\\\"dfe7d645-fe91-432e-8360-ef4633bfea29\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.176617    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.176671    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.176691    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.176714    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.176731    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:33Z","lastTransitionTime":"2026-03-20T15:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:33 crc kubenswrapper[4730]: E0320 15:40:33.194231    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"666d62d4-aa52-41cc-be79-8c9a068e7752\\\",\\\"systemUUID\\\":\\\"dfe7d645-fe91-432e-8360-ef4633bfea29\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.199316    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.199358    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.199372    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.199388    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.199400    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:33Z","lastTransitionTime":"2026-03-20T15:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:33 crc kubenswrapper[4730]: E0320 15:40:33.213387    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"666d62d4-aa52-41cc-be79-8c9a068e7752\\\",\\\"systemUUID\\\":\\\"dfe7d645-fe91-432e-8360-ef4633bfea29\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.218770    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.218825    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.218842    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.218867    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.218884    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:33Z","lastTransitionTime":"2026-03-20T15:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:33 crc kubenswrapper[4730]: E0320 15:40:33.234337    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"666d62d4-aa52-41cc-be79-8c9a068e7752\\\",\\\"systemUUID\\\":\\\"dfe7d645-fe91-432e-8360-ef4633bfea29\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:33 crc kubenswrapper[4730]: E0320 15:40:33.234527    4730 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.236471    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.236549    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.236562    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.236580    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.236591    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:33Z","lastTransitionTime":"2026-03-20T15:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.339415    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.339481    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.339499    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.339524    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.339540    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:33Z","lastTransitionTime":"2026-03-20T15:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.442476    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.442537    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.442554    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.442577    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.442595    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:33Z","lastTransitionTime":"2026-03-20T15:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.545036    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.545119    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.545139    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.545169    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.545192    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:33Z","lastTransitionTime":"2026-03-20T15:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.647887    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.647923    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.647934    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.647951    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.647963    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:33Z","lastTransitionTime":"2026-03-20T15:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.750495    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.750545    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.750555    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.750574    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.750585    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:33Z","lastTransitionTime":"2026-03-20T15:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.854623    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.854864    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.854932    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.855001    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.855070    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:33Z","lastTransitionTime":"2026-03-20T15:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.957795    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.958032    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.958098    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.958170    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:33 crc kubenswrapper[4730]: I0320 15:40:33.958230    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:33Z","lastTransitionTime":"2026-03-20T15:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.060591    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.060644    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.060663    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.060691    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.060713    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:34Z","lastTransitionTime":"2026-03-20T15:40:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.163784    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.163857    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.163871    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.163897    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.163913    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:34Z","lastTransitionTime":"2026-03-20T15:40:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.266876    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.266922    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.266942    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.266963    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.266978    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:34Z","lastTransitionTime":"2026-03-20T15:40:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.368907    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.368971    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.368990    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.369024    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.369041    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:34Z","lastTransitionTime":"2026-03-20T15:40:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.471929    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.472001    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.472012    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.472038    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.472053    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:34Z","lastTransitionTime":"2026-03-20T15:40:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.532526    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.532580    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.532534    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:40:34 crc kubenswrapper[4730]: E0320 15:40:34.532662    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.532526    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:40:34 crc kubenswrapper[4730]: E0320 15:40:34.532811    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:40:34 crc kubenswrapper[4730]: E0320 15:40:34.532906    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:40:34 crc kubenswrapper[4730]: E0320 15:40:34.532995    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.574316    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.574347    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.574357    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.574378    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.574392    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:34Z","lastTransitionTime":"2026-03-20T15:40:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.588583    4730 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.678883    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.679056    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.679075    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.679100    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.679118    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:34Z","lastTransitionTime":"2026-03-20T15:40:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.781324    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.781675    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.781864    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.782051    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.782204    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:34Z","lastTransitionTime":"2026-03-20T15:40:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.885348    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.885420    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.885442    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.885468    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.885485    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:34Z","lastTransitionTime":"2026-03-20T15:40:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.988981    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.989049    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.989087    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.989120    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:34 crc kubenswrapper[4730]: I0320 15:40:34.989141    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:34Z","lastTransitionTime":"2026-03-20T15:40:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:35 crc kubenswrapper[4730]: I0320 15:40:35.092446    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:35 crc kubenswrapper[4730]: I0320 15:40:35.092492    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:35 crc kubenswrapper[4730]: I0320 15:40:35.092503    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:35 crc kubenswrapper[4730]: I0320 15:40:35.092521    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:35 crc kubenswrapper[4730]: I0320 15:40:35.092532    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:35Z","lastTransitionTime":"2026-03-20T15:40:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:35 crc kubenswrapper[4730]: I0320 15:40:35.195195    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:35 crc kubenswrapper[4730]: I0320 15:40:35.195238    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:35 crc kubenswrapper[4730]: I0320 15:40:35.195266    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:35 crc kubenswrapper[4730]: I0320 15:40:35.195282    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:35 crc kubenswrapper[4730]: I0320 15:40:35.195293    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:35Z","lastTransitionTime":"2026-03-20T15:40:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:35 crc kubenswrapper[4730]: I0320 15:40:35.298198    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:35 crc kubenswrapper[4730]: I0320 15:40:35.298283    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:35 crc kubenswrapper[4730]: I0320 15:40:35.298305    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:35 crc kubenswrapper[4730]: I0320 15:40:35.298346    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:35 crc kubenswrapper[4730]: I0320 15:40:35.298382    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:35Z","lastTransitionTime":"2026-03-20T15:40:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:35 crc kubenswrapper[4730]: I0320 15:40:35.401689    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:35 crc kubenswrapper[4730]: I0320 15:40:35.401718    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:35 crc kubenswrapper[4730]: I0320 15:40:35.401726    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:35 crc kubenswrapper[4730]: I0320 15:40:35.401739    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:35 crc kubenswrapper[4730]: I0320 15:40:35.401748    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:35Z","lastTransitionTime":"2026-03-20T15:40:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:35 crc kubenswrapper[4730]: I0320 15:40:35.504837    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:35 crc kubenswrapper[4730]: I0320 15:40:35.504918    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:35 crc kubenswrapper[4730]: I0320 15:40:35.504935    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:35 crc kubenswrapper[4730]: I0320 15:40:35.504960    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:35 crc kubenswrapper[4730]: I0320 15:40:35.504978    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:35Z","lastTransitionTime":"2026-03-20T15:40:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:35 crc kubenswrapper[4730]: I0320 15:40:35.607820    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:35 crc kubenswrapper[4730]: I0320 15:40:35.607856    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:35 crc kubenswrapper[4730]: I0320 15:40:35.607868    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:35 crc kubenswrapper[4730]: I0320 15:40:35.607885    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:35 crc kubenswrapper[4730]: I0320 15:40:35.607896    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:35Z","lastTransitionTime":"2026-03-20T15:40:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:35 crc kubenswrapper[4730]: I0320 15:40:35.678676    4730 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160
Mar 20 15:40:35 crc kubenswrapper[4730]: I0320 15:40:35.711464    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:35 crc kubenswrapper[4730]: I0320 15:40:35.711515    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:35 crc kubenswrapper[4730]: I0320 15:40:35.711540    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:35 crc kubenswrapper[4730]: I0320 15:40:35.711566    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:35 crc kubenswrapper[4730]: I0320 15:40:35.711580    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:35Z","lastTransitionTime":"2026-03-20T15:40:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:35 crc kubenswrapper[4730]: I0320 15:40:35.814542    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:35 crc kubenswrapper[4730]: I0320 15:40:35.814642    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:35 crc kubenswrapper[4730]: I0320 15:40:35.814667    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:35 crc kubenswrapper[4730]: I0320 15:40:35.814756    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:35 crc kubenswrapper[4730]: I0320 15:40:35.814834    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:35Z","lastTransitionTime":"2026-03-20T15:40:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:35 crc kubenswrapper[4730]: I0320 15:40:35.917310    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:35 crc kubenswrapper[4730]: I0320 15:40:35.917342    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:35 crc kubenswrapper[4730]: I0320 15:40:35.917355    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:35 crc kubenswrapper[4730]: I0320 15:40:35.917372    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:35 crc kubenswrapper[4730]: I0320 15:40:35.917384    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:35Z","lastTransitionTime":"2026-03-20T15:40:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.019338    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.019393    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.019410    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.019436    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.019453    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:36Z","lastTransitionTime":"2026-03-20T15:40:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.121958    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.122011    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.122032    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.122066    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.122100    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:36Z","lastTransitionTime":"2026-03-20T15:40:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.224402    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.224454    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.224468    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.224487    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.224500    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:36Z","lastTransitionTime":"2026-03-20T15:40:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.327385    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.327435    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.327448    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.327467    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.327479    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:36Z","lastTransitionTime":"2026-03-20T15:40:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.431085    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.431168    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.431190    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.431225    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.431276    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:36Z","lastTransitionTime":"2026-03-20T15:40:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.532172    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.532264    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.532880    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:40:36 crc kubenswrapper[4730]: E0320 15:40:36.532959    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.533013    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:40:36 crc kubenswrapper[4730]: E0320 15:40:36.533087    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:40:36 crc kubenswrapper[4730]: E0320 15:40:36.533153    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:40:36 crc kubenswrapper[4730]: E0320 15:40:36.533221    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.535452    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.535477    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.535487    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.535523    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.535534    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:36Z","lastTransitionTime":"2026-03-20T15:40:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.638466    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.638501    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.638510    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.638524    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.638534    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:36Z","lastTransitionTime":"2026-03-20T15:40:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.741204    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.741831    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.741847    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.741867    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.741880    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:36Z","lastTransitionTime":"2026-03-20T15:40:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.844227    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.844299    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.844311    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.844326    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.844350    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:36Z","lastTransitionTime":"2026-03-20T15:40:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.907106    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e647990de11fdbff5c61eb5f2a4f36ee1c2562141d84795cf0d7bbc1155cac28"}
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.907164    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"188974c8e34d453f129d95f8a3ee36ff86b44b37a5388a0643611206f15d57c8"}
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.909150    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-49hht" event={"ID":"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9","Type":"ContainerStarted","Data":"458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85"}
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.916697    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.927320    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6r2kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvthz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6r2kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.935052    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2prfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2prfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.941327    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07c87d31-bb33-4cbb-8786-b42cee6340a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4298d984334a4955a24ac85a16284d8ffe1e4f3033c328be6e66f59d3b17f1b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.946324    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.946350    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.946359    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.946371    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.946381    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:36Z","lastTransitionTime":"2026-03-20T15:40:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.948103    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.964777    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b4e0e8-af33-491e-b1d1-31079d90c656\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj97f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.974469    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.982175    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.990161    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:36 crc kubenswrapper[4730]: I0320 15:40:36.998146    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n4w74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ee8d55f-90bd-4484-8455-933de455efea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fvg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n4w74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.007331    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e647990de11fdbff5c61eb5f2a4f36ee1c2562141d84795cf0d7bbc1155cac28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://188974c8e34d453f129d95f8a3ee36ff86b44b37a5388a0643611206f15d57c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.015463    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fcd3db3-55f1-4c23-8fa9-78844495cea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p5qvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.022744    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-69fnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102cb977-7291-453e-9282-20572071afee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plthx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-69fnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.039261    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-49hht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-49hht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.048987    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.049017    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.049026    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.049039    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.049049    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:37Z","lastTransitionTime":"2026-03-20T15:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.049275    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c3fe2f-3c67-4dee-becb-3ecfe2758384\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2660f772d76479685f586aeb9b242975695ac5e28668840bac6b8a75c09443a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d96c704317c9bfbcb499beae611ca6fcb6337084728df20ad791267de34359b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://180d79b8131e00449715e74d883031078dcceb74ca44ea372b6c3f651d617007\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:39:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:39:59.021893       1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:39:59.022050       1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:39:59.023119       1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311534806/tls.crt::/tmp/serving-cert-3311534806/tls.key\\\\\\\"\\\\nI0320 15:39:59.307640       1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:39:59.310126       1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:39:59.310145       1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:39:59.310173       1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:39:59.310181       1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:39:59.315103       1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 15:39:59.315287       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 15:39:59.315111       1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:39:59.315373       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:39:59.315499       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:39:59.315558       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:39:59.315615       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:39:59.315659       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:39:59.316192       1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8e7f1bfc56e54242e12f30487e8120eeb3d85dc86cfc17b587255889bd60bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.055305    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p47zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.061840    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n4w74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ee8d55f-90bd-4484-8455-933de455efea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fvg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n4w74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.072481    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.082455    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c3fe2f-3c67-4dee-becb-3ecfe2758384\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2660f772d76479685f586aeb9b242975695ac5e28668840bac6b8a75c09443a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d96c704317c9bfbcb499beae611ca6fcb6337084728df20ad791267de34359b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://180d79b8131e00449715e74d883031078dcceb74ca44ea372b6c3f651d617007\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:39:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:39:59.021893       1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:39:59.022050       1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:39:59.023119       1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311534806/tls.crt::/tmp/serving-cert-3311534806/tls.key\\\\\\\"\\\\nI0320 15:39:59.307640       1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:39:59.310126       1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:39:59.310145       1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:39:59.310173       1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:39:59.310181       1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:39:59.315103       1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 15:39:59.315287       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 15:39:59.315111       1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:39:59.315373       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:39:59.315499       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:39:59.315558       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:39:59.315615       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:39:59.315659       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:39:59.316192       1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8e7f1bfc56e54242e12f30487e8120eeb3d85dc86cfc17b587255889bd60bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.089211    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p47zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.096170    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e647990de11fdbff5c61eb5f2a4f36ee1c2562141d84795cf0d7bbc1155cac28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://188974c8e34d453f129d95f8a3ee36ff86b44b37a5388a0643611206f15d57c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.103203    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fcd3db3-55f1-4c23-8fa9-78844495cea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p5qvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.108521    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-69fnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102cb977-7291-453e-9282-20572071afee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plthx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-69fnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.118521    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-49hht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-49hht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.124121    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07c87d31-bb33-4cbb-8786-b42cee6340a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4298d984334a4955a24ac85a16284d8ffe1e4f3033c328be6e66f59d3b17f1b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.132372    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.139808    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.149760    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6r2kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvthz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6r2kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.151425    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.151450    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.151459    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.151472    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.151482    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:37Z","lastTransitionTime":"2026-03-20T15:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.158001    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2prfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2prfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.169113    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.178906    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.198537    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b4e0e8-af33-491e-b1d1-31079d90c656\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj97f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:37Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.253501    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.253535    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.253547    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.253563    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.253575    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:37Z","lastTransitionTime":"2026-03-20T15:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.355746    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.356317    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.356340    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.356356    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.356369    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:37Z","lastTransitionTime":"2026-03-20T15:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.458802    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.458829    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.458838    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.458850    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.458858    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:37Z","lastTransitionTime":"2026-03-20T15:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.560537    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.560578    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.560589    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.560605    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.560625    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:37Z","lastTransitionTime":"2026-03-20T15:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.663463    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.663506    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.663518    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.663537    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.663549    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:37Z","lastTransitionTime":"2026-03-20T15:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.766169    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.766201    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.766213    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.766227    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.766237    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:37Z","lastTransitionTime":"2026-03-20T15:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.870457    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.870508    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.870522    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.870564    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.870582    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:37Z","lastTransitionTime":"2026-03-20T15:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.913224    4730 generic.go:334] "Generic (PLEG): container finished" podID="dbb015c0-3a11-48bf-a59f-22bc03ca2fb9" containerID="458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85" exitCode=0
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.913330    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-49hht" event={"ID":"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9","Type":"ContainerDied","Data":"458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85"}
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.930380    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c3fe2f-3c67-4dee-becb-3ecfe2758384\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2660f772d76479685f586aeb9b242975695ac5e28668840bac6b8a75c09443a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d96c704317c9bfbcb499beae611ca6fcb6337084728df20ad791267de34359b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://180d79b8131e00449715e74d883031078dcceb74ca44ea372b6c3f651d617007\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:39:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:39:59.021893       1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:39:59.022050       1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:39:59.023119       1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311534806/tls.crt::/tmp/serving-cert-3311534806/tls.key\\\\\\\"\\\\nI0320 15:39:59.307640       1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:39:59.310126       1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:39:59.310145       1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:39:59.310173       1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:39:59.310181       1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:39:59.315103       1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 15:39:59.315287       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 15:39:59.315111       1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:39:59.315373       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:39:59.315499       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:39:59.315558       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:39:59.315615       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:39:59.315659       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:39:59.316192       1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8e7f1bfc56e54242e12f30487e8120eeb3d85dc86cfc17b587255889bd60bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:37Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.953578    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p47zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:37Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.968056    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e647990de11fdbff5c61eb5f2a4f36ee1c2562141d84795cf0d7bbc1155cac28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://188974c8e34d453f129d95f8a3ee36ff86b44b37a5388a0643611206f15d57c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:37Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.973576    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.973637    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.973655    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.973682    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.973753    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:37Z","lastTransitionTime":"2026-03-20T15:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.980137    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fcd3db3-55f1-4c23-8fa9-78844495cea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p5qvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:37Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:37 crc kubenswrapper[4730]: I0320 15:40:37.991331    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-69fnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102cb977-7291-453e-9282-20572071afee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plthx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-69fnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:37Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.004212    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-49hht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-49hht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:38Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.013488    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07c87d31-bb33-4cbb-8786-b42cee6340a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4298d984334a4955a24ac85a16284d8ffe1e4f3033c328be6e66f59d3b17f1b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:38Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.027234    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:38Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.042591    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:38Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.056585    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6r2kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvthz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6r2kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:38Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.066313    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2prfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2prfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:38Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.076630    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:38Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.078231    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.078280    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.078291    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.078305    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.078314    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:38Z","lastTransitionTime":"2026-03-20T15:40:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.086633    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:38Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.102097    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b4e0e8-af33-491e-b1d1-31079d90c656\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj97f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:38Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.111437    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:38Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.119560    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n4w74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ee8d55f-90bd-4484-8455-933de455efea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fvg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n4w74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:38Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.180804    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.181082    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.181158    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.181230    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.181320    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:38Z","lastTransitionTime":"2026-03-20T15:40:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.284334    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.284370    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.284382    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.284398    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.284410    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:38Z","lastTransitionTime":"2026-03-20T15:40:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.387454    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.387507    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.387532    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.387555    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.387571    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:38Z","lastTransitionTime":"2026-03-20T15:40:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.491085    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.491151    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.491168    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.491197    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.491215    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:38Z","lastTransitionTime":"2026-03-20T15:40:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.533089    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.533129    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.533161    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.533181    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:40:38 crc kubenswrapper[4730]: E0320 15:40:38.533387    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:40:38 crc kubenswrapper[4730]: E0320 15:40:38.533913    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:40:38 crc kubenswrapper[4730]: E0320 15:40:38.533993    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:40:38 crc kubenswrapper[4730]: E0320 15:40:38.534046    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.593436    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.593486    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.593524    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.593546    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.593562    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:38Z","lastTransitionTime":"2026-03-20T15:40:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.696707    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.696751    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.696764    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.696783    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.696796    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:38Z","lastTransitionTime":"2026-03-20T15:40:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.798958    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.799201    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.799313    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.799407    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.799490    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:38Z","lastTransitionTime":"2026-03-20T15:40:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.901792    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.902049    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.902143    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.902228    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.902339    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:38Z","lastTransitionTime":"2026-03-20T15:40:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.919624    4730 generic.go:334] "Generic (PLEG): container finished" podID="dbb015c0-3a11-48bf-a59f-22bc03ca2fb9" containerID="d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0" exitCode=0
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.919673    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-49hht" event={"ID":"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9","Type":"ContainerDied","Data":"d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0"}
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.960705    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b4e0e8-af33-491e-b1d1-31079d90c656\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj97f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:38Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.976150    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:38Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:38 crc kubenswrapper[4730]: I0320 15:40:38.993093    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:38Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.006115    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.006381    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.006600    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.006845    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.007093    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:39Z","lastTransitionTime":"2026-03-20T15:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.007830    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:39Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.031276    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n4w74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ee8d55f-90bd-4484-8455-933de455efea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fvg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n4w74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:39Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.058567    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e647990de11fdbff5c61eb5f2a4f36ee1c2562141d84795cf0d7bbc1155cac28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://188974c8e34d453f129d95f8a3ee36ff86b44b37a5388a0643611206f15d57c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:39Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.080048    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fcd3db3-55f1-4c23-8fa9-78844495cea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p5qvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:39Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.099238    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-69fnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102cb977-7291-453e-9282-20572071afee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plthx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-69fnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:39Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.111204    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.111425    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.111769    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.112006    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.112180    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:39Z","lastTransitionTime":"2026-03-20T15:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.113058    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-49hht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-49hht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:39Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.126079    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c3fe2f-3c67-4dee-becb-3ecfe2758384\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2660f772d76479685f586aeb9b242975695ac5e28668840bac6b8a75c09443a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d96c704317c9bfbcb499beae611ca6fcb6337084728df20ad791267de34359b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://180d79b8131e00449715e74d883031078dcceb74ca44ea372b6c3f651d617007\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:39:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:39:59.021893       1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:39:59.022050       1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:39:59.023119       1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311534806/tls.crt::/tmp/serving-cert-3311534806/tls.key\\\\\\\"\\\\nI0320 15:39:59.307640       1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:39:59.310126       1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:39:59.310145       1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:39:59.310173       1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:39:59.310181       1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:39:59.315103       1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 15:39:59.315287       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 15:39:59.315111       1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:39:59.315373       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:39:59.315499       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:39:59.315558       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:39:59.315615       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:39:59.315659       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:39:59.316192       1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8e7f1bfc56e54242e12f30487e8120eeb3d85dc86cfc17b587255889bd60bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:39Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.136138    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p47zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:39Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.145971    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:39Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.159052    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6r2kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvthz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6r2kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:39Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.167641    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2prfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2prfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:39Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.177145    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07c87d31-bb33-4cbb-8786-b42cee6340a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4298d984334a4955a24ac85a16284d8ffe1e4f3033c328be6e66f59d3b17f1b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:39Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.188052    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:39Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.217929    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.217975    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.217985    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.218019    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.218031    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:39Z","lastTransitionTime":"2026-03-20T15:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.321096    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.321134    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.321144    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.321158    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.321169    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:39Z","lastTransitionTime":"2026-03-20T15:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.423695    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.423734    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.423744    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.423759    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.423770    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:39Z","lastTransitionTime":"2026-03-20T15:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.526400    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.526455    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.526465    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.526477    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.526487    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:39Z","lastTransitionTime":"2026-03-20T15:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.630341    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.630384    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.630396    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.630410    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.630423    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:39Z","lastTransitionTime":"2026-03-20T15:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.732718    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.732774    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.732791    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.732812    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.732827    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:39Z","lastTransitionTime":"2026-03-20T15:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.834783    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.834817    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.834827    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.834841    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.834852    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:39Z","lastTransitionTime":"2026-03-20T15:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.926121    4730 generic.go:334] "Generic (PLEG): container finished" podID="dbb015c0-3a11-48bf-a59f-22bc03ca2fb9" containerID="bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34" exitCode=0
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.926197    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-49hht" event={"ID":"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9","Type":"ContainerDied","Data":"bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34"}
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.931536    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh" event={"ID":"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0","Type":"ContainerStarted","Data":"c1a8117d479604ca09fc23e043f373aa3acbffb209d9fbfae95f2eb907424712"}
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.931598    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh" event={"ID":"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0","Type":"ContainerStarted","Data":"1242122dff682493181de927aefeb2ac4e5274621d865ee89ccec2e60e86469b"}
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.938934    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.938992    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.939015    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.939043    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.939071    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:39Z","lastTransitionTime":"2026-03-20T15:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.950721    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:39Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.967309    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:39Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:39 crc kubenswrapper[4730]: I0320 15:40:39.996354    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b4e0e8-af33-491e-b1d1-31079d90c656\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj97f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:39Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.010828    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:40Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.022235    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n4w74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ee8d55f-90bd-4484-8455-933de455efea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fvg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n4w74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:40Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.034656    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-69fnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102cb977-7291-453e-9282-20572071afee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plthx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-69fnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:40Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.041305    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.041345    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.041358    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.041375    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.041389    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:40Z","lastTransitionTime":"2026-03-20T15:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.050644    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-49hht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-49hht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:40Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.063174    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c3fe2f-3c67-4dee-becb-3ecfe2758384\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2660f772d76479685f586aeb9b242975695ac5e28668840bac6b8a75c09443a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d96c704317c9bfbcb499beae611ca6fcb6337084728df20ad791267de34359b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://180d79b8131e00449715e74d883031078dcceb74ca44ea372b6c3f651d617007\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:39:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:39:59.021893       1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:39:59.022050       1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:39:59.023119       1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311534806/tls.crt::/tmp/serving-cert-3311534806/tls.key\\\\\\\"\\\\nI0320 15:39:59.307640       1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:39:59.310126       1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:39:59.310145       1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:39:59.310173       1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:39:59.310181       1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:39:59.315103       1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 15:39:59.315287       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 15:39:59.315111       1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:39:59.315373       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:39:59.315499       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:39:59.315558       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:39:59.315615       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:39:59.315659       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:39:59.316192       1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8e7f1bfc56e54242e12f30487e8120eeb3d85dc86cfc17b587255889bd60bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:40Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.073778    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p47zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:40Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.084901    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e647990de11fdbff5c61eb5f2a4f36ee1c2562141d84795cf0d7bbc1155cac28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://188974c8e34d453f129d95f8a3ee36ff86b44b37a5388a0643611206f15d57c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:40Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.094603    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fcd3db3-55f1-4c23-8fa9-78844495cea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p5qvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:40Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.105798    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2prfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2prfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:40Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.118013    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07c87d31-bb33-4cbb-8786-b42cee6340a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4298d984334a4955a24ac85a16284d8ffe1e4f3033c328be6e66f59d3b17f1b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:40Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.129220    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:40Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.139419    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:40Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.144942    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.144979    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.144990    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.145006    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.145017    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:40Z","lastTransitionTime":"2026-03-20T15:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.150994    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6r2kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvthz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6r2kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:40Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.161340    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:40Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.178279    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b4e0e8-af33-491e-b1d1-31079d90c656\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj97f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:40Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.190267    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:40Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.200702    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:40Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.211688    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n4w74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ee8d55f-90bd-4484-8455-933de455efea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fvg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n4w74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:40Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.222853    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1242122dff682493181de927aefeb2ac4e5274621d865ee89ccec2e60e86469b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8117d479604ca09fc23e043f373aa3acbffb209d9fbfae95f2eb907424712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p47zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:40Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.234181    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e647990de11fdbff5c61eb5f2a4f36ee1c2562141d84795cf0d7bbc1155cac28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://188974c8e34d453f129d95f8a3ee36ff86b44b37a5388a0643611206f15d57c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:40Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.244135    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fcd3db3-55f1-4c23-8fa9-78844495cea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p5qvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:40Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.247023    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.247047    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.247057    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.247071    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.247079    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:40Z","lastTransitionTime":"2026-03-20T15:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.255541    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-69fnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102cb977-7291-453e-9282-20572071afee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plthx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-69fnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:40Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.268812    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-49hht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-49hht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:40Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.281240    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c3fe2f-3c67-4dee-becb-3ecfe2758384\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2660f772d76479685f586aeb9b242975695ac5e28668840bac6b8a75c09443a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d96c704317c9bfbcb499beae611ca6fcb6337084728df20ad791267de34359b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://180d79b8131e00449715e74d883031078dcceb74ca44ea372b6c3f651d617007\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:39:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:39:59.021893       1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:39:59.022050       1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:39:59.023119       1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311534806/tls.crt::/tmp/serving-cert-3311534806/tls.key\\\\\\\"\\\\nI0320 15:39:59.307640       1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:39:59.310126       1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:39:59.310145       1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:39:59.310173       1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:39:59.310181       1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:39:59.315103       1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 15:39:59.315287       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 15:39:59.315111       1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:39:59.315373       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:39:59.315499       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:39:59.315558       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:39:59.315615       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:39:59.315659       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:39:59.316192       1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8e7f1bfc56e54242e12f30487e8120eeb3d85dc86cfc17b587255889bd60bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:40Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.293404    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:40Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.304773    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:40Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.315518    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6r2kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvthz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6r2kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:40Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.325870    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2prfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2prfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:40Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.335000    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07c87d31-bb33-4cbb-8786-b42cee6340a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4298d984334a4955a24ac85a16284d8ffe1e4f3033c328be6e66f59d3b17f1b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:40Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.349802    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.349837    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.349850    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.349866    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.349877    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:40Z","lastTransitionTime":"2026-03-20T15:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.452905    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.452951    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.452963    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.452980    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.452992    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:40Z","lastTransitionTime":"2026-03-20T15:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.532570    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.532760    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:40:40 crc kubenswrapper[4730]: E0320 15:40:40.532757    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:40:40 crc kubenswrapper[4730]: E0320 15:40:40.532954    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.533055    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.533344    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:40:40 crc kubenswrapper[4730]: E0320 15:40:40.533483    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:40:40 crc kubenswrapper[4730]: E0320 15:40:40.533495    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.549274    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"]
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.555535    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.555580    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.555594    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.555616    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.555628    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:40Z","lastTransitionTime":"2026-03-20T15:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.659312    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.659355    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.659366    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.659380    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.659389    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:40Z","lastTransitionTime":"2026-03-20T15:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.761314    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.761358    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.761369    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.761385    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.761398    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:40Z","lastTransitionTime":"2026-03-20T15:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.863914    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.863963    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.863976    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.863993    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.864004    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:40Z","lastTransitionTime":"2026-03-20T15:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.935065    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6r2kn" event={"ID":"6f97b1f1-1fad-44ec-8253-17dd6a5eee54","Type":"ContainerStarted","Data":"f7f0276b7ef4131fee3cc30530a3cb7ddf6e6be5ce73d3084a1f6f4272f9dde6"}
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.936159    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-n4w74" event={"ID":"2ee8d55f-90bd-4484-8455-933de455efea","Type":"ContainerStarted","Data":"c2070438cbc1bdd8e3c865f5aa62537649598e0f51effc9936b9cb1a630b3651"}
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.938749    4730 generic.go:334] "Generic (PLEG): container finished" podID="dbb015c0-3a11-48bf-a59f-22bc03ca2fb9" containerID="535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134" exitCode=0
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.938804    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-49hht" event={"ID":"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9","Type":"ContainerDied","Data":"535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134"}
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.955485    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:40Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.966329    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.966357    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.966366    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.966378    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.966387    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:40Z","lastTransitionTime":"2026-03-20T15:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.970006    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:40Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:40 crc kubenswrapper[4730]: I0320 15:40:40.988118    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b4e0e8-af33-491e-b1d1-31079d90c656\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj97f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:40Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.007960    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.018181    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n4w74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ee8d55f-90bd-4484-8455-933de455efea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fvg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n4w74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.036036    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708c4e4c-a9ad-466a-a3d0-a99ae3399934\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://631f7e03c5476d53c6d6a3a3bf6a1d45ac40443386f072fc7c01129e9eaf7619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d029581dfd248d1298c93e18a5352686b5e89fb730ede58a29400c62deb2bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a81d62ce9d72579cc4dd4778d22cf442919e08c7d83209fde12edc8a338684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b365c72e5e05c29295ba289a5bf795353c5221607dadd2ab8b4f642272d99fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfacdb41d11bf0b03bede308c267faddfd0241d99a4cbda2603c6539bc3a433a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.051115    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c3fe2f-3c67-4dee-becb-3ecfe2758384\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2660f772d76479685f586aeb9b242975695ac5e28668840bac6b8a75c09443a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d96c704317c9bfbcb499beae611ca6fcb6337084728df20ad791267de34359b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://180d79b8131e00449715e74d883031078dcceb74ca44ea372b6c3f651d617007\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:39:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:39:59.021893       1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:39:59.022050       1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:39:59.023119       1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311534806/tls.crt::/tmp/serving-cert-3311534806/tls.key\\\\\\\"\\\\nI0320 15:39:59.307640       1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:39:59.310126       1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:39:59.310145       1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:39:59.310173       1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:39:59.310181       1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:39:59.315103       1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 15:39:59.315287       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 15:39:59.315111       1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:39:59.315373       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:39:59.315499       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:39:59.315558       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:39:59.315615       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:39:59.315659       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:39:59.316192       1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8e7f1bfc56e54242e12f30487e8120eeb3d85dc86cfc17b587255889bd60bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.066509    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1242122dff682493181de927aefeb2ac4e5274621d865ee89ccec2e60e86469b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8117d479604ca09fc23e043f373aa3acbffb209d9fbfae95f2eb907424712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p47zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.069405    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.069443    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.069455    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.069472    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.069484    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:41Z","lastTransitionTime":"2026-03-20T15:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.080086    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e647990de11fdbff5c61eb5f2a4f36ee1c2562141d84795cf0d7bbc1155cac28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://188974c8e34d453f129d95f8a3ee36ff86b44b37a5388a0643611206f15d57c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.090605    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fcd3db3-55f1-4c23-8fa9-78844495cea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p5qvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.101365    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-69fnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102cb977-7291-453e-9282-20572071afee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plthx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-69fnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.125101    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-49hht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-49hht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.135983    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07c87d31-bb33-4cbb-8786-b42cee6340a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4298d984334a4955a24ac85a16284d8ffe1e4f3033c328be6e66f59d3b17f1b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.145661    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.158335    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.168562    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6r2kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f0276b7ef4131fee3cc30530a3cb7ddf6e6be5ce73d3084a1f6f4272f9dde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvthz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6r2kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.172000    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.172046    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.172057    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.172075    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.172087    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:41Z","lastTransitionTime":"2026-03-20T15:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.178813    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2prfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2prfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.191660    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.202747    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.218947    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b4e0e8-af33-491e-b1d1-31079d90c656\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj97f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.232781    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.244071    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n4w74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ee8d55f-90bd-4484-8455-933de455efea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2070438cbc1bdd8e3c865f5aa62537649598e0f51effc9936b9cb1a630b3651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fvg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n4w74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.253718    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fcd3db3-55f1-4c23-8fa9-78844495cea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p5qvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.264287    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-69fnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102cb977-7291-453e-9282-20572071afee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plthx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-69fnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.273805    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.273840    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.273848    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.273861    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.273870    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:41Z","lastTransitionTime":"2026-03-20T15:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.278122    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-49hht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-49hht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.294561    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708c4e4c-a9ad-466a-a3d0-a99ae3399934\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://631f7e03c5476d53c6d6a3a3bf6a1d45ac40443386f072fc7c01129e9eaf7619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d029581dfd248d1298c93e18a5352686b5e89fb730ede58a29400c62deb2bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a81d62ce9d72579cc4dd4778d22cf442919e08c7d83209fde12edc8a338684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b365c72e5e05c29295ba289a5bf795353c5221607dadd2ab8b4f642272d99fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfacdb41d11bf0b03bede308c267faddfd0241d99a4cbda2603c6539bc3a433a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.308089    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c3fe2f-3c67-4dee-becb-3ecfe2758384\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2660f772d76479685f586aeb9b242975695ac5e28668840bac6b8a75c09443a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d96c704317c9bfbcb499beae611ca6fcb6337084728df20ad791267de34359b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://180d79b8131e00449715e74d883031078dcceb74ca44ea372b6c3f651d617007\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:39:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:39:59.021893       1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:39:59.022050       1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:39:59.023119       1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311534806/tls.crt::/tmp/serving-cert-3311534806/tls.key\\\\\\\"\\\\nI0320 15:39:59.307640       1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:39:59.310126       1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:39:59.310145       1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:39:59.310173       1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:39:59.310181       1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:39:59.315103       1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 15:39:59.315287       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 15:39:59.315111       1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:39:59.315373       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:39:59.315499       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:39:59.315558       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:39:59.315615       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:39:59.315659       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:39:59.316192       1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8e7f1bfc56e54242e12f30487e8120eeb3d85dc86cfc17b587255889bd60bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.317629    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1242122dff682493181de927aefeb2ac4e5274621d865ee89ccec2e60e86469b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8117d479604ca09fc23e043f373aa3acbffb209d9fbfae95f2eb907424712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p47zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.328466    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e647990de11fdbff5c61eb5f2a4f36ee1c2562141d84795cf0d7bbc1155cac28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://188974c8e34d453f129d95f8a3ee36ff86b44b37a5388a0643611206f15d57c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.340449    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6r2kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f0276b7ef4131fee3cc30530a3cb7ddf6e6be5ce73d3084a1f6f4272f9dde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvthz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6r2kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.350718    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2prfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2prfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.359983    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07c87d31-bb33-4cbb-8786-b42cee6340a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4298d984334a4955a24ac85a16284d8ffe1e4f3033c328be6e66f59d3b17f1b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.370492    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.375753    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.375798    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.375806    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.375820    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.375830    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:41Z","lastTransitionTime":"2026-03-20T15:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.381882    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.479373    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.479408    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.479419    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.479436    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.479447    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:41Z","lastTransitionTime":"2026-03-20T15:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.533219    4730 scope.go:117] "RemoveContainer" containerID="688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.548085    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07c87d31-bb33-4cbb-8786-b42cee6340a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4298d984334a4955a24ac85a16284d8ffe1e4f3033c328be6e66f59d3b17f1b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.567635    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.582232    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.584523    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.584717    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.584746    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.584820    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.584848    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:41Z","lastTransitionTime":"2026-03-20T15:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.599892    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6r2kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f0276b7ef4131fee3cc30530a3cb7ddf6e6be5ce73d3084a1f6f4272f9dde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvthz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6r2kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.612660    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2prfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2prfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.631994    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.648312    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.670324    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b4e0e8-af33-491e-b1d1-31079d90c656\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj97f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.685810    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.689665    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.689712    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.689732    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.689759    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.689779    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:41Z","lastTransitionTime":"2026-03-20T15:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.696385    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n4w74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ee8d55f-90bd-4484-8455-933de455efea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2070438cbc1bdd8e3c865f5aa62537649598e0f51effc9936b9cb1a630b3651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fvg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n4w74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.718772    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708c4e4c-a9ad-466a-a3d0-a99ae3399934\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://631f7e03c5476d53c6d6a3a3bf6a1d45ac40443386f072fc7c01129e9eaf7619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d029581dfd248d1298c93e18a5352686b5e89fb730ede58a29400c62deb2bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a81d62ce9d72579cc4dd4778d22cf442919e08c7d83209fde12edc8a338684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b365c72e5e05c29295ba289a5bf795353c5221607dadd2ab8b4f642272d99fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfacdb41d11bf0b03bede308c267faddfd0241d99a4cbda2603c6539bc3a433a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.731966    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c3fe2f-3c67-4dee-becb-3ecfe2758384\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2660f772d76479685f586aeb9b242975695ac5e28668840bac6b8a75c09443a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d96c704317c9bfbcb499beae611ca6fcb6337084728df20ad791267de34359b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://180d79b8131e00449715e74d883031078dcceb74ca44ea372b6c3f651d617007\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:39:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:39:59.021893       1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:39:59.022050       1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:39:59.023119       1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311534806/tls.crt::/tmp/serving-cert-3311534806/tls.key\\\\\\\"\\\\nI0320 15:39:59.307640       1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:39:59.310126       1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:39:59.310145       1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:39:59.310173       1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:39:59.310181       1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:39:59.315103       1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 15:39:59.315287       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 15:39:59.315111       1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:39:59.315373       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:39:59.315499       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:39:59.315558       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:39:59.315615       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:39:59.315659       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:39:59.316192       1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8e7f1bfc56e54242e12f30487e8120eeb3d85dc86cfc17b587255889bd60bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.743338    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1242122dff682493181de927aefeb2ac4e5274621d865ee89ccec2e60e86469b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8117d479604ca09fc23e043f373aa3acbffb209d9fbfae95f2eb907424712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p47zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.754938    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e647990de11fdbff5c61eb5f2a4f36ee1c2562141d84795cf0d7bbc1155cac28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://188974c8e34d453f129d95f8a3ee36ff86b44b37a5388a0643611206f15d57c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.765902    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fcd3db3-55f1-4c23-8fa9-78844495cea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p5qvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.776089    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-69fnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102cb977-7291-453e-9282-20572071afee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plthx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-69fnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.788756    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-49hht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-49hht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.796267    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.796310    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.796320    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.796334    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.796344    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:41Z","lastTransitionTime":"2026-03-20T15:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.898446    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.898483    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.898491    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.898503    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.898512    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:41Z","lastTransitionTime":"2026-03-20T15:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.943373    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.944837    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b137ba07c1866334b1897d83f9590d1627b5724b950efaef0bbda497186fe1ad"}
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.945285    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.946159    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-69fnw" event={"ID":"102cb977-7291-453e-9282-20572071afee","Type":"ContainerStarted","Data":"35fa36f52355ad1fd24bd332fc5b0a5ff35ac2be71dd16b53449d9f47e48e8ad"}
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.947902    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"055b0251e049abda93d63985b27520b79b2596443df06427dd990fc974b52a07"}
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.954073    4730 generic.go:334] "Generic (PLEG): container finished" podID="dbb015c0-3a11-48bf-a59f-22bc03ca2fb9" containerID="5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a" exitCode=0
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.954137    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-49hht" event={"ID":"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9","Type":"ContainerDied","Data":"5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a"}
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.956275    4730 generic.go:334] "Generic (PLEG): container finished" podID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerID="b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e" exitCode=0
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.956298    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" event={"ID":"c4b4e0e8-af33-491e-b1d1-31079d90c656","Type":"ContainerDied","Data":"b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e"}
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.961919    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1242122dff682493181de927aefeb2ac4e5274621d865ee89ccec2e60e86469b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8117d479604ca09fc23e043f373aa3acbffb209d9fbfae95f2eb907424712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p47zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.986065    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e647990de11fdbff5c61eb5f2a4f36ee1c2562141d84795cf0d7bbc1155cac28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://188974c8e34d453f129d95f8a3ee36ff86b44b37a5388a0643611206f15d57c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:41 crc kubenswrapper[4730]: I0320 15:40:41.998903    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fcd3db3-55f1-4c23-8fa9-78844495cea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p5qvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.000319    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.000349    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.000357    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.000369    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.000377    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:42Z","lastTransitionTime":"2026-03-20T15:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.008616    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-69fnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102cb977-7291-453e-9282-20572071afee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plthx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-69fnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:42Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.022317    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-49hht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-49hht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:42Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.039836    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708c4e4c-a9ad-466a-a3d0-a99ae3399934\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://631f7e03c5476d53c6d6a3a3bf6a1d45ac40443386f072fc7c01129e9eaf7619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d029581dfd248d1298c93e18a5352686b5e89fb730ede58a29400c62deb2bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a81d62ce9d72579cc4dd4778d22cf442919e08c7d83209fde12edc8a338684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b365c72e5e05c29295ba289a5bf795353c5221607dadd2ab8b4f642272d99fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfacdb41d11bf0b03bede308c267faddfd0241d99a4cbda2603c6539bc3a433a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:42Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.053099    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c3fe2f-3c67-4dee-becb-3ecfe2758384\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2660f772d76479685f586aeb9b242975695ac5e28668840bac6b8a75c09443a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d96c704317c9bfbcb499beae611ca6fcb6337084728df20ad791267de34359b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://180d79b8131e00449715e74d883031078dcceb74ca44ea372b6c3f651d617007\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b137ba07c1866334b1897d83f9590d1627b5724b950efaef0bbda497186fe1ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:39:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:39:59.021893       1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:39:59.022050       1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:39:59.023119       1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311534806/tls.crt::/tmp/serving-cert-3311534806/tls.key\\\\\\\"\\\\nI0320 15:39:59.307640       1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:39:59.310126       1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:39:59.310145       1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:39:59.310173       1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:39:59.310181       1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:39:59.315103       1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 15:39:59.315287       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 15:39:59.315111       1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:39:59.315373       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:39:59.315499       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:39:59.315558       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:39:59.315615       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:39:59.315659       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:39:59.316192       1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8e7f1bfc56e54242e12f30487e8120eeb3d85dc86cfc17b587255889bd60bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:42Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.066380    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:42Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.077370    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:42Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.088827    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6r2kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f0276b7ef4131fee3cc30530a3cb7ddf6e6be5ce73d3084a1f6f4272f9dde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvthz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6r2kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:42Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.098295    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2prfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2prfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:42Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.108577    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.108623    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.108633    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.108646    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.108653    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:42Z","lastTransitionTime":"2026-03-20T15:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.139015    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07c87d31-bb33-4cbb-8786-b42cee6340a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4298d984334a4955a24ac85a16284d8ffe1e4f3033c328be6e66f59d3b17f1b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:42Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.149548    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:42Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.178371    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b4e0e8-af33-491e-b1d1-31079d90c656\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj97f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:42Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.189551    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:42Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.200806    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:42Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.209332    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n4w74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ee8d55f-90bd-4484-8455-933de455efea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2070438cbc1bdd8e3c865f5aa62537649598e0f51effc9936b9cb1a630b3651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fvg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n4w74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:42Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.213329    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.213397    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.213416    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.213441    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.213457    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:42Z","lastTransitionTime":"2026-03-20T15:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.220885    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055b0251e049abda93d63985b27520b79b2596443df06427dd990fc974b52a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:42Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.232440    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:42Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.243504    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6r2kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f0276b7ef4131fee3cc30530a3cb7ddf6e6be5ce73d3084a1f6f4272f9dde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvthz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6r2kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:42Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.252758    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2prfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2prfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:42Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.272372    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07c87d31-bb33-4cbb-8786-b42cee6340a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4298d984334a4955a24ac85a16284d8ffe1e4f3033c328be6e66f59d3b17f1b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:42Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.289661    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:42Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.307008    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b4e0e8-af33-491e-b1d1-31079d90c656\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj97f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:42Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.315033    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.315066    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.315075    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.315089    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.315098    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:42Z","lastTransitionTime":"2026-03-20T15:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.322343    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:42Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.332988    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:42Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.354516    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n4w74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ee8d55f-90bd-4484-8455-933de455efea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2070438cbc1bdd8e3c865f5aa62537649598e0f51effc9936b9cb1a630b3651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fvg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n4w74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:42Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.398136    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1242122dff682493181de927aefeb2ac4e5274621d865ee89ccec2e60e86469b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8117d479604ca09fc23e043f373aa3acbffb209d9fbfae95f2eb907424712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p47zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:42Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.417129    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.417189    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.417199    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.417218    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.417229    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:42Z","lastTransitionTime":"2026-03-20T15:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.437993    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e647990de11fdbff5c61eb5f2a4f36ee1c2562141d84795cf0d7bbc1155cac28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://188974c8e34d453f129d95f8a3ee36ff86b44b37a5388a0643611206f15d57c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:42Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.475053    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fcd3db3-55f1-4c23-8fa9-78844495cea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p5qvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:42Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.515186    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-69fnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102cb977-7291-453e-9282-20572071afee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35fa36f52355ad1fd24bd332fc5b0a5ff35ac2be71dd16b53449d9f47e48e8ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plthx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-69fnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:42Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.519826    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.519847    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.519854    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.519867    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.519876    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:42Z","lastTransitionTime":"2026-03-20T15:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.532420    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.532424    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.532433    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.532509    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:40:42 crc kubenswrapper[4730]: E0320 15:40:42.533481    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:40:42 crc kubenswrapper[4730]: E0320 15:40:42.534082    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:40:42 crc kubenswrapper[4730]: E0320 15:40:42.534157    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:40:42 crc kubenswrapper[4730]: E0320 15:40:42.534213    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.557737    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-49hht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-49hht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:42Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.602410    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708c4e4c-a9ad-466a-a3d0-a99ae3399934\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://631f7e03c5476d53c6d6a3a3bf6a1d45ac40443386f072fc7c01129e9eaf7619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d029581dfd248d1298c93e18a5352686b5e89fb730ede58a29400c62deb2bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a81d62ce9d72579cc4dd4778d22cf442919e08c7d83209fde12edc8a338684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b365c72e5e05c29295ba289a5bf795353c5221607dadd2ab8b4f642272d99fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfacdb41d11bf0b03bede308c267faddfd0241d99a4cbda2603c6539bc3a433a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:42Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.622740    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.622769    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.622780    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.622795    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.622804    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:42Z","lastTransitionTime":"2026-03-20T15:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.637888    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c3fe2f-3c67-4dee-becb-3ecfe2758384\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2660f772d76479685f586aeb9b242975695ac5e28668840bac6b8a75c09443a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d96c704317c9bfbcb499beae611ca6fcb6337084728df20ad791267de34359b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://180d79b8131e00449715e74d883031078dcceb74ca44ea372b6c3f651d617007\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b137ba07c1866334b1897d83f9590d1627b5724b950efaef0bbda497186fe1ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:39:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:39:59.021893       1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:39:59.022050       1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:39:59.023119       1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311534806/tls.crt::/tmp/serving-cert-3311534806/tls.key\\\\\\\"\\\\nI0320 15:39:59.307640       1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:39:59.310126       1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:39:59.310145       1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:39:59.310173       1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:39:59.310181       1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:39:59.315103       1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 15:39:59.315287       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 15:39:59.315111       1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:39:59.315373       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:39:59.315499       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:39:59.315558       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:39:59.315615       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:39:59.315659       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:39:59.316192       1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8e7f1bfc56e54242e12f30487e8120eeb3d85dc86cfc17b587255889bd60bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:42Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.724379    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.724414    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.724424    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.724439    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.724449    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:42Z","lastTransitionTime":"2026-03-20T15:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.826370    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.826408    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.826421    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.826438    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.826450    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:42Z","lastTransitionTime":"2026-03-20T15:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.929151    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.929200    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.929211    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.929228    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.929239    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:42Z","lastTransitionTime":"2026-03-20T15:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.960470    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"5f8a35dfa17acf7f7051d13b20cab55fe91645c8fa2773fed67baddae164b586"}
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.966482    4730 generic.go:334] "Generic (PLEG): container finished" podID="dbb015c0-3a11-48bf-a59f-22bc03ca2fb9" containerID="fdfa0e3e3af5612de41c905d807532a50887378171356a3dade8aaf3d29fe9bd" exitCode=0
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.966569    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-49hht" event={"ID":"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9","Type":"ContainerDied","Data":"fdfa0e3e3af5612de41c905d807532a50887378171356a3dade8aaf3d29fe9bd"}
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.974309    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" event={"ID":"c4b4e0e8-af33-491e-b1d1-31079d90c656","Type":"ContainerStarted","Data":"d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01"}
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.974483    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" event={"ID":"c4b4e0e8-af33-491e-b1d1-31079d90c656","Type":"ContainerStarted","Data":"462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c"}
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.974565    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" event={"ID":"c4b4e0e8-af33-491e-b1d1-31079d90c656","Type":"ContainerStarted","Data":"006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d"}
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.974652    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" event={"ID":"c4b4e0e8-af33-491e-b1d1-31079d90c656","Type":"ContainerStarted","Data":"e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91"}
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.974725    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" event={"ID":"c4b4e0e8-af33-491e-b1d1-31079d90c656","Type":"ContainerStarted","Data":"b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db"}
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.974809    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" event={"ID":"c4b4e0e8-af33-491e-b1d1-31079d90c656","Type":"ContainerStarted","Data":"31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c"}
Mar 20 15:40:42 crc kubenswrapper[4730]: I0320 15:40:42.995882    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b4e0e8-af33-491e-b1d1-31079d90c656\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj97f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:42Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.009167    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8a35dfa17acf7f7051d13b20cab55fe91645c8fa2773fed67baddae164b586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:43Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.022757    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:43Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.032574    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.032615    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.032625    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.032641    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.032655    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:43Z","lastTransitionTime":"2026-03-20T15:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.042709    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:43Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.053397    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n4w74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ee8d55f-90bd-4484-8455-933de455efea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2070438cbc1bdd8e3c865f5aa62537649598e0f51effc9936b9cb1a630b3651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fvg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n4w74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:43Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.064486    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e647990de11fdbff5c61eb5f2a4f36ee1c2562141d84795cf0d7bbc1155cac28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://188974c8e34d453f129d95f8a3ee36ff86b44b37a5388a0643611206f15d57c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:43Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.077355    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fcd3db3-55f1-4c23-8fa9-78844495cea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p5qvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:43Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.090894    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-69fnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102cb977-7291-453e-9282-20572071afee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35fa36f52355ad1fd24bd332fc5b0a5ff35ac2be71dd16b53449d9f47e48e8ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plthx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-69fnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:43Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.108451    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-49hht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-49hht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:43Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.128045    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708c4e4c-a9ad-466a-a3d0-a99ae3399934\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://631f7e03c5476d53c6d6a3a3bf6a1d45ac40443386f072fc7c01129e9eaf7619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d029581dfd248d1298c93e18a5352686b5e89fb730ede58a29400c62deb2bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a81d62ce9d72579cc4dd4778d22cf442919e08c7d83209fde12edc8a338684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b365c72e5e05c29295ba289a5bf795353c5221607dadd2ab8b4f642272d99fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfacdb41d11bf0b03bede308c267faddfd0241d99a4cbda2603c6539bc3a433a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:43Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.135368    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.135479    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.135537    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.135602    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.135683    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:43Z","lastTransitionTime":"2026-03-20T15:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.139986    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c3fe2f-3c67-4dee-becb-3ecfe2758384\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2660f772d76479685f586aeb9b242975695ac5e28668840bac6b8a75c09443a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d96c704317c9bfbcb499beae611ca6fcb6337084728df20ad791267de34359b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://180d79b8131e00449715e74d883031078dcceb74ca44ea372b6c3f651d617007\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b137ba07c1866334b1897d83f9590d1627b5724b950efaef0bbda497186fe1ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:39:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:39:59.021893       1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:39:59.022050       1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:39:59.023119       1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311534806/tls.crt::/tmp/serving-cert-3311534806/tls.key\\\\\\\"\\\\nI0320 15:39:59.307640       1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:39:59.310126       1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:39:59.310145       1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:39:59.310173       1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:39:59.310181       1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:39:59.315103       1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 15:39:59.315287       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 15:39:59.315111       1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:39:59.315373       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:39:59.315499       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:39:59.315558       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:39:59.315615       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:39:59.315659       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:39:59.316192       1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8e7f1bfc56e54242e12f30487e8120eeb3d85dc86cfc17b587255889bd60bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:43Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.148661    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1242122dff682493181de927aefeb2ac4e5274621d865ee89ccec2e60e86469b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8117d479604ca09fc23e043f373aa3acbffb209d9fbfae95f2eb907424712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p47zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:43Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.160636    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:43Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.198888    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6r2kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f0276b7ef4131fee3cc30530a3cb7ddf6e6be5ce73d3084a1f6f4272f9dde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvthz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6r2kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:43Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.234621    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2prfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2prfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:43Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.238257    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.238276    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.238284    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.238296    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.238304    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:43Z","lastTransitionTime":"2026-03-20T15:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.275215    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07c87d31-bb33-4cbb-8786-b42cee6340a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4298d984334a4955a24ac85a16284d8ffe1e4f3033c328be6e66f59d3b17f1b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:43Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.295718    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.295754    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.295767    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.295783    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.295795    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:43Z","lastTransitionTime":"2026-03-20T15:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:43 crc kubenswrapper[4730]: E0320 15:40:43.310710    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"666d62d4-aa52-41cc-be79-8c9a068e7752\\\",\\\"systemUUID\\\":\\\"dfe7d645-fe91-432e-8360-ef4633bfea29\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:43Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.314670    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.314718    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.314729    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.314748    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.314762    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:43Z","lastTransitionTime":"2026-03-20T15:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.317137    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055b0251e049abda93d63985b27520b79b2596443df06427dd990fc974b52a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:43Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:43 crc kubenswrapper[4730]: E0320 15:40:43.333666    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"666d62d4-aa52-41cc-be79-8c9a068e7752\\\",\\\"systemUUID\\\":\\\"dfe7d645-fe91-432e-8360-ef4633bfea29\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:43Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.339643    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.339702    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.339718    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.339744    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.339758    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:43Z","lastTransitionTime":"2026-03-20T15:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:43 crc kubenswrapper[4730]: E0320 15:40:43.363612    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"666d62d4-aa52-41cc-be79-8c9a068e7752\\\",\\\"systemUUID\\\":\\\"dfe7d645-fe91-432e-8360-ef4633bfea29\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:43Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.368895    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.368966    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.368983    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.369002    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.369015    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:43Z","lastTransitionTime":"2026-03-20T15:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.378920    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fcd3db3-55f1-4c23-8fa9-78844495cea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p5qvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:43Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:43 crc kubenswrapper[4730]: E0320 15:40:43.384868    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"666d62d4-aa52-41cc-be79-8c9a068e7752\\\",\\\"systemUUID\\\":\\\"dfe7d645-fe91-432e-8360-ef4633bfea29\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:43Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.389838    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.389873    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.389885    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.389902    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.389914    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:43Z","lastTransitionTime":"2026-03-20T15:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.402374    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-69fnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102cb977-7291-453e-9282-20572071afee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35fa36f52355ad1fd24bd332fc5b0a5ff35ac2be71dd16b53449d9f47e48e8ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plthx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-69fnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:43Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:43 crc kubenswrapper[4730]: E0320 15:40:43.404456    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"666d62d4-aa52-41cc-be79-8c9a068e7752\\\",\\\"systemUUID\\\":\\\"dfe7d645-fe91-432e-8360-ef4633bfea29\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:43Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:43 crc kubenswrapper[4730]: E0320 15:40:43.404560    4730 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.406176    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.406294    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.406370    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.406444    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.406501    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:43Z","lastTransitionTime":"2026-03-20T15:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.439447    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-49hht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdfa0e3e3af5612de41c905d807532a50887378171356a3dade8aaf3d29fe9bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdfa0e3e3af5612de41c905d807532a50887378171356a3dade8aaf3d29fe9bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-49hht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:43Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.482089    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708c4e4c-a9ad-466a-a3d0-a99ae3399934\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://631f7e03c5476d53c6d6a3a3bf6a1d45ac40443386f072fc7c01129e9eaf7619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d029581dfd248d1298c93e18a5352686b5e89fb730ede58a29400c62deb2bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a81d62ce9d72579cc4dd4778d22cf442919e08c7d83209fde12edc8a338684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b365c72e5e05c29295ba289a5bf795353c5221607dadd2ab8b4f642272d99fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfacdb41d11bf0b03bede308c267faddfd0241d99a4cbda2603c6539bc3a433a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:43Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.509174    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.509400    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.509492    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.509613    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.509700    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:43Z","lastTransitionTime":"2026-03-20T15:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.518085    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c3fe2f-3c67-4dee-becb-3ecfe2758384\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2660f772d76479685f586aeb9b242975695ac5e28668840bac6b8a75c09443a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d96c704317c9bfbcb499beae611ca6fcb6337084728df20ad791267de34359b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://180d79b8131e00449715e74d883031078dcceb74ca44ea372b6c3f651d617007\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b137ba07c1866334b1897d83f9590d1627b5724b950efaef0bbda497186fe1ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:39:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:39:59.021893       1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:39:59.022050       1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:39:59.023119       1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311534806/tls.crt::/tmp/serving-cert-3311534806/tls.key\\\\\\\"\\\\nI0320 15:39:59.307640       1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:39:59.310126       1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:39:59.310145       1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:39:59.310173       1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:39:59.310181       1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:39:59.315103       1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 15:39:59.315287       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 15:39:59.315111       1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:39:59.315373       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:39:59.315499       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:39:59.315558       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:39:59.315615       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:39:59.315659       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:39:59.316192       1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8e7f1bfc56e54242e12f30487e8120eeb3d85dc86cfc17b587255889bd60bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:43Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.555642    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1242122dff682493181de927aefeb2ac4e5274621d865ee89ccec2e60e86469b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8117d479604ca09fc23e043f373aa3acbffb209d9fbfae95f2eb907424712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p47zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:43Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.598207    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e647990de11fdbff5c61eb5f2a4f36ee1c2562141d84795cf0d7bbc1155cac28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://188974c8e34d453f129d95f8a3ee36ff86b44b37a5388a0643611206f15d57c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:43Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.611390    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.611420    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.611430    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.611443    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.611453    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:43Z","lastTransitionTime":"2026-03-20T15:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.639282    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6r2kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f0276b7ef4131fee3cc30530a3cb7ddf6e6be5ce73d3084a1f6f4272f9dde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvthz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6r2kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:43Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.675434    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2prfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2prfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:43Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.713071    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.713107    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.713117    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.713132    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.713143    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:43Z","lastTransitionTime":"2026-03-20T15:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.716352    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07c87d31-bb33-4cbb-8786-b42cee6340a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4298d984334a4955a24ac85a16284d8ffe1e4f3033c328be6e66f59d3b17f1b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:43Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.756762    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055b0251e049abda93d63985b27520b79b2596443df06427dd990fc974b52a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:43Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.796302    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:43Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.815420    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.815462    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.815471    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.815485    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.815493    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:43Z","lastTransitionTime":"2026-03-20T15:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.837053    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8a35dfa17acf7f7051d13b20cab55fe91645c8fa2773fed67baddae164b586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:43Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.876890    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:43Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.918136    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.918204    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.918217    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.918236    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.918270    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:43Z","lastTransitionTime":"2026-03-20T15:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.929422    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b4e0e8-af33-491e-b1d1-31079d90c656\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj97f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:43Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.964299    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:43Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.978482    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" event={"ID":"7fcd3db3-55f1-4c23-8fa9-78844495cea3","Type":"ContainerStarted","Data":"f8e20a286d6affcba7ffa950ef5386e7f439c9a02381cb8b7d3bc51ad9c4f343"}
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.978522    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" event={"ID":"7fcd3db3-55f1-4c23-8fa9-78844495cea3","Type":"ContainerStarted","Data":"cd661cda796a2bd61d1446bee672c4471d60370245caf7cb54faf54dfa9c58a0"}
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.982440    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-49hht" event={"ID":"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9","Type":"ContainerStarted","Data":"9c34c742c6da6c6d35f815901234a0c12c9628d22fa83e511bddc78eae4373cc"}
Mar 20 15:40:43 crc kubenswrapper[4730]: I0320 15:40:43.997136    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n4w74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ee8d55f-90bd-4484-8455-933de455efea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2070438cbc1bdd8e3c865f5aa62537649598e0f51effc9936b9cb1a630b3651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fvg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n4w74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:43Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.020542    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.020582    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.020598    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.020616    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.020627    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:44Z","lastTransitionTime":"2026-03-20T15:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.038509    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8a35dfa17acf7f7051d13b20cab55fe91645c8fa2773fed67baddae164b586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:44Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.078784    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:44Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.122854    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.122896    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.122908    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.122923    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.122934    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:44Z","lastTransitionTime":"2026-03-20T15:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.124911    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b4e0e8-af33-491e-b1d1-31079d90c656\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj97f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:44Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.162230    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n4w74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ee8d55f-90bd-4484-8455-933de455efea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2070438cbc1bdd8e3c865f5aa62537649598e0f51effc9936b9cb1a630b3651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fvg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n4w74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:44Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.202477    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:44Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.225115    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.225148    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.225156    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.225169    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.225179    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:44Z","lastTransitionTime":"2026-03-20T15:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.242229    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c3fe2f-3c67-4dee-becb-3ecfe2758384\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2660f772d76479685f586aeb9b242975695ac5e28668840bac6b8a75c09443a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d96c704317c9bfbcb499beae611ca6fcb6337084728df20ad791267de34359b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://180d79b8131e00449715e74d883031078dcceb74ca44ea372b6c3f651d617007\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b137ba07c1866334b1897d83f9590d1627b5724b950efaef0bbda497186fe1ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:39:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:39:59.021893       1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:39:59.022050       1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:39:59.023119       1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311534806/tls.crt::/tmp/serving-cert-3311534806/tls.key\\\\\\\"\\\\nI0320 15:39:59.307640       1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:39:59.310126       1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:39:59.310145       1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:39:59.310173       1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:39:59.310181       1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:39:59.315103       1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 15:39:59.315287       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 15:39:59.315111       1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:39:59.315373       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:39:59.315499       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:39:59.315558       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:39:59.315615       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:39:59.315659       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:39:59.316192       1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8e7f1bfc56e54242e12f30487e8120eeb3d85dc86cfc17b587255889bd60bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:44Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.278845    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1242122dff682493181de927aefeb2ac4e5274621d865ee89ccec2e60e86469b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8117d479604ca09fc23e043f373aa3acbffb209d9fbfae95f2eb907424712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p47zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:44Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.322568    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e647990de11fdbff5c61eb5f2a4f36ee1c2562141d84795cf0d7bbc1155cac28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://188974c8e34d453f129d95f8a3ee36ff86b44b37a5388a0643611206f15d57c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:44Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.327527    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.327575    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.327587    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.327603    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.327908    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:44Z","lastTransitionTime":"2026-03-20T15:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.366409    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fcd3db3-55f1-4c23-8fa9-78844495cea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8e20a286d6affcba7ffa950ef5386e7f439c9a02381cb8b7d3bc51ad9c4f343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd661cda796a2bd61d1446bee672c4471d60370245caf7cb54faf54dfa9c58a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p5qvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:44Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.395167    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-69fnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102cb977-7291-453e-9282-20572071afee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35fa36f52355ad1fd24bd332fc5b0a5ff35ac2be71dd16b53449d9f47e48e8ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plthx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-69fnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:44Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.436624    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.436668    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.436676    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.436690    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.436699    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:44Z","lastTransitionTime":"2026-03-20T15:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.441164    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-49hht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c34c742c6da6c6d35f815901234a0c12c9628d22fa83e511bddc78eae4373cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdfa0e3e3af5612de41c905d807532a50887378171356a3dade8aaf3d29fe9bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdfa0e3e3af5612de41c905d807532a50887378171356a3dade8aaf3d29fe9bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-49hht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:44Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.462769    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.462848    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.462873    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.462891    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:40:44 crc kubenswrapper[4730]: E0320 15:40:44.462924    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:41:16.462904696 +0000 UTC m=+135.676276065 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.462953    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:40:44 crc kubenswrapper[4730]: E0320 15:40:44.462988    4730 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered
Mar 20 15:40:44 crc kubenswrapper[4730]: E0320 15:40:44.463004    4730 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered
Mar 20 15:40:44 crc kubenswrapper[4730]: E0320 15:40:44.463016    4730 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered]
Mar 20 15:40:44 crc kubenswrapper[4730]: E0320 15:40:44.463039    4730 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered
Mar 20 15:40:44 crc kubenswrapper[4730]: E0320 15:40:44.463049    4730 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered
Mar 20 15:40:44 crc kubenswrapper[4730]: E0320 15:40:44.463137    4730 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered
Mar 20 15:40:44 crc kubenswrapper[4730]: E0320 15:40:44.463132    4730 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered
Mar 20 15:40:44 crc kubenswrapper[4730]: E0320 15:40:44.463152    4730 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered]
Mar 20 15:40:44 crc kubenswrapper[4730]: E0320 15:40:44.463060    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 15:41:16.46304966 +0000 UTC m=+135.676421019 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered]
Mar 20 15:40:44 crc kubenswrapper[4730]: E0320 15:40:44.463290    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 15:41:16.463275616 +0000 UTC m=+135.676646985 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered
Mar 20 15:40:44 crc kubenswrapper[4730]: E0320 15:40:44.463309    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 15:41:16.463298427 +0000 UTC m=+135.676669796 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered
Mar 20 15:40:44 crc kubenswrapper[4730]: E0320 15:40:44.463325    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 15:41:16.463318348 +0000 UTC m=+135.676689717 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered]
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.491089    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708c4e4c-a9ad-466a-a3d0-a99ae3399934\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://631f7e03c5476d53c6d6a3a3bf6a1d45ac40443386f072fc7c01129e9eaf7619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d029581dfd248d1298c93e18a5352686b5e89fb730ede58a29400c62deb2bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a81d62ce9d72579cc4dd4778d22cf442919e08c7d83209fde12edc8a338684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b365c72e5e05c29295ba289a5bf795353c5221607dadd2ab8b4f642272d99fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfacdb41d11bf0b03bede308c267faddfd0241d99a4cbda2603c6539bc3a433a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:44Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.514131    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07c87d31-bb33-4cbb-8786-b42cee6340a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4298d984334a4955a24ac85a16284d8ffe1e4f3033c328be6e66f59d3b17f1b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:44Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.532882    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:40:44 crc kubenswrapper[4730]: E0320 15:40:44.532994    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.533051    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:40:44 crc kubenswrapper[4730]: E0320 15:40:44.533186    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.533221    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:40:44 crc kubenswrapper[4730]: E0320 15:40:44.533288    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.533589    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:40:44 crc kubenswrapper[4730]: E0320 15:40:44.533780    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.538202    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.538230    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.538243    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.538278    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.538289    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:44Z","lastTransitionTime":"2026-03-20T15:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.555718    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055b0251e049abda93d63985b27520b79b2596443df06427dd990fc974b52a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:44Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.595795    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:44Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.636410    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6r2kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f0276b7ef4131fee3cc30530a3cb7ddf6e6be5ce73d3084a1f6f4272f9dde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvthz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6r2kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:44Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.641325    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.641366    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.641374    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.641389    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.641398    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:44Z","lastTransitionTime":"2026-03-20T15:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.664919    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a-metrics-certs\") pod \"network-metrics-daemon-2prfn\" (UID: \"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\") " pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:40:44 crc kubenswrapper[4730]: E0320 15:40:44.665082    4730 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered
Mar 20 15:40:44 crc kubenswrapper[4730]: E0320 15:40:44.665168    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a-metrics-certs podName:db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a nodeName:}" failed. No retries permitted until 2026-03-20 15:41:16.665148421 +0000 UTC m=+135.878519790 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a-metrics-certs") pod "network-metrics-daemon-2prfn" (UID: "db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a") : object "openshift-multus"/"metrics-daemon-secret" not registered
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.674495    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2prfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2prfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:44Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.743154    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.743209    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.743221    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.743243    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.743300    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:44Z","lastTransitionTime":"2026-03-20T15:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.845143    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.845183    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.845194    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.845211    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.845226    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:44Z","lastTransitionTime":"2026-03-20T15:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.947782    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.947849    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.947867    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.947897    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:44 crc kubenswrapper[4730]: I0320 15:40:44.947925    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:44Z","lastTransitionTime":"2026-03-20T15:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:45 crc kubenswrapper[4730]: I0320 15:40:45.053756    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:45 crc kubenswrapper[4730]: I0320 15:40:45.054122    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:45 crc kubenswrapper[4730]: I0320 15:40:45.054139    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:45 crc kubenswrapper[4730]: I0320 15:40:45.054156    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:45 crc kubenswrapper[4730]: I0320 15:40:45.054173    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:45Z","lastTransitionTime":"2026-03-20T15:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:45 crc kubenswrapper[4730]: I0320 15:40:45.158362    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:45 crc kubenswrapper[4730]: I0320 15:40:45.158403    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:45 crc kubenswrapper[4730]: I0320 15:40:45.158416    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:45 crc kubenswrapper[4730]: I0320 15:40:45.158434    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:45 crc kubenswrapper[4730]: I0320 15:40:45.158446    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:45Z","lastTransitionTime":"2026-03-20T15:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:45 crc kubenswrapper[4730]: I0320 15:40:45.261458    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:45 crc kubenswrapper[4730]: I0320 15:40:45.261510    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:45 crc kubenswrapper[4730]: I0320 15:40:45.261522    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:45 crc kubenswrapper[4730]: I0320 15:40:45.261537    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:45 crc kubenswrapper[4730]: I0320 15:40:45.261546    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:45Z","lastTransitionTime":"2026-03-20T15:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:45 crc kubenswrapper[4730]: I0320 15:40:45.364781    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:45 crc kubenswrapper[4730]: I0320 15:40:45.364857    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:45 crc kubenswrapper[4730]: I0320 15:40:45.364875    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:45 crc kubenswrapper[4730]: I0320 15:40:45.364903    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:45 crc kubenswrapper[4730]: I0320 15:40:45.364923    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:45Z","lastTransitionTime":"2026-03-20T15:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:45 crc kubenswrapper[4730]: I0320 15:40:45.467613    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:45 crc kubenswrapper[4730]: I0320 15:40:45.467691    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:45 crc kubenswrapper[4730]: I0320 15:40:45.467708    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:45 crc kubenswrapper[4730]: I0320 15:40:45.467732    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:45 crc kubenswrapper[4730]: I0320 15:40:45.467749    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:45Z","lastTransitionTime":"2026-03-20T15:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:45 crc kubenswrapper[4730]: I0320 15:40:45.571087    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:45 crc kubenswrapper[4730]: I0320 15:40:45.571164    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:45 crc kubenswrapper[4730]: I0320 15:40:45.571183    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:45 crc kubenswrapper[4730]: I0320 15:40:45.571207    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:45 crc kubenswrapper[4730]: I0320 15:40:45.571225    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:45Z","lastTransitionTime":"2026-03-20T15:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:45 crc kubenswrapper[4730]: I0320 15:40:45.673889    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:45 crc kubenswrapper[4730]: I0320 15:40:45.673937    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:45 crc kubenswrapper[4730]: I0320 15:40:45.673954    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:45 crc kubenswrapper[4730]: I0320 15:40:45.673976    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:45 crc kubenswrapper[4730]: I0320 15:40:45.673995    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:45Z","lastTransitionTime":"2026-03-20T15:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:45 crc kubenswrapper[4730]: I0320 15:40:45.777367    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:45 crc kubenswrapper[4730]: I0320 15:40:45.777881    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:45 crc kubenswrapper[4730]: I0320 15:40:45.777915    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:45 crc kubenswrapper[4730]: I0320 15:40:45.777945    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:45 crc kubenswrapper[4730]: I0320 15:40:45.777966    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:45Z","lastTransitionTime":"2026-03-20T15:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:45 crc kubenswrapper[4730]: I0320 15:40:45.879400    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:45 crc kubenswrapper[4730]: I0320 15:40:45.879458    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:45 crc kubenswrapper[4730]: I0320 15:40:45.879471    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:45 crc kubenswrapper[4730]: I0320 15:40:45.879485    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:45 crc kubenswrapper[4730]: I0320 15:40:45.879495    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:45Z","lastTransitionTime":"2026-03-20T15:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:45 crc kubenswrapper[4730]: I0320 15:40:45.981604    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:45 crc kubenswrapper[4730]: I0320 15:40:45.981639    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:45 crc kubenswrapper[4730]: I0320 15:40:45.981648    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:45 crc kubenswrapper[4730]: I0320 15:40:45.981662    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:45 crc kubenswrapper[4730]: I0320 15:40:45.981672    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:45Z","lastTransitionTime":"2026-03-20T15:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:45 crc kubenswrapper[4730]: I0320 15:40:45.995631    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" event={"ID":"c4b4e0e8-af33-491e-b1d1-31079d90c656","Type":"ContainerStarted","Data":"43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f"}
Mar 20 15:40:46 crc kubenswrapper[4730]: I0320 15:40:46.084380    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:46 crc kubenswrapper[4730]: I0320 15:40:46.084423    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:46 crc kubenswrapper[4730]: I0320 15:40:46.084435    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:46 crc kubenswrapper[4730]: I0320 15:40:46.084451    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:46 crc kubenswrapper[4730]: I0320 15:40:46.084463    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:46Z","lastTransitionTime":"2026-03-20T15:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:46 crc kubenswrapper[4730]: I0320 15:40:46.187178    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:46 crc kubenswrapper[4730]: I0320 15:40:46.187239    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:46 crc kubenswrapper[4730]: I0320 15:40:46.187331    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:46 crc kubenswrapper[4730]: I0320 15:40:46.187359    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:46 crc kubenswrapper[4730]: I0320 15:40:46.187380    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:46Z","lastTransitionTime":"2026-03-20T15:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:46 crc kubenswrapper[4730]: I0320 15:40:46.289693    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:46 crc kubenswrapper[4730]: I0320 15:40:46.289734    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:46 crc kubenswrapper[4730]: I0320 15:40:46.289744    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:46 crc kubenswrapper[4730]: I0320 15:40:46.289760    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:46 crc kubenswrapper[4730]: I0320 15:40:46.289771    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:46Z","lastTransitionTime":"2026-03-20T15:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:46 crc kubenswrapper[4730]: I0320 15:40:46.393521    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:46 crc kubenswrapper[4730]: I0320 15:40:46.393620    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:46 crc kubenswrapper[4730]: I0320 15:40:46.393655    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:46 crc kubenswrapper[4730]: I0320 15:40:46.393685    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:46 crc kubenswrapper[4730]: I0320 15:40:46.393709    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:46Z","lastTransitionTime":"2026-03-20T15:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:46 crc kubenswrapper[4730]: I0320 15:40:46.497166    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:46 crc kubenswrapper[4730]: I0320 15:40:46.497219    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:46 crc kubenswrapper[4730]: I0320 15:40:46.497232    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:46 crc kubenswrapper[4730]: I0320 15:40:46.497270    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:46 crc kubenswrapper[4730]: I0320 15:40:46.497284    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:46Z","lastTransitionTime":"2026-03-20T15:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:46 crc kubenswrapper[4730]: I0320 15:40:46.533145    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:40:46 crc kubenswrapper[4730]: I0320 15:40:46.533183    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:40:46 crc kubenswrapper[4730]: I0320 15:40:46.533224    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:40:46 crc kubenswrapper[4730]: I0320 15:40:46.533302    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:40:46 crc kubenswrapper[4730]: E0320 15:40:46.533421    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:40:46 crc kubenswrapper[4730]: E0320 15:40:46.533656    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:40:46 crc kubenswrapper[4730]: E0320 15:40:46.533901    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:40:46 crc kubenswrapper[4730]: E0320 15:40:46.533987    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:40:46 crc kubenswrapper[4730]: I0320 15:40:46.600153    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:46 crc kubenswrapper[4730]: I0320 15:40:46.600197    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:46 crc kubenswrapper[4730]: I0320 15:40:46.600208    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:46 crc kubenswrapper[4730]: I0320 15:40:46.600223    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:46 crc kubenswrapper[4730]: I0320 15:40:46.600232    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:46Z","lastTransitionTime":"2026-03-20T15:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:46 crc kubenswrapper[4730]: I0320 15:40:46.702728    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:46 crc kubenswrapper[4730]: I0320 15:40:46.702787    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:46 crc kubenswrapper[4730]: I0320 15:40:46.702806    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:46 crc kubenswrapper[4730]: I0320 15:40:46.702829    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:46 crc kubenswrapper[4730]: I0320 15:40:46.702848    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:46Z","lastTransitionTime":"2026-03-20T15:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:46 crc kubenswrapper[4730]: I0320 15:40:46.805545    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:46 crc kubenswrapper[4730]: I0320 15:40:46.805586    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:46 crc kubenswrapper[4730]: I0320 15:40:46.805601    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:46 crc kubenswrapper[4730]: I0320 15:40:46.805620    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:46 crc kubenswrapper[4730]: I0320 15:40:46.805635    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:46Z","lastTransitionTime":"2026-03-20T15:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:46 crc kubenswrapper[4730]: I0320 15:40:46.908739    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:46 crc kubenswrapper[4730]: I0320 15:40:46.908791    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:46 crc kubenswrapper[4730]: I0320 15:40:46.908808    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:46 crc kubenswrapper[4730]: I0320 15:40:46.908831    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:46 crc kubenswrapper[4730]: I0320 15:40:46.908852    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:46Z","lastTransitionTime":"2026-03-20T15:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:47 crc kubenswrapper[4730]: I0320 15:40:47.011914    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:47 crc kubenswrapper[4730]: I0320 15:40:47.011984    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:47 crc kubenswrapper[4730]: I0320 15:40:47.012007    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:47 crc kubenswrapper[4730]: I0320 15:40:47.012035    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:47 crc kubenswrapper[4730]: I0320 15:40:47.012057    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:47Z","lastTransitionTime":"2026-03-20T15:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:47 crc kubenswrapper[4730]: I0320 15:40:47.114382    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:47 crc kubenswrapper[4730]: I0320 15:40:47.114434    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:47 crc kubenswrapper[4730]: I0320 15:40:47.114450    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:47 crc kubenswrapper[4730]: I0320 15:40:47.114471    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:47 crc kubenswrapper[4730]: I0320 15:40:47.114489    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:47Z","lastTransitionTime":"2026-03-20T15:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:47 crc kubenswrapper[4730]: I0320 15:40:47.217639    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:47 crc kubenswrapper[4730]: I0320 15:40:47.217908    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:47 crc kubenswrapper[4730]: I0320 15:40:47.217920    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:47 crc kubenswrapper[4730]: I0320 15:40:47.217934    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:47 crc kubenswrapper[4730]: I0320 15:40:47.217946    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:47Z","lastTransitionTime":"2026-03-20T15:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:47 crc kubenswrapper[4730]: I0320 15:40:47.319799    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:47 crc kubenswrapper[4730]: I0320 15:40:47.319839    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:47 crc kubenswrapper[4730]: I0320 15:40:47.319851    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:47 crc kubenswrapper[4730]: I0320 15:40:47.319867    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:47 crc kubenswrapper[4730]: I0320 15:40:47.319878    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:47Z","lastTransitionTime":"2026-03-20T15:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:47 crc kubenswrapper[4730]: I0320 15:40:47.422523    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:47 crc kubenswrapper[4730]: I0320 15:40:47.422558    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:47 crc kubenswrapper[4730]: I0320 15:40:47.422572    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:47 crc kubenswrapper[4730]: I0320 15:40:47.422589    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:47 crc kubenswrapper[4730]: I0320 15:40:47.422601    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:47Z","lastTransitionTime":"2026-03-20T15:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:47 crc kubenswrapper[4730]: I0320 15:40:47.524761    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:47 crc kubenswrapper[4730]: I0320 15:40:47.524808    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:47 crc kubenswrapper[4730]: I0320 15:40:47.524828    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:47 crc kubenswrapper[4730]: I0320 15:40:47.524850    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:47 crc kubenswrapper[4730]: I0320 15:40:47.524866    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:47Z","lastTransitionTime":"2026-03-20T15:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:47 crc kubenswrapper[4730]: I0320 15:40:47.627263    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:47 crc kubenswrapper[4730]: I0320 15:40:47.627332    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:47 crc kubenswrapper[4730]: I0320 15:40:47.627347    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:47 crc kubenswrapper[4730]: I0320 15:40:47.627367    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:47 crc kubenswrapper[4730]: I0320 15:40:47.627379    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:47Z","lastTransitionTime":"2026-03-20T15:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:47 crc kubenswrapper[4730]: I0320 15:40:47.730747    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:47 crc kubenswrapper[4730]: I0320 15:40:47.730819    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:47 crc kubenswrapper[4730]: I0320 15:40:47.730863    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:47 crc kubenswrapper[4730]: I0320 15:40:47.730901    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:47 crc kubenswrapper[4730]: I0320 15:40:47.730924    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:47Z","lastTransitionTime":"2026-03-20T15:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:47 crc kubenswrapper[4730]: I0320 15:40:47.834145    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:47 crc kubenswrapper[4730]: I0320 15:40:47.834197    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:47 crc kubenswrapper[4730]: I0320 15:40:47.834211    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:47 crc kubenswrapper[4730]: I0320 15:40:47.834229    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:47 crc kubenswrapper[4730]: I0320 15:40:47.834264    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:47Z","lastTransitionTime":"2026-03-20T15:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:47 crc kubenswrapper[4730]: I0320 15:40:47.936504    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:47 crc kubenswrapper[4730]: I0320 15:40:47.936546    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:47 crc kubenswrapper[4730]: I0320 15:40:47.936556    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:47 crc kubenswrapper[4730]: I0320 15:40:47.936572    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:47 crc kubenswrapper[4730]: I0320 15:40:47.936581    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:47Z","lastTransitionTime":"2026-03-20T15:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.014110    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" event={"ID":"c4b4e0e8-af33-491e-b1d1-31079d90c656","Type":"ContainerStarted","Data":"1c7fc7c3b6cf0ebabd03bf607d7da5f1221244499e74f0f13a94a8015113c518"}
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.014709    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.029517    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:48Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.040131    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.040181    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.040200    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.040223    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.040240    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:48Z","lastTransitionTime":"2026-03-20T15:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.043686    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n4w74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ee8d55f-90bd-4484-8455-933de455efea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2070438cbc1bdd8e3c865f5aa62537649598e0f51effc9936b9cb1a630b3651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fvg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n4w74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:48Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.051585    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.058479    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1242122dff682493181de927aefeb2ac4e5274621d865ee89ccec2e60e86469b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8117d479604ca09fc23e043f373aa3acbffb209d9fbfae95f2eb907424712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p47zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:48Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.076492    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e647990de11fdbff5c61eb5f2a4f36ee1c2562141d84795cf0d7bbc1155cac28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://188974c8e34d453f129d95f8a3ee36ff86b44b37a5388a0643611206f15d57c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:48Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.091274    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fcd3db3-55f1-4c23-8fa9-78844495cea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8e20a286d6affcba7ffa950ef5386e7f439c9a02381cb8b7d3bc51ad9c4f343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd661cda796a2bd61d1446bee672c4471d60370245caf7cb54faf54dfa9c58a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p5qvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:48Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.103867    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-69fnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102cb977-7291-453e-9282-20572071afee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35fa36f52355ad1fd24bd332fc5b0a5ff35ac2be71dd16b53449d9f47e48e8ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plthx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-69fnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:48Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.119119    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-49hht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c34c742c6da6c6d35f815901234a0c12c9628d22fa83e511bddc78eae4373cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdfa0e3e3af5612de41c905d807532a50887378171356a3dade8aaf3d29fe9bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdfa0e3e3af5612de41c905d807532a50887378171356a3dade8aaf3d29fe9bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-49hht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:48Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.137664    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708c4e4c-a9ad-466a-a3d0-a99ae3399934\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://631f7e03c5476d53c6d6a3a3bf6a1d45ac40443386f072fc7c01129e9eaf7619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d029581dfd248d1298c93e18a5352686b5e89fb730ede58a29400c62deb2bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a81d62ce9d72579cc4dd4778d22cf442919e08c7d83209fde12edc8a338684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b365c72e5e05c29295ba289a5bf795353c5221607dadd2ab8b4f642272d99fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfacdb41d11bf0b03bede308c267faddfd0241d99a4cbda2603c6539bc3a433a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:48Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.142084    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.142114    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.142123    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.142138    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.142148    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:48Z","lastTransitionTime":"2026-03-20T15:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.156222    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c3fe2f-3c67-4dee-becb-3ecfe2758384\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2660f772d76479685f586aeb9b242975695ac5e28668840bac6b8a75c09443a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d96c704317c9bfbcb499beae611ca6fcb6337084728df20ad791267de34359b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://180d79b8131e00449715e74d883031078dcceb74ca44ea372b6c3f651d617007\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b137ba07c1866334b1897d83f9590d1627b5724b950efaef0bbda497186fe1ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:39:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:39:59.021893       1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:39:59.022050       1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:39:59.023119       1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311534806/tls.crt::/tmp/serving-cert-3311534806/tls.key\\\\\\\"\\\\nI0320 15:39:59.307640       1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:39:59.310126       1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:39:59.310145       1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:39:59.310173       1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:39:59.310181       1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:39:59.315103       1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 15:39:59.315287       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 15:39:59.315111       1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:39:59.315373       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:39:59.315499       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:39:59.315558       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:39:59.315615       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:39:59.315659       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:39:59.316192       1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8e7f1bfc56e54242e12f30487e8120eeb3d85dc86cfc17b587255889bd60bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:48Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.170452    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055b0251e049abda93d63985b27520b79b2596443df06427dd990fc974b52a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:48Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.186477    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:48Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.208624    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6r2kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f0276b7ef4131fee3cc30530a3cb7ddf6e6be5ce73d3084a1f6f4272f9dde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvthz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6r2kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:48Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.224197    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2prfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2prfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:48Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.236429    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07c87d31-bb33-4cbb-8786-b42cee6340a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4298d984334a4955a24ac85a16284d8ffe1e4f3033c328be6e66f59d3b17f1b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:48Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.244914    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.244983    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.244997    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.245016    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.245029    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:48Z","lastTransitionTime":"2026-03-20T15:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.251144    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:48Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.269749    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b4e0e8-af33-491e-b1d1-31079d90c656\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c7fc7c3b6cf0ebabd03bf607d7da5f1221244499e74f0f13a94a8015113c518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj97f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:48Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.287498    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8a35dfa17acf7f7051d13b20cab55fe91645c8fa2773fed67baddae164b586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:48Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.300818    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:48Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.311109    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n4w74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ee8d55f-90bd-4484-8455-933de455efea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2070438cbc1bdd8e3c865f5aa62537649598e0f51effc9936b9cb1a630b3651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fvg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n4w74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:48Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.331363    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708c4e4c-a9ad-466a-a3d0-a99ae3399934\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://631f7e03c5476d53c6d6a3a3bf6a1d45ac40443386f072fc7c01129e9eaf7619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d029581dfd248d1298c93e18a5352686b5e89fb730ede58a29400c62deb2bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a81d62ce9d72579cc4dd4778d22cf442919e08c7d83209fde12edc8a338684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b365c72e5e05c29295ba289a5bf795353c5221607dadd2ab8b4f642272d99fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfacdb41d11bf0b03bede308c267faddfd0241d99a4cbda2603c6539bc3a433a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:48Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.348826    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c3fe2f-3c67-4dee-becb-3ecfe2758384\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2660f772d76479685f586aeb9b242975695ac5e28668840bac6b8a75c09443a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d96c704317c9bfbcb499beae611ca6fcb6337084728df20ad791267de34359b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://180d79b8131e00449715e74d883031078dcceb74ca44ea372b6c3f651d617007\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b137ba07c1866334b1897d83f9590d1627b5724b950efaef0bbda497186fe1ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:39:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:39:59.021893       1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:39:59.022050       1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:39:59.023119       1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311534806/tls.crt::/tmp/serving-cert-3311534806/tls.key\\\\\\\"\\\\nI0320 15:39:59.307640       1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:39:59.310126       1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:39:59.310145       1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:39:59.310173       1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:39:59.310181       1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:39:59.315103       1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 15:39:59.315287       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 15:39:59.315111       1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:39:59.315373       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:39:59.315499       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:39:59.315558       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:39:59.315615       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:39:59.315659       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:39:59.316192       1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8e7f1bfc56e54242e12f30487e8120eeb3d85dc86cfc17b587255889bd60bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:48Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.351274    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.351323    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.351339    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.351363    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.351380    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:48Z","lastTransitionTime":"2026-03-20T15:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.391177    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1242122dff682493181de927aefeb2ac4e5274621d865ee89ccec2e60e86469b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8117d479604ca09fc23e043f373aa3acbffb209d9fbfae95f2eb907424712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p47zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:48Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.411538    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e647990de11fdbff5c61eb5f2a4f36ee1c2562141d84795cf0d7bbc1155cac28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://188974c8e34d453f129d95f8a3ee36ff86b44b37a5388a0643611206f15d57c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:48Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.423214    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fcd3db3-55f1-4c23-8fa9-78844495cea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8e20a286d6affcba7ffa950ef5386e7f439c9a02381cb8b7d3bc51ad9c4f343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd661cda796a2bd61d1446bee672c4471d60370245caf7cb54faf54dfa9c58a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p5qvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:48Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.432766    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-69fnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102cb977-7291-453e-9282-20572071afee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35fa36f52355ad1fd24bd332fc5b0a5ff35ac2be71dd16b53449d9f47e48e8ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plthx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-69fnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:48Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.444424    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-49hht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c34c742c6da6c6d35f815901234a0c12c9628d22fa83e511bddc78eae4373cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdfa0e3e3af5612de41c905d807532a50887378171356a3dade8aaf3d29fe9bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdfa0e3e3af5612de41c905d807532a50887378171356a3dade8aaf3d29fe9bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-49hht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:48Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.453814    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.453853    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.453861    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.453873    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.453883    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:48Z","lastTransitionTime":"2026-03-20T15:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.454291    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07c87d31-bb33-4cbb-8786-b42cee6340a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4298d984334a4955a24ac85a16284d8ffe1e4f3033c328be6e66f59d3b17f1b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:48Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.464907    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055b0251e049abda93d63985b27520b79b2596443df06427dd990fc974b52a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:48Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.477490    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:48Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.490286    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6r2kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f0276b7ef4131fee3cc30530a3cb7ddf6e6be5ce73d3084a1f6f4272f9dde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvthz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6r2kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:48Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.501225    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2prfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2prfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:48Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.513565    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8a35dfa17acf7f7051d13b20cab55fe91645c8fa2773fed67baddae164b586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:48Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.526055    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:48Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.532604    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.532647    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:40:48 crc kubenswrapper[4730]: E0320 15:40:48.532731    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:40:48 crc kubenswrapper[4730]: E0320 15:40:48.532869    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.533007    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:40:48 crc kubenswrapper[4730]: E0320 15:40:48.533095    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.533325    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:40:48 crc kubenswrapper[4730]: E0320 15:40:48.533429    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.546357    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b4e0e8-af33-491e-b1d1-31079d90c656\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c7fc7c3b6cf0ebabd03bf607d7da5f1221244499e74f0f13a94a8015113c518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj97f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:48Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.555950    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.555999    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.556016    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.556042    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.556064    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:48Z","lastTransitionTime":"2026-03-20T15:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.658798    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.658866    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.658883    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.658956    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.658979    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:48Z","lastTransitionTime":"2026-03-20T15:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.762069    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.762147    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.762168    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.762196    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.762214    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:48Z","lastTransitionTime":"2026-03-20T15:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.865508    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.865991    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.866010    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.866038    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.866057    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:48Z","lastTransitionTime":"2026-03-20T15:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.969174    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.969219    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.969232    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.969269    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:48 crc kubenswrapper[4730]: I0320 15:40:48.969284    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:48Z","lastTransitionTime":"2026-03-20T15:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.017910    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.018566    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.041659    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.059139    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:49Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.070426    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n4w74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ee8d55f-90bd-4484-8455-933de455efea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2070438cbc1bdd8e3c865f5aa62537649598e0f51effc9936b9cb1a630b3651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fvg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n4w74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:49Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.076208    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.076268    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.076282    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.076301    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.076656    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:49Z","lastTransitionTime":"2026-03-20T15:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.084513    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fcd3db3-55f1-4c23-8fa9-78844495cea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8e20a286d6affcba7ffa950ef5386e7f439c9a02381cb8b7d3bc51ad9c4f343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd661cda796a2bd61d1446bee672c4471d60370245caf7cb54faf54dfa9c58a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p5qvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:49Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.094986    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-69fnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102cb977-7291-453e-9282-20572071afee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35fa36f52355ad1fd24bd332fc5b0a5ff35ac2be71dd16b53449d9f47e48e8ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plthx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-69fnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:49Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.112793    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-49hht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c34c742c6da6c6d35f815901234a0c12c9628d22fa83e511bddc78eae4373cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdfa0e3e3af5612de41c905d807532a50887378171356a3dade8aaf3d29fe9bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdfa0e3e3af5612de41c905d807532a50887378171356a3dade8aaf3d29fe9bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-49hht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:49Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.135146    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708c4e4c-a9ad-466a-a3d0-a99ae3399934\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://631f7e03c5476d53c6d6a3a3bf6a1d45ac40443386f072fc7c01129e9eaf7619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d029581dfd248d1298c93e18a5352686b5e89fb730ede58a29400c62deb2bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a81d62ce9d72579cc4dd4778d22cf442919e08c7d83209fde12edc8a338684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b365c72e5e05c29295ba289a5bf795353c5221607dadd2ab8b4f642272d99fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfacdb41d11bf0b03bede308c267faddfd0241d99a4cbda2603c6539bc3a433a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:49Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.151947    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c3fe2f-3c67-4dee-becb-3ecfe2758384\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2660f772d76479685f586aeb9b242975695ac5e28668840bac6b8a75c09443a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d96c704317c9bfbcb499beae611ca6fcb6337084728df20ad791267de34359b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://180d79b8131e00449715e74d883031078dcceb74ca44ea372b6c3f651d617007\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b137ba07c1866334b1897d83f9590d1627b5724b950efaef0bbda497186fe1ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:39:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:39:59.021893       1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:39:59.022050       1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:39:59.023119       1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311534806/tls.crt::/tmp/serving-cert-3311534806/tls.key\\\\\\\"\\\\nI0320 15:39:59.307640       1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:39:59.310126       1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:39:59.310145       1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:39:59.310173       1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:39:59.310181       1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:39:59.315103       1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 15:39:59.315287       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 15:39:59.315111       1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:39:59.315373       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:39:59.315499       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:39:59.315558       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:39:59.315615       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:39:59.315659       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:39:59.316192       1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8e7f1bfc56e54242e12f30487e8120eeb3d85dc86cfc17b587255889bd60bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:49Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.165205    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1242122dff682493181de927aefeb2ac4e5274621d865ee89ccec2e60e86469b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8117d479604ca09fc23e043f373aa3acbffb209d9fbfae95f2eb907424712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p47zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:49Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.177727    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e647990de11fdbff5c61eb5f2a4f36ee1c2562141d84795cf0d7bbc1155cac28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://188974c8e34d453f129d95f8a3ee36ff86b44b37a5388a0643611206f15d57c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:49Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.179681    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.179710    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.179719    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.179737    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.179746    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:49Z","lastTransitionTime":"2026-03-20T15:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.189026    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6r2kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f0276b7ef4131fee3cc30530a3cb7ddf6e6be5ce73d3084a1f6f4272f9dde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvthz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6r2kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:49Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.199873    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2prfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2prfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:49Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.214514    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07c87d31-bb33-4cbb-8786-b42cee6340a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4298d984334a4955a24ac85a16284d8ffe1e4f3033c328be6e66f59d3b17f1b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:49Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.229997    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055b0251e049abda93d63985b27520b79b2596443df06427dd990fc974b52a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:49Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.243962    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:49Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.261177    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8a35dfa17acf7f7051d13b20cab55fe91645c8fa2773fed67baddae164b586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:49Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.279016    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:49Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.282778    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.282841    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.282858    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.282880    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.282895    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:49Z","lastTransitionTime":"2026-03-20T15:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.297181    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b4e0e8-af33-491e-b1d1-31079d90c656\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c7fc7c3b6cf0ebabd03bf607d7da5f1221244499e74f0f13a94a8015113c518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj97f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:49Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.386322    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.386728    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.386854    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.386935    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.387018    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:49Z","lastTransitionTime":"2026-03-20T15:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.490869    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.491140    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.491231    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.491311    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.491376    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:49Z","lastTransitionTime":"2026-03-20T15:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.594373    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.594536    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.594631    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.594721    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.594811    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:49Z","lastTransitionTime":"2026-03-20T15:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.696993    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.697059    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.697074    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.697097    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.697142    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:49Z","lastTransitionTime":"2026-03-20T15:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.799593    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.799665    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.799678    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.799694    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.799717    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:49Z","lastTransitionTime":"2026-03-20T15:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.903158    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.903225    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.903245    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.903311    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:49 crc kubenswrapper[4730]: I0320 15:40:49.903331    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:49Z","lastTransitionTime":"2026-03-20T15:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.005616    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.005846    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.005925    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.006005    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.006128    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:50Z","lastTransitionTime":"2026-03-20T15:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.022157    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qj97f_c4b4e0e8-af33-491e-b1d1-31079d90c656/ovnkube-controller/0.log"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.025232    4730 generic.go:334] "Generic (PLEG): container finished" podID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerID="1c7fc7c3b6cf0ebabd03bf607d7da5f1221244499e74f0f13a94a8015113c518" exitCode=1
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.025273    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" event={"ID":"c4b4e0e8-af33-491e-b1d1-31079d90c656","Type":"ContainerDied","Data":"1c7fc7c3b6cf0ebabd03bf607d7da5f1221244499e74f0f13a94a8015113c518"}
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.025820    4730 scope.go:117] "RemoveContainer" containerID="1c7fc7c3b6cf0ebabd03bf607d7da5f1221244499e74f0f13a94a8015113c518"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.039234    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-49hht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c34c742c6da6c6d35f815901234a0c12c9628d22fa83e511bddc78eae4373cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdfa0e3e3af5612de41c905d807532a50887378171356a3dade8aaf3d29fe9bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdfa0e3e3af5612de41c905d807532a50887378171356a3dade8aaf3d29fe9bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-49hht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:50Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.065168    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708c4e4c-a9ad-466a-a3d0-a99ae3399934\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://631f7e03c5476d53c6d6a3a3bf6a1d45ac40443386f072fc7c01129e9eaf7619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d029581dfd248d1298c93e18a5352686b5e89fb730ede58a29400c62deb2bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a81d62ce9d72579cc4dd4778d22cf442919e08c7d83209fde12edc8a338684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b365c72e5e05c29295ba289a5bf795353c5221607dadd2ab8b4f642272d99fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfacdb41d11bf0b03bede308c267faddfd0241d99a4cbda2603c6539bc3a433a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:50Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.076463    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c3fe2f-3c67-4dee-becb-3ecfe2758384\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2660f772d76479685f586aeb9b242975695ac5e28668840bac6b8a75c09443a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d96c704317c9bfbcb499beae611ca6fcb6337084728df20ad791267de34359b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://180d79b8131e00449715e74d883031078dcceb74ca44ea372b6c3f651d617007\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b137ba07c1866334b1897d83f9590d1627b5724b950efaef0bbda497186fe1ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:39:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:39:59.021893       1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:39:59.022050       1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:39:59.023119       1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311534806/tls.crt::/tmp/serving-cert-3311534806/tls.key\\\\\\\"\\\\nI0320 15:39:59.307640       1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:39:59.310126       1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:39:59.310145       1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:39:59.310173       1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:39:59.310181       1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:39:59.315103       1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 15:39:59.315287       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 15:39:59.315111       1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:39:59.315373       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:39:59.315499       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:39:59.315558       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:39:59.315615       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:39:59.315659       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:39:59.316192       1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8e7f1bfc56e54242e12f30487e8120eeb3d85dc86cfc17b587255889bd60bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:50Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.089228    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1242122dff682493181de927aefeb2ac4e5274621d865ee89ccec2e60e86469b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8117d479604ca09fc23e043f373aa3acbffb209d9fbfae95f2eb907424712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p47zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:50Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.101303    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e647990de11fdbff5c61eb5f2a4f36ee1c2562141d84795cf0d7bbc1155cac28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://188974c8e34d453f129d95f8a3ee36ff86b44b37a5388a0643611206f15d57c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:50Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.108285    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.108328    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.108342    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.108407    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.108434    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:50Z","lastTransitionTime":"2026-03-20T15:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.112518    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fcd3db3-55f1-4c23-8fa9-78844495cea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8e20a286d6affcba7ffa950ef5386e7f439c9a02381cb8b7d3bc51ad9c4f343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd661cda796a2bd61d1446bee672c4471d60370245caf7cb54faf54dfa9c58a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p5qvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:50Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.123302    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-69fnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102cb977-7291-453e-9282-20572071afee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35fa36f52355ad1fd24bd332fc5b0a5ff35ac2be71dd16b53449d9f47e48e8ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plthx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-69fnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:50Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.133328    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07c87d31-bb33-4cbb-8786-b42cee6340a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4298d984334a4955a24ac85a16284d8ffe1e4f3033c328be6e66f59d3b17f1b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:50Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.149197    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055b0251e049abda93d63985b27520b79b2596443df06427dd990fc974b52a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:50Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.160629    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:50Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.174839    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6r2kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f0276b7ef4131fee3cc30530a3cb7ddf6e6be5ce73d3084a1f6f4272f9dde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvthz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6r2kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:50Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.185509    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2prfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2prfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:50Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.196731    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8a35dfa17acf7f7051d13b20cab55fe91645c8fa2773fed67baddae164b586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:50Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.207848    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:50Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.211655    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.211693    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.211705    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.211722    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.211733    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:50Z","lastTransitionTime":"2026-03-20T15:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.224483    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b4e0e8-af33-491e-b1d1-31079d90c656\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c7fc7c3b6cf0ebabd03bf607d7da5f1221244499e74f0f13a94a8015113c518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c7fc7c3b6cf0ebabd03bf607d7da5f1221244499e74f0f13a94a8015113c518\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:40:49Z\\\",\\\"message\\\":\\\"ice/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 15:40:49.565874    6588 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 15:40:49.565925    6588 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 15:40:49.565931    6588 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 15:40:49.565977    6588 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 15:40:49.565987    6588 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 15:40:49.565980    6588 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 15:40:49.566007    6588 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 15:40:49.566013    6588 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 15:40:49.566012    6588 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 15:40:49.566033    6588 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 15:40:49.566034    6588 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 15:40:49.566047    6588 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 15:40:49.566043    6588 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 15:40:49.566092    6588 factory.go:656] Stopping watch factory\\\\nI0320 15:40:49.566129    6588 ovnkube.go:599] Stopped ovnkube\\\\nI0320 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj97f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:50Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.236425    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:50Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.244622    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n4w74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ee8d55f-90bd-4484-8455-933de455efea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2070438cbc1bdd8e3c865f5aa62537649598e0f51effc9936b9cb1a630b3651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fvg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n4w74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:50Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.313950    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.314000    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.314017    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.314038    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.314055    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:50Z","lastTransitionTime":"2026-03-20T15:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.417932    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.417973    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.417983    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.417998    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.418007    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:50Z","lastTransitionTime":"2026-03-20T15:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.520402    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.520445    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.520459    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.520476    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.520489    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:50Z","lastTransitionTime":"2026-03-20T15:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.532746    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.532774    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.532838    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.532913    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:40:50 crc kubenswrapper[4730]: E0320 15:40:50.532908    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:40:50 crc kubenswrapper[4730]: E0320 15:40:50.533050    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:40:50 crc kubenswrapper[4730]: E0320 15:40:50.533121    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:40:50 crc kubenswrapper[4730]: E0320 15:40:50.533186    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.623114    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.623156    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.623164    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.623179    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.623189    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:50Z","lastTransitionTime":"2026-03-20T15:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.725789    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.725823    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.725831    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.725844    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.725853    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:50Z","lastTransitionTime":"2026-03-20T15:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.827869    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.827912    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.827921    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.827936    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.827950    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:50Z","lastTransitionTime":"2026-03-20T15:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.929769    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.929803    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.929811    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.929824    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:50 crc kubenswrapper[4730]: I0320 15:40:50.929832    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:50Z","lastTransitionTime":"2026-03-20T15:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.028704    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qj97f_c4b4e0e8-af33-491e-b1d1-31079d90c656/ovnkube-controller/1.log"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.029408    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qj97f_c4b4e0e8-af33-491e-b1d1-31079d90c656/ovnkube-controller/0.log"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.032414    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.032448    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.032460    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.032493    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.032503    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:51Z","lastTransitionTime":"2026-03-20T15:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.033127    4730 generic.go:334] "Generic (PLEG): container finished" podID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerID="7695af707a9a2e3b80ba56e6a4b97e35b5df9886333ea9cbb7f3852243b61ad7" exitCode=1
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.033159    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" event={"ID":"c4b4e0e8-af33-491e-b1d1-31079d90c656","Type":"ContainerDied","Data":"7695af707a9a2e3b80ba56e6a4b97e35b5df9886333ea9cbb7f3852243b61ad7"}
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.033188    4730 scope.go:117] "RemoveContainer" containerID="1c7fc7c3b6cf0ebabd03bf607d7da5f1221244499e74f0f13a94a8015113c518"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.033774    4730 scope.go:117] "RemoveContainer" containerID="7695af707a9a2e3b80ba56e6a4b97e35b5df9886333ea9cbb7f3852243b61ad7"
Mar 20 15:40:51 crc kubenswrapper[4730]: E0320 15:40:51.034011    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qj97f_openshift-ovn-kubernetes(c4b4e0e8-af33-491e-b1d1-31079d90c656)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.052281    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8a35dfa17acf7f7051d13b20cab55fe91645c8fa2773fed67baddae164b586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.068854    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.087074    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b4e0e8-af33-491e-b1d1-31079d90c656\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7695af707a9a2e3b80ba56e6a4b97e35b5df9886333ea9cbb7f3852243b61ad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c7fc7c3b6cf0ebabd03bf607d7da5f1221244499e74f0f13a94a8015113c518\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:40:49Z\\\",\\\"message\\\":\\\"ice/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 15:40:49.565874    6588 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 15:40:49.565925    6588 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 15:40:49.565931    6588 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 15:40:49.565977    6588 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 15:40:49.565987    6588 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 15:40:49.565980    6588 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 15:40:49.566007    6588 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 15:40:49.566013    6588 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 15:40:49.566012    6588 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 15:40:49.566033    6588 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 15:40:49.566034    6588 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 15:40:49.566047    6588 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 15:40:49.566043    6588 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 15:40:49.566092    6588 factory.go:656] Stopping watch factory\\\\nI0320 15:40:49.566129    6588 ovnkube.go:599] Stopped ovnkube\\\\nI0320 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7695af707a9a2e3b80ba56e6a4b97e35b5df9886333ea9cbb7f3852243b61ad7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:40:50Z\\\",\\\"message\\\":\\\"0.980093    6718 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 15:40:50.980099    6718 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 15:40:50.980105    6718 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 15:40:50.980374    6718 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 15:40:50.980766    6718 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 15:40:50.981084    6718 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 15:40:50.981851    6718 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 15:40:50.982080    6718 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 15:40:50.982550    6718 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 15:40:50.982574    6718 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 15:40:50.982620    6718 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 15:40:50.982656    6718 factory.go:656] Stopping watch factory\\\\nI0320 15:40:50.982676    6718 ovnkube.go:599] Stopped ovnkube\\\\nI0320 15\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj97f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.102700    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.111306    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n4w74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ee8d55f-90bd-4484-8455-933de455efea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2070438cbc1bdd8e3c865f5aa62537649598e0f51effc9936b9cb1a630b3651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fvg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n4w74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.128925    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-49hht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c34c742c6da6c6d35f815901234a0c12c9628d22fa83e511bddc78eae4373cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdfa0e3e3af5612de41c905d807532a50887378171356a3dade8aaf3d29fe9bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdfa0e3e3af5612de41c905d807532a50887378171356a3dade8aaf3d29fe9bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-49hht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.134674    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.134722    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.134736    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.134753    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.134783    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:51Z","lastTransitionTime":"2026-03-20T15:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.148039    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708c4e4c-a9ad-466a-a3d0-a99ae3399934\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://631f7e03c5476d53c6d6a3a3bf6a1d45ac40443386f072fc7c01129e9eaf7619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d029581dfd248d1298c93e18a5352686b5e89fb730ede58a29400c62deb2bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a81d62ce9d72579cc4dd4778d22cf442919e08c7d83209fde12edc8a338684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b365c72e5e05c29295ba289a5bf795353c5221607dadd2ab8b4f642272d99fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfacdb41d11bf0b03bede308c267faddfd0241d99a4cbda2603c6539bc3a433a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.163878    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c3fe2f-3c67-4dee-becb-3ecfe2758384\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2660f772d76479685f586aeb9b242975695ac5e28668840bac6b8a75c09443a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d96c704317c9bfbcb499beae611ca6fcb6337084728df20ad791267de34359b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://180d79b8131e00449715e74d883031078dcceb74ca44ea372b6c3f651d617007\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b137ba07c1866334b1897d83f9590d1627b5724b950efaef0bbda497186fe1ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:39:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:39:59.021893       1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:39:59.022050       1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:39:59.023119       1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311534806/tls.crt::/tmp/serving-cert-3311534806/tls.key\\\\\\\"\\\\nI0320 15:39:59.307640       1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:39:59.310126       1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:39:59.310145       1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:39:59.310173       1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:39:59.310181       1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:39:59.315103       1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 15:39:59.315287       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 15:39:59.315111       1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:39:59.315373       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:39:59.315499       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:39:59.315558       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:39:59.315615       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:39:59.315659       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:39:59.316192       1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8e7f1bfc56e54242e12f30487e8120eeb3d85dc86cfc17b587255889bd60bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.174164    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1242122dff682493181de927aefeb2ac4e5274621d865ee89ccec2e60e86469b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8117d479604ca09fc23e043f373aa3acbffb209d9fbfae95f2eb907424712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p47zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.189969    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e647990de11fdbff5c61eb5f2a4f36ee1c2562141d84795cf0d7bbc1155cac28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://188974c8e34d453f129d95f8a3ee36ff86b44b37a5388a0643611206f15d57c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.203008    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fcd3db3-55f1-4c23-8fa9-78844495cea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8e20a286d6affcba7ffa950ef5386e7f439c9a02381cb8b7d3bc51ad9c4f343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd661cda796a2bd61d1446bee672c4471d60370245caf7cb54faf54dfa9c58a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p5qvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.216718    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-69fnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102cb977-7291-453e-9282-20572071afee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35fa36f52355ad1fd24bd332fc5b0a5ff35ac2be71dd16b53449d9f47e48e8ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plthx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-69fnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.228117    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07c87d31-bb33-4cbb-8786-b42cee6340a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4298d984334a4955a24ac85a16284d8ffe1e4f3033c328be6e66f59d3b17f1b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.238059    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.238135    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.238159    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.238190    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.238216    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:51Z","lastTransitionTime":"2026-03-20T15:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.240850    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055b0251e049abda93d63985b27520b79b2596443df06427dd990fc974b52a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.256382    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.269923    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6r2kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f0276b7ef4131fee3cc30530a3cb7ddf6e6be5ce73d3084a1f6f4272f9dde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvthz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6r2kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.282462    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2prfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2prfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.340814    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.340850    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.340860    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.340874    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.340884    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:51Z","lastTransitionTime":"2026-03-20T15:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.443097    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.443141    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.443153    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.443169    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.443182    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:51Z","lastTransitionTime":"2026-03-20T15:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.548746    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.549044    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.549164    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.549303    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.549403    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:51Z","lastTransitionTime":"2026-03-20T15:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.555273    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c3fe2f-3c67-4dee-becb-3ecfe2758384\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2660f772d76479685f586aeb9b242975695ac5e28668840bac6b8a75c09443a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d96c704317c9bfbcb499beae611ca6fcb6337084728df20ad791267de34359b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://180d79b8131e00449715e74d883031078dcceb74ca44ea372b6c3f651d617007\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b137ba07c1866334b1897d83f9590d1627b5724b950efaef0bbda497186fe1ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:39:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:39:59.021893       1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:39:59.022050       1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:39:59.023119       1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311534806/tls.crt::/tmp/serving-cert-3311534806/tls.key\\\\\\\"\\\\nI0320 15:39:59.307640       1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:39:59.310126       1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:39:59.310145       1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:39:59.310173       1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:39:59.310181       1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:39:59.315103       1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 15:39:59.315287       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 15:39:59.315111       1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:39:59.315373       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:39:59.315499       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:39:59.315558       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:39:59.315615       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:39:59.315659       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:39:59.316192       1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8e7f1bfc56e54242e12f30487e8120eeb3d85dc86cfc17b587255889bd60bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.571949    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1242122dff682493181de927aefeb2ac4e5274621d865ee89ccec2e60e86469b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8117d479604ca09fc23e043f373aa3acbffb209d9fbfae95f2eb907424712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p47zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.594165    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e647990de11fdbff5c61eb5f2a4f36ee1c2562141d84795cf0d7bbc1155cac28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://188974c8e34d453f129d95f8a3ee36ff86b44b37a5388a0643611206f15d57c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.604983    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fcd3db3-55f1-4c23-8fa9-78844495cea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8e20a286d6affcba7ffa950ef5386e7f439c9a02381cb8b7d3bc51ad9c4f343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd661cda796a2bd61d1446bee672c4471d60370245caf7cb54faf54dfa9c58a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p5qvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.614451    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-69fnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102cb977-7291-453e-9282-20572071afee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35fa36f52355ad1fd24bd332fc5b0a5ff35ac2be71dd16b53449d9f47e48e8ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plthx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-69fnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.629026    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-49hht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c34c742c6da6c6d35f815901234a0c12c9628d22fa83e511bddc78eae4373cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdfa0e3e3af5612de41c905d807532a50887378171356a3dade8aaf3d29fe9bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdfa0e3e3af5612de41c905d807532a50887378171356a3dade8aaf3d29fe9bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-49hht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.647805    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708c4e4c-a9ad-466a-a3d0-a99ae3399934\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://631f7e03c5476d53c6d6a3a3bf6a1d45ac40443386f072fc7c01129e9eaf7619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d029581dfd248d1298c93e18a5352686b5e89fb730ede58a29400c62deb2bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a81d62ce9d72579cc4dd4778d22cf442919e08c7d83209fde12edc8a338684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b365c72e5e05c29295ba289a5bf795353c5221607dadd2ab8b4f642272d99fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfacdb41d11bf0b03bede308c267faddfd0241d99a4cbda2603c6539bc3a433a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.651681    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.651710    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.651720    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.651735    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.651746    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:51Z","lastTransitionTime":"2026-03-20T15:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.660804    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07c87d31-bb33-4cbb-8786-b42cee6340a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4298d984334a4955a24ac85a16284d8ffe1e4f3033c328be6e66f59d3b17f1b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.670933    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055b0251e049abda93d63985b27520b79b2596443df06427dd990fc974b52a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.682746    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.695120    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6r2kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f0276b7ef4131fee3cc30530a3cb7ddf6e6be5ce73d3084a1f6f4272f9dde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvthz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6r2kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.706976    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2prfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2prfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.719543    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8a35dfa17acf7f7051d13b20cab55fe91645c8fa2773fed67baddae164b586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.736855    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.754283    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.754342    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.754359    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.754379    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.754393    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:51Z","lastTransitionTime":"2026-03-20T15:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.767307    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b4e0e8-af33-491e-b1d1-31079d90c656\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7695af707a9a2e3b80ba56e6a4b97e35b5df9886333ea9cbb7f3852243b61ad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c7fc7c3b6cf0ebabd03bf607d7da5f1221244499e74f0f13a94a8015113c518\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:40:49Z\\\",\\\"message\\\":\\\"ice/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 15:40:49.565874    6588 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 15:40:49.565925    6588 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 15:40:49.565931    6588 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 15:40:49.565977    6588 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 15:40:49.565987    6588 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 15:40:49.565980    6588 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 15:40:49.566007    6588 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 15:40:49.566013    6588 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 15:40:49.566012    6588 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 15:40:49.566033    6588 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 15:40:49.566034    6588 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 15:40:49.566047    6588 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 15:40:49.566043    6588 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 15:40:49.566092    6588 factory.go:656] Stopping watch factory\\\\nI0320 15:40:49.566129    6588 ovnkube.go:599] Stopped ovnkube\\\\nI0320 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7695af707a9a2e3b80ba56e6a4b97e35b5df9886333ea9cbb7f3852243b61ad7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:40:50Z\\\",\\\"message\\\":\\\"0.980093    6718 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 15:40:50.980099    6718 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 15:40:50.980105    6718 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 15:40:50.980374    6718 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 15:40:50.980766    6718 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 15:40:50.981084    6718 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 15:40:50.981851    6718 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 15:40:50.982080    6718 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 15:40:50.982550    6718 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 15:40:50.982574    6718 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 15:40:50.982620    6718 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 15:40:50.982656    6718 factory.go:656] Stopping watch factory\\\\nI0320 15:40:50.982676    6718 ovnkube.go:599] Stopped ovnkube\\\\nI0320 15\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj97f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.786174    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n4w74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ee8d55f-90bd-4484-8455-933de455efea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2070438cbc1bdd8e3c865f5aa62537649598e0f51effc9936b9cb1a630b3651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fvg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n4w74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.799957    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.856357    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.856413    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.856433    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.856455    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.856472    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:51Z","lastTransitionTime":"2026-03-20T15:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.958524    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.958559    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.958570    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.958584    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:51 crc kubenswrapper[4730]: I0320 15:40:51.958595    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:51Z","lastTransitionTime":"2026-03-20T15:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.038097    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qj97f_c4b4e0e8-af33-491e-b1d1-31079d90c656/ovnkube-controller/1.log"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.041045    4730 scope.go:117] "RemoveContainer" containerID="7695af707a9a2e3b80ba56e6a4b97e35b5df9886333ea9cbb7f3852243b61ad7"
Mar 20 15:40:52 crc kubenswrapper[4730]: E0320 15:40:52.041196    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qj97f_openshift-ovn-kubernetes(c4b4e0e8-af33-491e-b1d1-31079d90c656)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.052040    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:52Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.060666    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.060700    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.060711    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.060726    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.060739    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:52Z","lastTransitionTime":"2026-03-20T15:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.062272    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n4w74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ee8d55f-90bd-4484-8455-933de455efea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2070438cbc1bdd8e3c865f5aa62537649598e0f51effc9936b9cb1a630b3651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fvg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n4w74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:52Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.075660    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-49hht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c34c742c6da6c6d35f815901234a0c12c9628d22fa83e511bddc78eae4373cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdfa0e3e3af5612de41c905d807532a50887378171356a3dade8aaf3d29fe9bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdfa0e3e3af5612de41c905d807532a50887378171356a3dade8aaf3d29fe9bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-49hht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:52Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.093405    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708c4e4c-a9ad-466a-a3d0-a99ae3399934\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://631f7e03c5476d53c6d6a3a3bf6a1d45ac40443386f072fc7c01129e9eaf7619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d029581dfd248d1298c93e18a5352686b5e89fb730ede58a29400c62deb2bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a81d62ce9d72579cc4dd4778d22cf442919e08c7d83209fde12edc8a338684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b365c72e5e05c29295ba289a5bf795353c5221607dadd2ab8b4f642272d99fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfacdb41d11bf0b03bede308c267faddfd0241d99a4cbda2603c6539bc3a433a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:52Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.112319    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c3fe2f-3c67-4dee-becb-3ecfe2758384\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2660f772d76479685f586aeb9b242975695ac5e28668840bac6b8a75c09443a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d96c704317c9bfbcb499beae611ca6fcb6337084728df20ad791267de34359b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://180d79b8131e00449715e74d883031078dcceb74ca44ea372b6c3f651d617007\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b137ba07c1866334b1897d83f9590d1627b5724b950efaef0bbda497186fe1ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:39:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:39:59.021893       1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:39:59.022050       1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:39:59.023119       1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311534806/tls.crt::/tmp/serving-cert-3311534806/tls.key\\\\\\\"\\\\nI0320 15:39:59.307640       1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:39:59.310126       1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:39:59.310145       1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:39:59.310173       1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:39:59.310181       1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:39:59.315103       1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 15:39:59.315287       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 15:39:59.315111       1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:39:59.315373       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:39:59.315499       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:39:59.315558       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:39:59.315615       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:39:59.315659       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:39:59.316192       1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8e7f1bfc56e54242e12f30487e8120eeb3d85dc86cfc17b587255889bd60bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:52Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.123848    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1242122dff682493181de927aefeb2ac4e5274621d865ee89ccec2e60e86469b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8117d479604ca09fc23e043f373aa3acbffb209d9fbfae95f2eb907424712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p47zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:52Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.134484    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e647990de11fdbff5c61eb5f2a4f36ee1c2562141d84795cf0d7bbc1155cac28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://188974c8e34d453f129d95f8a3ee36ff86b44b37a5388a0643611206f15d57c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:52Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.147086    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fcd3db3-55f1-4c23-8fa9-78844495cea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8e20a286d6affcba7ffa950ef5386e7f439c9a02381cb8b7d3bc51ad9c4f343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd661cda796a2bd61d1446bee672c4471d60370245caf7cb54faf54dfa9c58a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p5qvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:52Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.155240    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-69fnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102cb977-7291-453e-9282-20572071afee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35fa36f52355ad1fd24bd332fc5b0a5ff35ac2be71dd16b53449d9f47e48e8ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plthx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-69fnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:52Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.163656    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07c87d31-bb33-4cbb-8786-b42cee6340a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4298d984334a4955a24ac85a16284d8ffe1e4f3033c328be6e66f59d3b17f1b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:52Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.165414    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.165467    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.165481    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.165522    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.165534    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:52Z","lastTransitionTime":"2026-03-20T15:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.173327    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055b0251e049abda93d63985b27520b79b2596443df06427dd990fc974b52a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:52Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.183268    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:52Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.193513    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6r2kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f0276b7ef4131fee3cc30530a3cb7ddf6e6be5ce73d3084a1f6f4272f9dde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvthz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6r2kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:52Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.202304    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2prfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2prfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:52Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.214753    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8a35dfa17acf7f7051d13b20cab55fe91645c8fa2773fed67baddae164b586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:52Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.225994    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:52Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.242776    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b4e0e8-af33-491e-b1d1-31079d90c656\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7695af707a9a2e3b80ba56e6a4b97e35b5df9886333ea9cbb7f3852243b61ad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7695af707a9a2e3b80ba56e6a4b97e35b5df9886333ea9cbb7f3852243b61ad7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:40:50Z\\\",\\\"message\\\":\\\"0.980093    6718 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 15:40:50.980099    6718 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 15:40:50.980105    6718 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 15:40:50.980374    6718 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 15:40:50.980766    6718 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 15:40:50.981084    6718 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 15:40:50.981851    6718 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 15:40:50.982080    6718 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 15:40:50.982550    6718 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 15:40:50.982574    6718 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 15:40:50.982620    6718 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 15:40:50.982656    6718 factory.go:656] Stopping watch factory\\\\nI0320 15:40:50.982676    6718 ovnkube.go:599] Stopped ovnkube\\\\nI0320 15\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qj97f_openshift-ovn-kubernetes(c4b4e0e8-af33-491e-b1d1-31079d90c656)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj97f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:52Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.267770    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.267804    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.267835    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.267852    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.267864    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:52Z","lastTransitionTime":"2026-03-20T15:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.370728    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.370802    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.370814    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.370884    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.370895    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:52Z","lastTransitionTime":"2026-03-20T15:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.475038    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.475139    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.475168    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.475201    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.475225    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:52Z","lastTransitionTime":"2026-03-20T15:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.532361    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.532386    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:40:52 crc kubenswrapper[4730]: E0320 15:40:52.532527    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.532364    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.532596    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:40:52 crc kubenswrapper[4730]: E0320 15:40:52.532760    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:40:52 crc kubenswrapper[4730]: E0320 15:40:52.532956    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:40:52 crc kubenswrapper[4730]: E0320 15:40:52.533073    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.578504    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.578673    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.578701    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.578733    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.578757    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:52Z","lastTransitionTime":"2026-03-20T15:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.681971    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.682037    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.682063    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.682096    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.682119    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:52Z","lastTransitionTime":"2026-03-20T15:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.785161    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.785223    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.785234    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.785287    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.785303    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:52Z","lastTransitionTime":"2026-03-20T15:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.872232    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.888124    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.888204    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.888231    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.888326    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.888356    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:52Z","lastTransitionTime":"2026-03-20T15:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.893225    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8a35dfa17acf7f7051d13b20cab55fe91645c8fa2773fed67baddae164b586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:52Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.909851    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:52Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.941728    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b4e0e8-af33-491e-b1d1-31079d90c656\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7695af707a9a2e3b80ba56e6a4b97e35b5df9886333ea9cbb7f3852243b61ad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7695af707a9a2e3b80ba56e6a4b97e35b5df9886333ea9cbb7f3852243b61ad7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:40:50Z\\\",\\\"message\\\":\\\"0.980093    6718 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 15:40:50.980099    6718 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 15:40:50.980105    6718 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 15:40:50.980374    6718 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 15:40:50.980766    6718 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 15:40:50.981084    6718 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 15:40:50.981851    6718 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 15:40:50.982080    6718 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 15:40:50.982550    6718 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 15:40:50.982574    6718 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 15:40:50.982620    6718 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 15:40:50.982656    6718 factory.go:656] Stopping watch factory\\\\nI0320 15:40:50.982676    6718 ovnkube.go:599] Stopped ovnkube\\\\nI0320 15\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qj97f_openshift-ovn-kubernetes(c4b4e0e8-af33-491e-b1d1-31079d90c656)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj97f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:52Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.959430    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:52Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.972263    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n4w74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ee8d55f-90bd-4484-8455-933de455efea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2070438cbc1bdd8e3c865f5aa62537649598e0f51effc9936b9cb1a630b3651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fvg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n4w74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:52Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.987772    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-49hht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c34c742c6da6c6d35f815901234a0c12c9628d22fa83e511bddc78eae4373cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdfa0e3e3af5612de41c905d807532a50887378171356a3dade8aaf3d29fe9bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdfa0e3e3af5612de41c905d807532a50887378171356a3dade8aaf3d29fe9bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-49hht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:52Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.991391    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.991453    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.991471    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.991496    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:52 crc kubenswrapper[4730]: I0320 15:40:52.991513    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:52Z","lastTransitionTime":"2026-03-20T15:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.010276    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708c4e4c-a9ad-466a-a3d0-a99ae3399934\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://631f7e03c5476d53c6d6a3a3bf6a1d45ac40443386f072fc7c01129e9eaf7619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d029581dfd248d1298c93e18a5352686b5e89fb730ede58a29400c62deb2bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a81d62ce9d72579cc4dd4778d22cf442919e08c7d83209fde12edc8a338684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b365c72e5e05c29295ba289a5bf795353c5221607dadd2ab8b4f642272d99fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfacdb41d11bf0b03bede308c267faddfd0241d99a4cbda2603c6539bc3a433a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:53Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.029995    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c3fe2f-3c67-4dee-becb-3ecfe2758384\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2660f772d76479685f586aeb9b242975695ac5e28668840bac6b8a75c09443a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d96c704317c9bfbcb499beae611ca6fcb6337084728df20ad791267de34359b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://180d79b8131e00449715e74d883031078dcceb74ca44ea372b6c3f651d617007\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b137ba07c1866334b1897d83f9590d1627b5724b950efaef0bbda497186fe1ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:39:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:39:59.021893       1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:39:59.022050       1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:39:59.023119       1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311534806/tls.crt::/tmp/serving-cert-3311534806/tls.key\\\\\\\"\\\\nI0320 15:39:59.307640       1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:39:59.310126       1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:39:59.310145       1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:39:59.310173       1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:39:59.310181       1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:39:59.315103       1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 15:39:59.315287       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 15:39:59.315111       1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:39:59.315373       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:39:59.315499       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:39:59.315558       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:39:59.315615       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:39:59.315659       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:39:59.316192       1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8e7f1bfc56e54242e12f30487e8120eeb3d85dc86cfc17b587255889bd60bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:53Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.044016    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1242122dff682493181de927aefeb2ac4e5274621d865ee89ccec2e60e86469b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8117d479604ca09fc23e043f373aa3acbffb209d9fbfae95f2eb907424712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p47zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:53Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.055629    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e647990de11fdbff5c61eb5f2a4f36ee1c2562141d84795cf0d7bbc1155cac28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://188974c8e34d453f129d95f8a3ee36ff86b44b37a5388a0643611206f15d57c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:53Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.069929    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fcd3db3-55f1-4c23-8fa9-78844495cea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8e20a286d6affcba7ffa950ef5386e7f439c9a02381cb8b7d3bc51ad9c4f343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd661cda796a2bd61d1446bee672c4471d60370245caf7cb54faf54dfa9c58a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p5qvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:53Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.083350    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-69fnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102cb977-7291-453e-9282-20572071afee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35fa36f52355ad1fd24bd332fc5b0a5ff35ac2be71dd16b53449d9f47e48e8ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plthx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-69fnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:53Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.094614    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.094656    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.094668    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.094686    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.094697    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:53Z","lastTransitionTime":"2026-03-20T15:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.096319    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07c87d31-bb33-4cbb-8786-b42cee6340a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4298d984334a4955a24ac85a16284d8ffe1e4f3033c328be6e66f59d3b17f1b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:53Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.111566    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055b0251e049abda93d63985b27520b79b2596443df06427dd990fc974b52a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:53Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.128277    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:53Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.149781    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6r2kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f0276b7ef4131fee3cc30530a3cb7ddf6e6be5ce73d3084a1f6f4272f9dde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvthz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6r2kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:53Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.164067    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2prfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2prfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:53Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.197769    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.197823    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.197838    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.197857    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.197868    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:53Z","lastTransitionTime":"2026-03-20T15:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.300945    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.301006    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.301021    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.301044    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.301062    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:53Z","lastTransitionTime":"2026-03-20T15:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.403495    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.403578    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.403601    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.403629    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.403649    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:53Z","lastTransitionTime":"2026-03-20T15:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.507010    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.507097    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.507121    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.507150    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.507171    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:53Z","lastTransitionTime":"2026-03-20T15:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.610237    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.610346    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.610370    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.610406    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.610427    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:53Z","lastTransitionTime":"2026-03-20T15:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.713195    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.713313    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.713332    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.713365    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.713390    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:53Z","lastTransitionTime":"2026-03-20T15:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.757771    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.757846    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.757871    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.757909    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.757935    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:53Z","lastTransitionTime":"2026-03-20T15:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:53 crc kubenswrapper[4730]: E0320 15:40:53.780319    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"666d62d4-aa52-41cc-be79-8c9a068e7752\\\",\\\"systemUUID\\\":\\\"dfe7d645-fe91-432e-8360-ef4633bfea29\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:53Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.785146    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.785185    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.785194    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.785210    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.785221    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:53Z","lastTransitionTime":"2026-03-20T15:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:53 crc kubenswrapper[4730]: E0320 15:40:53.802347    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"666d62d4-aa52-41cc-be79-8c9a068e7752\\\",\\\"systemUUID\\\":\\\"dfe7d645-fe91-432e-8360-ef4633bfea29\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:53Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.808488    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.808544    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.808565    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.808593    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.808611    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:53Z","lastTransitionTime":"2026-03-20T15:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:53 crc kubenswrapper[4730]: E0320 15:40:53.829184    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"666d62d4-aa52-41cc-be79-8c9a068e7752\\\",\\\"systemUUID\\\":\\\"dfe7d645-fe91-432e-8360-ef4633bfea29\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:53Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.835934    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.835991    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.836009    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.836035    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.836057    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:53Z","lastTransitionTime":"2026-03-20T15:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:53 crc kubenswrapper[4730]: E0320 15:40:53.857205    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"666d62d4-aa52-41cc-be79-8c9a068e7752\\\",\\\"systemUUID\\\":\\\"dfe7d645-fe91-432e-8360-ef4633bfea29\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:53Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.862772    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.862825    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.862843    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.862870    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.862890    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:53Z","lastTransitionTime":"2026-03-20T15:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:53 crc kubenswrapper[4730]: E0320 15:40:53.879841    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:40:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"666d62d4-aa52-41cc-be79-8c9a068e7752\\\",\\\"systemUUID\\\":\\\"dfe7d645-fe91-432e-8360-ef4633bfea29\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:40:53Z is after 2025-08-24T17:21:41Z"
Mar 20 15:40:53 crc kubenswrapper[4730]: E0320 15:40:53.879965    4730 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.881733    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.881775    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.881790    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.881809    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.881822    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:53Z","lastTransitionTime":"2026-03-20T15:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.986218    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.986333    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.986353    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.986379    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:53 crc kubenswrapper[4730]: I0320 15:40:53.986400    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:53Z","lastTransitionTime":"2026-03-20T15:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:54 crc kubenswrapper[4730]: I0320 15:40:54.089572    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:54 crc kubenswrapper[4730]: I0320 15:40:54.089634    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:54 crc kubenswrapper[4730]: I0320 15:40:54.089653    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:54 crc kubenswrapper[4730]: I0320 15:40:54.089681    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:54 crc kubenswrapper[4730]: I0320 15:40:54.089705    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:54Z","lastTransitionTime":"2026-03-20T15:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:54 crc kubenswrapper[4730]: I0320 15:40:54.191944    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:54 crc kubenswrapper[4730]: I0320 15:40:54.191999    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:54 crc kubenswrapper[4730]: I0320 15:40:54.192010    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:54 crc kubenswrapper[4730]: I0320 15:40:54.192029    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:54 crc kubenswrapper[4730]: I0320 15:40:54.192040    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:54Z","lastTransitionTime":"2026-03-20T15:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:54 crc kubenswrapper[4730]: I0320 15:40:54.294113    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:54 crc kubenswrapper[4730]: I0320 15:40:54.294168    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:54 crc kubenswrapper[4730]: I0320 15:40:54.294183    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:54 crc kubenswrapper[4730]: I0320 15:40:54.294202    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:54 crc kubenswrapper[4730]: I0320 15:40:54.294219    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:54Z","lastTransitionTime":"2026-03-20T15:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:54 crc kubenswrapper[4730]: I0320 15:40:54.397372    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:54 crc kubenswrapper[4730]: I0320 15:40:54.397418    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:54 crc kubenswrapper[4730]: I0320 15:40:54.397465    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:54 crc kubenswrapper[4730]: I0320 15:40:54.397486    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:54 crc kubenswrapper[4730]: I0320 15:40:54.397499    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:54Z","lastTransitionTime":"2026-03-20T15:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:54 crc kubenswrapper[4730]: I0320 15:40:54.499886    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:54 crc kubenswrapper[4730]: I0320 15:40:54.499922    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:54 crc kubenswrapper[4730]: I0320 15:40:54.499931    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:54 crc kubenswrapper[4730]: I0320 15:40:54.499945    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:54 crc kubenswrapper[4730]: I0320 15:40:54.499953    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:54Z","lastTransitionTime":"2026-03-20T15:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:54 crc kubenswrapper[4730]: I0320 15:40:54.532386    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:40:54 crc kubenswrapper[4730]: E0320 15:40:54.532498    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:40:54 crc kubenswrapper[4730]: I0320 15:40:54.532670    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:40:54 crc kubenswrapper[4730]: I0320 15:40:54.532691    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:40:54 crc kubenswrapper[4730]: E0320 15:40:54.532740    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:40:54 crc kubenswrapper[4730]: E0320 15:40:54.532903    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:40:54 crc kubenswrapper[4730]: I0320 15:40:54.533057    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:40:54 crc kubenswrapper[4730]: E0320 15:40:54.533276    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:40:54 crc kubenswrapper[4730]: I0320 15:40:54.603416    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:54 crc kubenswrapper[4730]: I0320 15:40:54.603496    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:54 crc kubenswrapper[4730]: I0320 15:40:54.603512    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:54 crc kubenswrapper[4730]: I0320 15:40:54.603533    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:54 crc kubenswrapper[4730]: I0320 15:40:54.603548    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:54Z","lastTransitionTime":"2026-03-20T15:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:54 crc kubenswrapper[4730]: I0320 15:40:54.707635    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:54 crc kubenswrapper[4730]: I0320 15:40:54.707727    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:54 crc kubenswrapper[4730]: I0320 15:40:54.707747    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:54 crc kubenswrapper[4730]: I0320 15:40:54.707770    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:54 crc kubenswrapper[4730]: I0320 15:40:54.707788    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:54Z","lastTransitionTime":"2026-03-20T15:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:54 crc kubenswrapper[4730]: I0320 15:40:54.811637    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:54 crc kubenswrapper[4730]: I0320 15:40:54.811703    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:54 crc kubenswrapper[4730]: I0320 15:40:54.811721    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:54 crc kubenswrapper[4730]: I0320 15:40:54.811746    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:54 crc kubenswrapper[4730]: I0320 15:40:54.811764    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:54Z","lastTransitionTime":"2026-03-20T15:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:54 crc kubenswrapper[4730]: I0320 15:40:54.915228    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:54 crc kubenswrapper[4730]: I0320 15:40:54.915311    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:54 crc kubenswrapper[4730]: I0320 15:40:54.915328    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:54 crc kubenswrapper[4730]: I0320 15:40:54.915352    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:54 crc kubenswrapper[4730]: I0320 15:40:54.915370    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:54Z","lastTransitionTime":"2026-03-20T15:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:55 crc kubenswrapper[4730]: I0320 15:40:55.018994    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:55 crc kubenswrapper[4730]: I0320 15:40:55.019058    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:55 crc kubenswrapper[4730]: I0320 15:40:55.019095    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:55 crc kubenswrapper[4730]: I0320 15:40:55.019130    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:55 crc kubenswrapper[4730]: I0320 15:40:55.019153    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:55Z","lastTransitionTime":"2026-03-20T15:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:55 crc kubenswrapper[4730]: I0320 15:40:55.123183    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:55 crc kubenswrapper[4730]: I0320 15:40:55.123291    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:55 crc kubenswrapper[4730]: I0320 15:40:55.123318    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:55 crc kubenswrapper[4730]: I0320 15:40:55.123349    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:55 crc kubenswrapper[4730]: I0320 15:40:55.123370    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:55Z","lastTransitionTime":"2026-03-20T15:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:55 crc kubenswrapper[4730]: I0320 15:40:55.226607    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:55 crc kubenswrapper[4730]: I0320 15:40:55.226664    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:55 crc kubenswrapper[4730]: I0320 15:40:55.226682    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:55 crc kubenswrapper[4730]: I0320 15:40:55.226705    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:55 crc kubenswrapper[4730]: I0320 15:40:55.226723    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:55Z","lastTransitionTime":"2026-03-20T15:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:55 crc kubenswrapper[4730]: I0320 15:40:55.329053    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:55 crc kubenswrapper[4730]: I0320 15:40:55.329125    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:55 crc kubenswrapper[4730]: I0320 15:40:55.329144    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:55 crc kubenswrapper[4730]: I0320 15:40:55.329174    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:55 crc kubenswrapper[4730]: I0320 15:40:55.329196    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:55Z","lastTransitionTime":"2026-03-20T15:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:55 crc kubenswrapper[4730]: I0320 15:40:55.432748    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:55 crc kubenswrapper[4730]: I0320 15:40:55.432811    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:55 crc kubenswrapper[4730]: I0320 15:40:55.432829    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:55 crc kubenswrapper[4730]: I0320 15:40:55.432853    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:55 crc kubenswrapper[4730]: I0320 15:40:55.432871    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:55Z","lastTransitionTime":"2026-03-20T15:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:55 crc kubenswrapper[4730]: I0320 15:40:55.534960    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:55 crc kubenswrapper[4730]: I0320 15:40:55.535007    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:55 crc kubenswrapper[4730]: I0320 15:40:55.535027    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:55 crc kubenswrapper[4730]: I0320 15:40:55.535045    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:55 crc kubenswrapper[4730]: I0320 15:40:55.535058    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:55Z","lastTransitionTime":"2026-03-20T15:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:55 crc kubenswrapper[4730]: I0320 15:40:55.637591    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:55 crc kubenswrapper[4730]: I0320 15:40:55.637635    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:55 crc kubenswrapper[4730]: I0320 15:40:55.637646    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:55 crc kubenswrapper[4730]: I0320 15:40:55.637662    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:55 crc kubenswrapper[4730]: I0320 15:40:55.637675    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:55Z","lastTransitionTime":"2026-03-20T15:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:55 crc kubenswrapper[4730]: I0320 15:40:55.739811    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:55 crc kubenswrapper[4730]: I0320 15:40:55.739864    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:55 crc kubenswrapper[4730]: I0320 15:40:55.739879    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:55 crc kubenswrapper[4730]: I0320 15:40:55.739899    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:55 crc kubenswrapper[4730]: I0320 15:40:55.739913    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:55Z","lastTransitionTime":"2026-03-20T15:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:55 crc kubenswrapper[4730]: I0320 15:40:55.842483    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:55 crc kubenswrapper[4730]: I0320 15:40:55.842536    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:55 crc kubenswrapper[4730]: I0320 15:40:55.842549    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:55 crc kubenswrapper[4730]: I0320 15:40:55.842566    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:55 crc kubenswrapper[4730]: I0320 15:40:55.842579    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:55Z","lastTransitionTime":"2026-03-20T15:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:55 crc kubenswrapper[4730]: I0320 15:40:55.945091    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:55 crc kubenswrapper[4730]: I0320 15:40:55.945126    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:55 crc kubenswrapper[4730]: I0320 15:40:55.945135    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:55 crc kubenswrapper[4730]: I0320 15:40:55.945149    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:55 crc kubenswrapper[4730]: I0320 15:40:55.945158    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:55Z","lastTransitionTime":"2026-03-20T15:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.047035    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.047072    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.047089    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.047110    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.047121    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:56Z","lastTransitionTime":"2026-03-20T15:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.149899    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.149949    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.149961    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.149979    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.149988    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:56Z","lastTransitionTime":"2026-03-20T15:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.252639    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.252700    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.252717    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.252744    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.252764    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:56Z","lastTransitionTime":"2026-03-20T15:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.355467    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.355501    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.355509    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.355523    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.355532    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:56Z","lastTransitionTime":"2026-03-20T15:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.458688    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.458741    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.458759    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.458783    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.458799    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:56Z","lastTransitionTime":"2026-03-20T15:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.532831    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.532925    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:40:56 crc kubenswrapper[4730]: E0320 15:40:56.533067    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.533112    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.532873    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:40:56 crc kubenswrapper[4730]: E0320 15:40:56.533231    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:40:56 crc kubenswrapper[4730]: E0320 15:40:56.533388    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:40:56 crc kubenswrapper[4730]: E0320 15:40:56.533554    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.561381    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.561451    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.561470    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.561496    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.561516    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:56Z","lastTransitionTime":"2026-03-20T15:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.664781    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.664850    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.664868    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.664920    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.664941    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:56Z","lastTransitionTime":"2026-03-20T15:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.768165    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.768225    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.768267    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.768292    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.768305    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:56Z","lastTransitionTime":"2026-03-20T15:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.871331    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.871383    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.871398    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.871417    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.871429    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:56Z","lastTransitionTime":"2026-03-20T15:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.973956    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.974012    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.974029    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.974056    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:56 crc kubenswrapper[4730]: I0320 15:40:56.974076    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:56Z","lastTransitionTime":"2026-03-20T15:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:57 crc kubenswrapper[4730]: I0320 15:40:57.076284    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:57 crc kubenswrapper[4730]: I0320 15:40:57.076573    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:57 crc kubenswrapper[4730]: I0320 15:40:57.076643    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:57 crc kubenswrapper[4730]: I0320 15:40:57.076705    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:57 crc kubenswrapper[4730]: I0320 15:40:57.076766    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:57Z","lastTransitionTime":"2026-03-20T15:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:57 crc kubenswrapper[4730]: I0320 15:40:57.179982    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:57 crc kubenswrapper[4730]: I0320 15:40:57.180051    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:57 crc kubenswrapper[4730]: I0320 15:40:57.180074    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:57 crc kubenswrapper[4730]: I0320 15:40:57.180114    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:57 crc kubenswrapper[4730]: I0320 15:40:57.180134    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:57Z","lastTransitionTime":"2026-03-20T15:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:57 crc kubenswrapper[4730]: I0320 15:40:57.283661    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:57 crc kubenswrapper[4730]: I0320 15:40:57.283728    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:57 crc kubenswrapper[4730]: I0320 15:40:57.283747    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:57 crc kubenswrapper[4730]: I0320 15:40:57.283776    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:57 crc kubenswrapper[4730]: I0320 15:40:57.283795    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:57Z","lastTransitionTime":"2026-03-20T15:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:57 crc kubenswrapper[4730]: I0320 15:40:57.386795    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:57 crc kubenswrapper[4730]: I0320 15:40:57.386854    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:57 crc kubenswrapper[4730]: I0320 15:40:57.386873    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:57 crc kubenswrapper[4730]: I0320 15:40:57.386898    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:57 crc kubenswrapper[4730]: I0320 15:40:57.386916    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:57Z","lastTransitionTime":"2026-03-20T15:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:57 crc kubenswrapper[4730]: I0320 15:40:57.490145    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:57 crc kubenswrapper[4730]: I0320 15:40:57.490194    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:57 crc kubenswrapper[4730]: I0320 15:40:57.490206    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:57 crc kubenswrapper[4730]: I0320 15:40:57.490223    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:57 crc kubenswrapper[4730]: I0320 15:40:57.490235    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:57Z","lastTransitionTime":"2026-03-20T15:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:57 crc kubenswrapper[4730]: I0320 15:40:57.592398    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:57 crc kubenswrapper[4730]: I0320 15:40:57.592462    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:57 crc kubenswrapper[4730]: I0320 15:40:57.592480    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:57 crc kubenswrapper[4730]: I0320 15:40:57.592546    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:57 crc kubenswrapper[4730]: I0320 15:40:57.592569    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:57Z","lastTransitionTime":"2026-03-20T15:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:57 crc kubenswrapper[4730]: I0320 15:40:57.695595    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:57 crc kubenswrapper[4730]: I0320 15:40:57.696057    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:57 crc kubenswrapper[4730]: I0320 15:40:57.696070    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:57 crc kubenswrapper[4730]: I0320 15:40:57.696090    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:57 crc kubenswrapper[4730]: I0320 15:40:57.696103    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:57Z","lastTransitionTime":"2026-03-20T15:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:57 crc kubenswrapper[4730]: I0320 15:40:57.798977    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:57 crc kubenswrapper[4730]: I0320 15:40:57.799103    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:57 crc kubenswrapper[4730]: I0320 15:40:57.799131    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:57 crc kubenswrapper[4730]: I0320 15:40:57.799160    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:57 crc kubenswrapper[4730]: I0320 15:40:57.799182    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:57Z","lastTransitionTime":"2026-03-20T15:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:57 crc kubenswrapper[4730]: I0320 15:40:57.902239    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:57 crc kubenswrapper[4730]: I0320 15:40:57.902362    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:57 crc kubenswrapper[4730]: I0320 15:40:57.902384    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:57 crc kubenswrapper[4730]: I0320 15:40:57.902410    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:57 crc kubenswrapper[4730]: I0320 15:40:57.902428    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:57Z","lastTransitionTime":"2026-03-20T15:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.005690    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.005765    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.005785    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.005811    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.005832    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:58Z","lastTransitionTime":"2026-03-20T15:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.109022    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.109148    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.109183    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.109216    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.109240    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:58Z","lastTransitionTime":"2026-03-20T15:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.211912    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.211977    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.211999    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.212031    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.212052    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:58Z","lastTransitionTime":"2026-03-20T15:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.314875    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.314940    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.314957    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.314982    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.315016    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:58Z","lastTransitionTime":"2026-03-20T15:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.418711    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.419030    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.419140    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.419237    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.419374    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:58Z","lastTransitionTime":"2026-03-20T15:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.522946    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.522991    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.523007    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.523030    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.523050    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:58Z","lastTransitionTime":"2026-03-20T15:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.532773    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.532785    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.532842    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.532886    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:40:58 crc kubenswrapper[4730]: E0320 15:40:58.533621    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:40:58 crc kubenswrapper[4730]: E0320 15:40:58.533744    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:40:58 crc kubenswrapper[4730]: E0320 15:40:58.533911    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:40:58 crc kubenswrapper[4730]: E0320 15:40:58.533827    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.625393    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.625663    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.625745    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.625819    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.625885    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:58Z","lastTransitionTime":"2026-03-20T15:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.728713    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.728759    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.728773    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.728791    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.728802    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:58Z","lastTransitionTime":"2026-03-20T15:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.831914    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.831994    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.832015    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.832040    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.832058    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:58Z","lastTransitionTime":"2026-03-20T15:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.934577    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.934625    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.934634    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.934646    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:58 crc kubenswrapper[4730]: I0320 15:40:58.934655    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:58Z","lastTransitionTime":"2026-03-20T15:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:59 crc kubenswrapper[4730]: I0320 15:40:59.037671    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:59 crc kubenswrapper[4730]: I0320 15:40:59.038071    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:59 crc kubenswrapper[4730]: I0320 15:40:59.038214    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:59 crc kubenswrapper[4730]: I0320 15:40:59.038410    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:59 crc kubenswrapper[4730]: I0320 15:40:59.038552    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:59Z","lastTransitionTime":"2026-03-20T15:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:59 crc kubenswrapper[4730]: I0320 15:40:59.141689    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:59 crc kubenswrapper[4730]: I0320 15:40:59.141731    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:59 crc kubenswrapper[4730]: I0320 15:40:59.141742    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:59 crc kubenswrapper[4730]: I0320 15:40:59.141757    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:59 crc kubenswrapper[4730]: I0320 15:40:59.141768    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:59Z","lastTransitionTime":"2026-03-20T15:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:59 crc kubenswrapper[4730]: I0320 15:40:59.244431    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:59 crc kubenswrapper[4730]: I0320 15:40:59.244506    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:59 crc kubenswrapper[4730]: I0320 15:40:59.244532    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:59 crc kubenswrapper[4730]: I0320 15:40:59.244558    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:59 crc kubenswrapper[4730]: I0320 15:40:59.244575    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:59Z","lastTransitionTime":"2026-03-20T15:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:59 crc kubenswrapper[4730]: I0320 15:40:59.347908    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:59 crc kubenswrapper[4730]: I0320 15:40:59.347959    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:59 crc kubenswrapper[4730]: I0320 15:40:59.347978    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:59 crc kubenswrapper[4730]: I0320 15:40:59.348001    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:59 crc kubenswrapper[4730]: I0320 15:40:59.348018    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:59Z","lastTransitionTime":"2026-03-20T15:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:59 crc kubenswrapper[4730]: I0320 15:40:59.451910    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:59 crc kubenswrapper[4730]: I0320 15:40:59.451982    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:59 crc kubenswrapper[4730]: I0320 15:40:59.452003    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:59 crc kubenswrapper[4730]: I0320 15:40:59.452029    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:59 crc kubenswrapper[4730]: I0320 15:40:59.452050    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:59Z","lastTransitionTime":"2026-03-20T15:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:59 crc kubenswrapper[4730]: I0320 15:40:59.554102    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:59 crc kubenswrapper[4730]: I0320 15:40:59.554145    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:59 crc kubenswrapper[4730]: I0320 15:40:59.554157    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:59 crc kubenswrapper[4730]: I0320 15:40:59.554175    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:59 crc kubenswrapper[4730]: I0320 15:40:59.554187    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:59Z","lastTransitionTime":"2026-03-20T15:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:59 crc kubenswrapper[4730]: I0320 15:40:59.657752    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:59 crc kubenswrapper[4730]: I0320 15:40:59.657814    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:59 crc kubenswrapper[4730]: I0320 15:40:59.657827    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:59 crc kubenswrapper[4730]: I0320 15:40:59.657847    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:59 crc kubenswrapper[4730]: I0320 15:40:59.657859    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:59Z","lastTransitionTime":"2026-03-20T15:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:59 crc kubenswrapper[4730]: I0320 15:40:59.761558    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:59 crc kubenswrapper[4730]: I0320 15:40:59.761644    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:59 crc kubenswrapper[4730]: I0320 15:40:59.761671    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:59 crc kubenswrapper[4730]: I0320 15:40:59.761711    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:59 crc kubenswrapper[4730]: I0320 15:40:59.761742    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:59Z","lastTransitionTime":"2026-03-20T15:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:59 crc kubenswrapper[4730]: I0320 15:40:59.865672    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:59 crc kubenswrapper[4730]: I0320 15:40:59.866134    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:59 crc kubenswrapper[4730]: I0320 15:40:59.866321    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:59 crc kubenswrapper[4730]: I0320 15:40:59.866496    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:59 crc kubenswrapper[4730]: I0320 15:40:59.866668    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:59Z","lastTransitionTime":"2026-03-20T15:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:40:59 crc kubenswrapper[4730]: I0320 15:40:59.970795    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:40:59 crc kubenswrapper[4730]: I0320 15:40:59.970879    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:40:59 crc kubenswrapper[4730]: I0320 15:40:59.970903    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:40:59 crc kubenswrapper[4730]: I0320 15:40:59.970945    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:40:59 crc kubenswrapper[4730]: I0320 15:40:59.970967    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:40:59Z","lastTransitionTime":"2026-03-20T15:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:41:00 crc kubenswrapper[4730]: I0320 15:41:00.074498    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:41:00 crc kubenswrapper[4730]: I0320 15:41:00.074567    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:41:00 crc kubenswrapper[4730]: I0320 15:41:00.074594    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:41:00 crc kubenswrapper[4730]: I0320 15:41:00.074626    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:41:00 crc kubenswrapper[4730]: I0320 15:41:00.074652    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:41:00Z","lastTransitionTime":"2026-03-20T15:41:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:41:00 crc kubenswrapper[4730]: I0320 15:41:00.177999    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:41:00 crc kubenswrapper[4730]: I0320 15:41:00.178050    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:41:00 crc kubenswrapper[4730]: I0320 15:41:00.178065    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:41:00 crc kubenswrapper[4730]: I0320 15:41:00.178086    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:41:00 crc kubenswrapper[4730]: I0320 15:41:00.178101    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:41:00Z","lastTransitionTime":"2026-03-20T15:41:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:41:00 crc kubenswrapper[4730]: I0320 15:41:00.280973    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:41:00 crc kubenswrapper[4730]: I0320 15:41:00.281046    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:41:00 crc kubenswrapper[4730]: I0320 15:41:00.281061    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:41:00 crc kubenswrapper[4730]: I0320 15:41:00.281084    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:41:00 crc kubenswrapper[4730]: I0320 15:41:00.281103    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:41:00Z","lastTransitionTime":"2026-03-20T15:41:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:41:00 crc kubenswrapper[4730]: I0320 15:41:00.383478    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:41:00 crc kubenswrapper[4730]: I0320 15:41:00.383780    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:41:00 crc kubenswrapper[4730]: I0320 15:41:00.383854    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:41:00 crc kubenswrapper[4730]: I0320 15:41:00.383937    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:41:00 crc kubenswrapper[4730]: I0320 15:41:00.384010    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:41:00Z","lastTransitionTime":"2026-03-20T15:41:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:41:00 crc kubenswrapper[4730]: I0320 15:41:00.487812    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:41:00 crc kubenswrapper[4730]: I0320 15:41:00.487875    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:41:00 crc kubenswrapper[4730]: I0320 15:41:00.487888    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:41:00 crc kubenswrapper[4730]: I0320 15:41:00.487908    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:41:00 crc kubenswrapper[4730]: I0320 15:41:00.487923    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:41:00Z","lastTransitionTime":"2026-03-20T15:41:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:41:00 crc kubenswrapper[4730]: I0320 15:41:00.533111    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:41:00 crc kubenswrapper[4730]: I0320 15:41:00.533202    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:41:00 crc kubenswrapper[4730]: I0320 15:41:00.533290    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:41:00 crc kubenswrapper[4730]: E0320 15:41:00.533496    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:41:00 crc kubenswrapper[4730]: E0320 15:41:00.533733    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:41:00 crc kubenswrapper[4730]: I0320 15:41:00.533920    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:41:00 crc kubenswrapper[4730]: E0320 15:41:00.534004    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:41:00 crc kubenswrapper[4730]: E0320 15:41:00.534390    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:41:00 crc kubenswrapper[4730]: I0320 15:41:00.591511    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:41:00 crc kubenswrapper[4730]: I0320 15:41:00.591571    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:41:00 crc kubenswrapper[4730]: I0320 15:41:00.591588    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:41:00 crc kubenswrapper[4730]: I0320 15:41:00.591613    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:41:00 crc kubenswrapper[4730]: I0320 15:41:00.591635    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:41:00Z","lastTransitionTime":"2026-03-20T15:41:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:41:00 crc kubenswrapper[4730]: I0320 15:41:00.694418    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:41:00 crc kubenswrapper[4730]: I0320 15:41:00.694468    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:41:00 crc kubenswrapper[4730]: I0320 15:41:00.694512    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:41:00 crc kubenswrapper[4730]: I0320 15:41:00.694535    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:41:00 crc kubenswrapper[4730]: I0320 15:41:00.694551    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:41:00Z","lastTransitionTime":"2026-03-20T15:41:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:41:00 crc kubenswrapper[4730]: I0320 15:41:00.797543    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:41:00 crc kubenswrapper[4730]: I0320 15:41:00.797611    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:41:00 crc kubenswrapper[4730]: I0320 15:41:00.797634    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:41:00 crc kubenswrapper[4730]: I0320 15:41:00.797662    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:41:00 crc kubenswrapper[4730]: I0320 15:41:00.797686    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:41:00Z","lastTransitionTime":"2026-03-20T15:41:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:41:00 crc kubenswrapper[4730]: I0320 15:41:00.900492    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:41:00 crc kubenswrapper[4730]: I0320 15:41:00.900814    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:41:00 crc kubenswrapper[4730]: I0320 15:41:00.900961    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:41:00 crc kubenswrapper[4730]: I0320 15:41:00.901103    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:41:00 crc kubenswrapper[4730]: I0320 15:41:00.901307    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:41:00Z","lastTransitionTime":"2026-03-20T15:41:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:41:01 crc kubenswrapper[4730]: I0320 15:41:01.005413    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:41:01 crc kubenswrapper[4730]: I0320 15:41:01.005487    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:41:01 crc kubenswrapper[4730]: I0320 15:41:01.005504    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:41:01 crc kubenswrapper[4730]: I0320 15:41:01.005528    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:41:01 crc kubenswrapper[4730]: I0320 15:41:01.005548    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:41:01Z","lastTransitionTime":"2026-03-20T15:41:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:41:01 crc kubenswrapper[4730]: I0320 15:41:01.108058    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:41:01 crc kubenswrapper[4730]: I0320 15:41:01.108135    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:41:01 crc kubenswrapper[4730]: I0320 15:41:01.108158    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:41:01 crc kubenswrapper[4730]: I0320 15:41:01.108185    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:41:01 crc kubenswrapper[4730]: I0320 15:41:01.108206    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:41:01Z","lastTransitionTime":"2026-03-20T15:41:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:41:01 crc kubenswrapper[4730]: I0320 15:41:01.212383    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:41:01 crc kubenswrapper[4730]: I0320 15:41:01.212510    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:41:01 crc kubenswrapper[4730]: I0320 15:41:01.212539    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:41:01 crc kubenswrapper[4730]: I0320 15:41:01.212608    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:41:01 crc kubenswrapper[4730]: I0320 15:41:01.212634    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:41:01Z","lastTransitionTime":"2026-03-20T15:41:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:41:01 crc kubenswrapper[4730]: I0320 15:41:01.316483    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:41:01 crc kubenswrapper[4730]: I0320 15:41:01.316557    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:41:01 crc kubenswrapper[4730]: I0320 15:41:01.316585    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:41:01 crc kubenswrapper[4730]: I0320 15:41:01.316616    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:41:01 crc kubenswrapper[4730]: I0320 15:41:01.316639    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:41:01Z","lastTransitionTime":"2026-03-20T15:41:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:41:01 crc kubenswrapper[4730]: I0320 15:41:01.419854    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:41:01 crc kubenswrapper[4730]: I0320 15:41:01.419926    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:41:01 crc kubenswrapper[4730]: I0320 15:41:01.419949    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:41:01 crc kubenswrapper[4730]: I0320 15:41:01.419982    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:41:01 crc kubenswrapper[4730]: I0320 15:41:01.420005    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:41:01Z","lastTransitionTime":"2026-03-20T15:41:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:41:01 crc kubenswrapper[4730]: E0320 15:41:01.520897    4730 kubelet_node_status.go:497] "Node not becoming ready in time after startup"
Mar 20 15:41:01 crc kubenswrapper[4730]: I0320 15:41:01.558428    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:01Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:01 crc kubenswrapper[4730]: I0320 15:41:01.574307    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n4w74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ee8d55f-90bd-4484-8455-933de455efea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2070438cbc1bdd8e3c865f5aa62537649598e0f51effc9936b9cb1a630b3651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fvg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n4w74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:01Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:01 crc kubenswrapper[4730]: I0320 15:41:01.597413    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-49hht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c34c742c6da6c6d35f815901234a0c12c9628d22fa83e511bddc78eae4373cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdfa0e3e3af5612de41c905d807532a50887378171356a3dade8aaf3d29fe9bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdfa0e3e3af5612de41c905d807532a50887378171356a3dade8aaf3d29fe9bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-49hht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:01Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:01 crc kubenswrapper[4730]: E0320 15:41:01.629226    4730 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"
Mar 20 15:41:01 crc kubenswrapper[4730]: I0320 15:41:01.632547    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708c4e4c-a9ad-466a-a3d0-a99ae3399934\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://631f7e03c5476d53c6d6a3a3bf6a1d45ac40443386f072fc7c01129e9eaf7619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d029581dfd248d1298c93e18a5352686b5e89fb730ede58a29400c62deb2bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a81d62ce9d72579cc4dd4778d22cf442919e08c7d83209fde12edc8a338684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b365c72e5e05c29295ba289a5bf795353c5221607dadd2ab8b4f642272d99fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfacdb41d11bf0b03bede308c267faddfd0241d99a4cbda2603c6539bc3a433a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:01Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:01 crc kubenswrapper[4730]: I0320 15:41:01.657531    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c3fe2f-3c67-4dee-becb-3ecfe2758384\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2660f772d76479685f586aeb9b242975695ac5e28668840bac6b8a75c09443a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d96c704317c9bfbcb499beae611ca6fcb6337084728df20ad791267de34359b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://180d79b8131e00449715e74d883031078dcceb74ca44ea372b6c3f651d617007\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b137ba07c1866334b1897d83f9590d1627b5724b950efaef0bbda497186fe1ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:39:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:39:59.021893       1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:39:59.022050       1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:39:59.023119       1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311534806/tls.crt::/tmp/serving-cert-3311534806/tls.key\\\\\\\"\\\\nI0320 15:39:59.307640       1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:39:59.310126       1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:39:59.310145       1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:39:59.310173       1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:39:59.310181       1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:39:59.315103       1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 15:39:59.315287       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 15:39:59.315111       1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:39:59.315373       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:39:59.315499       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:39:59.315558       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:39:59.315615       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:39:59.315659       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:39:59.316192       1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8e7f1bfc56e54242e12f30487e8120eeb3d85dc86cfc17b587255889bd60bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:01Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:01 crc kubenswrapper[4730]: I0320 15:41:01.671540    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1242122dff682493181de927aefeb2ac4e5274621d865ee89ccec2e60e86469b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8117d479604ca09fc23e043f373aa3acbffb209d9fbfae95f2eb907424712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p47zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:01Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:01 crc kubenswrapper[4730]: I0320 15:41:01.688549    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e647990de11fdbff5c61eb5f2a4f36ee1c2562141d84795cf0d7bbc1155cac28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://188974c8e34d453f129d95f8a3ee36ff86b44b37a5388a0643611206f15d57c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:01Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:01 crc kubenswrapper[4730]: I0320 15:41:01.700810    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fcd3db3-55f1-4c23-8fa9-78844495cea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8e20a286d6affcba7ffa950ef5386e7f439c9a02381cb8b7d3bc51ad9c4f343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd661cda796a2bd61d1446bee672c4471d60370245caf7cb54faf54dfa9c58a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p5qvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:01Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:01 crc kubenswrapper[4730]: I0320 15:41:01.713212    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-69fnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102cb977-7291-453e-9282-20572071afee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35fa36f52355ad1fd24bd332fc5b0a5ff35ac2be71dd16b53449d9f47e48e8ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plthx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-69fnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:01Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:01 crc kubenswrapper[4730]: I0320 15:41:01.725965    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07c87d31-bb33-4cbb-8786-b42cee6340a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4298d984334a4955a24ac85a16284d8ffe1e4f3033c328be6e66f59d3b17f1b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:01Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:01 crc kubenswrapper[4730]: I0320 15:41:01.740463    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055b0251e049abda93d63985b27520b79b2596443df06427dd990fc974b52a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:01Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:01 crc kubenswrapper[4730]: I0320 15:41:01.758828    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:01Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:01 crc kubenswrapper[4730]: I0320 15:41:01.772880    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6r2kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f0276b7ef4131fee3cc30530a3cb7ddf6e6be5ce73d3084a1f6f4272f9dde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvthz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6r2kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:01Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:01 crc kubenswrapper[4730]: I0320 15:41:01.786739    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2prfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2prfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:01Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:01 crc kubenswrapper[4730]: I0320 15:41:01.806996    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8a35dfa17acf7f7051d13b20cab55fe91645c8fa2773fed67baddae164b586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:01Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:01 crc kubenswrapper[4730]: I0320 15:41:01.824810    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:01Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:01 crc kubenswrapper[4730]: I0320 15:41:01.846648    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b4e0e8-af33-491e-b1d1-31079d90c656\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7695af707a9a2e3b80ba56e6a4b97e35b5df9886333ea9cbb7f3852243b61ad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7695af707a9a2e3b80ba56e6a4b97e35b5df9886333ea9cbb7f3852243b61ad7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:40:50Z\\\",\\\"message\\\":\\\"0.980093    6718 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 15:40:50.980099    6718 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 15:40:50.980105    6718 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 15:40:50.980374    6718 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 15:40:50.980766    6718 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 15:40:50.981084    6718 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 15:40:50.981851    6718 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 15:40:50.982080    6718 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 15:40:50.982550    6718 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 15:40:50.982574    6718 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 15:40:50.982620    6718 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 15:40:50.982656    6718 factory.go:656] Stopping watch factory\\\\nI0320 15:40:50.982676    6718 ovnkube.go:599] Stopped ovnkube\\\\nI0320 15\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qj97f_openshift-ovn-kubernetes(c4b4e0e8-af33-491e-b1d1-31079d90c656)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj97f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:01Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:02 crc kubenswrapper[4730]: I0320 15:41:02.532405    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:41:02 crc kubenswrapper[4730]: I0320 15:41:02.532488    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:41:02 crc kubenswrapper[4730]: I0320 15:41:02.532493    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:41:02 crc kubenswrapper[4730]: I0320 15:41:02.532447    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:41:02 crc kubenswrapper[4730]: E0320 15:41:02.533195    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:41:02 crc kubenswrapper[4730]: E0320 15:41:02.533535    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:41:02 crc kubenswrapper[4730]: E0320 15:41:02.533703    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:41:02 crc kubenswrapper[4730]: E0320 15:41:02.533847    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:41:03 crc kubenswrapper[4730]: I0320 15:41:03.534087    4730 scope.go:117] "RemoveContainer" containerID="7695af707a9a2e3b80ba56e6a4b97e35b5df9886333ea9cbb7f3852243b61ad7"
Mar 20 15:41:04 crc kubenswrapper[4730]: I0320 15:41:04.088755    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:41:04 crc kubenswrapper[4730]: I0320 15:41:04.089039    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:41:04 crc kubenswrapper[4730]: I0320 15:41:04.089049    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:41:04 crc kubenswrapper[4730]: I0320 15:41:04.089061    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:41:04 crc kubenswrapper[4730]: I0320 15:41:04.089070    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:41:04Z","lastTransitionTime":"2026-03-20T15:41:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:41:04 crc kubenswrapper[4730]: I0320 15:41:04.090423    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qj97f_c4b4e0e8-af33-491e-b1d1-31079d90c656/ovnkube-controller/1.log"
Mar 20 15:41:04 crc kubenswrapper[4730]: I0320 15:41:04.093103    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" event={"ID":"c4b4e0e8-af33-491e-b1d1-31079d90c656","Type":"ContainerStarted","Data":"03c5491305bb4c7cdbc1498dbb8cc5fe445c665c58d977511f19cb13251f7ef0"}
Mar 20 15:41:04 crc kubenswrapper[4730]: I0320 15:41:04.093555    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:41:04 crc kubenswrapper[4730]: E0320 15:41:04.103155    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"666d62d4-aa52-41cc-be79-8c9a068e7752\\\",\\\"systemUUID\\\":\\\"dfe7d645-fe91-432e-8360-ef4633bfea29\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:04Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:04 crc kubenswrapper[4730]: I0320 15:41:04.106458    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:04Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:04 crc kubenswrapper[4730]: I0320 15:41:04.106928    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:41:04 crc kubenswrapper[4730]: I0320 15:41:04.106956    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:41:04 crc kubenswrapper[4730]: I0320 15:41:04.106965    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:41:04 crc kubenswrapper[4730]: I0320 15:41:04.106983    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:41:04 crc kubenswrapper[4730]: I0320 15:41:04.106994    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:41:04Z","lastTransitionTime":"2026-03-20T15:41:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:41:04 crc kubenswrapper[4730]: E0320 15:41:04.118596    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"666d62d4-aa52-41cc-be79-8c9a068e7752\\\",\\\"systemUUID\\\":\\\"dfe7d645-fe91-432e-8360-ef4633bfea29\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:04Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:04 crc kubenswrapper[4730]: I0320 15:41:04.119127    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n4w74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ee8d55f-90bd-4484-8455-933de455efea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2070438cbc1bdd8e3c865f5aa62537649598e0f51effc9936b9cb1a630b3651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fvg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n4w74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:04Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:04 crc kubenswrapper[4730]: I0320 15:41:04.122978    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:41:04 crc kubenswrapper[4730]: I0320 15:41:04.123013    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:41:04 crc kubenswrapper[4730]: I0320 15:41:04.123028    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:41:04 crc kubenswrapper[4730]: I0320 15:41:04.123044    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:41:04 crc kubenswrapper[4730]: I0320 15:41:04.123056    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:41:04Z","lastTransitionTime":"2026-03-20T15:41:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:41:04 crc kubenswrapper[4730]: E0320 15:41:04.137824    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"666d62d4-aa52-41cc-be79-8c9a068e7752\\\",\\\"systemUUID\\\":\\\"dfe7d645-fe91-432e-8360-ef4633bfea29\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:04Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:04 crc kubenswrapper[4730]: I0320 15:41:04.140446    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708c4e4c-a9ad-466a-a3d0-a99ae3399934\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://631f7e03c5476d53c6d6a3a3bf6a1d45ac40443386f072fc7c01129e9eaf7619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d029581dfd248d1298c93e18a5352686b5e89fb730ede58a29400c62deb2bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a81d62ce9d72579cc4dd4778d22cf442919e08c7d83209fde12edc8a338684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b365c72e5e05c29295ba289a5bf795353c5221607dadd2ab8b4f642272d99fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfacdb41d11bf0b03bede308c267faddfd0241d99a4cbda2603c6539bc3a433a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:04Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:04 crc kubenswrapper[4730]: I0320 15:41:04.141650    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:41:04 crc kubenswrapper[4730]: I0320 15:41:04.141694    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:41:04 crc kubenswrapper[4730]: I0320 15:41:04.141712    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:41:04 crc kubenswrapper[4730]: I0320 15:41:04.141734    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:41:04 crc kubenswrapper[4730]: I0320 15:41:04.141751    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:41:04Z","lastTransitionTime":"2026-03-20T15:41:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:41:04 crc kubenswrapper[4730]: E0320 15:41:04.154045    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"666d62d4-aa52-41cc-be79-8c9a068e7752\\\",\\\"systemUUID\\\":\\\"dfe7d645-fe91-432e-8360-ef4633bfea29\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:04Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:04 crc kubenswrapper[4730]: I0320 15:41:04.160449    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c3fe2f-3c67-4dee-becb-3ecfe2758384\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2660f772d76479685f586aeb9b242975695ac5e28668840bac6b8a75c09443a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d96c704317c9bfbcb499beae611ca6fcb6337084728df20ad791267de34359b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://180d79b8131e00449715e74d883031078dcceb74ca44ea372b6c3f651d617007\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b137ba07c1866334b1897d83f9590d1627b5724b950efaef0bbda497186fe1ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:39:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:39:59.021893       1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:39:59.022050       1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:39:59.023119       1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311534806/tls.crt::/tmp/serving-cert-3311534806/tls.key\\\\\\\"\\\\nI0320 15:39:59.307640       1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:39:59.310126       1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:39:59.310145       1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:39:59.310173       1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:39:59.310181       1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:39:59.315103       1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 15:39:59.315287       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 15:39:59.315111       1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:39:59.315373       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:39:59.315499       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:39:59.315558       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:39:59.315615       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:39:59.315659       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:39:59.316192       1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8e7f1bfc56e54242e12f30487e8120eeb3d85dc86cfc17b587255889bd60bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:04Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:04 crc kubenswrapper[4730]: I0320 15:41:04.161763    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:41:04 crc kubenswrapper[4730]: I0320 15:41:04.161795    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:41:04 crc kubenswrapper[4730]: I0320 15:41:04.161805    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:41:04 crc kubenswrapper[4730]: I0320 15:41:04.161818    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:41:04 crc kubenswrapper[4730]: I0320 15:41:04.161826    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:41:04Z","lastTransitionTime":"2026-03-20T15:41:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:41:04 crc kubenswrapper[4730]: I0320 15:41:04.182012    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1242122dff682493181de927aefeb2ac4e5274621d865ee89ccec2e60e86469b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8117d479604ca09fc23e043f373aa3acbffb209d9fbfae95f2eb907424712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p47zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:04Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:04 crc kubenswrapper[4730]: E0320 15:41:04.183527    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"666d62d4-aa52-41cc-be79-8c9a068e7752\\\",\\\"systemUUID\\\":\\\"dfe7d645-fe91-432e-8360-ef4633bfea29\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:04Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:04 crc kubenswrapper[4730]: E0320 15:41:04.183693    4730 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count"
Mar 20 15:41:04 crc kubenswrapper[4730]: I0320 15:41:04.196593    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e647990de11fdbff5c61eb5f2a4f36ee1c2562141d84795cf0d7bbc1155cac28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://188974c8e34d453f129d95f8a3ee36ff86b44b37a5388a0643611206f15d57c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:04Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:04 crc kubenswrapper[4730]: I0320 15:41:04.209661    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fcd3db3-55f1-4c23-8fa9-78844495cea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8e20a286d6affcba7ffa950ef5386e7f439c9a02381cb8b7d3bc51ad9c4f343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd661cda796a2bd61d1446bee672c4471d60370245caf7cb54faf54dfa9c58a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p5qvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:04Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:04 crc kubenswrapper[4730]: I0320 15:41:04.219600    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-69fnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102cb977-7291-453e-9282-20572071afee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35fa36f52355ad1fd24bd332fc5b0a5ff35ac2be71dd16b53449d9f47e48e8ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plthx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-69fnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:04Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:04 crc kubenswrapper[4730]: I0320 15:41:04.236356    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-49hht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c34c742c6da6c6d35f815901234a0c12c9628d22fa83e511bddc78eae4373cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdfa0e3e3af5612de41c905d807532a50887378171356a3dade8aaf3d29fe9bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdfa0e3e3af5612de41c905d807532a50887378171356a3dade8aaf3d29fe9bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-49hht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:04Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:04 crc kubenswrapper[4730]: I0320 15:41:04.248377    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07c87d31-bb33-4cbb-8786-b42cee6340a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4298d984334a4955a24ac85a16284d8ffe1e4f3033c328be6e66f59d3b17f1b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:04Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:04 crc kubenswrapper[4730]: I0320 15:41:04.262663    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055b0251e049abda93d63985b27520b79b2596443df06427dd990fc974b52a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:04Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:04 crc kubenswrapper[4730]: I0320 15:41:04.273293    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:04Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:04 crc kubenswrapper[4730]: I0320 15:41:04.288037    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6r2kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f0276b7ef4131fee3cc30530a3cb7ddf6e6be5ce73d3084a1f6f4272f9dde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvthz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6r2kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:04Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:04 crc kubenswrapper[4730]: I0320 15:41:04.297206    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2prfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2prfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:04Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:04 crc kubenswrapper[4730]: I0320 15:41:04.309722    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8a35dfa17acf7f7051d13b20cab55fe91645c8fa2773fed67baddae164b586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:04Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:04 crc kubenswrapper[4730]: I0320 15:41:04.321058    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:04Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:04 crc kubenswrapper[4730]: I0320 15:41:04.350184    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b4e0e8-af33-491e-b1d1-31079d90c656\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c5491305bb4c7cdbc1498dbb8cc5fe445c665c58d977511f19cb13251f7ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7695af707a9a2e3b80ba56e6a4b97e35b5df9886333ea9cbb7f3852243b61ad7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:40:50Z\\\",\\\"message\\\":\\\"0.980093    6718 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 15:40:50.980099    6718 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 15:40:50.980105    6718 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 15:40:50.980374    6718 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 15:40:50.980766    6718 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 15:40:50.981084    6718 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 15:40:50.981851    6718 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 15:40:50.982080    6718 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 15:40:50.982550    6718 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 15:40:50.982574    6718 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 15:40:50.982620    6718 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 15:40:50.982656    6718 factory.go:656] Stopping watch factory\\\\nI0320 15:40:50.982676    6718 ovnkube.go:599] Stopped ovnkube\\\\nI0320 15\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:41:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj97f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:04Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:04 crc kubenswrapper[4730]: I0320 15:41:04.532474    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:41:04 crc kubenswrapper[4730]: I0320 15:41:04.532572    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:41:04 crc kubenswrapper[4730]: I0320 15:41:04.532598    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:41:04 crc kubenswrapper[4730]: I0320 15:41:04.532486    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:41:04 crc kubenswrapper[4730]: E0320 15:41:04.532885    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:41:04 crc kubenswrapper[4730]: E0320 15:41:04.532984    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:41:04 crc kubenswrapper[4730]: E0320 15:41:04.533071    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:41:04 crc kubenswrapper[4730]: E0320 15:41:04.533113    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:41:05 crc kubenswrapper[4730]: I0320 15:41:05.099087    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qj97f_c4b4e0e8-af33-491e-b1d1-31079d90c656/ovnkube-controller/2.log"
Mar 20 15:41:05 crc kubenswrapper[4730]: I0320 15:41:05.100003    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qj97f_c4b4e0e8-af33-491e-b1d1-31079d90c656/ovnkube-controller/1.log"
Mar 20 15:41:05 crc kubenswrapper[4730]: I0320 15:41:05.102955    4730 generic.go:334] "Generic (PLEG): container finished" podID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerID="03c5491305bb4c7cdbc1498dbb8cc5fe445c665c58d977511f19cb13251f7ef0" exitCode=1
Mar 20 15:41:05 crc kubenswrapper[4730]: I0320 15:41:05.103021    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" event={"ID":"c4b4e0e8-af33-491e-b1d1-31079d90c656","Type":"ContainerDied","Data":"03c5491305bb4c7cdbc1498dbb8cc5fe445c665c58d977511f19cb13251f7ef0"}
Mar 20 15:41:05 crc kubenswrapper[4730]: I0320 15:41:05.103082    4730 scope.go:117] "RemoveContainer" containerID="7695af707a9a2e3b80ba56e6a4b97e35b5df9886333ea9cbb7f3852243b61ad7"
Mar 20 15:41:05 crc kubenswrapper[4730]: I0320 15:41:05.104392    4730 scope.go:117] "RemoveContainer" containerID="03c5491305bb4c7cdbc1498dbb8cc5fe445c665c58d977511f19cb13251f7ef0"
Mar 20 15:41:05 crc kubenswrapper[4730]: E0320 15:41:05.104697    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qj97f_openshift-ovn-kubernetes(c4b4e0e8-af33-491e-b1d1-31079d90c656)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656"
Mar 20 15:41:05 crc kubenswrapper[4730]: I0320 15:41:05.124655    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:05Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:05 crc kubenswrapper[4730]: I0320 15:41:05.137494    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n4w74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ee8d55f-90bd-4484-8455-933de455efea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2070438cbc1bdd8e3c865f5aa62537649598e0f51effc9936b9cb1a630b3651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fvg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n4w74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:05Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:05 crc kubenswrapper[4730]: I0320 15:41:05.152419    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1242122dff682493181de927aefeb2ac4e5274621d865ee89ccec2e60e86469b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8117d479604ca09fc23e043f373aa3acbffb209d9fbfae95f2eb907424712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p47zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:05Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:05 crc kubenswrapper[4730]: I0320 15:41:05.168739    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e647990de11fdbff5c61eb5f2a4f36ee1c2562141d84795cf0d7bbc1155cac28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://188974c8e34d453f129d95f8a3ee36ff86b44b37a5388a0643611206f15d57c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:05Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:05 crc kubenswrapper[4730]: I0320 15:41:05.180804    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fcd3db3-55f1-4c23-8fa9-78844495cea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8e20a286d6affcba7ffa950ef5386e7f439c9a02381cb8b7d3bc51ad9c4f343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd661cda796a2bd61d1446bee672c4471d60370245caf7cb54faf54dfa9c58a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p5qvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:05Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:05 crc kubenswrapper[4730]: I0320 15:41:05.191547    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-69fnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102cb977-7291-453e-9282-20572071afee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35fa36f52355ad1fd24bd332fc5b0a5ff35ac2be71dd16b53449d9f47e48e8ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plthx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-69fnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:05Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:05 crc kubenswrapper[4730]: I0320 15:41:05.212222    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-49hht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c34c742c6da6c6d35f815901234a0c12c9628d22fa83e511bddc78eae4373cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdfa0e3e3af5612de41c905d807532a50887378171356a3dade8aaf3d29fe9bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdfa0e3e3af5612de41c905d807532a50887378171356a3dade8aaf3d29fe9bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-49hht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:05Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:05 crc kubenswrapper[4730]: I0320 15:41:05.240870    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708c4e4c-a9ad-466a-a3d0-a99ae3399934\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://631f7e03c5476d53c6d6a3a3bf6a1d45ac40443386f072fc7c01129e9eaf7619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d029581dfd248d1298c93e18a5352686b5e89fb730ede58a29400c62deb2bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a81d62ce9d72579cc4dd4778d22cf442919e08c7d83209fde12edc8a338684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b365c72e5e05c29295ba289a5bf795353c5221607dadd2ab8b4f642272d99fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfacdb41d11bf0b03bede308c267faddfd0241d99a4cbda2603c6539bc3a433a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:05Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:05 crc kubenswrapper[4730]: I0320 15:41:05.257920    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c3fe2f-3c67-4dee-becb-3ecfe2758384\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2660f772d76479685f586aeb9b242975695ac5e28668840bac6b8a75c09443a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d96c704317c9bfbcb499beae611ca6fcb6337084728df20ad791267de34359b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://180d79b8131e00449715e74d883031078dcceb74ca44ea372b6c3f651d617007\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b137ba07c1866334b1897d83f9590d1627b5724b950efaef0bbda497186fe1ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:39:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:39:59.021893       1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:39:59.022050       1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:39:59.023119       1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311534806/tls.crt::/tmp/serving-cert-3311534806/tls.key\\\\\\\"\\\\nI0320 15:39:59.307640       1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:39:59.310126       1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:39:59.310145       1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:39:59.310173       1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:39:59.310181       1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:39:59.315103       1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 15:39:59.315287       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 15:39:59.315111       1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:39:59.315373       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:39:59.315499       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:39:59.315558       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:39:59.315615       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:39:59.315659       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:39:59.316192       1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8e7f1bfc56e54242e12f30487e8120eeb3d85dc86cfc17b587255889bd60bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:05Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:05 crc kubenswrapper[4730]: I0320 15:41:05.272069    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055b0251e049abda93d63985b27520b79b2596443df06427dd990fc974b52a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:05Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:05 crc kubenswrapper[4730]: I0320 15:41:05.283528    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:05Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:05 crc kubenswrapper[4730]: I0320 15:41:05.300046    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6r2kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f0276b7ef4131fee3cc30530a3cb7ddf6e6be5ce73d3084a1f6f4272f9dde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvthz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6r2kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:05Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:05 crc kubenswrapper[4730]: I0320 15:41:05.312382    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2prfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2prfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:05Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:05 crc kubenswrapper[4730]: I0320 15:41:05.324280    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07c87d31-bb33-4cbb-8786-b42cee6340a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4298d984334a4955a24ac85a16284d8ffe1e4f3033c328be6e66f59d3b17f1b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:05Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:05 crc kubenswrapper[4730]: I0320 15:41:05.337289    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:05Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:05 crc kubenswrapper[4730]: I0320 15:41:05.356796    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b4e0e8-af33-491e-b1d1-31079d90c656\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c5491305bb4c7cdbc1498dbb8cc5fe445c665c58d977511f19cb13251f7ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7695af707a9a2e3b80ba56e6a4b97e35b5df9886333ea9cbb7f3852243b61ad7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:40:50Z\\\",\\\"message\\\":\\\"0.980093    6718 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 15:40:50.980099    6718 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 15:40:50.980105    6718 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 15:40:50.980374    6718 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 15:40:50.980766    6718 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 15:40:50.981084    6718 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 15:40:50.981851    6718 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 15:40:50.982080    6718 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 15:40:50.982550    6718 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 15:40:50.982574    6718 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 15:40:50.982620    6718 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 15:40:50.982656    6718 factory.go:656] Stopping watch factory\\\\nI0320 15:40:50.982676    6718 ovnkube.go:599] Stopped ovnkube\\\\nI0320 15\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03c5491305bb4c7cdbc1498dbb8cc5fe445c665c58d977511f19cb13251f7ef0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:41:04Z\\\",\\\"message\\\":\\\" LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}\\\\nI0320 15:41:04.407421    6925 services_controller.go:360] Finished syncing service dns-default on namespace openshift-dns for network=default : 4.127302ms\\\\nI0320 15:41:04.407620    6925 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0320 15:41:04.407658    6925 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0320 15:41:04.407675    6925 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0320 15:41:04.407733    6925 factory.go:1336] Added *v1.Node event handler 7\\\\nI0320 15:41:04.407751    6925 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0320 15:41:04.408006    6925 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0320 15:41:04.408084    6925 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 15:41:04.408106    6925 ovnkube.go:599] Stopped ovnkube\\\\nI0320 15:41:04.408148    6925 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 15:41:04.408227    6925 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:41:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj97f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:05Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:05 crc kubenswrapper[4730]: I0320 15:41:05.368988    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8a35dfa17acf7f7051d13b20cab55fe91645c8fa2773fed67baddae164b586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:05Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:06 crc kubenswrapper[4730]: I0320 15:41:06.108788    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qj97f_c4b4e0e8-af33-491e-b1d1-31079d90c656/ovnkube-controller/2.log"
Mar 20 15:41:06 crc kubenswrapper[4730]: I0320 15:41:06.112784    4730 scope.go:117] "RemoveContainer" containerID="03c5491305bb4c7cdbc1498dbb8cc5fe445c665c58d977511f19cb13251f7ef0"
Mar 20 15:41:06 crc kubenswrapper[4730]: E0320 15:41:06.112951    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qj97f_openshift-ovn-kubernetes(c4b4e0e8-af33-491e-b1d1-31079d90c656)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656"
Mar 20 15:41:06 crc kubenswrapper[4730]: I0320 15:41:06.127318    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8a35dfa17acf7f7051d13b20cab55fe91645c8fa2773fed67baddae164b586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:06Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:06 crc kubenswrapper[4730]: I0320 15:41:06.139730    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:06Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:06 crc kubenswrapper[4730]: I0320 15:41:06.164153    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b4e0e8-af33-491e-b1d1-31079d90c656\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c5491305bb4c7cdbc1498dbb8cc5fe445c665c58d977511f19cb13251f7ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03c5491305bb4c7cdbc1498dbb8cc5fe445c665c58d977511f19cb13251f7ef0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:41:04Z\\\",\\\"message\\\":\\\" LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}\\\\nI0320 15:41:04.407421    6925 services_controller.go:360] Finished syncing service dns-default on namespace openshift-dns for network=default : 4.127302ms\\\\nI0320 15:41:04.407620    6925 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0320 15:41:04.407658    6925 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0320 15:41:04.407675    6925 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0320 15:41:04.407733    6925 factory.go:1336] Added *v1.Node event handler 7\\\\nI0320 15:41:04.407751    6925 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0320 15:41:04.408006    6925 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0320 15:41:04.408084    6925 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 15:41:04.408106    6925 ovnkube.go:599] Stopped ovnkube\\\\nI0320 15:41:04.408148    6925 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 15:41:04.408227    6925 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:41:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qj97f_openshift-ovn-kubernetes(c4b4e0e8-af33-491e-b1d1-31079d90c656)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj97f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:06Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:06 crc kubenswrapper[4730]: I0320 15:41:06.180496    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:06Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:06 crc kubenswrapper[4730]: I0320 15:41:06.194744    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n4w74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ee8d55f-90bd-4484-8455-933de455efea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2070438cbc1bdd8e3c865f5aa62537649598e0f51effc9936b9cb1a630b3651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fvg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n4w74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:06Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:06 crc kubenswrapper[4730]: I0320 15:41:06.217598    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708c4e4c-a9ad-466a-a3d0-a99ae3399934\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://631f7e03c5476d53c6d6a3a3bf6a1d45ac40443386f072fc7c01129e9eaf7619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d029581dfd248d1298c93e18a5352686b5e89fb730ede58a29400c62deb2bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a81d62ce9d72579cc4dd4778d22cf442919e08c7d83209fde12edc8a338684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b365c72e5e05c29295ba289a5bf795353c5221607dadd2ab8b4f642272d99fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfacdb41d11bf0b03bede308c267faddfd0241d99a4cbda2603c6539bc3a433a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:06Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:06 crc kubenswrapper[4730]: I0320 15:41:06.234374    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c3fe2f-3c67-4dee-becb-3ecfe2758384\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2660f772d76479685f586aeb9b242975695ac5e28668840bac6b8a75c09443a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d96c704317c9bfbcb499beae611ca6fcb6337084728df20ad791267de34359b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://180d79b8131e00449715e74d883031078dcceb74ca44ea372b6c3f651d617007\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b137ba07c1866334b1897d83f9590d1627b5724b950efaef0bbda497186fe1ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:39:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:39:59.021893       1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:39:59.022050       1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:39:59.023119       1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311534806/tls.crt::/tmp/serving-cert-3311534806/tls.key\\\\\\\"\\\\nI0320 15:39:59.307640       1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:39:59.310126       1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:39:59.310145       1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:39:59.310173       1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:39:59.310181       1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:39:59.315103       1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 15:39:59.315287       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 15:39:59.315111       1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:39:59.315373       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:39:59.315499       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:39:59.315558       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:39:59.315615       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:39:59.315659       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:39:59.316192       1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8e7f1bfc56e54242e12f30487e8120eeb3d85dc86cfc17b587255889bd60bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:06Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:06 crc kubenswrapper[4730]: I0320 15:41:06.248172    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1242122dff682493181de927aefeb2ac4e5274621d865ee89ccec2e60e86469b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8117d479604ca09fc23e043f373aa3acbffb209d9fbfae95f2eb907424712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p47zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:06Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:06 crc kubenswrapper[4730]: I0320 15:41:06.261361    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e647990de11fdbff5c61eb5f2a4f36ee1c2562141d84795cf0d7bbc1155cac28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://188974c8e34d453f129d95f8a3ee36ff86b44b37a5388a0643611206f15d57c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:06Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:06 crc kubenswrapper[4730]: I0320 15:41:06.273780    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fcd3db3-55f1-4c23-8fa9-78844495cea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8e20a286d6affcba7ffa950ef5386e7f439c9a02381cb8b7d3bc51ad9c4f343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd661cda796a2bd61d1446bee672c4471d60370245caf7cb54faf54dfa9c58a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p5qvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:06Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:06 crc kubenswrapper[4730]: I0320 15:41:06.285771    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-69fnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102cb977-7291-453e-9282-20572071afee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35fa36f52355ad1fd24bd332fc5b0a5ff35ac2be71dd16b53449d9f47e48e8ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plthx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-69fnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:06Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:06 crc kubenswrapper[4730]: I0320 15:41:06.302926    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-49hht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c34c742c6da6c6d35f815901234a0c12c9628d22fa83e511bddc78eae4373cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdfa0e3e3af5612de41c905d807532a50887378171356a3dade8aaf3d29fe9bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdfa0e3e3af5612de41c905d807532a50887378171356a3dade8aaf3d29fe9bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-49hht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:06Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:06 crc kubenswrapper[4730]: I0320 15:41:06.321100    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07c87d31-bb33-4cbb-8786-b42cee6340a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4298d984334a4955a24ac85a16284d8ffe1e4f3033c328be6e66f59d3b17f1b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:06Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:06 crc kubenswrapper[4730]: I0320 15:41:06.337033    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055b0251e049abda93d63985b27520b79b2596443df06427dd990fc974b52a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:06Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:06 crc kubenswrapper[4730]: I0320 15:41:06.355191    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:06Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:06 crc kubenswrapper[4730]: I0320 15:41:06.372620    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6r2kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f0276b7ef4131fee3cc30530a3cb7ddf6e6be5ce73d3084a1f6f4272f9dde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvthz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6r2kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:06Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:06 crc kubenswrapper[4730]: I0320 15:41:06.390721    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2prfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2prfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:06Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:06 crc kubenswrapper[4730]: I0320 15:41:06.532698    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:41:06 crc kubenswrapper[4730]: I0320 15:41:06.532844    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:41:06 crc kubenswrapper[4730]: E0320 15:41:06.532942    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:41:06 crc kubenswrapper[4730]: I0320 15:41:06.532967    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:41:06 crc kubenswrapper[4730]: E0320 15:41:06.533112    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:41:06 crc kubenswrapper[4730]: E0320 15:41:06.533291    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:41:06 crc kubenswrapper[4730]: I0320 15:41:06.533465    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:41:06 crc kubenswrapper[4730]: E0320 15:41:06.533601    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:41:06 crc kubenswrapper[4730]: I0320 15:41:06.552541    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"]
Mar 20 15:41:06 crc kubenswrapper[4730]: E0320 15:41:06.630619    4730 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"
Mar 20 15:41:08 crc kubenswrapper[4730]: I0320 15:41:08.532288    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:41:08 crc kubenswrapper[4730]: I0320 15:41:08.532384    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:41:08 crc kubenswrapper[4730]: E0320 15:41:08.532414    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:41:08 crc kubenswrapper[4730]: I0320 15:41:08.532572    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:41:08 crc kubenswrapper[4730]: E0320 15:41:08.532571    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:41:08 crc kubenswrapper[4730]: E0320 15:41:08.532617    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:41:08 crc kubenswrapper[4730]: I0320 15:41:08.532991    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:41:08 crc kubenswrapper[4730]: E0320 15:41:08.533358    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:41:10 crc kubenswrapper[4730]: I0320 15:41:10.533029    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:41:10 crc kubenswrapper[4730]: I0320 15:41:10.533080    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:41:10 crc kubenswrapper[4730]: E0320 15:41:10.534281    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:41:10 crc kubenswrapper[4730]: E0320 15:41:10.534481    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:41:10 crc kubenswrapper[4730]: I0320 15:41:10.533134    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:41:10 crc kubenswrapper[4730]: I0320 15:41:10.533230    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:41:10 crc kubenswrapper[4730]: E0320 15:41:10.534848    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:41:10 crc kubenswrapper[4730]: E0320 15:41:10.534945    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:41:11 crc kubenswrapper[4730]: I0320 15:41:11.558514    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:11Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:11 crc kubenswrapper[4730]: I0320 15:41:11.591219    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b4e0e8-af33-491e-b1d1-31079d90c656\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c5491305bb4c7cdbc1498dbb8cc5fe445c665c58d977511f19cb13251f7ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03c5491305bb4c7cdbc1498dbb8cc5fe445c665c58d977511f19cb13251f7ef0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:41:04Z\\\",\\\"message\\\":\\\" LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}\\\\nI0320 15:41:04.407421    6925 services_controller.go:360] Finished syncing service dns-default on namespace openshift-dns for network=default : 4.127302ms\\\\nI0320 15:41:04.407620    6925 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0320 15:41:04.407658    6925 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0320 15:41:04.407675    6925 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0320 15:41:04.407733    6925 factory.go:1336] Added *v1.Node event handler 7\\\\nI0320 15:41:04.407751    6925 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0320 15:41:04.408006    6925 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0320 15:41:04.408084    6925 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 15:41:04.408106    6925 ovnkube.go:599] Stopped ovnkube\\\\nI0320 15:41:04.408148    6925 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 15:41:04.408227    6925 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:41:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qj97f_openshift-ovn-kubernetes(c4b4e0e8-af33-491e-b1d1-31079d90c656)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj97f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:11Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:11 crc kubenswrapper[4730]: I0320 15:41:11.611907    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eed0424-45fd-4b1e-8b59-d041af7fb08f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://620070760ce503ee2102ce0880913637feb032124892ce1a1e2060939f38e050\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b34522460ebd4556ce4291e5c5132788387cf45b0be3b9535af9262948b71ac\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:39:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 15:39:03.602145       1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 15:39:03.606021       1 observer_polling.go:159] Starting file observer\\\\nI0320 15:39:03.640071       1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 15:39:03.644835       1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 15:39:27.437935       1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 15:39:27.438079       1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:27Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aee2dcf43ecf6df4a1615aa6e468921053ccb529d3c6dbc2c2ad641e264e606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99093fe46696a888b221d24d1b42226d0ff16bab6b3fb2a718d055cf97066a69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://899dbd6715433cfe5141851019e164daea952552c26706648245fd6319168685\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:11Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:11 crc kubenswrapper[4730]: E0320 15:41:11.632741    4730 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"
Mar 20 15:41:11 crc kubenswrapper[4730]: I0320 15:41:11.637524    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8a35dfa17acf7f7051d13b20cab55fe91645c8fa2773fed67baddae164b586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:11Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:11 crc kubenswrapper[4730]: I0320 15:41:11.657438    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:11Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:11 crc kubenswrapper[4730]: I0320 15:41:11.674819    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n4w74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ee8d55f-90bd-4484-8455-933de455efea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2070438cbc1bdd8e3c865f5aa62537649598e0f51effc9936b9cb1a630b3651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fvg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n4w74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:11Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:11 crc kubenswrapper[4730]: I0320 15:41:11.688727    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1242122dff682493181de927aefeb2ac4e5274621d865ee89ccec2e60e86469b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8117d479604ca09fc23e043f373aa3acbffb209d9fbfae95f2eb907424712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p47zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:11Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:11 crc kubenswrapper[4730]: I0320 15:41:11.704104    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e647990de11fdbff5c61eb5f2a4f36ee1c2562141d84795cf0d7bbc1155cac28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://188974c8e34d453f129d95f8a3ee36ff86b44b37a5388a0643611206f15d57c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:11Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:11 crc kubenswrapper[4730]: I0320 15:41:11.718124    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fcd3db3-55f1-4c23-8fa9-78844495cea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8e20a286d6affcba7ffa950ef5386e7f439c9a02381cb8b7d3bc51ad9c4f343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd661cda796a2bd61d1446bee672c4471d60370245caf7cb54faf54dfa9c58a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p5qvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:11Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:11 crc kubenswrapper[4730]: I0320 15:41:11.731433    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-69fnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102cb977-7291-453e-9282-20572071afee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35fa36f52355ad1fd24bd332fc5b0a5ff35ac2be71dd16b53449d9f47e48e8ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plthx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-69fnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:11Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:11 crc kubenswrapper[4730]: I0320 15:41:11.749296    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-49hht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c34c742c6da6c6d35f815901234a0c12c9628d22fa83e511bddc78eae4373cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdfa0e3e3af5612de41c905d807532a50887378171356a3dade8aaf3d29fe9bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdfa0e3e3af5612de41c905d807532a50887378171356a3dade8aaf3d29fe9bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-49hht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:11Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:11 crc kubenswrapper[4730]: I0320 15:41:11.772092    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708c4e4c-a9ad-466a-a3d0-a99ae3399934\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://631f7e03c5476d53c6d6a3a3bf6a1d45ac40443386f072fc7c01129e9eaf7619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d029581dfd248d1298c93e18a5352686b5e89fb730ede58a29400c62deb2bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a81d62ce9d72579cc4dd4778d22cf442919e08c7d83209fde12edc8a338684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b365c72e5e05c29295ba289a5bf795353c5221607dadd2ab8b4f642272d99fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfacdb41d11bf0b03bede308c267faddfd0241d99a4cbda2603c6539bc3a433a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:11Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:11 crc kubenswrapper[4730]: I0320 15:41:11.786527    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c3fe2f-3c67-4dee-becb-3ecfe2758384\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2660f772d76479685f586aeb9b242975695ac5e28668840bac6b8a75c09443a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d96c704317c9bfbcb499beae611ca6fcb6337084728df20ad791267de34359b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://180d79b8131e00449715e74d883031078dcceb74ca44ea372b6c3f651d617007\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b137ba07c1866334b1897d83f9590d1627b5724b950efaef0bbda497186fe1ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:39:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:39:59.021893       1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:39:59.022050       1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:39:59.023119       1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311534806/tls.crt::/tmp/serving-cert-3311534806/tls.key\\\\\\\"\\\\nI0320 15:39:59.307640       1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:39:59.310126       1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:39:59.310145       1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:39:59.310173       1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:39:59.310181       1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:39:59.315103       1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 15:39:59.315287       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 15:39:59.315111       1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:39:59.315373       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:39:59.315499       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:39:59.315558       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:39:59.315615       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:39:59.315659       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:39:59.316192       1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8e7f1bfc56e54242e12f30487e8120eeb3d85dc86cfc17b587255889bd60bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:11Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:11 crc kubenswrapper[4730]: I0320 15:41:11.797820    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055b0251e049abda93d63985b27520b79b2596443df06427dd990fc974b52a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:11Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:11 crc kubenswrapper[4730]: I0320 15:41:11.808292    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:11Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:11 crc kubenswrapper[4730]: I0320 15:41:11.820449    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6r2kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f0276b7ef4131fee3cc30530a3cb7ddf6e6be5ce73d3084a1f6f4272f9dde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvthz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6r2kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:11Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:11 crc kubenswrapper[4730]: I0320 15:41:11.834212    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2prfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2prfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:11Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:11 crc kubenswrapper[4730]: I0320 15:41:11.846098    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07c87d31-bb33-4cbb-8786-b42cee6340a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4298d984334a4955a24ac85a16284d8ffe1e4f3033c328be6e66f59d3b17f1b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:11Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:12 crc kubenswrapper[4730]: I0320 15:41:12.532703    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:41:12 crc kubenswrapper[4730]: I0320 15:41:12.532761    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:41:12 crc kubenswrapper[4730]: E0320 15:41:12.533438    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:41:12 crc kubenswrapper[4730]: I0320 15:41:12.532836    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:41:12 crc kubenswrapper[4730]: E0320 15:41:12.533557    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:41:12 crc kubenswrapper[4730]: I0320 15:41:12.532820    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:41:12 crc kubenswrapper[4730]: E0320 15:41:12.533208    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:41:12 crc kubenswrapper[4730]: E0320 15:41:12.533660    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:41:14 crc kubenswrapper[4730]: I0320 15:41:14.219218    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:41:14 crc kubenswrapper[4730]: I0320 15:41:14.219304    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:41:14 crc kubenswrapper[4730]: I0320 15:41:14.219325    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:41:14 crc kubenswrapper[4730]: I0320 15:41:14.219348    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:41:14 crc kubenswrapper[4730]: I0320 15:41:14.219365    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:41:14Z","lastTransitionTime":"2026-03-20T15:41:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:41:14 crc kubenswrapper[4730]: E0320 15:41:14.240672    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"666d62d4-aa52-41cc-be79-8c9a068e7752\\\",\\\"systemUUID\\\":\\\"dfe7d645-fe91-432e-8360-ef4633bfea29\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:14Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:14 crc kubenswrapper[4730]: I0320 15:41:14.246050    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:41:14 crc kubenswrapper[4730]: I0320 15:41:14.246096    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:41:14 crc kubenswrapper[4730]: I0320 15:41:14.246112    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:41:14 crc kubenswrapper[4730]: I0320 15:41:14.246135    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:41:14 crc kubenswrapper[4730]: I0320 15:41:14.246149    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:41:14Z","lastTransitionTime":"2026-03-20T15:41:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:41:14 crc kubenswrapper[4730]: E0320 15:41:14.265788    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"666d62d4-aa52-41cc-be79-8c9a068e7752\\\",\\\"systemUUID\\\":\\\"dfe7d645-fe91-432e-8360-ef4633bfea29\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:14Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:14 crc kubenswrapper[4730]: I0320 15:41:14.272771    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:41:14 crc kubenswrapper[4730]: I0320 15:41:14.272823    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:41:14 crc kubenswrapper[4730]: I0320 15:41:14.272841    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:41:14 crc kubenswrapper[4730]: I0320 15:41:14.272864    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:41:14 crc kubenswrapper[4730]: I0320 15:41:14.272880    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:41:14Z","lastTransitionTime":"2026-03-20T15:41:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:41:14 crc kubenswrapper[4730]: E0320 15:41:14.292891    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"666d62d4-aa52-41cc-be79-8c9a068e7752\\\",\\\"systemUUID\\\":\\\"dfe7d645-fe91-432e-8360-ef4633bfea29\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:14Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:14 crc kubenswrapper[4730]: I0320 15:41:14.298550    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:41:14 crc kubenswrapper[4730]: I0320 15:41:14.298650    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:41:14 crc kubenswrapper[4730]: I0320 15:41:14.298671    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:41:14 crc kubenswrapper[4730]: I0320 15:41:14.298697    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:41:14 crc kubenswrapper[4730]: I0320 15:41:14.298715    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:41:14Z","lastTransitionTime":"2026-03-20T15:41:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:41:14 crc kubenswrapper[4730]: E0320 15:41:14.316983    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"666d62d4-aa52-41cc-be79-8c9a068e7752\\\",\\\"systemUUID\\\":\\\"dfe7d645-fe91-432e-8360-ef4633bfea29\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:14Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:14 crc kubenswrapper[4730]: I0320 15:41:14.321668    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:41:14 crc kubenswrapper[4730]: I0320 15:41:14.321725    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:41:14 crc kubenswrapper[4730]: I0320 15:41:14.321746    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:41:14 crc kubenswrapper[4730]: I0320 15:41:14.321772    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:41:14 crc kubenswrapper[4730]: I0320 15:41:14.321791    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:41:14Z","lastTransitionTime":"2026-03-20T15:41:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:41:14 crc kubenswrapper[4730]: E0320 15:41:14.344810    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"666d62d4-aa52-41cc-be79-8c9a068e7752\\\",\\\"systemUUID\\\":\\\"dfe7d645-fe91-432e-8360-ef4633bfea29\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:14Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:14 crc kubenswrapper[4730]: E0320 15:41:14.345042    4730 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count"
Mar 20 15:41:14 crc kubenswrapper[4730]: I0320 15:41:14.532497    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:41:14 crc kubenswrapper[4730]: I0320 15:41:14.532592    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:41:14 crc kubenswrapper[4730]: I0320 15:41:14.532513    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:41:14 crc kubenswrapper[4730]: I0320 15:41:14.532638    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:41:14 crc kubenswrapper[4730]: E0320 15:41:14.532719    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:41:14 crc kubenswrapper[4730]: E0320 15:41:14.532868    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:41:14 crc kubenswrapper[4730]: E0320 15:41:14.532946    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:41:14 crc kubenswrapper[4730]: E0320 15:41:14.533090    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:41:15 crc kubenswrapper[4730]: I0320 15:41:15.546601    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"]
Mar 20 15:41:16 crc kubenswrapper[4730]: I0320 15:41:16.532124    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:41:16 crc kubenswrapper[4730]: I0320 15:41:16.532179    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:41:16 crc kubenswrapper[4730]: I0320 15:41:16.532229    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:41:16 crc kubenswrapper[4730]: I0320 15:41:16.532124    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:41:16 crc kubenswrapper[4730]: E0320 15:41:16.532383    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:41:16 crc kubenswrapper[4730]: E0320 15:41:16.532527    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:41:16 crc kubenswrapper[4730]: E0320 15:41:16.532652    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:41:16 crc kubenswrapper[4730]: E0320 15:41:16.532789    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:41:16 crc kubenswrapper[4730]: I0320 15:41:16.538814    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:41:16 crc kubenswrapper[4730]: I0320 15:41:16.538972    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:41:16 crc kubenswrapper[4730]: E0320 15:41:16.539015    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:42:20.538983294 +0000 UTC m=+199.752354703 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:41:16 crc kubenswrapper[4730]: I0320 15:41:16.539131    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:41:16 crc kubenswrapper[4730]: E0320 15:41:16.539146    4730 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered
Mar 20 15:41:16 crc kubenswrapper[4730]: E0320 15:41:16.539239    4730 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered
Mar 20 15:41:16 crc kubenswrapper[4730]: I0320 15:41:16.539210    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:41:16 crc kubenswrapper[4730]: E0320 15:41:16.539318    4730 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered]
Mar 20 15:41:16 crc kubenswrapper[4730]: E0320 15:41:16.539369    4730 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered
Mar 20 15:41:16 crc kubenswrapper[4730]: E0320 15:41:16.539195    4730 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered
Mar 20 15:41:16 crc kubenswrapper[4730]: I0320 15:41:16.539389    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:41:16 crc kubenswrapper[4730]: E0320 15:41:16.539445    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 15:42:20.539417447 +0000 UTC m=+199.752788846 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered
Mar 20 15:41:16 crc kubenswrapper[4730]: E0320 15:41:16.539479    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 15:42:20.539460808 +0000 UTC m=+199.752832267 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered
Mar 20 15:41:16 crc kubenswrapper[4730]: E0320 15:41:16.539521    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 15:42:20.53950924 +0000 UTC m=+199.752880649 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered]
Mar 20 15:41:16 crc kubenswrapper[4730]: E0320 15:41:16.539521    4730 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered
Mar 20 15:41:16 crc kubenswrapper[4730]: E0320 15:41:16.539583    4730 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered
Mar 20 15:41:16 crc kubenswrapper[4730]: E0320 15:41:16.539597    4730 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered]
Mar 20 15:41:16 crc kubenswrapper[4730]: E0320 15:41:16.539658    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 15:42:20.539646254 +0000 UTC m=+199.753017663 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered]
Mar 20 15:41:16 crc kubenswrapper[4730]: E0320 15:41:16.634648    4730 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"
Mar 20 15:41:16 crc kubenswrapper[4730]: I0320 15:41:16.741211    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a-metrics-certs\") pod \"network-metrics-daemon-2prfn\" (UID: \"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\") " pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:41:16 crc kubenswrapper[4730]: E0320 15:41:16.741415    4730 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered
Mar 20 15:41:16 crc kubenswrapper[4730]: E0320 15:41:16.741509    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a-metrics-certs podName:db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a nodeName:}" failed. No retries permitted until 2026-03-20 15:42:20.741485562 +0000 UTC m=+199.954856961 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a-metrics-certs") pod "network-metrics-daemon-2prfn" (UID: "db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a") : object "openshift-multus"/"metrics-daemon-secret" not registered
Mar 20 15:41:18 crc kubenswrapper[4730]: I0320 15:41:18.532955    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:41:18 crc kubenswrapper[4730]: I0320 15:41:18.533053    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:41:18 crc kubenswrapper[4730]: E0320 15:41:18.533083    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:41:18 crc kubenswrapper[4730]: I0320 15:41:18.533117    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:41:18 crc kubenswrapper[4730]: I0320 15:41:18.533165    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:41:18 crc kubenswrapper[4730]: E0320 15:41:18.533296    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:41:18 crc kubenswrapper[4730]: E0320 15:41:18.533441    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:41:18 crc kubenswrapper[4730]: E0320 15:41:18.533474    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:41:19 crc kubenswrapper[4730]: I0320 15:41:19.533833    4730 scope.go:117] "RemoveContainer" containerID="03c5491305bb4c7cdbc1498dbb8cc5fe445c665c58d977511f19cb13251f7ef0"
Mar 20 15:41:19 crc kubenswrapper[4730]: E0320 15:41:19.534064    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qj97f_openshift-ovn-kubernetes(c4b4e0e8-af33-491e-b1d1-31079d90c656)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656"
Mar 20 15:41:20 crc kubenswrapper[4730]: I0320 15:41:20.532953    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:41:20 crc kubenswrapper[4730]: I0320 15:41:20.533032    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:41:20 crc kubenswrapper[4730]: E0320 15:41:20.533104    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:41:20 crc kubenswrapper[4730]: I0320 15:41:20.533050    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:41:20 crc kubenswrapper[4730]: E0320 15:41:20.533272    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:41:20 crc kubenswrapper[4730]: I0320 15:41:20.533344    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:41:20 crc kubenswrapper[4730]: E0320 15:41:20.533492    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:41:20 crc kubenswrapper[4730]: E0320 15:41:20.533614    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:41:21 crc kubenswrapper[4730]: I0320 15:41:21.553177    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caf1c50b-d896-46ac-8c1c-2368a862eb88\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a2006b33d30cf2fd57843f3df0fb087253dd116f48a4d807c31260ce7508b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://badcec1b25a9d088fe7e563366ee7568adcabfe9c29a536db19fe3119b10f229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f50a49e995c2647a19bd3dedd3ca85f1d7d0279df106c153af39641af9ea83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e8cd87f56c4a70c698163de2d3f364420981943d389a3cc9b64401bb5fbf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e8cd87f56c4a70c698163de2d3f364420981943d389a3cc9b64401bb5fbf08e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:21Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:21 crc kubenswrapper[4730]: I0320 15:41:21.571824    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07c87d31-bb33-4cbb-8786-b42cee6340a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4298d984334a4955a24ac85a16284d8ffe1e4f3033c328be6e66f59d3b17f1b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:21Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:21 crc kubenswrapper[4730]: I0320 15:41:21.628343    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055b0251e049abda93d63985b27520b79b2596443df06427dd990fc974b52a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:21Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:21 crc kubenswrapper[4730]: E0320 15:41:21.635996    4730 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"
Mar 20 15:41:21 crc kubenswrapper[4730]: I0320 15:41:21.659194    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:21Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:21 crc kubenswrapper[4730]: I0320 15:41:21.675650    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6r2kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f0276b7ef4131fee3cc30530a3cb7ddf6e6be5ce73d3084a1f6f4272f9dde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvthz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6r2kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:21Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:21 crc kubenswrapper[4730]: I0320 15:41:21.688582    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2prfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2prfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:21Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:21 crc kubenswrapper[4730]: I0320 15:41:21.701467    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eed0424-45fd-4b1e-8b59-d041af7fb08f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://620070760ce503ee2102ce0880913637feb032124892ce1a1e2060939f38e050\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b34522460ebd4556ce4291e5c5132788387cf45b0be3b9535af9262948b71ac\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:39:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 15:39:03.602145       1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 15:39:03.606021       1 observer_polling.go:159] Starting file observer\\\\nI0320 15:39:03.640071       1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 15:39:03.644835       1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 15:39:27.437935       1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 15:39:27.438079       1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:27Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aee2dcf43ecf6df4a1615aa6e468921053ccb529d3c6dbc2c2ad641e264e606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99093fe46696a888b221d24d1b42226d0ff16bab6b3fb2a718d055cf97066a69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://899dbd6715433cfe5141851019e164daea952552c26706648245fd6319168685\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:21Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:21 crc kubenswrapper[4730]: I0320 15:41:21.714500    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8a35dfa17acf7f7051d13b20cab55fe91645c8fa2773fed67baddae164b586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:21Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:21 crc kubenswrapper[4730]: I0320 15:41:21.731161    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:21Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:21 crc kubenswrapper[4730]: I0320 15:41:21.751923    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b4e0e8-af33-491e-b1d1-31079d90c656\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c5491305bb4c7cdbc1498dbb8cc5fe445c665c58d977511f19cb13251f7ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03c5491305bb4c7cdbc1498dbb8cc5fe445c665c58d977511f19cb13251f7ef0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:41:04Z\\\",\\\"message\\\":\\\" LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}\\\\nI0320 15:41:04.407421    6925 services_controller.go:360] Finished syncing service dns-default on namespace openshift-dns for network=default : 4.127302ms\\\\nI0320 15:41:04.407620    6925 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0320 15:41:04.407658    6925 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0320 15:41:04.407675    6925 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0320 15:41:04.407733    6925 factory.go:1336] Added *v1.Node event handler 7\\\\nI0320 15:41:04.407751    6925 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0320 15:41:04.408006    6925 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0320 15:41:04.408084    6925 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 15:41:04.408106    6925 ovnkube.go:599] Stopped ovnkube\\\\nI0320 15:41:04.408148    6925 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 15:41:04.408227    6925 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:41:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qj97f_openshift-ovn-kubernetes(c4b4e0e8-af33-491e-b1d1-31079d90c656)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj97f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:21Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:21 crc kubenswrapper[4730]: I0320 15:41:21.765817    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:21Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:21 crc kubenswrapper[4730]: I0320 15:41:21.784745    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n4w74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ee8d55f-90bd-4484-8455-933de455efea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2070438cbc1bdd8e3c865f5aa62537649598e0f51effc9936b9cb1a630b3651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fvg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n4w74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:21Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:21 crc kubenswrapper[4730]: I0320 15:41:21.801239    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-49hht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c34c742c6da6c6d35f815901234a0c12c9628d22fa83e511bddc78eae4373cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdfa0e3e3af5612de41c905d807532a50887378171356a3dade8aaf3d29fe9bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdfa0e3e3af5612de41c905d807532a50887378171356a3dade8aaf3d29fe9bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-49hht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:21Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:21 crc kubenswrapper[4730]: I0320 15:41:21.827644    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708c4e4c-a9ad-466a-a3d0-a99ae3399934\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://631f7e03c5476d53c6d6a3a3bf6a1d45ac40443386f072fc7c01129e9eaf7619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d029581dfd248d1298c93e18a5352686b5e89fb730ede58a29400c62deb2bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a81d62ce9d72579cc4dd4778d22cf442919e08c7d83209fde12edc8a338684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b365c72e5e05c29295ba289a5bf795353c5221607dadd2ab8b4f642272d99fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfacdb41d11bf0b03bede308c267faddfd0241d99a4cbda2603c6539bc3a433a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:21Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:21 crc kubenswrapper[4730]: I0320 15:41:21.842545    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c3fe2f-3c67-4dee-becb-3ecfe2758384\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2660f772d76479685f586aeb9b242975695ac5e28668840bac6b8a75c09443a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d96c704317c9bfbcb499beae611ca6fcb6337084728df20ad791267de34359b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://180d79b8131e00449715e74d883031078dcceb74ca44ea372b6c3f651d617007\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b137ba07c1866334b1897d83f9590d1627b5724b950efaef0bbda497186fe1ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:39:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:39:59.021893       1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:39:59.022050       1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:39:59.023119       1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311534806/tls.crt::/tmp/serving-cert-3311534806/tls.key\\\\\\\"\\\\nI0320 15:39:59.307640       1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:39:59.310126       1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:39:59.310145       1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:39:59.310173       1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:39:59.310181       1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:39:59.315103       1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 15:39:59.315287       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 15:39:59.315111       1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:39:59.315373       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:39:59.315499       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:39:59.315558       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:39:59.315615       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:39:59.315659       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:39:59.316192       1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8e7f1bfc56e54242e12f30487e8120eeb3d85dc86cfc17b587255889bd60bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:21Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:21 crc kubenswrapper[4730]: I0320 15:41:21.856099    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1242122dff682493181de927aefeb2ac4e5274621d865ee89ccec2e60e86469b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8117d479604ca09fc23e043f373aa3acbffb209d9fbfae95f2eb907424712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p47zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:21Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:21 crc kubenswrapper[4730]: I0320 15:41:21.875079    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e647990de11fdbff5c61eb5f2a4f36ee1c2562141d84795cf0d7bbc1155cac28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://188974c8e34d453f129d95f8a3ee36ff86b44b37a5388a0643611206f15d57c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:21Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:21 crc kubenswrapper[4730]: I0320 15:41:21.887551    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fcd3db3-55f1-4c23-8fa9-78844495cea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8e20a286d6affcba7ffa950ef5386e7f439c9a02381cb8b7d3bc51ad9c4f343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd661cda796a2bd61d1446bee672c4471d60370245caf7cb54faf54dfa9c58a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p5qvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:21Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:21 crc kubenswrapper[4730]: I0320 15:41:21.898769    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-69fnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102cb977-7291-453e-9282-20572071afee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35fa36f52355ad1fd24bd332fc5b0a5ff35ac2be71dd16b53449d9f47e48e8ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plthx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-69fnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:21Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:22 crc kubenswrapper[4730]: I0320 15:41:22.532227    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:41:22 crc kubenswrapper[4730]: I0320 15:41:22.532308    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:41:22 crc kubenswrapper[4730]: E0320 15:41:22.532931    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:41:22 crc kubenswrapper[4730]: E0320 15:41:22.533146    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:41:22 crc kubenswrapper[4730]: I0320 15:41:22.532330    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:41:22 crc kubenswrapper[4730]: E0320 15:41:22.533356    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:41:22 crc kubenswrapper[4730]: I0320 15:41:22.532390    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:41:22 crc kubenswrapper[4730]: E0320 15:41:22.533576    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:41:24 crc kubenswrapper[4730]: I0320 15:41:24.392354    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:41:24 crc kubenswrapper[4730]: I0320 15:41:24.392400    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:41:24 crc kubenswrapper[4730]: I0320 15:41:24.392412    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:41:24 crc kubenswrapper[4730]: I0320 15:41:24.392426    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:41:24 crc kubenswrapper[4730]: I0320 15:41:24.392437    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:41:24Z","lastTransitionTime":"2026-03-20T15:41:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:41:24 crc kubenswrapper[4730]: E0320 15:41:24.412693    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"666d62d4-aa52-41cc-be79-8c9a068e7752\\\",\\\"systemUUID\\\":\\\"dfe7d645-fe91-432e-8360-ef4633bfea29\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:24Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:24 crc kubenswrapper[4730]: I0320 15:41:24.417772    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:41:24 crc kubenswrapper[4730]: I0320 15:41:24.417804    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:41:24 crc kubenswrapper[4730]: I0320 15:41:24.417813    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:41:24 crc kubenswrapper[4730]: I0320 15:41:24.417826    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:41:24 crc kubenswrapper[4730]: I0320 15:41:24.417838    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:41:24Z","lastTransitionTime":"2026-03-20T15:41:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:41:24 crc kubenswrapper[4730]: E0320 15:41:24.432333    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"666d62d4-aa52-41cc-be79-8c9a068e7752\\\",\\\"systemUUID\\\":\\\"dfe7d645-fe91-432e-8360-ef4633bfea29\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:24Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:24 crc kubenswrapper[4730]: I0320 15:41:24.436590    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:41:24 crc kubenswrapper[4730]: I0320 15:41:24.436632    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:41:24 crc kubenswrapper[4730]: I0320 15:41:24.436640    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:41:24 crc kubenswrapper[4730]: I0320 15:41:24.436654    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:41:24 crc kubenswrapper[4730]: I0320 15:41:24.436663    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:41:24Z","lastTransitionTime":"2026-03-20T15:41:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:41:24 crc kubenswrapper[4730]: E0320 15:41:24.455768    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"666d62d4-aa52-41cc-be79-8c9a068e7752\\\",\\\"systemUUID\\\":\\\"dfe7d645-fe91-432e-8360-ef4633bfea29\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:24Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:24 crc kubenswrapper[4730]: I0320 15:41:24.459723    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:41:24 crc kubenswrapper[4730]: I0320 15:41:24.459772    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:41:24 crc kubenswrapper[4730]: I0320 15:41:24.459785    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:41:24 crc kubenswrapper[4730]: I0320 15:41:24.459800    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:41:24 crc kubenswrapper[4730]: I0320 15:41:24.459811    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:41:24Z","lastTransitionTime":"2026-03-20T15:41:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:41:24 crc kubenswrapper[4730]: E0320 15:41:24.481801    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"666d62d4-aa52-41cc-be79-8c9a068e7752\\\",\\\"systemUUID\\\":\\\"dfe7d645-fe91-432e-8360-ef4633bfea29\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:24Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:24 crc kubenswrapper[4730]: I0320 15:41:24.487213    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:41:24 crc kubenswrapper[4730]: I0320 15:41:24.487268    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:41:24 crc kubenswrapper[4730]: I0320 15:41:24.487280    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:41:24 crc kubenswrapper[4730]: I0320 15:41:24.487295    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:41:24 crc kubenswrapper[4730]: I0320 15:41:24.487307    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:41:24Z","lastTransitionTime":"2026-03-20T15:41:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:41:24 crc kubenswrapper[4730]: E0320 15:41:24.505882    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"666d62d4-aa52-41cc-be79-8c9a068e7752\\\",\\\"systemUUID\\\":\\\"dfe7d645-fe91-432e-8360-ef4633bfea29\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:24Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:24 crc kubenswrapper[4730]: E0320 15:41:24.506588    4730 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count"
Mar 20 15:41:24 crc kubenswrapper[4730]: I0320 15:41:24.532268    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:41:24 crc kubenswrapper[4730]: I0320 15:41:24.532322    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:41:24 crc kubenswrapper[4730]: I0320 15:41:24.532277    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:41:24 crc kubenswrapper[4730]: E0320 15:41:24.532436    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:41:24 crc kubenswrapper[4730]: E0320 15:41:24.532546    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:41:24 crc kubenswrapper[4730]: E0320 15:41:24.532709    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:41:24 crc kubenswrapper[4730]: I0320 15:41:24.532805    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:41:24 crc kubenswrapper[4730]: E0320 15:41:24.533323    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:41:26 crc kubenswrapper[4730]: I0320 15:41:26.532023    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:41:26 crc kubenswrapper[4730]: I0320 15:41:26.532029    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:41:26 crc kubenswrapper[4730]: I0320 15:41:26.532028    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:41:26 crc kubenswrapper[4730]: E0320 15:41:26.532221    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:41:26 crc kubenswrapper[4730]: E0320 15:41:26.532413    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:41:26 crc kubenswrapper[4730]: I0320 15:41:26.532539    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:41:26 crc kubenswrapper[4730]: E0320 15:41:26.532645    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:41:26 crc kubenswrapper[4730]: E0320 15:41:26.532750    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:41:26 crc kubenswrapper[4730]: E0320 15:41:26.637897    4730 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"
Mar 20 15:41:27 crc kubenswrapper[4730]: I0320 15:41:27.187093    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6r2kn_6f97b1f1-1fad-44ec-8253-17dd6a5eee54/kube-multus/0.log"
Mar 20 15:41:27 crc kubenswrapper[4730]: I0320 15:41:27.187173    4730 generic.go:334] "Generic (PLEG): container finished" podID="6f97b1f1-1fad-44ec-8253-17dd6a5eee54" containerID="f7f0276b7ef4131fee3cc30530a3cb7ddf6e6be5ce73d3084a1f6f4272f9dde6" exitCode=1
Mar 20 15:41:27 crc kubenswrapper[4730]: I0320 15:41:27.187217    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6r2kn" event={"ID":"6f97b1f1-1fad-44ec-8253-17dd6a5eee54","Type":"ContainerDied","Data":"f7f0276b7ef4131fee3cc30530a3cb7ddf6e6be5ce73d3084a1f6f4272f9dde6"}
Mar 20 15:41:27 crc kubenswrapper[4730]: I0320 15:41:27.187839    4730 scope.go:117] "RemoveContainer" containerID="f7f0276b7ef4131fee3cc30530a3cb7ddf6e6be5ce73d3084a1f6f4272f9dde6"
Mar 20 15:41:27 crc kubenswrapper[4730]: I0320 15:41:27.204499    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caf1c50b-d896-46ac-8c1c-2368a862eb88\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a2006b33d30cf2fd57843f3df0fb087253dd116f48a4d807c31260ce7508b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://badcec1b25a9d088fe7e563366ee7568adcabfe9c29a536db19fe3119b10f229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f50a49e995c2647a19bd3dedd3ca85f1d7d0279df106c153af39641af9ea83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e8cd87f56c4a70c698163de2d3f364420981943d389a3cc9b64401bb5fbf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e8cd87f56c4a70c698163de2d3f364420981943d389a3cc9b64401bb5fbf08e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:27Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:27 crc kubenswrapper[4730]: I0320 15:41:27.220740    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07c87d31-bb33-4cbb-8786-b42cee6340a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4298d984334a4955a24ac85a16284d8ffe1e4f3033c328be6e66f59d3b17f1b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:27Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:27 crc kubenswrapper[4730]: I0320 15:41:27.234870    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055b0251e049abda93d63985b27520b79b2596443df06427dd990fc974b52a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:27Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:27 crc kubenswrapper[4730]: I0320 15:41:27.253013    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:27Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:27 crc kubenswrapper[4730]: I0320 15:41:27.269109    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6r2kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7f0276b7ef4131fee3cc30530a3cb7ddf6e6be5ce73d3084a1f6f4272f9dde6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f0276b7ef4131fee3cc30530a3cb7ddf6e6be5ce73d3084a1f6f4272f9dde6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:41:27Z\\\",\\\"message\\\":\\\"2026-03-20T15:40:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5b4f3207-7d26-40ce-9cb0-76c6f07e26b9\\\\n2026-03-20T15:40:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5b4f3207-7d26-40ce-9cb0-76c6f07e26b9 to /host/opt/cni/bin/\\\\n2026-03-20T15:40:42Z [verbose] multus-daemon started\\\\n2026-03-20T15:40:42Z [verbose] Readiness Indicator file check\\\\n2026-03-20T15:41:27Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvthz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6r2kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:27Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:27 crc kubenswrapper[4730]: I0320 15:41:27.282279    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2prfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2prfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:27Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:27 crc kubenswrapper[4730]: I0320 15:41:27.296975    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eed0424-45fd-4b1e-8b59-d041af7fb08f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://620070760ce503ee2102ce0880913637feb032124892ce1a1e2060939f38e050\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b34522460ebd4556ce4291e5c5132788387cf45b0be3b9535af9262948b71ac\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:39:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 15:39:03.602145       1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 15:39:03.606021       1 observer_polling.go:159] Starting file observer\\\\nI0320 15:39:03.640071       1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 15:39:03.644835       1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 15:39:27.437935       1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 15:39:27.438079       1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:27Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aee2dcf43ecf6df4a1615aa6e468921053ccb529d3c6dbc2c2ad641e264e606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99093fe46696a888b221d24d1b42226d0ff16bab6b3fb2a718d055cf97066a69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://899dbd6715433cfe5141851019e164daea952552c26706648245fd6319168685\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:27Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:27 crc kubenswrapper[4730]: I0320 15:41:27.314573    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8a35dfa17acf7f7051d13b20cab55fe91645c8fa2773fed67baddae164b586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:27Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:27 crc kubenswrapper[4730]: I0320 15:41:27.328439    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:27Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:27 crc kubenswrapper[4730]: I0320 15:41:27.349540    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b4e0e8-af33-491e-b1d1-31079d90c656\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c5491305bb4c7cdbc1498dbb8cc5fe445c665c58d977511f19cb13251f7ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03c5491305bb4c7cdbc1498dbb8cc5fe445c665c58d977511f19cb13251f7ef0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:41:04Z\\\",\\\"message\\\":\\\" LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}\\\\nI0320 15:41:04.407421    6925 services_controller.go:360] Finished syncing service dns-default on namespace openshift-dns for network=default : 4.127302ms\\\\nI0320 15:41:04.407620    6925 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0320 15:41:04.407658    6925 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0320 15:41:04.407675    6925 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0320 15:41:04.407733    6925 factory.go:1336] Added *v1.Node event handler 7\\\\nI0320 15:41:04.407751    6925 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0320 15:41:04.408006    6925 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0320 15:41:04.408084    6925 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 15:41:04.408106    6925 ovnkube.go:599] Stopped ovnkube\\\\nI0320 15:41:04.408148    6925 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 15:41:04.408227    6925 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:41:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qj97f_openshift-ovn-kubernetes(c4b4e0e8-af33-491e-b1d1-31079d90c656)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj97f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:27Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:27 crc kubenswrapper[4730]: I0320 15:41:27.363663    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:27Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:27 crc kubenswrapper[4730]: I0320 15:41:27.375021    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n4w74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ee8d55f-90bd-4484-8455-933de455efea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2070438cbc1bdd8e3c865f5aa62537649598e0f51effc9936b9cb1a630b3651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fvg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n4w74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:27Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:27 crc kubenswrapper[4730]: I0320 15:41:27.393596    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708c4e4c-a9ad-466a-a3d0-a99ae3399934\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://631f7e03c5476d53c6d6a3a3bf6a1d45ac40443386f072fc7c01129e9eaf7619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d029581dfd248d1298c93e18a5352686b5e89fb730ede58a29400c62deb2bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a81d62ce9d72579cc4dd4778d22cf442919e08c7d83209fde12edc8a338684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b365c72e5e05c29295ba289a5bf795353c5221607dadd2ab8b4f642272d99fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfacdb41d11bf0b03bede308c267faddfd0241d99a4cbda2603c6539bc3a433a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:27Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:27 crc kubenswrapper[4730]: I0320 15:41:27.407849    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c3fe2f-3c67-4dee-becb-3ecfe2758384\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2660f772d76479685f586aeb9b242975695ac5e28668840bac6b8a75c09443a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d96c704317c9bfbcb499beae611ca6fcb6337084728df20ad791267de34359b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://180d79b8131e00449715e74d883031078dcceb74ca44ea372b6c3f651d617007\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b137ba07c1866334b1897d83f9590d1627b5724b950efaef0bbda497186fe1ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:39:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:39:59.021893       1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:39:59.022050       1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:39:59.023119       1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311534806/tls.crt::/tmp/serving-cert-3311534806/tls.key\\\\\\\"\\\\nI0320 15:39:59.307640       1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:39:59.310126       1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:39:59.310145       1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:39:59.310173       1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:39:59.310181       1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:39:59.315103       1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 15:39:59.315287       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 15:39:59.315111       1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:39:59.315373       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:39:59.315499       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:39:59.315558       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:39:59.315615       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:39:59.315659       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:39:59.316192       1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8e7f1bfc56e54242e12f30487e8120eeb3d85dc86cfc17b587255889bd60bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:27Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:27 crc kubenswrapper[4730]: I0320 15:41:27.417789    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1242122dff682493181de927aefeb2ac4e5274621d865ee89ccec2e60e86469b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8117d479604ca09fc23e043f373aa3acbffb209d9fbfae95f2eb907424712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p47zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:27Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:27 crc kubenswrapper[4730]: I0320 15:41:27.429723    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e647990de11fdbff5c61eb5f2a4f36ee1c2562141d84795cf0d7bbc1155cac28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://188974c8e34d453f129d95f8a3ee36ff86b44b37a5388a0643611206f15d57c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:27Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:27 crc kubenswrapper[4730]: I0320 15:41:27.441116    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fcd3db3-55f1-4c23-8fa9-78844495cea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8e20a286d6affcba7ffa950ef5386e7f439c9a02381cb8b7d3bc51ad9c4f343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd661cda796a2bd61d1446bee672c4471d60370245caf7cb54faf54dfa9c58a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p5qvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:27Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:27 crc kubenswrapper[4730]: I0320 15:41:27.449532    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-69fnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102cb977-7291-453e-9282-20572071afee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35fa36f52355ad1fd24bd332fc5b0a5ff35ac2be71dd16b53449d9f47e48e8ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plthx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-69fnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:27Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:27 crc kubenswrapper[4730]: I0320 15:41:27.461039    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-49hht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c34c742c6da6c6d35f815901234a0c12c9628d22fa83e511bddc78eae4373cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdfa0e3e3af5612de41c905d807532a50887378171356a3dade8aaf3d29fe9bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdfa0e3e3af5612de41c905d807532a50887378171356a3dade8aaf3d29fe9bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-49hht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:27Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:28 crc kubenswrapper[4730]: I0320 15:41:28.192372    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6r2kn_6f97b1f1-1fad-44ec-8253-17dd6a5eee54/kube-multus/0.log"
Mar 20 15:41:28 crc kubenswrapper[4730]: I0320 15:41:28.192446    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6r2kn" event={"ID":"6f97b1f1-1fad-44ec-8253-17dd6a5eee54","Type":"ContainerStarted","Data":"12ba423ea0fecce8b2416cc8f75f3323980aae80a20ff26bd2f9a6c4cd464812"}
Mar 20 15:41:28 crc kubenswrapper[4730]: I0320 15:41:28.205558    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2prfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2prfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:28Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:28 crc kubenswrapper[4730]: I0320 15:41:28.223203    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caf1c50b-d896-46ac-8c1c-2368a862eb88\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a2006b33d30cf2fd57843f3df0fb087253dd116f48a4d807c31260ce7508b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://badcec1b25a9d088fe7e563366ee7568adcabfe9c29a536db19fe3119b10f229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f50a49e995c2647a19bd3dedd3ca85f1d7d0279df106c153af39641af9ea83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e8cd87f56c4a70c698163de2d3f364420981943d389a3cc9b64401bb5fbf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e8cd87f56c4a70c698163de2d3f364420981943d389a3cc9b64401bb5fbf08e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:28Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:28 crc kubenswrapper[4730]: I0320 15:41:28.238910    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07c87d31-bb33-4cbb-8786-b42cee6340a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4298d984334a4955a24ac85a16284d8ffe1e4f3033c328be6e66f59d3b17f1b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:28Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:28 crc kubenswrapper[4730]: I0320 15:41:28.257307    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055b0251e049abda93d63985b27520b79b2596443df06427dd990fc974b52a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:28Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:28 crc kubenswrapper[4730]: I0320 15:41:28.274088    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:28Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:28 crc kubenswrapper[4730]: I0320 15:41:28.290820    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6r2kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12ba423ea0fecce8b2416cc8f75f3323980aae80a20ff26bd2f9a6c4cd464812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f0276b7ef4131fee3cc30530a3cb7ddf6e6be5ce73d3084a1f6f4272f9dde6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:41:27Z\\\",\\\"message\\\":\\\"2026-03-20T15:40:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5b4f3207-7d26-40ce-9cb0-76c6f07e26b9\\\\n2026-03-20T15:40:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5b4f3207-7d26-40ce-9cb0-76c6f07e26b9 to /host/opt/cni/bin/\\\\n2026-03-20T15:40:42Z [verbose] multus-daemon started\\\\n2026-03-20T15:40:42Z [verbose] Readiness Indicator file check\\\\n2026-03-20T15:41:27Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:41:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvthz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6r2kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:28Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:28 crc kubenswrapper[4730]: I0320 15:41:28.309440    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eed0424-45fd-4b1e-8b59-d041af7fb08f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://620070760ce503ee2102ce0880913637feb032124892ce1a1e2060939f38e050\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b34522460ebd4556ce4291e5c5132788387cf45b0be3b9535af9262948b71ac\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:39:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 15:39:03.602145       1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 15:39:03.606021       1 observer_polling.go:159] Starting file observer\\\\nI0320 15:39:03.640071       1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 15:39:03.644835       1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 15:39:27.437935       1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 15:39:27.438079       1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:27Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aee2dcf43ecf6df4a1615aa6e468921053ccb529d3c6dbc2c2ad641e264e606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99093fe46696a888b221d24d1b42226d0ff16bab6b3fb2a718d055cf97066a69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://899dbd6715433cfe5141851019e164daea952552c26706648245fd6319168685\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:28Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:28 crc kubenswrapper[4730]: I0320 15:41:28.322922    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8a35dfa17acf7f7051d13b20cab55fe91645c8fa2773fed67baddae164b586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:28Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:28 crc kubenswrapper[4730]: I0320 15:41:28.335328    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:28Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:28 crc kubenswrapper[4730]: I0320 15:41:28.355767    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b4e0e8-af33-491e-b1d1-31079d90c656\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c5491305bb4c7cdbc1498dbb8cc5fe445c665c58d977511f19cb13251f7ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03c5491305bb4c7cdbc1498dbb8cc5fe445c665c58d977511f19cb13251f7ef0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:41:04Z\\\",\\\"message\\\":\\\" LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}\\\\nI0320 15:41:04.407421    6925 services_controller.go:360] Finished syncing service dns-default on namespace openshift-dns for network=default : 4.127302ms\\\\nI0320 15:41:04.407620    6925 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0320 15:41:04.407658    6925 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0320 15:41:04.407675    6925 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0320 15:41:04.407733    6925 factory.go:1336] Added *v1.Node event handler 7\\\\nI0320 15:41:04.407751    6925 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0320 15:41:04.408006    6925 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0320 15:41:04.408084    6925 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 15:41:04.408106    6925 ovnkube.go:599] Stopped ovnkube\\\\nI0320 15:41:04.408148    6925 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 15:41:04.408227    6925 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:41:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qj97f_openshift-ovn-kubernetes(c4b4e0e8-af33-491e-b1d1-31079d90c656)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj97f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:28Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:28 crc kubenswrapper[4730]: I0320 15:41:28.369909    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:28Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:28 crc kubenswrapper[4730]: I0320 15:41:28.383896    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n4w74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ee8d55f-90bd-4484-8455-933de455efea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2070438cbc1bdd8e3c865f5aa62537649598e0f51effc9936b9cb1a630b3651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fvg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n4w74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:28Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:28 crc kubenswrapper[4730]: I0320 15:41:28.395312    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-69fnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102cb977-7291-453e-9282-20572071afee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35fa36f52355ad1fd24bd332fc5b0a5ff35ac2be71dd16b53449d9f47e48e8ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plthx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-69fnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:28Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:28 crc kubenswrapper[4730]: I0320 15:41:28.408108    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-49hht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c34c742c6da6c6d35f815901234a0c12c9628d22fa83e511bddc78eae4373cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdfa0e3e3af5612de41c905d807532a50887378171356a3dade8aaf3d29fe9bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdfa0e3e3af5612de41c905d807532a50887378171356a3dade8aaf3d29fe9bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-49hht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:28Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:28 crc kubenswrapper[4730]: I0320 15:41:28.429159    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708c4e4c-a9ad-466a-a3d0-a99ae3399934\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://631f7e03c5476d53c6d6a3a3bf6a1d45ac40443386f072fc7c01129e9eaf7619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d029581dfd248d1298c93e18a5352686b5e89fb730ede58a29400c62deb2bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a81d62ce9d72579cc4dd4778d22cf442919e08c7d83209fde12edc8a338684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b365c72e5e05c29295ba289a5bf795353c5221607dadd2ab8b4f642272d99fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfacdb41d11bf0b03bede308c267faddfd0241d99a4cbda2603c6539bc3a433a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:28Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:28 crc kubenswrapper[4730]: I0320 15:41:28.444485    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c3fe2f-3c67-4dee-becb-3ecfe2758384\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2660f772d76479685f586aeb9b242975695ac5e28668840bac6b8a75c09443a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d96c704317c9bfbcb499beae611ca6fcb6337084728df20ad791267de34359b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://180d79b8131e00449715e74d883031078dcceb74ca44ea372b6c3f651d617007\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b137ba07c1866334b1897d83f9590d1627b5724b950efaef0bbda497186fe1ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:39:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:39:59.021893       1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:39:59.022050       1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:39:59.023119       1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311534806/tls.crt::/tmp/serving-cert-3311534806/tls.key\\\\\\\"\\\\nI0320 15:39:59.307640       1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:39:59.310126       1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:39:59.310145       1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:39:59.310173       1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:39:59.310181       1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:39:59.315103       1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 15:39:59.315287       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 15:39:59.315111       1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:39:59.315373       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:39:59.315499       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:39:59.315558       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:39:59.315615       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:39:59.315659       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:39:59.316192       1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8e7f1bfc56e54242e12f30487e8120eeb3d85dc86cfc17b587255889bd60bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:28Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:28 crc kubenswrapper[4730]: I0320 15:41:28.455712    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1242122dff682493181de927aefeb2ac4e5274621d865ee89ccec2e60e86469b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8117d479604ca09fc23e043f373aa3acbffb209d9fbfae95f2eb907424712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p47zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:28Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:28 crc kubenswrapper[4730]: I0320 15:41:28.472074    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e647990de11fdbff5c61eb5f2a4f36ee1c2562141d84795cf0d7bbc1155cac28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://188974c8e34d453f129d95f8a3ee36ff86b44b37a5388a0643611206f15d57c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:28Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:28 crc kubenswrapper[4730]: I0320 15:41:28.484574    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fcd3db3-55f1-4c23-8fa9-78844495cea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8e20a286d6affcba7ffa950ef5386e7f439c9a02381cb8b7d3bc51ad9c4f343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd661cda796a2bd61d1446bee672c4471d60370245caf7cb54faf54dfa9c58a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p5qvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:28Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:28 crc kubenswrapper[4730]: I0320 15:41:28.532917    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:41:28 crc kubenswrapper[4730]: I0320 15:41:28.532917    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:41:28 crc kubenswrapper[4730]: I0320 15:41:28.532952    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:41:28 crc kubenswrapper[4730]: I0320 15:41:28.532980    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:41:28 crc kubenswrapper[4730]: E0320 15:41:28.533644    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:41:28 crc kubenswrapper[4730]: E0320 15:41:28.533767    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:41:28 crc kubenswrapper[4730]: E0320 15:41:28.533868    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:41:28 crc kubenswrapper[4730]: E0320 15:41:28.533967    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:41:30 crc kubenswrapper[4730]: I0320 15:41:30.532180    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:41:30 crc kubenswrapper[4730]: E0320 15:41:30.532327    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:41:30 crc kubenswrapper[4730]: I0320 15:41:30.532180    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:41:30 crc kubenswrapper[4730]: I0320 15:41:30.532199    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:41:30 crc kubenswrapper[4730]: I0320 15:41:30.532187    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:41:30 crc kubenswrapper[4730]: E0320 15:41:30.532699    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:41:30 crc kubenswrapper[4730]: E0320 15:41:30.532823    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:41:30 crc kubenswrapper[4730]: E0320 15:41:30.532920    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:41:31 crc kubenswrapper[4730]: I0320 15:41:31.533663    4730 scope.go:117] "RemoveContainer" containerID="03c5491305bb4c7cdbc1498dbb8cc5fe445c665c58d977511f19cb13251f7ef0"
Mar 20 15:41:31 crc kubenswrapper[4730]: I0320 15:41:31.549382    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:31Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:31 crc kubenswrapper[4730]: I0320 15:41:31.561231    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n4w74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ee8d55f-90bd-4484-8455-933de455efea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2070438cbc1bdd8e3c865f5aa62537649598e0f51effc9936b9cb1a630b3651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fvg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n4w74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:31Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:31 crc kubenswrapper[4730]: I0320 15:41:31.588013    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708c4e4c-a9ad-466a-a3d0-a99ae3399934\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://631f7e03c5476d53c6d6a3a3bf6a1d45ac40443386f072fc7c01129e9eaf7619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d029581dfd248d1298c93e18a5352686b5e89fb730ede58a29400c62deb2bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a81d62ce9d72579cc4dd4778d22cf442919e08c7d83209fde12edc8a338684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b365c72e5e05c29295ba289a5bf795353c5221607dadd2ab8b4f642272d99fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfacdb41d11bf0b03bede308c267faddfd0241d99a4cbda2603c6539bc3a433a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:31Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:31 crc kubenswrapper[4730]: I0320 15:41:31.610607    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c3fe2f-3c67-4dee-becb-3ecfe2758384\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2660f772d76479685f586aeb9b242975695ac5e28668840bac6b8a75c09443a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d96c704317c9bfbcb499beae611ca6fcb6337084728df20ad791267de34359b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://180d79b8131e00449715e74d883031078dcceb74ca44ea372b6c3f651d617007\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b137ba07c1866334b1897d83f9590d1627b5724b950efaef0bbda497186fe1ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:39:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:39:59.021893       1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:39:59.022050       1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:39:59.023119       1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311534806/tls.crt::/tmp/serving-cert-3311534806/tls.key\\\\\\\"\\\\nI0320 15:39:59.307640       1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:39:59.310126       1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:39:59.310145       1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:39:59.310173       1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:39:59.310181       1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:39:59.315103       1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 15:39:59.315287       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 15:39:59.315111       1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:39:59.315373       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:39:59.315499       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:39:59.315558       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:39:59.315615       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:39:59.315659       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:39:59.316192       1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8e7f1bfc56e54242e12f30487e8120eeb3d85dc86cfc17b587255889bd60bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:31Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:31 crc kubenswrapper[4730]: I0320 15:41:31.624239    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1242122dff682493181de927aefeb2ac4e5274621d865ee89ccec2e60e86469b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8117d479604ca09fc23e043f373aa3acbffb209d9fbfae95f2eb907424712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p47zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:31Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:31 crc kubenswrapper[4730]: I0320 15:41:31.637880    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e647990de11fdbff5c61eb5f2a4f36ee1c2562141d84795cf0d7bbc1155cac28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://188974c8e34d453f129d95f8a3ee36ff86b44b37a5388a0643611206f15d57c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:31Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:31 crc kubenswrapper[4730]: E0320 15:41:31.639051    4730 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"
Mar 20 15:41:31 crc kubenswrapper[4730]: I0320 15:41:31.649682    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fcd3db3-55f1-4c23-8fa9-78844495cea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8e20a286d6affcba7ffa950ef5386e7f439c9a02381cb8b7d3bc51ad9c4f343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd661cda796a2bd61d1446bee672c4471d60370245caf7cb54faf54dfa9c58a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p5qvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:31Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:31 crc kubenswrapper[4730]: I0320 15:41:31.659320    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-69fnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102cb977-7291-453e-9282-20572071afee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35fa36f52355ad1fd24bd332fc5b0a5ff35ac2be71dd16b53449d9f47e48e8ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plthx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-69fnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:31Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:31 crc kubenswrapper[4730]: I0320 15:41:31.672988    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-49hht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c34c742c6da6c6d35f815901234a0c12c9628d22fa83e511bddc78eae4373cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdfa0e3e3af5612de41c905d807532a50887378171356a3dade8aaf3d29fe9bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdfa0e3e3af5612de41c905d807532a50887378171356a3dade8aaf3d29fe9bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-49hht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:31Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:31 crc kubenswrapper[4730]: I0320 15:41:31.685189    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caf1c50b-d896-46ac-8c1c-2368a862eb88\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a2006b33d30cf2fd57843f3df0fb087253dd116f48a4d807c31260ce7508b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://badcec1b25a9d088fe7e563366ee7568adcabfe9c29a536db19fe3119b10f229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f50a49e995c2647a19bd3dedd3ca85f1d7d0279df106c153af39641af9ea83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e8cd87f56c4a70c698163de2d3f364420981943d389a3cc9b64401bb5fbf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e8cd87f56c4a70c698163de2d3f364420981943d389a3cc9b64401bb5fbf08e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:31Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:31 crc kubenswrapper[4730]: I0320 15:41:31.694606    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07c87d31-bb33-4cbb-8786-b42cee6340a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4298d984334a4955a24ac85a16284d8ffe1e4f3033c328be6e66f59d3b17f1b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:31Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:31 crc kubenswrapper[4730]: I0320 15:41:31.704657    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055b0251e049abda93d63985b27520b79b2596443df06427dd990fc974b52a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:31Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:31 crc kubenswrapper[4730]: I0320 15:41:31.717115    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:31Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:31 crc kubenswrapper[4730]: I0320 15:41:31.728980    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6r2kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12ba423ea0fecce8b2416cc8f75f3323980aae80a20ff26bd2f9a6c4cd464812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f0276b7ef4131fee3cc30530a3cb7ddf6e6be5ce73d3084a1f6f4272f9dde6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:41:27Z\\\",\\\"message\\\":\\\"2026-03-20T15:40:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5b4f3207-7d26-40ce-9cb0-76c6f07e26b9\\\\n2026-03-20T15:40:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5b4f3207-7d26-40ce-9cb0-76c6f07e26b9 to /host/opt/cni/bin/\\\\n2026-03-20T15:40:42Z [verbose] multus-daemon started\\\\n2026-03-20T15:40:42Z [verbose] Readiness Indicator file check\\\\n2026-03-20T15:41:27Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:41:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvthz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6r2kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:31Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:31 crc kubenswrapper[4730]: I0320 15:41:31.740182    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2prfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2prfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:31Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:31 crc kubenswrapper[4730]: I0320 15:41:31.752802    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eed0424-45fd-4b1e-8b59-d041af7fb08f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://620070760ce503ee2102ce0880913637feb032124892ce1a1e2060939f38e050\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b34522460ebd4556ce4291e5c5132788387cf45b0be3b9535af9262948b71ac\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:39:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 15:39:03.602145       1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 15:39:03.606021       1 observer_polling.go:159] Starting file observer\\\\nI0320 15:39:03.640071       1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 15:39:03.644835       1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 15:39:27.437935       1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 15:39:27.438079       1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:27Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aee2dcf43ecf6df4a1615aa6e468921053ccb529d3c6dbc2c2ad641e264e606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99093fe46696a888b221d24d1b42226d0ff16bab6b3fb2a718d055cf97066a69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://899dbd6715433cfe5141851019e164daea952552c26706648245fd6319168685\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:31Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:31 crc kubenswrapper[4730]: I0320 15:41:31.764855    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8a35dfa17acf7f7051d13b20cab55fe91645c8fa2773fed67baddae164b586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:31Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:31 crc kubenswrapper[4730]: I0320 15:41:31.775073    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:31Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:31 crc kubenswrapper[4730]: I0320 15:41:31.789746    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b4e0e8-af33-491e-b1d1-31079d90c656\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c5491305bb4c7cdbc1498dbb8cc5fe445c665c58d977511f19cb13251f7ef0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03c5491305bb4c7cdbc1498dbb8cc5fe445c665c58d977511f19cb13251f7ef0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:41:04Z\\\",\\\"message\\\":\\\" LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}\\\\nI0320 15:41:04.407421    6925 services_controller.go:360] Finished syncing service dns-default on namespace openshift-dns for network=default : 4.127302ms\\\\nI0320 15:41:04.407620    6925 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0320 15:41:04.407658    6925 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0320 15:41:04.407675    6925 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0320 15:41:04.407733    6925 factory.go:1336] Added *v1.Node event handler 7\\\\nI0320 15:41:04.407751    6925 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0320 15:41:04.408006    6925 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0320 15:41:04.408084    6925 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 15:41:04.408106    6925 ovnkube.go:599] Stopped ovnkube\\\\nI0320 15:41:04.408148    6925 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 15:41:04.408227    6925 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:41:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qj97f_openshift-ovn-kubernetes(c4b4e0e8-af33-491e-b1d1-31079d90c656)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj97f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:31Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:32 crc kubenswrapper[4730]: I0320 15:41:32.205822    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qj97f_c4b4e0e8-af33-491e-b1d1-31079d90c656/ovnkube-controller/2.log"
Mar 20 15:41:32 crc kubenswrapper[4730]: I0320 15:41:32.208016    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" event={"ID":"c4b4e0e8-af33-491e-b1d1-31079d90c656","Type":"ContainerStarted","Data":"7263ba407f8e4372a1f43899fa38bbfefcf8ad7bf2f07d5e4fc6f145dc922aed"}
Mar 20 15:41:32 crc kubenswrapper[4730]: I0320 15:41:32.208451    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:41:32 crc kubenswrapper[4730]: I0320 15:41:32.220030    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:32Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:32 crc kubenswrapper[4730]: I0320 15:41:32.228028    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n4w74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ee8d55f-90bd-4484-8455-933de455efea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2070438cbc1bdd8e3c865f5aa62537649598e0f51effc9936b9cb1a630b3651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fvg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n4w74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:32Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:32 crc kubenswrapper[4730]: I0320 15:41:32.237302    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1242122dff682493181de927aefeb2ac4e5274621d865ee89ccec2e60e86469b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8117d479604ca09fc23e043f373aa3acbffb209d9fbfae95f2eb907424712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p47zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:32Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:32 crc kubenswrapper[4730]: I0320 15:41:32.250184    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e647990de11fdbff5c61eb5f2a4f36ee1c2562141d84795cf0d7bbc1155cac28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://188974c8e34d453f129d95f8a3ee36ff86b44b37a5388a0643611206f15d57c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:32Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:32 crc kubenswrapper[4730]: I0320 15:41:32.266237    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fcd3db3-55f1-4c23-8fa9-78844495cea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8e20a286d6affcba7ffa950ef5386e7f439c9a02381cb8b7d3bc51ad9c4f343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd661cda796a2bd61d1446bee672c4471d60370245caf7cb54faf54dfa9c58a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p5qvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:32Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:32 crc kubenswrapper[4730]: I0320 15:41:32.277049    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-69fnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102cb977-7291-453e-9282-20572071afee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35fa36f52355ad1fd24bd332fc5b0a5ff35ac2be71dd16b53449d9f47e48e8ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plthx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-69fnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:32Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:32 crc kubenswrapper[4730]: I0320 15:41:32.290701    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-49hht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c34c742c6da6c6d35f815901234a0c12c9628d22fa83e511bddc78eae4373cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdfa0e3e3af5612de41c905d807532a50887378171356a3dade8aaf3d29fe9bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdfa0e3e3af5612de41c905d807532a50887378171356a3dade8aaf3d29fe9bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-49hht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:32Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:32 crc kubenswrapper[4730]: I0320 15:41:32.312323    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708c4e4c-a9ad-466a-a3d0-a99ae3399934\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://631f7e03c5476d53c6d6a3a3bf6a1d45ac40443386f072fc7c01129e9eaf7619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d029581dfd248d1298c93e18a5352686b5e89fb730ede58a29400c62deb2bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a81d62ce9d72579cc4dd4778d22cf442919e08c7d83209fde12edc8a338684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b365c72e5e05c29295ba289a5bf795353c5221607dadd2ab8b4f642272d99fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfacdb41d11bf0b03bede308c267faddfd0241d99a4cbda2603c6539bc3a433a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:32Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:32 crc kubenswrapper[4730]: I0320 15:41:32.326509    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c3fe2f-3c67-4dee-becb-3ecfe2758384\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2660f772d76479685f586aeb9b242975695ac5e28668840bac6b8a75c09443a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d96c704317c9bfbcb499beae611ca6fcb6337084728df20ad791267de34359b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://180d79b8131e00449715e74d883031078dcceb74ca44ea372b6c3f651d617007\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b137ba07c1866334b1897d83f9590d1627b5724b950efaef0bbda497186fe1ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:39:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:39:59.021893       1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:39:59.022050       1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:39:59.023119       1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311534806/tls.crt::/tmp/serving-cert-3311534806/tls.key\\\\\\\"\\\\nI0320 15:39:59.307640       1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:39:59.310126       1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:39:59.310145       1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:39:59.310173       1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:39:59.310181       1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:39:59.315103       1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 15:39:59.315287       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 15:39:59.315111       1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:39:59.315373       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:39:59.315499       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:39:59.315558       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:39:59.315615       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:39:59.315659       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:39:59.316192       1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8e7f1bfc56e54242e12f30487e8120eeb3d85dc86cfc17b587255889bd60bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:32Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:32 crc kubenswrapper[4730]: I0320 15:41:32.338636    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055b0251e049abda93d63985b27520b79b2596443df06427dd990fc974b52a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:32Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:32 crc kubenswrapper[4730]: I0320 15:41:32.348623    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:32Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:32 crc kubenswrapper[4730]: I0320 15:41:32.429614    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6r2kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12ba423ea0fecce8b2416cc8f75f3323980aae80a20ff26bd2f9a6c4cd464812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f0276b7ef4131fee3cc30530a3cb7ddf6e6be5ce73d3084a1f6f4272f9dde6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:41:27Z\\\",\\\"message\\\":\\\"2026-03-20T15:40:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5b4f3207-7d26-40ce-9cb0-76c6f07e26b9\\\\n2026-03-20T15:40:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5b4f3207-7d26-40ce-9cb0-76c6f07e26b9 to /host/opt/cni/bin/\\\\n2026-03-20T15:40:42Z [verbose] multus-daemon started\\\\n2026-03-20T15:40:42Z [verbose] Readiness Indicator file check\\\\n2026-03-20T15:41:27Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:41:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvthz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6r2kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:32Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:32 crc kubenswrapper[4730]: I0320 15:41:32.441162    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2prfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2prfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:32Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:32 crc kubenswrapper[4730]: I0320 15:41:32.452616    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caf1c50b-d896-46ac-8c1c-2368a862eb88\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a2006b33d30cf2fd57843f3df0fb087253dd116f48a4d807c31260ce7508b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://badcec1b25a9d088fe7e563366ee7568adcabfe9c29a536db19fe3119b10f229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f50a49e995c2647a19bd3dedd3ca85f1d7d0279df106c153af39641af9ea83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e8cd87f56c4a70c698163de2d3f364420981943d389a3cc9b64401bb5fbf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e8cd87f56c4a70c698163de2d3f364420981943d389a3cc9b64401bb5fbf08e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:32Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:32 crc kubenswrapper[4730]: I0320 15:41:32.462115    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07c87d31-bb33-4cbb-8786-b42cee6340a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4298d984334a4955a24ac85a16284d8ffe1e4f3033c328be6e66f59d3b17f1b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:32Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:32 crc kubenswrapper[4730]: I0320 15:41:32.474041    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:32Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:32 crc kubenswrapper[4730]: I0320 15:41:32.494949    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b4e0e8-af33-491e-b1d1-31079d90c656\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7263ba407f8e4372a1f43899fa38bbfefcf8ad7bf2f07d5e4fc6f145dc922aed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03c5491305bb4c7cdbc1498dbb8cc5fe445c665c58d977511f19cb13251f7ef0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:41:04Z\\\",\\\"message\\\":\\\" LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}\\\\nI0320 15:41:04.407421    6925 services_controller.go:360] Finished syncing service dns-default on namespace openshift-dns for network=default : 4.127302ms\\\\nI0320 15:41:04.407620    6925 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0320 15:41:04.407658    6925 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0320 15:41:04.407675    6925 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0320 15:41:04.407733    6925 factory.go:1336] Added *v1.Node event handler 7\\\\nI0320 15:41:04.407751    6925 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0320 15:41:04.408006    6925 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0320 15:41:04.408084    6925 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 15:41:04.408106    6925 ovnkube.go:599] Stopped ovnkube\\\\nI0320 15:41:04.408148    6925 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 15:41:04.408227    6925 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:41:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj97f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:32Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:32 crc kubenswrapper[4730]: I0320 15:41:32.506945    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eed0424-45fd-4b1e-8b59-d041af7fb08f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://620070760ce503ee2102ce0880913637feb032124892ce1a1e2060939f38e050\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b34522460ebd4556ce4291e5c5132788387cf45b0be3b9535af9262948b71ac\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:39:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 15:39:03.602145       1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 15:39:03.606021       1 observer_polling.go:159] Starting file observer\\\\nI0320 15:39:03.640071       1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 15:39:03.644835       1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 15:39:27.437935       1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 15:39:27.438079       1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:27Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aee2dcf43ecf6df4a1615aa6e468921053ccb529d3c6dbc2c2ad641e264e606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99093fe46696a888b221d24d1b42226d0ff16bab6b3fb2a718d055cf97066a69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://899dbd6715433cfe5141851019e164daea952552c26706648245fd6319168685\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:32Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:32 crc kubenswrapper[4730]: I0320 15:41:32.519571    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8a35dfa17acf7f7051d13b20cab55fe91645c8fa2773fed67baddae164b586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:32Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:32 crc kubenswrapper[4730]: I0320 15:41:32.532965    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:41:32 crc kubenswrapper[4730]: I0320 15:41:32.532987    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:41:32 crc kubenswrapper[4730]: I0320 15:41:32.533009    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:41:32 crc kubenswrapper[4730]: E0320 15:41:32.533678    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:41:32 crc kubenswrapper[4730]: I0320 15:41:32.533040    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:41:32 crc kubenswrapper[4730]: E0320 15:41:32.533737    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:41:32 crc kubenswrapper[4730]: E0320 15:41:32.534651    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:41:32 crc kubenswrapper[4730]: E0320 15:41:32.538983    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:41:33 crc kubenswrapper[4730]: I0320 15:41:33.213701    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qj97f_c4b4e0e8-af33-491e-b1d1-31079d90c656/ovnkube-controller/3.log"
Mar 20 15:41:33 crc kubenswrapper[4730]: I0320 15:41:33.214912    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qj97f_c4b4e0e8-af33-491e-b1d1-31079d90c656/ovnkube-controller/2.log"
Mar 20 15:41:33 crc kubenswrapper[4730]: I0320 15:41:33.219629    4730 generic.go:334] "Generic (PLEG): container finished" podID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerID="7263ba407f8e4372a1f43899fa38bbfefcf8ad7bf2f07d5e4fc6f145dc922aed" exitCode=1
Mar 20 15:41:33 crc kubenswrapper[4730]: I0320 15:41:33.219670    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" event={"ID":"c4b4e0e8-af33-491e-b1d1-31079d90c656","Type":"ContainerDied","Data":"7263ba407f8e4372a1f43899fa38bbfefcf8ad7bf2f07d5e4fc6f145dc922aed"}
Mar 20 15:41:33 crc kubenswrapper[4730]: I0320 15:41:33.219706    4730 scope.go:117] "RemoveContainer" containerID="03c5491305bb4c7cdbc1498dbb8cc5fe445c665c58d977511f19cb13251f7ef0"
Mar 20 15:41:33 crc kubenswrapper[4730]: I0320 15:41:33.222140    4730 scope.go:117] "RemoveContainer" containerID="7263ba407f8e4372a1f43899fa38bbfefcf8ad7bf2f07d5e4fc6f145dc922aed"
Mar 20 15:41:33 crc kubenswrapper[4730]: E0320 15:41:33.222856    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qj97f_openshift-ovn-kubernetes(c4b4e0e8-af33-491e-b1d1-31079d90c656)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656"
Mar 20 15:41:33 crc kubenswrapper[4730]: I0320 15:41:33.238694    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eed0424-45fd-4b1e-8b59-d041af7fb08f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://620070760ce503ee2102ce0880913637feb032124892ce1a1e2060939f38e050\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b34522460ebd4556ce4291e5c5132788387cf45b0be3b9535af9262948b71ac\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:39:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 15:39:03.602145       1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 15:39:03.606021       1 observer_polling.go:159] Starting file observer\\\\nI0320 15:39:03.640071       1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 15:39:03.644835       1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 15:39:27.437935       1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 15:39:27.438079       1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:27Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aee2dcf43ecf6df4a1615aa6e468921053ccb529d3c6dbc2c2ad641e264e606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99093fe46696a888b221d24d1b42226d0ff16bab6b3fb2a718d055cf97066a69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://899dbd6715433cfe5141851019e164daea952552c26706648245fd6319168685\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:33Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:33 crc kubenswrapper[4730]: I0320 15:41:33.255901    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8a35dfa17acf7f7051d13b20cab55fe91645c8fa2773fed67baddae164b586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:33Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:33 crc kubenswrapper[4730]: I0320 15:41:33.273610    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:33Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:33 crc kubenswrapper[4730]: I0320 15:41:33.295031    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b4e0e8-af33-491e-b1d1-31079d90c656\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7263ba407f8e4372a1f43899fa38bbfefcf8ad7bf2f07d5e4fc6f145dc922aed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://03c5491305bb4c7cdbc1498dbb8cc5fe445c665c58d977511f19cb13251f7ef0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:41:04Z\\\",\\\"message\\\":\\\" LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}\\\\nI0320 15:41:04.407421    6925 services_controller.go:360] Finished syncing service dns-default on namespace openshift-dns for network=default : 4.127302ms\\\\nI0320 15:41:04.407620    6925 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0320 15:41:04.407658    6925 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0320 15:41:04.407675    6925 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0320 15:41:04.407733    6925 factory.go:1336] Added *v1.Node event handler 7\\\\nI0320 15:41:04.407751    6925 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0320 15:41:04.408006    6925 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0320 15:41:04.408084    6925 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 15:41:04.408106    6925 ovnkube.go:599] Stopped ovnkube\\\\nI0320 15:41:04.408148    6925 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 15:41:04.408227    6925 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:41:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7263ba407f8e4372a1f43899fa38bbfefcf8ad7bf2f07d5e4fc6f145dc922aed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:41:32Z\\\",\\\"message\\\":\\\"44],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0320 15:41:32.432008    7239 lb_config.go:1031] Cluster endpoints for openshift-ingress-operator/metrics for network=default are: map[]\\\\nF0320 15:41:32.432028    7239 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:32Z is after 2025-08-24T17:21:41Z]\\\\nI0320 15:41:32.432031    7239 services_controller.go:443] Built service openshift-ingress-operator/metrics LB cluster-wide confi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj97f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:33Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:33 crc kubenswrapper[4730]: I0320 15:41:33.307517    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:33Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:33 crc kubenswrapper[4730]: I0320 15:41:33.319305    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n4w74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ee8d55f-90bd-4484-8455-933de455efea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2070438cbc1bdd8e3c865f5aa62537649598e0f51effc9936b9cb1a630b3651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fvg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n4w74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:33Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:33 crc kubenswrapper[4730]: I0320 15:41:33.330337    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fcd3db3-55f1-4c23-8fa9-78844495cea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8e20a286d6affcba7ffa950ef5386e7f439c9a02381cb8b7d3bc51ad9c4f343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd661cda796a2bd61d1446bee672c4471d60370245caf7cb54faf54dfa9c58a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p5qvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:33Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:33 crc kubenswrapper[4730]: I0320 15:41:33.341338    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-69fnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102cb977-7291-453e-9282-20572071afee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35fa36f52355ad1fd24bd332fc5b0a5ff35ac2be71dd16b53449d9f47e48e8ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plthx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-69fnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:33Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:33 crc kubenswrapper[4730]: I0320 15:41:33.352997    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-49hht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c34c742c6da6c6d35f815901234a0c12c9628d22fa83e511bddc78eae4373cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdfa0e3e3af5612de41c905d807532a50887378171356a3dade8aaf3d29fe9bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdfa0e3e3af5612de41c905d807532a50887378171356a3dade8aaf3d29fe9bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-49hht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:33Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:33 crc kubenswrapper[4730]: I0320 15:41:33.374675    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708c4e4c-a9ad-466a-a3d0-a99ae3399934\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://631f7e03c5476d53c6d6a3a3bf6a1d45ac40443386f072fc7c01129e9eaf7619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d029581dfd248d1298c93e18a5352686b5e89fb730ede58a29400c62deb2bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a81d62ce9d72579cc4dd4778d22cf442919e08c7d83209fde12edc8a338684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b365c72e5e05c29295ba289a5bf795353c5221607dadd2ab8b4f642272d99fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfacdb41d11bf0b03bede308c267faddfd0241d99a4cbda2603c6539bc3a433a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:33Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:33 crc kubenswrapper[4730]: I0320 15:41:33.388879    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c3fe2f-3c67-4dee-becb-3ecfe2758384\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2660f772d76479685f586aeb9b242975695ac5e28668840bac6b8a75c09443a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d96c704317c9bfbcb499beae611ca6fcb6337084728df20ad791267de34359b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://180d79b8131e00449715e74d883031078dcceb74ca44ea372b6c3f651d617007\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b137ba07c1866334b1897d83f9590d1627b5724b950efaef0bbda497186fe1ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:39:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:39:59.021893       1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:39:59.022050       1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:39:59.023119       1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311534806/tls.crt::/tmp/serving-cert-3311534806/tls.key\\\\\\\"\\\\nI0320 15:39:59.307640       1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:39:59.310126       1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:39:59.310145       1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:39:59.310173       1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:39:59.310181       1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:39:59.315103       1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 15:39:59.315287       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 15:39:59.315111       1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:39:59.315373       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:39:59.315499       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:39:59.315558       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:39:59.315615       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:39:59.315659       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:39:59.316192       1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8e7f1bfc56e54242e12f30487e8120eeb3d85dc86cfc17b587255889bd60bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:33Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:33 crc kubenswrapper[4730]: I0320 15:41:33.401798    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1242122dff682493181de927aefeb2ac4e5274621d865ee89ccec2e60e86469b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8117d479604ca09fc23e043f373aa3acbffb209d9fbfae95f2eb907424712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p47zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:33Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:33 crc kubenswrapper[4730]: I0320 15:41:33.416169    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e647990de11fdbff5c61eb5f2a4f36ee1c2562141d84795cf0d7bbc1155cac28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://188974c8e34d453f129d95f8a3ee36ff86b44b37a5388a0643611206f15d57c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:33Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:33 crc kubenswrapper[4730]: I0320 15:41:33.433830    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6r2kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12ba423ea0fecce8b2416cc8f75f3323980aae80a20ff26bd2f9a6c4cd464812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f0276b7ef4131fee3cc30530a3cb7ddf6e6be5ce73d3084a1f6f4272f9dde6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:41:27Z\\\",\\\"message\\\":\\\"2026-03-20T15:40:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5b4f3207-7d26-40ce-9cb0-76c6f07e26b9\\\\n2026-03-20T15:40:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5b4f3207-7d26-40ce-9cb0-76c6f07e26b9 to /host/opt/cni/bin/\\\\n2026-03-20T15:40:42Z [verbose] multus-daemon started\\\\n2026-03-20T15:40:42Z [verbose] Readiness Indicator file check\\\\n2026-03-20T15:41:27Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:41:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvthz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6r2kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:33Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:33 crc kubenswrapper[4730]: I0320 15:41:33.448863    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2prfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2prfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:33Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:33 crc kubenswrapper[4730]: I0320 15:41:33.463923    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caf1c50b-d896-46ac-8c1c-2368a862eb88\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a2006b33d30cf2fd57843f3df0fb087253dd116f48a4d807c31260ce7508b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://badcec1b25a9d088fe7e563366ee7568adcabfe9c29a536db19fe3119b10f229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f50a49e995c2647a19bd3dedd3ca85f1d7d0279df106c153af39641af9ea83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e8cd87f56c4a70c698163de2d3f364420981943d389a3cc9b64401bb5fbf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e8cd87f56c4a70c698163de2d3f364420981943d389a3cc9b64401bb5fbf08e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:33Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:33 crc kubenswrapper[4730]: I0320 15:41:33.475283    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07c87d31-bb33-4cbb-8786-b42cee6340a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4298d984334a4955a24ac85a16284d8ffe1e4f3033c328be6e66f59d3b17f1b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:33Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:33 crc kubenswrapper[4730]: I0320 15:41:33.495020    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055b0251e049abda93d63985b27520b79b2596443df06427dd990fc974b52a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:33Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:33 crc kubenswrapper[4730]: I0320 15:41:33.512934    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:33Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:34 crc kubenswrapper[4730]: I0320 15:41:34.226855    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qj97f_c4b4e0e8-af33-491e-b1d1-31079d90c656/ovnkube-controller/3.log"
Mar 20 15:41:34 crc kubenswrapper[4730]: I0320 15:41:34.232358    4730 scope.go:117] "RemoveContainer" containerID="7263ba407f8e4372a1f43899fa38bbfefcf8ad7bf2f07d5e4fc6f145dc922aed"
Mar 20 15:41:34 crc kubenswrapper[4730]: E0320 15:41:34.232646    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qj97f_openshift-ovn-kubernetes(c4b4e0e8-af33-491e-b1d1-31079d90c656)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656"
Mar 20 15:41:34 crc kubenswrapper[4730]: I0320 15:41:34.250989    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fcd3db3-55f1-4c23-8fa9-78844495cea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8e20a286d6affcba7ffa950ef5386e7f439c9a02381cb8b7d3bc51ad9c4f343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd661cda796a2bd61d1446bee672c4471d60370245caf7cb54faf54dfa9c58a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p5qvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:34Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:34 crc kubenswrapper[4730]: I0320 15:41:34.266184    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-69fnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102cb977-7291-453e-9282-20572071afee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35fa36f52355ad1fd24bd332fc5b0a5ff35ac2be71dd16b53449d9f47e48e8ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plthx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-69fnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:34Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:34 crc kubenswrapper[4730]: I0320 15:41:34.285322    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-49hht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c34c742c6da6c6d35f815901234a0c12c9628d22fa83e511bddc78eae4373cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdfa0e3e3af5612de41c905d807532a50887378171356a3dade8aaf3d29fe9bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdfa0e3e3af5612de41c905d807532a50887378171356a3dade8aaf3d29fe9bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-49hht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:34Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:34 crc kubenswrapper[4730]: I0320 15:41:34.320387    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708c4e4c-a9ad-466a-a3d0-a99ae3399934\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://631f7e03c5476d53c6d6a3a3bf6a1d45ac40443386f072fc7c01129e9eaf7619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d029581dfd248d1298c93e18a5352686b5e89fb730ede58a29400c62deb2bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a81d62ce9d72579cc4dd4778d22cf442919e08c7d83209fde12edc8a338684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b365c72e5e05c29295ba289a5bf795353c5221607dadd2ab8b4f642272d99fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfacdb41d11bf0b03bede308c267faddfd0241d99a4cbda2603c6539bc3a433a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:34Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:34 crc kubenswrapper[4730]: I0320 15:41:34.342441    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c3fe2f-3c67-4dee-becb-3ecfe2758384\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2660f772d76479685f586aeb9b242975695ac5e28668840bac6b8a75c09443a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d96c704317c9bfbcb499beae611ca6fcb6337084728df20ad791267de34359b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://180d79b8131e00449715e74d883031078dcceb74ca44ea372b6c3f651d617007\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b137ba07c1866334b1897d83f9590d1627b5724b950efaef0bbda497186fe1ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:39:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:39:59.021893       1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:39:59.022050       1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:39:59.023119       1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311534806/tls.crt::/tmp/serving-cert-3311534806/tls.key\\\\\\\"\\\\nI0320 15:39:59.307640       1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:39:59.310126       1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:39:59.310145       1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:39:59.310173       1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:39:59.310181       1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:39:59.315103       1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 15:39:59.315287       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 15:39:59.315111       1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:39:59.315373       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:39:59.315499       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:39:59.315558       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:39:59.315615       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:39:59.315659       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:39:59.316192       1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8e7f1bfc56e54242e12f30487e8120eeb3d85dc86cfc17b587255889bd60bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:34Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:34 crc kubenswrapper[4730]: I0320 15:41:34.359941    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1242122dff682493181de927aefeb2ac4e5274621d865ee89ccec2e60e86469b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8117d479604ca09fc23e043f373aa3acbffb209d9fbfae95f2eb907424712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p47zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:34Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:34 crc kubenswrapper[4730]: I0320 15:41:34.379553    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e647990de11fdbff5c61eb5f2a4f36ee1c2562141d84795cf0d7bbc1155cac28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://188974c8e34d453f129d95f8a3ee36ff86b44b37a5388a0643611206f15d57c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:34Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:34 crc kubenswrapper[4730]: I0320 15:41:34.420386    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6r2kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12ba423ea0fecce8b2416cc8f75f3323980aae80a20ff26bd2f9a6c4cd464812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f0276b7ef4131fee3cc30530a3cb7ddf6e6be5ce73d3084a1f6f4272f9dde6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:41:27Z\\\",\\\"message\\\":\\\"2026-03-20T15:40:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5b4f3207-7d26-40ce-9cb0-76c6f07e26b9\\\\n2026-03-20T15:40:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5b4f3207-7d26-40ce-9cb0-76c6f07e26b9 to /host/opt/cni/bin/\\\\n2026-03-20T15:40:42Z [verbose] multus-daemon started\\\\n2026-03-20T15:40:42Z [verbose] Readiness Indicator file check\\\\n2026-03-20T15:41:27Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:41:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvthz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6r2kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:34Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:34 crc kubenswrapper[4730]: I0320 15:41:34.438993    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2prfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2prfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:34Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:34 crc kubenswrapper[4730]: I0320 15:41:34.458598    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caf1c50b-d896-46ac-8c1c-2368a862eb88\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a2006b33d30cf2fd57843f3df0fb087253dd116f48a4d807c31260ce7508b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://badcec1b25a9d088fe7e563366ee7568adcabfe9c29a536db19fe3119b10f229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f50a49e995c2647a19bd3dedd3ca85f1d7d0279df106c153af39641af9ea83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e8cd87f56c4a70c698163de2d3f364420981943d389a3cc9b64401bb5fbf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e8cd87f56c4a70c698163de2d3f364420981943d389a3cc9b64401bb5fbf08e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:34Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:34 crc kubenswrapper[4730]: I0320 15:41:34.470345    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07c87d31-bb33-4cbb-8786-b42cee6340a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4298d984334a4955a24ac85a16284d8ffe1e4f3033c328be6e66f59d3b17f1b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:34Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:34 crc kubenswrapper[4730]: I0320 15:41:34.480635    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055b0251e049abda93d63985b27520b79b2596443df06427dd990fc974b52a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:34Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:34 crc kubenswrapper[4730]: I0320 15:41:34.491708    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:34Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:34 crc kubenswrapper[4730]: I0320 15:41:34.501444    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eed0424-45fd-4b1e-8b59-d041af7fb08f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://620070760ce503ee2102ce0880913637feb032124892ce1a1e2060939f38e050\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b34522460ebd4556ce4291e5c5132788387cf45b0be3b9535af9262948b71ac\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:39:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 15:39:03.602145       1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 15:39:03.606021       1 observer_polling.go:159] Starting file observer\\\\nI0320 15:39:03.640071       1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 15:39:03.644835       1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 15:39:27.437935       1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 15:39:27.438079       1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:27Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aee2dcf43ecf6df4a1615aa6e468921053ccb529d3c6dbc2c2ad641e264e606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99093fe46696a888b221d24d1b42226d0ff16bab6b3fb2a718d055cf97066a69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://899dbd6715433cfe5141851019e164daea952552c26706648245fd6319168685\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:34Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:34 crc kubenswrapper[4730]: I0320 15:41:34.511595    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8a35dfa17acf7f7051d13b20cab55fe91645c8fa2773fed67baddae164b586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:34Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:34 crc kubenswrapper[4730]: I0320 15:41:34.522501    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:34Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:34 crc kubenswrapper[4730]: I0320 15:41:34.532683    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:41:34 crc kubenswrapper[4730]: E0320 15:41:34.532903    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:41:34 crc kubenswrapper[4730]: I0320 15:41:34.532718    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:41:34 crc kubenswrapper[4730]: E0320 15:41:34.533115    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:41:34 crc kubenswrapper[4730]: I0320 15:41:34.532683    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:41:34 crc kubenswrapper[4730]: E0320 15:41:34.533315    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:41:34 crc kubenswrapper[4730]: I0320 15:41:34.532718    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:41:34 crc kubenswrapper[4730]: E0320 15:41:34.533518    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:41:34 crc kubenswrapper[4730]: I0320 15:41:34.538636    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b4e0e8-af33-491e-b1d1-31079d90c656\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7263ba407f8e4372a1f43899fa38bbfefcf8ad7bf2f07d5e4fc6f145dc922aed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7263ba407f8e4372a1f43899fa38bbfefcf8ad7bf2f07d5e4fc6f145dc922aed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:41:32Z\\\",\\\"message\\\":\\\"44],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0320 15:41:32.432008    7239 lb_config.go:1031] Cluster endpoints for openshift-ingress-operator/metrics for network=default are: map[]\\\\nF0320 15:41:32.432028    7239 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:32Z is after 2025-08-24T17:21:41Z]\\\\nI0320 15:41:32.432031    7239 services_controller.go:443] Built service openshift-ingress-operator/metrics LB cluster-wide confi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:41:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qj97f_openshift-ovn-kubernetes(c4b4e0e8-af33-491e-b1d1-31079d90c656)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj97f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:34Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:34 crc kubenswrapper[4730]: I0320 15:41:34.550534    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:41:34 crc kubenswrapper[4730]: I0320 15:41:34.550723    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:41:34 crc kubenswrapper[4730]: I0320 15:41:34.550796    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:41:34 crc kubenswrapper[4730]: I0320 15:41:34.550868    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:41:34 crc kubenswrapper[4730]: I0320 15:41:34.550931    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:41:34Z","lastTransitionTime":"2026-03-20T15:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:41:34 crc kubenswrapper[4730]: I0320 15:41:34.554419    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:34Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:34 crc kubenswrapper[4730]: I0320 15:41:34.565694    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n4w74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ee8d55f-90bd-4484-8455-933de455efea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2070438cbc1bdd8e3c865f5aa62537649598e0f51effc9936b9cb1a630b3651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fvg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n4w74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:34Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:34 crc kubenswrapper[4730]: E0320 15:41:34.569672    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"666d62d4-aa52-41cc-be79-8c9a068e7752\\\",\\\"systemUUID\\\":\\\"dfe7d645-fe91-432e-8360-ef4633bfea29\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:34Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:34 crc kubenswrapper[4730]: I0320 15:41:34.573082    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:41:34 crc kubenswrapper[4730]: I0320 15:41:34.573105    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:41:34 crc kubenswrapper[4730]: I0320 15:41:34.573114    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:41:34 crc kubenswrapper[4730]: I0320 15:41:34.573128    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:41:34 crc kubenswrapper[4730]: I0320 15:41:34.573137    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:41:34Z","lastTransitionTime":"2026-03-20T15:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:41:34 crc kubenswrapper[4730]: E0320 15:41:34.585269    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"666d62d4-aa52-41cc-be79-8c9a068e7752\\\",\\\"systemUUID\\\":\\\"dfe7d645-fe91-432e-8360-ef4633bfea29\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:34Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:34 crc kubenswrapper[4730]: I0320 15:41:34.588349    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:41:34 crc kubenswrapper[4730]: I0320 15:41:34.588384    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:41:34 crc kubenswrapper[4730]: I0320 15:41:34.588398    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:41:34 crc kubenswrapper[4730]: I0320 15:41:34.588414    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:41:34 crc kubenswrapper[4730]: I0320 15:41:34.588426    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:41:34Z","lastTransitionTime":"2026-03-20T15:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:41:34 crc kubenswrapper[4730]: E0320 15:41:34.602022    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"666d62d4-aa52-41cc-be79-8c9a068e7752\\\",\\\"systemUUID\\\":\\\"dfe7d645-fe91-432e-8360-ef4633bfea29\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:34Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:34 crc kubenswrapper[4730]: I0320 15:41:34.605002    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:41:34 crc kubenswrapper[4730]: I0320 15:41:34.605038    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:41:34 crc kubenswrapper[4730]: I0320 15:41:34.605062    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:41:34 crc kubenswrapper[4730]: I0320 15:41:34.605076    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:41:34 crc kubenswrapper[4730]: I0320 15:41:34.605086    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:41:34Z","lastTransitionTime":"2026-03-20T15:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:41:34 crc kubenswrapper[4730]: E0320 15:41:34.617355    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"666d62d4-aa52-41cc-be79-8c9a068e7752\\\",\\\"systemUUID\\\":\\\"dfe7d645-fe91-432e-8360-ef4633bfea29\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:34Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:34 crc kubenswrapper[4730]: I0320 15:41:34.620205    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:41:34 crc kubenswrapper[4730]: I0320 15:41:34.620228    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:41:34 crc kubenswrapper[4730]: I0320 15:41:34.620236    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:41:34 crc kubenswrapper[4730]: I0320 15:41:34.620259    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:41:34 crc kubenswrapper[4730]: I0320 15:41:34.620268    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:41:34Z","lastTransitionTime":"2026-03-20T15:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:41:34 crc kubenswrapper[4730]: E0320 15:41:34.631500    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"666d62d4-aa52-41cc-be79-8c9a068e7752\\\",\\\"systemUUID\\\":\\\"dfe7d645-fe91-432e-8360-ef4633bfea29\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:34Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:34 crc kubenswrapper[4730]: E0320 15:41:34.631604    4730 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count"
Mar 20 15:41:36 crc kubenswrapper[4730]: I0320 15:41:36.532179    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:41:36 crc kubenswrapper[4730]: I0320 15:41:36.532226    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:41:36 crc kubenswrapper[4730]: I0320 15:41:36.532200    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:41:36 crc kubenswrapper[4730]: I0320 15:41:36.532177    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:41:36 crc kubenswrapper[4730]: E0320 15:41:36.532390    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:41:36 crc kubenswrapper[4730]: E0320 15:41:36.532562    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:41:36 crc kubenswrapper[4730]: E0320 15:41:36.532687    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:41:36 crc kubenswrapper[4730]: E0320 15:41:36.532955    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:41:36 crc kubenswrapper[4730]: E0320 15:41:36.640598    4730 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"
Mar 20 15:41:38 crc kubenswrapper[4730]: I0320 15:41:38.533054    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:41:38 crc kubenswrapper[4730]: E0320 15:41:38.533310    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:41:38 crc kubenswrapper[4730]: I0320 15:41:38.533513    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:41:38 crc kubenswrapper[4730]: I0320 15:41:38.533631    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:41:38 crc kubenswrapper[4730]: E0320 15:41:38.533673    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:41:38 crc kubenswrapper[4730]: I0320 15:41:38.533744    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:41:38 crc kubenswrapper[4730]: E0320 15:41:38.533865    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:41:38 crc kubenswrapper[4730]: E0320 15:41:38.534015    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:41:40 crc kubenswrapper[4730]: I0320 15:41:40.532948    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:41:40 crc kubenswrapper[4730]: I0320 15:41:40.533026    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:41:40 crc kubenswrapper[4730]: I0320 15:41:40.533131    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:41:40 crc kubenswrapper[4730]: I0320 15:41:40.533191    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:41:40 crc kubenswrapper[4730]: E0320 15:41:40.533283    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:41:40 crc kubenswrapper[4730]: E0320 15:41:40.533399    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:41:40 crc kubenswrapper[4730]: E0320 15:41:40.533496    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:41:40 crc kubenswrapper[4730]: E0320 15:41:40.533557    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:41:41 crc kubenswrapper[4730]: I0320 15:41:41.565475    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708c4e4c-a9ad-466a-a3d0-a99ae3399934\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://631f7e03c5476d53c6d6a3a3bf6a1d45ac40443386f072fc7c01129e9eaf7619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d029581dfd248d1298c93e18a5352686b5e89fb730ede58a29400c62deb2bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a81d62ce9d72579cc4dd4778d22cf442919e08c7d83209fde12edc8a338684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b365c72e5e05c29295ba289a5bf795353c5221607dadd2ab8b4f642272d99fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfacdb41d11bf0b03bede308c267faddfd0241d99a4cbda2603c6539bc3a433a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:41 crc kubenswrapper[4730]: I0320 15:41:41.590753    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c3fe2f-3c67-4dee-becb-3ecfe2758384\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2660f772d76479685f586aeb9b242975695ac5e28668840bac6b8a75c09443a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d96c704317c9bfbcb499beae611ca6fcb6337084728df20ad791267de34359b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://180d79b8131e00449715e74d883031078dcceb74ca44ea372b6c3f651d617007\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b137ba07c1866334b1897d83f9590d1627b5724b950efaef0bbda497186fe1ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:39:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:39:59.021893       1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:39:59.022050       1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:39:59.023119       1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311534806/tls.crt::/tmp/serving-cert-3311534806/tls.key\\\\\\\"\\\\nI0320 15:39:59.307640       1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:39:59.310126       1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:39:59.310145       1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:39:59.310173       1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:39:59.310181       1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:39:59.315103       1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 15:39:59.315287       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 15:39:59.315111       1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:39:59.315373       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:39:59.315499       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:39:59.315558       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:39:59.315615       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:39:59.315659       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:39:59.316192       1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8e7f1bfc56e54242e12f30487e8120eeb3d85dc86cfc17b587255889bd60bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:41 crc kubenswrapper[4730]: I0320 15:41:41.610747    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1242122dff682493181de927aefeb2ac4e5274621d865ee89ccec2e60e86469b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8117d479604ca09fc23e043f373aa3acbffb209d9fbfae95f2eb907424712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p47zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:41 crc kubenswrapper[4730]: I0320 15:41:41.633707    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e647990de11fdbff5c61eb5f2a4f36ee1c2562141d84795cf0d7bbc1155cac28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://188974c8e34d453f129d95f8a3ee36ff86b44b37a5388a0643611206f15d57c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:41 crc kubenswrapper[4730]: E0320 15:41:41.641474    4730 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"
Mar 20 15:41:41 crc kubenswrapper[4730]: I0320 15:41:41.660187    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fcd3db3-55f1-4c23-8fa9-78844495cea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8e20a286d6affcba7ffa950ef5386e7f439c9a02381cb8b7d3bc51ad9c4f343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd661cda796a2bd61d1446bee672c4471d60370245caf7cb54faf54dfa9c58a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p5qvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:41 crc kubenswrapper[4730]: I0320 15:41:41.677688    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-69fnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102cb977-7291-453e-9282-20572071afee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35fa36f52355ad1fd24bd332fc5b0a5ff35ac2be71dd16b53449d9f47e48e8ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plthx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-69fnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:41 crc kubenswrapper[4730]: I0320 15:41:41.697473    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-49hht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c34c742c6da6c6d35f815901234a0c12c9628d22fa83e511bddc78eae4373cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdfa0e3e3af5612de41c905d807532a50887378171356a3dade8aaf3d29fe9bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdfa0e3e3af5612de41c905d807532a50887378171356a3dade8aaf3d29fe9bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-49hht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:41 crc kubenswrapper[4730]: I0320 15:41:41.709830    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caf1c50b-d896-46ac-8c1c-2368a862eb88\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a2006b33d30cf2fd57843f3df0fb087253dd116f48a4d807c31260ce7508b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://badcec1b25a9d088fe7e563366ee7568adcabfe9c29a536db19fe3119b10f229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f50a49e995c2647a19bd3dedd3ca85f1d7d0279df106c153af39641af9ea83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e8cd87f56c4a70c698163de2d3f364420981943d389a3cc9b64401bb5fbf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e8cd87f56c4a70c698163de2d3f364420981943d389a3cc9b64401bb5fbf08e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:41 crc kubenswrapper[4730]: I0320 15:41:41.720617    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07c87d31-bb33-4cbb-8786-b42cee6340a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4298d984334a4955a24ac85a16284d8ffe1e4f3033c328be6e66f59d3b17f1b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:41 crc kubenswrapper[4730]: I0320 15:41:41.733184    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055b0251e049abda93d63985b27520b79b2596443df06427dd990fc974b52a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:41 crc kubenswrapper[4730]: I0320 15:41:41.744453    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:41 crc kubenswrapper[4730]: I0320 15:41:41.759307    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6r2kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12ba423ea0fecce8b2416cc8f75f3323980aae80a20ff26bd2f9a6c4cd464812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f0276b7ef4131fee3cc30530a3cb7ddf6e6be5ce73d3084a1f6f4272f9dde6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:41:27Z\\\",\\\"message\\\":\\\"2026-03-20T15:40:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5b4f3207-7d26-40ce-9cb0-76c6f07e26b9\\\\n2026-03-20T15:40:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5b4f3207-7d26-40ce-9cb0-76c6f07e26b9 to /host/opt/cni/bin/\\\\n2026-03-20T15:40:42Z [verbose] multus-daemon started\\\\n2026-03-20T15:40:42Z [verbose] Readiness Indicator file check\\\\n2026-03-20T15:41:27Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:41:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvthz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6r2kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:41 crc kubenswrapper[4730]: I0320 15:41:41.770751    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2prfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2prfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:41 crc kubenswrapper[4730]: I0320 15:41:41.783896    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eed0424-45fd-4b1e-8b59-d041af7fb08f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://620070760ce503ee2102ce0880913637feb032124892ce1a1e2060939f38e050\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b34522460ebd4556ce4291e5c5132788387cf45b0be3b9535af9262948b71ac\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:39:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 15:39:03.602145       1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 15:39:03.606021       1 observer_polling.go:159] Starting file observer\\\\nI0320 15:39:03.640071       1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 15:39:03.644835       1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 15:39:27.437935       1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 15:39:27.438079       1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:27Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aee2dcf43ecf6df4a1615aa6e468921053ccb529d3c6dbc2c2ad641e264e606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99093fe46696a888b221d24d1b42226d0ff16bab6b3fb2a718d055cf97066a69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://899dbd6715433cfe5141851019e164daea952552c26706648245fd6319168685\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:41 crc kubenswrapper[4730]: I0320 15:41:41.794578    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8a35dfa17acf7f7051d13b20cab55fe91645c8fa2773fed67baddae164b586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:41 crc kubenswrapper[4730]: I0320 15:41:41.805523    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:41 crc kubenswrapper[4730]: I0320 15:41:41.821788    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b4e0e8-af33-491e-b1d1-31079d90c656\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7263ba407f8e4372a1f43899fa38bbfefcf8ad7bf2f07d5e4fc6f145dc922aed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7263ba407f8e4372a1f43899fa38bbfefcf8ad7bf2f07d5e4fc6f145dc922aed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:41:32Z\\\",\\\"message\\\":\\\"44],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0320 15:41:32.432008    7239 lb_config.go:1031] Cluster endpoints for openshift-ingress-operator/metrics for network=default are: map[]\\\\nF0320 15:41:32.432028    7239 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:32Z is after 2025-08-24T17:21:41Z]\\\\nI0320 15:41:32.432031    7239 services_controller.go:443] Built service openshift-ingress-operator/metrics LB cluster-wide confi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:41:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qj97f_openshift-ovn-kubernetes(c4b4e0e8-af33-491e-b1d1-31079d90c656)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj97f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:41 crc kubenswrapper[4730]: I0320 15:41:41.834278    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:41 crc kubenswrapper[4730]: I0320 15:41:41.844013    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n4w74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ee8d55f-90bd-4484-8455-933de455efea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2070438cbc1bdd8e3c865f5aa62537649598e0f51effc9936b9cb1a630b3651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fvg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n4w74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:41Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:42 crc kubenswrapper[4730]: I0320 15:41:42.532148    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:41:42 crc kubenswrapper[4730]: I0320 15:41:42.532213    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:41:42 crc kubenswrapper[4730]: I0320 15:41:42.532293    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:41:42 crc kubenswrapper[4730]: I0320 15:41:42.532347    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:41:42 crc kubenswrapper[4730]: E0320 15:41:42.533530    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:41:42 crc kubenswrapper[4730]: E0320 15:41:42.533682    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:41:42 crc kubenswrapper[4730]: E0320 15:41:42.533091    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:41:42 crc kubenswrapper[4730]: E0320 15:41:42.533705    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:41:44 crc kubenswrapper[4730]: I0320 15:41:44.533130    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:41:44 crc kubenswrapper[4730]: I0320 15:41:44.533189    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:41:44 crc kubenswrapper[4730]: I0320 15:41:44.533136    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:41:44 crc kubenswrapper[4730]: E0320 15:41:44.533337    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:41:44 crc kubenswrapper[4730]: E0320 15:41:44.533444    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:41:44 crc kubenswrapper[4730]: E0320 15:41:44.533550    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:41:44 crc kubenswrapper[4730]: I0320 15:41:44.534539    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:41:44 crc kubenswrapper[4730]: E0320 15:41:44.534762    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:41:44 crc kubenswrapper[4730]: I0320 15:41:44.960310    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:41:44 crc kubenswrapper[4730]: I0320 15:41:44.960367    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:41:44 crc kubenswrapper[4730]: I0320 15:41:44.960386    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:41:44 crc kubenswrapper[4730]: I0320 15:41:44.960408    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:41:44 crc kubenswrapper[4730]: I0320 15:41:44.960426    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:41:44Z","lastTransitionTime":"2026-03-20T15:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:41:44 crc kubenswrapper[4730]: E0320 15:41:44.976853    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"666d62d4-aa52-41cc-be79-8c9a068e7752\\\",\\\"systemUUID\\\":\\\"dfe7d645-fe91-432e-8360-ef4633bfea29\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:44Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:44 crc kubenswrapper[4730]: I0320 15:41:44.982169    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:41:44 crc kubenswrapper[4730]: I0320 15:41:44.982223    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:41:44 crc kubenswrapper[4730]: I0320 15:41:44.982276    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:41:44 crc kubenswrapper[4730]: I0320 15:41:44.982309    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:41:44 crc kubenswrapper[4730]: I0320 15:41:44.982331    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:41:44Z","lastTransitionTime":"2026-03-20T15:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:41:45 crc kubenswrapper[4730]: E0320 15:41:45.001953    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"666d62d4-aa52-41cc-be79-8c9a068e7752\\\",\\\"systemUUID\\\":\\\"dfe7d645-fe91-432e-8360-ef4633bfea29\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:44Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:45 crc kubenswrapper[4730]: I0320 15:41:45.005583    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:41:45 crc kubenswrapper[4730]: I0320 15:41:45.005636    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:41:45 crc kubenswrapper[4730]: I0320 15:41:45.005650    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:41:45 crc kubenswrapper[4730]: I0320 15:41:45.005676    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:41:45 crc kubenswrapper[4730]: I0320 15:41:45.005696    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:41:45Z","lastTransitionTime":"2026-03-20T15:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:41:45 crc kubenswrapper[4730]: E0320 15:41:45.019683    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"666d62d4-aa52-41cc-be79-8c9a068e7752\\\",\\\"systemUUID\\\":\\\"dfe7d645-fe91-432e-8360-ef4633bfea29\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:45Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:45 crc kubenswrapper[4730]: I0320 15:41:45.023288    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:41:45 crc kubenswrapper[4730]: I0320 15:41:45.023354    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:41:45 crc kubenswrapper[4730]: I0320 15:41:45.023370    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:41:45 crc kubenswrapper[4730]: I0320 15:41:45.023388    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:41:45 crc kubenswrapper[4730]: I0320 15:41:45.023401    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:41:45Z","lastTransitionTime":"2026-03-20T15:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:41:45 crc kubenswrapper[4730]: E0320 15:41:45.038904    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"666d62d4-aa52-41cc-be79-8c9a068e7752\\\",\\\"systemUUID\\\":\\\"dfe7d645-fe91-432e-8360-ef4633bfea29\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:45Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:45 crc kubenswrapper[4730]: I0320 15:41:45.041794    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:41:45 crc kubenswrapper[4730]: I0320 15:41:45.041830    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:41:45 crc kubenswrapper[4730]: I0320 15:41:45.041843    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:41:45 crc kubenswrapper[4730]: I0320 15:41:45.041861    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:41:45 crc kubenswrapper[4730]: I0320 15:41:45.041872    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:41:45Z","lastTransitionTime":"2026-03-20T15:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:41:45 crc kubenswrapper[4730]: E0320 15:41:45.057594    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:41:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"666d62d4-aa52-41cc-be79-8c9a068e7752\\\",\\\"systemUUID\\\":\\\"dfe7d645-fe91-432e-8360-ef4633bfea29\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:45Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:45 crc kubenswrapper[4730]: E0320 15:41:45.057721    4730 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count"
Mar 20 15:41:46 crc kubenswrapper[4730]: I0320 15:41:46.532086    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:41:46 crc kubenswrapper[4730]: I0320 15:41:46.532141    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:41:46 crc kubenswrapper[4730]: E0320 15:41:46.532969    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:41:46 crc kubenswrapper[4730]: I0320 15:41:46.532311    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:41:46 crc kubenswrapper[4730]: I0320 15:41:46.532169    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:41:46 crc kubenswrapper[4730]: E0320 15:41:46.533107    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:41:46 crc kubenswrapper[4730]: E0320 15:41:46.533272    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:41:46 crc kubenswrapper[4730]: E0320 15:41:46.533364    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:41:46 crc kubenswrapper[4730]: E0320 15:41:46.642810    4730 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"
Mar 20 15:41:47 crc kubenswrapper[4730]: I0320 15:41:47.532901    4730 scope.go:117] "RemoveContainer" containerID="7263ba407f8e4372a1f43899fa38bbfefcf8ad7bf2f07d5e4fc6f145dc922aed"
Mar 20 15:41:47 crc kubenswrapper[4730]: E0320 15:41:47.533685    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qj97f_openshift-ovn-kubernetes(c4b4e0e8-af33-491e-b1d1-31079d90c656)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656"
Mar 20 15:41:48 crc kubenswrapper[4730]: I0320 15:41:48.532102    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:41:48 crc kubenswrapper[4730]: I0320 15:41:48.532170    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:41:48 crc kubenswrapper[4730]: E0320 15:41:48.532468    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:41:48 crc kubenswrapper[4730]: I0320 15:41:48.532505    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:41:48 crc kubenswrapper[4730]: I0320 15:41:48.532561    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:41:48 crc kubenswrapper[4730]: E0320 15:41:48.532738    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:41:48 crc kubenswrapper[4730]: E0320 15:41:48.532943    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:41:48 crc kubenswrapper[4730]: E0320 15:41:48.532999    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:41:50 crc kubenswrapper[4730]: I0320 15:41:50.532668    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:41:50 crc kubenswrapper[4730]: I0320 15:41:50.532735    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:41:50 crc kubenswrapper[4730]: E0320 15:41:50.532800    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:41:50 crc kubenswrapper[4730]: I0320 15:41:50.532841    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:41:50 crc kubenswrapper[4730]: I0320 15:41:50.532926    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:41:50 crc kubenswrapper[4730]: E0320 15:41:50.533028    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:41:50 crc kubenswrapper[4730]: E0320 15:41:50.533288    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:41:50 crc kubenswrapper[4730]: E0320 15:41:50.533416    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:41:51 crc kubenswrapper[4730]: I0320 15:41:51.560672    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"708c4e4c-a9ad-466a-a3d0-a99ae3399934\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://631f7e03c5476d53c6d6a3a3bf6a1d45ac40443386f072fc7c01129e9eaf7619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d029581dfd248d1298c93e18a5352686b5e89fb730ede58a29400c62deb2bc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77a81d62ce9d72579cc4dd4778d22cf442919e08c7d83209fde12edc8a338684\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b365c72e5e05c29295ba289a5bf795353c5221607dadd2ab8b4f642272d99fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfacdb41d11bf0b03bede308c267faddfd0241d99a4cbda2603c6539bc3a433a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://452bb8a7271d9ab590cf3dadd911d28ca0a8e11759db651f023858cf8089cba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401768d00b76ca556c6946881677a30c5e8dc1a0e680743b86dbe11d3ef0e61c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2302456cbdefcc20adf9d01130c9726f807b89a7a39d032d86a1f6f3901c506\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:51 crc kubenswrapper[4730]: I0320 15:41:51.577597    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c3fe2f-3c67-4dee-becb-3ecfe2758384\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2660f772d76479685f586aeb9b242975695ac5e28668840bac6b8a75c09443a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d96c704317c9bfbcb499beae611ca6fcb6337084728df20ad791267de34359b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://180d79b8131e00449715e74d883031078dcceb74ca44ea372b6c3f651d617007\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b137ba07c1866334b1897d83f9590d1627b5724b950efaef0bbda497186fe1ad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:39:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 15:39:59.021893       1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 15:39:59.022050       1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 15:39:59.023119       1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311534806/tls.crt::/tmp/serving-cert-3311534806/tls.key\\\\\\\"\\\\nI0320 15:39:59.307640       1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 15:39:59.310126       1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 15:39:59.310145       1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 15:39:59.310173       1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 15:39:59.310181       1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 15:39:59.315103       1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 15:39:59.315287       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0320 15:39:59.315111       1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 15:39:59.315373       1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 15:39:59.315499       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 15:39:59.315558       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 15:39:59.315615       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 15:39:59.315659       1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 15:39:59.316192       1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8e7f1bfc56e54242e12f30487e8120eeb3d85dc86cfc17b587255889bd60bf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:51 crc kubenswrapper[4730]: I0320 15:41:51.590919    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a25d8b0d-e1be-41c2-ada3-49aaecfe3bc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1242122dff682493181de927aefeb2ac4e5274621d865ee89ccec2e60e86469b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a8117d479604ca09fc23e043f373aa3acbffb209d9fbfae95f2eb907424712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4xpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p47zh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:51 crc kubenswrapper[4730]: I0320 15:41:51.606312    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e647990de11fdbff5c61eb5f2a4f36ee1c2562141d84795cf0d7bbc1155cac28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://188974c8e34d453f129d95f8a3ee36ff86b44b37a5388a0643611206f15d57c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:51 crc kubenswrapper[4730]: I0320 15:41:51.621174    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7fcd3db3-55f1-4c23-8fa9-78844495cea3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8e20a286d6affcba7ffa950ef5386e7f439c9a02381cb8b7d3bc51ad9c4f343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd661cda796a2bd61d1446bee672c4471d60370245caf7cb54faf54dfa9c58a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzk8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p5qvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:51 crc kubenswrapper[4730]: I0320 15:41:51.636571    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-69fnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"102cb977-7291-453e-9282-20572071afee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35fa36f52355ad1fd24bd332fc5b0a5ff35ac2be71dd16b53449d9f47e48e8ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-plthx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-69fnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:51 crc kubenswrapper[4730]: E0320 15:41:51.643586    4730 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"
Mar 20 15:41:51 crc kubenswrapper[4730]: I0320 15:41:51.653532    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-49hht" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbb015c0-3a11-48bf-a59f-22bc03ca2fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c34c742c6da6c6d35f815901234a0c12c9628d22fa83e511bddc78eae4373cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://458c14e265e86cad40fc02fc28e4e3b9210fbae7c6fa7d26a0018cef7bf02b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7567c6606959a0875ab14452050996da4ff86fcfecfd82a6c2ba826136a69a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1f1bf348c9c58382cf7ce0065613fde5d4b11b2ccc480c6ee8c40adf70fe34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535e0a48f9967f05eb75c2e5518b11385cba269c5e3d50f13b26e0ef8209c134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d05aa3e72971a8965d62f6cdae8245fd372b5d8a9df8e5a524b34326ad9f91a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdfa0e3e3af5612de41c905d807532a50887378171356a3dade8aaf3d29fe9bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdfa0e3e3af5612de41c905d807532a50887378171356a3dade8aaf3d29fe9bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4qtg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-49hht\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:51 crc kubenswrapper[4730]: I0320 15:41:51.669463    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caf1c50b-d896-46ac-8c1c-2368a862eb88\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a2006b33d30cf2fd57843f3df0fb087253dd116f48a4d807c31260ce7508b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://badcec1b25a9d088fe7e563366ee7568adcabfe9c29a536db19fe3119b10f229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f50a49e995c2647a19bd3dedd3ca85f1d7d0279df106c153af39641af9ea83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e8cd87f56c4a70c698163de2d3f364420981943d389a3cc9b64401bb5fbf08e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e8cd87f56c4a70c698163de2d3f364420981943d389a3cc9b64401bb5fbf08e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:51 crc kubenswrapper[4730]: I0320 15:41:51.684391    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07c87d31-bb33-4cbb-8786-b42cee6340a0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4298d984334a4955a24ac85a16284d8ffe1e4f3033c328be6e66f59d3b17f1b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35485648d66e1e884978d1ea90e47c514192f3fa8382f6c74eb48ec2a587ec6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:51 crc kubenswrapper[4730]: I0320 15:41:51.696627    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055b0251e049abda93d63985b27520b79b2596443df06427dd990fc974b52a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:51 crc kubenswrapper[4730]: I0320 15:41:51.710210    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:51 crc kubenswrapper[4730]: I0320 15:41:51.729301    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6r2kn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f97b1f1-1fad-44ec-8253-17dd6a5eee54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12ba423ea0fecce8b2416cc8f75f3323980aae80a20ff26bd2f9a6c4cd464812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7f0276b7ef4131fee3cc30530a3cb7ddf6e6be5ce73d3084a1f6f4272f9dde6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:41:27Z\\\",\\\"message\\\":\\\"2026-03-20T15:40:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5b4f3207-7d26-40ce-9cb0-76c6f07e26b9\\\\n2026-03-20T15:40:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5b4f3207-7d26-40ce-9cb0-76c6f07e26b9 to /host/opt/cni/bin/\\\\n2026-03-20T15:40:42Z [verbose] multus-daemon started\\\\n2026-03-20T15:40:42Z [verbose] Readiness Indicator file check\\\\n2026-03-20T15:41:27Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:41:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvthz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6r2kn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:51 crc kubenswrapper[4730]: I0320 15:41:51.746481    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2prfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2prfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:51 crc kubenswrapper[4730]: I0320 15:41:51.766167    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eed0424-45fd-4b1e-8b59-d041af7fb08f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:39:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://620070760ce503ee2102ce0880913637feb032124892ce1a1e2060939f38e050\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b34522460ebd4556ce4291e5c5132788387cf45b0be3b9535af9262948b71ac\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T15:39:27Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 15:39:03.602145       1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 15:39:03.606021       1 observer_polling.go:159] Starting file observer\\\\nI0320 15:39:03.640071       1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 15:39:03.644835       1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 15:39:27.437935       1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 15:39:27.438079       1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:39:27Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aee2dcf43ecf6df4a1615aa6e468921053ccb529d3c6dbc2c2ad641e264e606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99093fe46696a888b221d24d1b42226d0ff16bab6b3fb2a718d055cf97066a69\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://899dbd6715433cfe5141851019e164daea952552c26706648245fd6319168685\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:39:01Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:51 crc kubenswrapper[4730]: I0320 15:41:51.783175    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f8a35dfa17acf7f7051d13b20cab55fe91645c8fa2773fed67baddae164b586\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:51 crc kubenswrapper[4730]: I0320 15:41:51.799450    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:51 crc kubenswrapper[4730]: I0320 15:41:51.817962    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4b4e0e8-af33-491e-b1d1-31079d90c656\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7263ba407f8e4372a1f43899fa38bbfefcf8ad7bf2f07d5e4fc6f145dc922aed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7263ba407f8e4372a1f43899fa38bbfefcf8ad7bf2f07d5e4fc6f145dc922aed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T15:41:32Z\\\",\\\"message\\\":\\\"44],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0320 15:41:32.432008    7239 lb_config.go:1031] Cluster endpoints for openshift-ingress-operator/metrics for network=default are: map[]\\\\nF0320 15:41:32.432028    7239 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:32Z is after 2025-08-24T17:21:41Z]\\\\nI0320 15:41:32.432031    7239 services_controller.go:443] Built service openshift-ingress-operator/metrics LB cluster-wide confi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T15:41:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qj97f_openshift-ovn-kubernetes(c4b4e0e8-af33-491e-b1d1-31079d90c656)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T15:40:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T15:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz64b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qj97f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:51 crc kubenswrapper[4730]: I0320 15:41:51.835040    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted.  The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:51 crc kubenswrapper[4730]: I0320 15:41:51.846856    4730 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n4w74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ee8d55f-90bd-4484-8455-933de455efea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T15:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2070438cbc1bdd8e3c865f5aa62537649598e0f51effc9936b9cb1a630b3651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T15:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fvg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T15:40:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n4w74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T15:41:51Z is after 2025-08-24T17:21:41Z"
Mar 20 15:41:52 crc kubenswrapper[4730]: I0320 15:41:52.532430    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:41:52 crc kubenswrapper[4730]: E0320 15:41:52.532562    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:41:52 crc kubenswrapper[4730]: I0320 15:41:52.532631    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:41:52 crc kubenswrapper[4730]: I0320 15:41:52.532637    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:41:52 crc kubenswrapper[4730]: E0320 15:41:52.532925    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:41:52 crc kubenswrapper[4730]: I0320 15:41:52.532637    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:41:52 crc kubenswrapper[4730]: E0320 15:41:52.532769    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:41:52 crc kubenswrapper[4730]: E0320 15:41:52.533008    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:41:54 crc kubenswrapper[4730]: I0320 15:41:54.532211    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:41:54 crc kubenswrapper[4730]: I0320 15:41:54.532220    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:41:54 crc kubenswrapper[4730]: E0320 15:41:54.533161    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:41:54 crc kubenswrapper[4730]: I0320 15:41:54.532327    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:41:54 crc kubenswrapper[4730]: I0320 15:41:54.532277    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:41:54 crc kubenswrapper[4730]: E0320 15:41:54.533317    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:41:54 crc kubenswrapper[4730]: E0320 15:41:54.533410    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:41:54 crc kubenswrapper[4730]: E0320 15:41:54.533471    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:41:55 crc kubenswrapper[4730]: I0320 15:41:55.201870    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory"
Mar 20 15:41:55 crc kubenswrapper[4730]: I0320 15:41:55.202147    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure"
Mar 20 15:41:55 crc kubenswrapper[4730]: I0320 15:41:55.202219    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID"
Mar 20 15:41:55 crc kubenswrapper[4730]: I0320 15:41:55.202324    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady"
Mar 20 15:41:55 crc kubenswrapper[4730]: I0320 15:41:55.202399    4730 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T15:41:55Z","lastTransitionTime":"2026-03-20T15:41:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"}
Mar 20 15:41:55 crc kubenswrapper[4730]: I0320 15:41:55.259146    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-vfvcj"]
Mar 20 15:41:55 crc kubenswrapper[4730]: I0320 15:41:55.259640    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vfvcj"
Mar 20 15:41:55 crc kubenswrapper[4730]: W0320 15:41:55.262328    4730 reflector.go:561] object-"openshift-cluster-version"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-cluster-version": no relationship found between node 'crc' and this object
Mar 20 15:41:55 crc kubenswrapper[4730]: E0320 15:41:55.262396    4730 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-version\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-cluster-version\": no relationship found between node 'crc' and this object" logger="UnhandledError"
Mar 20 15:41:55 crc kubenswrapper[4730]: I0320 15:41:55.262804    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt"
Mar 20 15:41:55 crc kubenswrapper[4730]: I0320 15:41:55.262847    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert"
Mar 20 15:41:55 crc kubenswrapper[4730]: I0320 15:41:55.262924    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4"
Mar 20 15:41:55 crc kubenswrapper[4730]: I0320 15:41:55.309803    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=49.309784251 podStartE2EDuration="49.309784251s" podCreationTimestamp="2026-03-20 15:41:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:41:55.290316332 +0000 UTC m=+174.503687711" watchObservedRunningTime="2026-03-20 15:41:55.309784251 +0000 UTC m=+174.523155630"
Mar 20 15:41:55 crc kubenswrapper[4730]: I0320 15:41:55.366393    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/30a16a52-ae94-449a-ba45-a98829e0a60d-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-vfvcj\" (UID: \"30a16a52-ae94-449a-ba45-a98829e0a60d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vfvcj"
Mar 20 15:41:55 crc kubenswrapper[4730]: I0320 15:41:55.366482    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/30a16a52-ae94-449a-ba45-a98829e0a60d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-vfvcj\" (UID: \"30a16a52-ae94-449a-ba45-a98829e0a60d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vfvcj"
Mar 20 15:41:55 crc kubenswrapper[4730]: I0320 15:41:55.366539    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30a16a52-ae94-449a-ba45-a98829e0a60d-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-vfvcj\" (UID: \"30a16a52-ae94-449a-ba45-a98829e0a60d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vfvcj"
Mar 20 15:41:55 crc kubenswrapper[4730]: I0320 15:41:55.366573    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/30a16a52-ae94-449a-ba45-a98829e0a60d-service-ca\") pod \"cluster-version-operator-5c965bbfc6-vfvcj\" (UID: \"30a16a52-ae94-449a-ba45-a98829e0a60d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vfvcj"
Mar 20 15:41:55 crc kubenswrapper[4730]: I0320 15:41:55.366637    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/30a16a52-ae94-449a-ba45-a98829e0a60d-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-vfvcj\" (UID: \"30a16a52-ae94-449a-ba45-a98829e0a60d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vfvcj"
Mar 20 15:41:55 crc kubenswrapper[4730]: I0320 15:41:55.393912    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-n4w74" podStartSLOduration=123.393894332 podStartE2EDuration="2m3.393894332s" podCreationTimestamp="2026-03-20 15:39:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:41:55.383003513 +0000 UTC m=+174.596374882" watchObservedRunningTime="2026-03-20 15:41:55.393894332 +0000 UTC m=+174.607265691"
Mar 20 15:41:55 crc kubenswrapper[4730]: I0320 15:41:55.414994    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=75.414976789 podStartE2EDuration="1m15.414976789s" podCreationTimestamp="2026-03-20 15:40:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:41:55.414581967 +0000 UTC m=+174.627953336" watchObservedRunningTime="2026-03-20 15:41:55.414976789 +0000 UTC m=+174.628348158"
Mar 20 15:41:55 crc kubenswrapper[4730]: I0320 15:41:55.429879    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=101.429860779 podStartE2EDuration="1m41.429860779s" podCreationTimestamp="2026-03-20 15:40:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:41:55.429399305 +0000 UTC m=+174.642770684" watchObservedRunningTime="2026-03-20 15:41:55.429860779 +0000 UTC m=+174.643232148"
Mar 20 15:41:55 crc kubenswrapper[4730]: I0320 15:41:55.453705    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p47zh" podStartSLOduration=122.453684028 podStartE2EDuration="2m2.453684028s" podCreationTimestamp="2026-03-20 15:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:41:55.441447229 +0000 UTC m=+174.654818588" watchObservedRunningTime="2026-03-20 15:41:55.453684028 +0000 UTC m=+174.667055397"
Mar 20 15:41:55 crc kubenswrapper[4730]: I0320 15:41:55.467468    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/30a16a52-ae94-449a-ba45-a98829e0a60d-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-vfvcj\" (UID: \"30a16a52-ae94-449a-ba45-a98829e0a60d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vfvcj"
Mar 20 15:41:55 crc kubenswrapper[4730]: I0320 15:41:55.467549    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/30a16a52-ae94-449a-ba45-a98829e0a60d-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-vfvcj\" (UID: \"30a16a52-ae94-449a-ba45-a98829e0a60d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vfvcj"
Mar 20 15:41:55 crc kubenswrapper[4730]: I0320 15:41:55.467569    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/30a16a52-ae94-449a-ba45-a98829e0a60d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-vfvcj\" (UID: \"30a16a52-ae94-449a-ba45-a98829e0a60d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vfvcj"
Mar 20 15:41:55 crc kubenswrapper[4730]: I0320 15:41:55.467594    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30a16a52-ae94-449a-ba45-a98829e0a60d-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-vfvcj\" (UID: \"30a16a52-ae94-449a-ba45-a98829e0a60d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vfvcj"
Mar 20 15:41:55 crc kubenswrapper[4730]: I0320 15:41:55.467613    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/30a16a52-ae94-449a-ba45-a98829e0a60d-service-ca\") pod \"cluster-version-operator-5c965bbfc6-vfvcj\" (UID: \"30a16a52-ae94-449a-ba45-a98829e0a60d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vfvcj"
Mar 20 15:41:55 crc kubenswrapper[4730]: I0320 15:41:55.467733    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/30a16a52-ae94-449a-ba45-a98829e0a60d-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-vfvcj\" (UID: \"30a16a52-ae94-449a-ba45-a98829e0a60d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vfvcj"
Mar 20 15:41:55 crc kubenswrapper[4730]: I0320 15:41:55.467969    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/30a16a52-ae94-449a-ba45-a98829e0a60d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-vfvcj\" (UID: \"30a16a52-ae94-449a-ba45-a98829e0a60d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vfvcj"
Mar 20 15:41:55 crc kubenswrapper[4730]: I0320 15:41:55.474927    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podStartSLOduration=123.47490991 podStartE2EDuration="2m3.47490991s" podCreationTimestamp="2026-03-20 15:39:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:41:55.465068082 +0000 UTC m=+174.678439461" watchObservedRunningTime="2026-03-20 15:41:55.47490991 +0000 UTC m=+174.688281279"
Mar 20 15:41:55 crc kubenswrapper[4730]: I0320 15:41:55.475573    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-69fnw" podStartSLOduration=123.475565299 podStartE2EDuration="2m3.475565299s" podCreationTimestamp="2026-03-20 15:39:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:41:55.474569639 +0000 UTC m=+174.687941008" watchObservedRunningTime="2026-03-20 15:41:55.475565299 +0000 UTC m=+174.688936668"
Mar 20 15:41:55 crc kubenswrapper[4730]: I0320 15:41:55.482408    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30a16a52-ae94-449a-ba45-a98829e0a60d-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-vfvcj\" (UID: \"30a16a52-ae94-449a-ba45-a98829e0a60d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vfvcj"
Mar 20 15:41:55 crc kubenswrapper[4730]: I0320 15:41:55.486455    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/30a16a52-ae94-449a-ba45-a98829e0a60d-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-vfvcj\" (UID: \"30a16a52-ae94-449a-ba45-a98829e0a60d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vfvcj"
Mar 20 15:41:55 crc kubenswrapper[4730]: I0320 15:41:55.495339    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-49hht" podStartSLOduration=123.495322386 podStartE2EDuration="2m3.495322386s" podCreationTimestamp="2026-03-20 15:39:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:41:55.494914404 +0000 UTC m=+174.708285773" watchObservedRunningTime="2026-03-20 15:41:55.495322386 +0000 UTC m=+174.708693755"
Mar 20 15:41:55 crc kubenswrapper[4730]: I0320 15:41:55.508120    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=40.508098922 podStartE2EDuration="40.508098922s" podCreationTimestamp="2026-03-20 15:41:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:41:55.50734948 +0000 UTC m=+174.720720859" watchObservedRunningTime="2026-03-20 15:41:55.508098922 +0000 UTC m=+174.721470301"
Mar 20 15:41:55 crc kubenswrapper[4730]: I0320 15:41:55.534542    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=93.534521141 podStartE2EDuration="1m33.534521141s" podCreationTimestamp="2026-03-20 15:40:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:41:55.523319382 +0000 UTC m=+174.736690751" watchObservedRunningTime="2026-03-20 15:41:55.534521141 +0000 UTC m=+174.747892520"
Mar 20 15:41:55 crc kubenswrapper[4730]: I0320 15:41:55.572434    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-6r2kn" podStartSLOduration=123.572418986 podStartE2EDuration="2m3.572418986s" podCreationTimestamp="2026-03-20 15:39:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:41:55.562344101 +0000 UTC m=+174.775715470" watchObservedRunningTime="2026-03-20 15:41:55.572418986 +0000 UTC m=+174.785790355"
Mar 20 15:41:55 crc kubenswrapper[4730]: I0320 15:41:55.581729    4730 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates
Mar 20 15:41:55 crc kubenswrapper[4730]: I0320 15:41:55.588516    4730 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146
Mar 20 15:41:56 crc kubenswrapper[4730]: E0320 15:41:56.468073    4730 configmap.go:193] Couldn't get configMap openshift-cluster-version/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition
Mar 20 15:41:56 crc kubenswrapper[4730]: E0320 15:41:56.468193    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/30a16a52-ae94-449a-ba45-a98829e0a60d-service-ca podName:30a16a52-ae94-449a-ba45-a98829e0a60d nodeName:}" failed. No retries permitted until 2026-03-20 15:41:56.968167017 +0000 UTC m=+176.181538386 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca" (UniqueName: "kubernetes.io/configmap/30a16a52-ae94-449a-ba45-a98829e0a60d-service-ca") pod "cluster-version-operator-5c965bbfc6-vfvcj" (UID: "30a16a52-ae94-449a-ba45-a98829e0a60d") : failed to sync configmap cache: timed out waiting for the condition
Mar 20 15:41:56 crc kubenswrapper[4730]: I0320 15:41:56.488880    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt"
Mar 20 15:41:56 crc kubenswrapper[4730]: I0320 15:41:56.532457    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:41:56 crc kubenswrapper[4730]: I0320 15:41:56.532485    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:41:56 crc kubenswrapper[4730]: E0320 15:41:56.532804    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:41:56 crc kubenswrapper[4730]: I0320 15:41:56.533161    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:41:56 crc kubenswrapper[4730]: E0320 15:41:56.533389    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:41:56 crc kubenswrapper[4730]: E0320 15:41:56.533563    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:41:56 crc kubenswrapper[4730]: I0320 15:41:56.534607    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:41:56 crc kubenswrapper[4730]: E0320 15:41:56.534790    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:41:56 crc kubenswrapper[4730]: E0320 15:41:56.645658    4730 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"
Mar 20 15:41:56 crc kubenswrapper[4730]: I0320 15:41:56.984615    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/30a16a52-ae94-449a-ba45-a98829e0a60d-service-ca\") pod \"cluster-version-operator-5c965bbfc6-vfvcj\" (UID: \"30a16a52-ae94-449a-ba45-a98829e0a60d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vfvcj"
Mar 20 15:41:56 crc kubenswrapper[4730]: I0320 15:41:56.986476    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/30a16a52-ae94-449a-ba45-a98829e0a60d-service-ca\") pod \"cluster-version-operator-5c965bbfc6-vfvcj\" (UID: \"30a16a52-ae94-449a-ba45-a98829e0a60d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vfvcj"
Mar 20 15:41:57 crc kubenswrapper[4730]: I0320 15:41:57.078324    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vfvcj"
Mar 20 15:41:57 crc kubenswrapper[4730]: I0320 15:41:57.309321    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vfvcj" event={"ID":"30a16a52-ae94-449a-ba45-a98829e0a60d","Type":"ContainerStarted","Data":"f6345dfc0fb582e6ffcc50d8ae0c8e7b2847b61888dbe556736c5310027440c1"}
Mar 20 15:41:57 crc kubenswrapper[4730]: I0320 15:41:57.309391    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vfvcj" event={"ID":"30a16a52-ae94-449a-ba45-a98829e0a60d","Type":"ContainerStarted","Data":"0f82b0eab87c550337070792b2fa7f0a7aae2a1e760a773dbdc47fa8aa632ca3"}
Mar 20 15:41:57 crc kubenswrapper[4730]: I0320 15:41:57.323605    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vfvcj" podStartSLOduration=125.323584122 podStartE2EDuration="2m5.323584122s" podCreationTimestamp="2026-03-20 15:39:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:41:57.323343005 +0000 UTC m=+176.536714394" watchObservedRunningTime="2026-03-20 15:41:57.323584122 +0000 UTC m=+176.536955501"
Mar 20 15:41:58 crc kubenswrapper[4730]: I0320 15:41:58.533058    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:41:58 crc kubenswrapper[4730]: I0320 15:41:58.533207    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:41:58 crc kubenswrapper[4730]: E0320 15:41:58.533345    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:41:58 crc kubenswrapper[4730]: I0320 15:41:58.533069    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:41:58 crc kubenswrapper[4730]: E0320 15:41:58.533469    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:41:58 crc kubenswrapper[4730]: E0320 15:41:58.533639    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:41:58 crc kubenswrapper[4730]: I0320 15:41:58.533731    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:41:58 crc kubenswrapper[4730]: E0320 15:41:58.533922    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:42:00 crc kubenswrapper[4730]: I0320 15:42:00.532978    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:42:00 crc kubenswrapper[4730]: I0320 15:42:00.532979    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:42:00 crc kubenswrapper[4730]: I0320 15:42:00.533034    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:42:00 crc kubenswrapper[4730]: I0320 15:42:00.533586    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:42:00 crc kubenswrapper[4730]: E0320 15:42:00.533745    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:42:00 crc kubenswrapper[4730]: E0320 15:42:00.533859    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:42:00 crc kubenswrapper[4730]: E0320 15:42:00.533964    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:42:00 crc kubenswrapper[4730]: E0320 15:42:00.534016    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:42:01 crc kubenswrapper[4730]: I0320 15:42:01.538550    4730 scope.go:117] "RemoveContainer" containerID="7263ba407f8e4372a1f43899fa38bbfefcf8ad7bf2f07d5e4fc6f145dc922aed"
Mar 20 15:42:01 crc kubenswrapper[4730]: E0320 15:42:01.538700    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qj97f_openshift-ovn-kubernetes(c4b4e0e8-af33-491e-b1d1-31079d90c656)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656"
Mar 20 15:42:01 crc kubenswrapper[4730]: E0320 15:42:01.646993    4730 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"
Mar 20 15:42:02 crc kubenswrapper[4730]: I0320 15:42:02.532151    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:42:02 crc kubenswrapper[4730]: I0320 15:42:02.532237    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:42:02 crc kubenswrapper[4730]: E0320 15:42:02.532972    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:42:02 crc kubenswrapper[4730]: I0320 15:42:02.532339    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:42:02 crc kubenswrapper[4730]: I0320 15:42:02.532290    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:42:02 crc kubenswrapper[4730]: E0320 15:42:02.533462    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:42:02 crc kubenswrapper[4730]: E0320 15:42:02.533585    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:42:02 crc kubenswrapper[4730]: E0320 15:42:02.533366    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:42:04 crc kubenswrapper[4730]: I0320 15:42:04.532621    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:42:04 crc kubenswrapper[4730]: I0320 15:42:04.532733    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:42:04 crc kubenswrapper[4730]: I0320 15:42:04.532910    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:42:04 crc kubenswrapper[4730]: E0320 15:42:04.532932    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:42:04 crc kubenswrapper[4730]: E0320 15:42:04.532747    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:42:04 crc kubenswrapper[4730]: E0320 15:42:04.532954    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:42:04 crc kubenswrapper[4730]: I0320 15:42:04.533004    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:42:04 crc kubenswrapper[4730]: E0320 15:42:04.533040    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:42:06 crc kubenswrapper[4730]: I0320 15:42:06.532376    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:42:06 crc kubenswrapper[4730]: I0320 15:42:06.532409    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:42:06 crc kubenswrapper[4730]: I0320 15:42:06.532436    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:42:06 crc kubenswrapper[4730]: E0320 15:42:06.532501    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:42:06 crc kubenswrapper[4730]: I0320 15:42:06.532553    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:42:06 crc kubenswrapper[4730]: E0320 15:42:06.532730    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:42:06 crc kubenswrapper[4730]: E0320 15:42:06.532916    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:42:06 crc kubenswrapper[4730]: E0320 15:42:06.532969    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:42:06 crc kubenswrapper[4730]: E0320 15:42:06.648439    4730 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"
Mar 20 15:42:08 crc kubenswrapper[4730]: I0320 15:42:08.532779    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:42:08 crc kubenswrapper[4730]: I0320 15:42:08.532812    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:42:08 crc kubenswrapper[4730]: I0320 15:42:08.532821    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:42:08 crc kubenswrapper[4730]: E0320 15:42:08.532945    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:42:08 crc kubenswrapper[4730]: I0320 15:42:08.533040    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:42:08 crc kubenswrapper[4730]: E0320 15:42:08.533334    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:42:08 crc kubenswrapper[4730]: E0320 15:42:08.533584    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:42:08 crc kubenswrapper[4730]: E0320 15:42:08.533683    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:42:10 crc kubenswrapper[4730]: I0320 15:42:10.532996    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:42:10 crc kubenswrapper[4730]: I0320 15:42:10.533064    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:42:10 crc kubenswrapper[4730]: E0320 15:42:10.533137    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:42:10 crc kubenswrapper[4730]: I0320 15:42:10.533020    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:42:10 crc kubenswrapper[4730]: I0320 15:42:10.533219    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:42:10 crc kubenswrapper[4730]: E0320 15:42:10.533329    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:42:10 crc kubenswrapper[4730]: E0320 15:42:10.533405    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:42:10 crc kubenswrapper[4730]: E0320 15:42:10.533470    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:42:11 crc kubenswrapper[4730]: E0320 15:42:11.648980    4730 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"
Mar 20 15:42:12 crc kubenswrapper[4730]: I0320 15:42:12.532572    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:42:12 crc kubenswrapper[4730]: I0320 15:42:12.532586    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:42:12 crc kubenswrapper[4730]: I0320 15:42:12.532644    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:42:12 crc kubenswrapper[4730]: E0320 15:42:12.532937    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:42:12 crc kubenswrapper[4730]: I0320 15:42:12.533053    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:42:12 crc kubenswrapper[4730]: E0320 15:42:12.533191    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:42:12 crc kubenswrapper[4730]: E0320 15:42:12.533326    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:42:12 crc kubenswrapper[4730]: E0320 15:42:12.533488    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:42:13 crc kubenswrapper[4730]: I0320 15:42:13.370808    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6r2kn_6f97b1f1-1fad-44ec-8253-17dd6a5eee54/kube-multus/1.log"
Mar 20 15:42:13 crc kubenswrapper[4730]: I0320 15:42:13.371950    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6r2kn_6f97b1f1-1fad-44ec-8253-17dd6a5eee54/kube-multus/0.log"
Mar 20 15:42:13 crc kubenswrapper[4730]: I0320 15:42:13.371991    4730 generic.go:334] "Generic (PLEG): container finished" podID="6f97b1f1-1fad-44ec-8253-17dd6a5eee54" containerID="12ba423ea0fecce8b2416cc8f75f3323980aae80a20ff26bd2f9a6c4cd464812" exitCode=1
Mar 20 15:42:13 crc kubenswrapper[4730]: I0320 15:42:13.372021    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6r2kn" event={"ID":"6f97b1f1-1fad-44ec-8253-17dd6a5eee54","Type":"ContainerDied","Data":"12ba423ea0fecce8b2416cc8f75f3323980aae80a20ff26bd2f9a6c4cd464812"}
Mar 20 15:42:13 crc kubenswrapper[4730]: I0320 15:42:13.372054    4730 scope.go:117] "RemoveContainer" containerID="f7f0276b7ef4131fee3cc30530a3cb7ddf6e6be5ce73d3084a1f6f4272f9dde6"
Mar 20 15:42:13 crc kubenswrapper[4730]: I0320 15:42:13.372504    4730 scope.go:117] "RemoveContainer" containerID="12ba423ea0fecce8b2416cc8f75f3323980aae80a20ff26bd2f9a6c4cd464812"
Mar 20 15:42:13 crc kubenswrapper[4730]: E0320 15:42:13.372713    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-6r2kn_openshift-multus(6f97b1f1-1fad-44ec-8253-17dd6a5eee54)\"" pod="openshift-multus/multus-6r2kn" podUID="6f97b1f1-1fad-44ec-8253-17dd6a5eee54"
Mar 20 15:42:14 crc kubenswrapper[4730]: I0320 15:42:14.377328    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6r2kn_6f97b1f1-1fad-44ec-8253-17dd6a5eee54/kube-multus/1.log"
Mar 20 15:42:14 crc kubenswrapper[4730]: I0320 15:42:14.532933    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:42:14 crc kubenswrapper[4730]: I0320 15:42:14.533042    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:42:14 crc kubenswrapper[4730]: I0320 15:42:14.533054    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:42:14 crc kubenswrapper[4730]: I0320 15:42:14.533075    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:42:14 crc kubenswrapper[4730]: E0320 15:42:14.533639    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:42:14 crc kubenswrapper[4730]: E0320 15:42:14.533707    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:42:14 crc kubenswrapper[4730]: E0320 15:42:14.533826    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:42:14 crc kubenswrapper[4730]: E0320 15:42:14.533930    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:42:16 crc kubenswrapper[4730]: I0320 15:42:16.533139    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:42:16 crc kubenswrapper[4730]: I0320 15:42:16.533273    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:42:16 crc kubenswrapper[4730]: E0320 15:42:16.533354    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:42:16 crc kubenswrapper[4730]: I0320 15:42:16.533274    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:42:16 crc kubenswrapper[4730]: I0320 15:42:16.533685    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:42:16 crc kubenswrapper[4730]: E0320 15:42:16.533817    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:42:16 crc kubenswrapper[4730]: E0320 15:42:16.533928    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:42:16 crc kubenswrapper[4730]: E0320 15:42:16.534021    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:42:16 crc kubenswrapper[4730]: I0320 15:42:16.534306    4730 scope.go:117] "RemoveContainer" containerID="7263ba407f8e4372a1f43899fa38bbfefcf8ad7bf2f07d5e4fc6f145dc922aed"
Mar 20 15:42:16 crc kubenswrapper[4730]: E0320 15:42:16.651355    4730 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"
Mar 20 15:42:17 crc kubenswrapper[4730]: I0320 15:42:17.388965    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qj97f_c4b4e0e8-af33-491e-b1d1-31079d90c656/ovnkube-controller/3.log"
Mar 20 15:42:17 crc kubenswrapper[4730]: I0320 15:42:17.392045    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" event={"ID":"c4b4e0e8-af33-491e-b1d1-31079d90c656","Type":"ContainerStarted","Data":"35f50ce915b43165e56de90b2b1949244c29bf63fd85bb0583a8a8a6988d8971"}
Mar 20 15:42:17 crc kubenswrapper[4730]: I0320 15:42:17.392483    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:42:17 crc kubenswrapper[4730]: I0320 15:42:17.433127    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" podStartSLOduration=145.433108719 podStartE2EDuration="2m25.433108719s" podCreationTimestamp="2026-03-20 15:39:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:42:17.432560952 +0000 UTC m=+196.645932341" watchObservedRunningTime="2026-03-20 15:42:17.433108719 +0000 UTC m=+196.646480088"
Mar 20 15:42:17 crc kubenswrapper[4730]: I0320 15:42:17.455420    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2prfn"]
Mar 20 15:42:17 crc kubenswrapper[4730]: I0320 15:42:17.455535    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:42:17 crc kubenswrapper[4730]: E0320 15:42:17.455641    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:42:18 crc kubenswrapper[4730]: I0320 15:42:18.532805    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:42:18 crc kubenswrapper[4730]: I0320 15:42:18.532808    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:42:18 crc kubenswrapper[4730]: E0320 15:42:18.533434    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:42:18 crc kubenswrapper[4730]: I0320 15:42:18.532890    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:42:18 crc kubenswrapper[4730]: E0320 15:42:18.533566    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:42:18 crc kubenswrapper[4730]: E0320 15:42:18.533735    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:42:19 crc kubenswrapper[4730]: I0320 15:42:19.533236    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:42:19 crc kubenswrapper[4730]: E0320 15:42:19.533510    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:42:20 crc kubenswrapper[4730]: I0320 15:42:20.532822    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:42:20 crc kubenswrapper[4730]: I0320 15:42:20.532853    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:42:20 crc kubenswrapper[4730]: I0320 15:42:20.532887    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:42:20 crc kubenswrapper[4730]: E0320 15:42:20.533000    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:42:20 crc kubenswrapper[4730]: E0320 15:42:20.533077    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:42:20 crc kubenswrapper[4730]: E0320 15:42:20.533154    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:42:20 crc kubenswrapper[4730]: I0320 15:42:20.539101    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:42:20 crc kubenswrapper[4730]: E0320 15:42:20.539399    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:44:22.539364574 +0000 UTC m=+321.752735953 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:20 crc kubenswrapper[4730]: I0320 15:42:20.640154    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:42:20 crc kubenswrapper[4730]: I0320 15:42:20.640304    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:42:20 crc kubenswrapper[4730]: I0320 15:42:20.640376    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:42:20 crc kubenswrapper[4730]: I0320 15:42:20.640425    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:42:20 crc kubenswrapper[4730]: E0320 15:42:20.640489    4730 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered
Mar 20 15:42:20 crc kubenswrapper[4730]: E0320 15:42:20.640533    4730 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered
Mar 20 15:42:20 crc kubenswrapper[4730]: E0320 15:42:20.640553    4730 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered]
Mar 20 15:42:20 crc kubenswrapper[4730]: E0320 15:42:20.640551    4730 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered
Mar 20 15:42:20 crc kubenswrapper[4730]: E0320 15:42:20.640619    4730 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered
Mar 20 15:42:20 crc kubenswrapper[4730]: E0320 15:42:20.640648    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 15:44:22.640624282 +0000 UTC m=+321.853995681 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered]
Mar 20 15:42:20 crc kubenswrapper[4730]: E0320 15:42:20.640686    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 15:44:22.640668703 +0000 UTC m=+321.854040102 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered
Mar 20 15:42:20 crc kubenswrapper[4730]: E0320 15:42:20.640655    4730 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered
Mar 20 15:42:20 crc kubenswrapper[4730]: E0320 15:42:20.640711    4730 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered]
Mar 20 15:42:20 crc kubenswrapper[4730]: E0320 15:42:20.640620    4730 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered
Mar 20 15:42:20 crc kubenswrapper[4730]: E0320 15:42:20.640753    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 15:44:22.640742125 +0000 UTC m=+321.854113534 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered]
Mar 20 15:42:20 crc kubenswrapper[4730]: E0320 15:42:20.640846    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 15:44:22.640813967 +0000 UTC m=+321.854185407 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered
Mar 20 15:42:20 crc kubenswrapper[4730]: I0320 15:42:20.841835    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a-metrics-certs\") pod \"network-metrics-daemon-2prfn\" (UID: \"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\") " pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:42:20 crc kubenswrapper[4730]: E0320 15:42:20.842030    4730 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered
Mar 20 15:42:20 crc kubenswrapper[4730]: E0320 15:42:20.842117    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a-metrics-certs podName:db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a nodeName:}" failed. No retries permitted until 2026-03-20 15:44:22.842100035 +0000 UTC m=+322.055471404 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a-metrics-certs") pod "network-metrics-daemon-2prfn" (UID: "db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a") : object "openshift-multus"/"metrics-daemon-secret" not registered
Mar 20 15:42:21 crc kubenswrapper[4730]: I0320 15:42:21.532581    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:42:21 crc kubenswrapper[4730]: E0320 15:42:21.535149    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:42:21 crc kubenswrapper[4730]: E0320 15:42:21.652286    4730 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"
Mar 20 15:42:22 crc kubenswrapper[4730]: I0320 15:42:22.533058    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:42:22 crc kubenswrapper[4730]: I0320 15:42:22.533243    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:42:22 crc kubenswrapper[4730]: E0320 15:42:22.533335    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:42:22 crc kubenswrapper[4730]: I0320 15:42:22.533455    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:42:22 crc kubenswrapper[4730]: E0320 15:42:22.534036    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:42:22 crc kubenswrapper[4730]: E0320 15:42:22.534325    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:42:23 crc kubenswrapper[4730]: I0320 15:42:23.532622    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:42:23 crc kubenswrapper[4730]: E0320 15:42:23.532910    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:42:24 crc kubenswrapper[4730]: I0320 15:42:24.533230    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:42:24 crc kubenswrapper[4730]: I0320 15:42:24.533287    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:42:24 crc kubenswrapper[4730]: E0320 15:42:24.533586    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:42:24 crc kubenswrapper[4730]: E0320 15:42:24.533703    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:42:24 crc kubenswrapper[4730]: I0320 15:42:24.533913    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:42:24 crc kubenswrapper[4730]: E0320 15:42:24.534075    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:42:25 crc kubenswrapper[4730]: I0320 15:42:25.533342    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:42:25 crc kubenswrapper[4730]: E0320 15:42:25.533551    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:42:25 crc kubenswrapper[4730]: I0320 15:42:25.533757    4730 scope.go:117] "RemoveContainer" containerID="12ba423ea0fecce8b2416cc8f75f3323980aae80a20ff26bd2f9a6c4cd464812"
Mar 20 15:42:26 crc kubenswrapper[4730]: I0320 15:42:26.426956    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6r2kn_6f97b1f1-1fad-44ec-8253-17dd6a5eee54/kube-multus/1.log"
Mar 20 15:42:26 crc kubenswrapper[4730]: I0320 15:42:26.427340    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6r2kn" event={"ID":"6f97b1f1-1fad-44ec-8253-17dd6a5eee54","Type":"ContainerStarted","Data":"b07ba8437e9756f6cb976900c9db574ebb08c12f74c7cd2c86009c95fccf5b7e"}
Mar 20 15:42:26 crc kubenswrapper[4730]: I0320 15:42:26.532795    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:42:26 crc kubenswrapper[4730]: I0320 15:42:26.532864    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:42:26 crc kubenswrapper[4730]: I0320 15:42:26.532868    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:42:26 crc kubenswrapper[4730]: E0320 15:42:26.533136    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:42:26 crc kubenswrapper[4730]: E0320 15:42:26.533285    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:42:26 crc kubenswrapper[4730]: E0320 15:42:26.533451    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:42:26 crc kubenswrapper[4730]: E0320 15:42:26.653889    4730 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"
Mar 20 15:42:27 crc kubenswrapper[4730]: I0320 15:42:27.532742    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:42:27 crc kubenswrapper[4730]: E0320 15:42:27.532885    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:42:28 crc kubenswrapper[4730]: I0320 15:42:28.532140    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:42:28 crc kubenswrapper[4730]: I0320 15:42:28.532263    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:42:28 crc kubenswrapper[4730]: E0320 15:42:28.532390    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:42:28 crc kubenswrapper[4730]: E0320 15:42:28.532606    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:42:28 crc kubenswrapper[4730]: I0320 15:42:28.532177    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:42:28 crc kubenswrapper[4730]: E0320 15:42:28.533443    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:42:29 crc kubenswrapper[4730]: I0320 15:42:29.532577    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:42:29 crc kubenswrapper[4730]: E0320 15:42:29.532718    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:42:30 crc kubenswrapper[4730]: I0320 15:42:30.532155    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:42:30 crc kubenswrapper[4730]: E0320 15:42:30.533027    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:42:30 crc kubenswrapper[4730]: I0320 15:42:30.532186    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:42:30 crc kubenswrapper[4730]: E0320 15:42:30.533603    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:42:30 crc kubenswrapper[4730]: I0320 15:42:30.532155    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:42:30 crc kubenswrapper[4730]: E0320 15:42:30.533779    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:42:31 crc kubenswrapper[4730]: I0320 15:42:31.532464    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:42:31 crc kubenswrapper[4730]: E0320 15:42:31.533830    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:42:32 crc kubenswrapper[4730]: I0320 15:42:32.532092    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:42:32 crc kubenswrapper[4730]: I0320 15:42:32.532157    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:42:32 crc kubenswrapper[4730]: I0320 15:42:32.532089    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:42:32 crc kubenswrapper[4730]: I0320 15:42:32.581066    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt"
Mar 20 15:42:32 crc kubenswrapper[4730]: I0320 15:42:32.581854    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert"
Mar 20 15:42:32 crc kubenswrapper[4730]: I0320 15:42:32.582955    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin"
Mar 20 15:42:32 crc kubenswrapper[4730]: I0320 15:42:32.586509    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt"
Mar 20 15:42:33 crc kubenswrapper[4730]: I0320 15:42:33.533172    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:42:33 crc kubenswrapper[4730]: I0320 15:42:33.534947    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c"
Mar 20 15:42:33 crc kubenswrapper[4730]: I0320 15:42:33.535137    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.313107    4730 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.474598    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hrm7z"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.476240    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-jzx77"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.476366    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-hrm7z"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.477987    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-jzx77"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.479852    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-k6z2l"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.480380    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-k6z2l"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.491315    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.491415    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.491823    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.491825    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.491961    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.492064    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.491968    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.492064    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.492334    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.492585    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.492697    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.492896    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.493284    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.493438    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.493602    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.494118    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.494233    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.494399    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.494606    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.494831    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tgpgm"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.495222    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.495235    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tgpgm"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.495336    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.495522    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-nsdw7"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.501437    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.519462    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-st79s"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.519904    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-ldb64"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.520048    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.522609    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zr8dk"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.522742    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ldb64"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.522810    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nsdw7"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.522855    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-st79s"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.525133    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-csmvr"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.525603    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-9kgl8"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.525668    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zr8dk"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.525805    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-csmvr"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.526539    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-g7hdt"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.526722    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.526906    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kfjm5"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.527414    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kfjm5"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.527756    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-9kgl8"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.527999    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-g7hdt"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.530366    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-mkxg7"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.530712    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-6rbg9"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.531050    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.531100    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6rbg9"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.531232    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.531374    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-mkxg7"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.531680    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.531861    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.532009    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.532139    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.532705    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.533729    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-nfww4"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.534563    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-nfww4"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.535811    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pncxq"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.536454    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xspkm"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.536855    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xspkm"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.539238    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pncxq"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.541044    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.541425    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hrm7z"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.543448    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tgpgm"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.544769    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.545057    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.545802    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.545947    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.546106    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.546490    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.546682    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.546769    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.546909    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bdpg6"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.546939    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.547016    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.547074    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.547417    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.547529    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.547555    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.547590    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.547671    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.547734    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.547770    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.547936    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.547999    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.548021    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.548066    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.548077    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.548109    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.548137    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.548109    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.548165    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.548176    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.548180    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.548224    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2qdqs"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.548286    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.548304    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.548316    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.548436    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.548439    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.548459    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.548443    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.548721    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2qdqs"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.549024    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-92dt7"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.549068    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.549363    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.549483    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-92dt7"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.549677    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.549855    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.552624    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.552922    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.552969    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.553194    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.553287    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.553372    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.553414    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.553458    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.553196    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.553719    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.553795    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.554045    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.554133    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.554208    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.554335    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.553604    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.553658    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.553665    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.553709    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.554346    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.575961    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.576347    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mzjxx"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.576503    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.599538    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.600347    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mzjxx"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.600537    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.601339    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.601411    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.603397    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qwghv"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.603902    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qxnn6"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.605816    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qwghv"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.608555    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m7cfz"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.609076    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m7cfz"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.609154    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.609510    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qxnn6"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.609689    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.609907    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.610926    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sk8hp"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.611470    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sk8hp"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.611722    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7fsc"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.612183    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7fsc"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.614198    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.614873    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-84pdq"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.615406    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-84pdq"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.616077    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.618364    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.619319    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-fjhxb"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.620194    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vqxff"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.620583    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-4dp6q"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.621063    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fjhxb"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.621770    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.624833    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vqxff"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.626202    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.630892    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-7hzc8"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.631322    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jkk9s"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.631630    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m4xlq"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.633139    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-w5gdn"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.635006    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4dp6q"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.635474    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7hzc8"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.635559    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m4xlq"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.636394    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jkk9s"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.648597    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-g7hdt"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.648681    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567010-d69bc"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.649879    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.650870    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-sxbnn"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.651188    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-w5gdn"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.652397    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567010-d69bc"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.654435    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-8n5gl"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.657127    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-klbh8"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.657935    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-sxbnn"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.660786    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7406e06c-cbde-48f9-b5e7-57a2a86b5a4d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-xspkm\" (UID: \"7406e06c-cbde-48f9-b5e7-57a2a86b5a4d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xspkm"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.660846    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftpl2\" (UniqueName: \"kubernetes.io/projected/2eb5b4cf-be45-4c3d-abbf-9a3e525d5ed3-kube-api-access-ftpl2\") pod \"console-operator-58897d9998-mkxg7\" (UID: \"2eb5b4cf-be45-4c3d-abbf-9a3e525d5ed3\") " pod="openshift-console-operator/console-operator-58897d9998-mkxg7"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.660886    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-st79s\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") " pod="openshift-authentication/oauth-openshift-558db77b4-st79s"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.660958    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2eb5b4cf-be45-4c3d-abbf-9a3e525d5ed3-trusted-ca\") pod \"console-operator-58897d9998-mkxg7\" (UID: \"2eb5b4cf-be45-4c3d-abbf-9a3e525d5ed3\") " pod="openshift-console-operator/console-operator-58897d9998-mkxg7"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.660997    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d86jz\" (UniqueName: \"kubernetes.io/projected/7406e06c-cbde-48f9-b5e7-57a2a86b5a4d-kube-api-access-d86jz\") pod \"cluster-image-registry-operator-dc59b4c8b-xspkm\" (UID: \"7406e06c-cbde-48f9-b5e7-57a2a86b5a4d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xspkm"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.661054    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5edbd5a9-6c8b-4ef8-950f-58deaecf36ee-console-config\") pod \"console-f9d7485db-9kgl8\" (UID: \"5edbd5a9-6c8b-4ef8-950f-58deaecf36ee\") " pod="openshift-console/console-f9d7485db-9kgl8"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.661090    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/ad1f04f2-f7c4-4bc6-9daf-0db7a0809206-machine-approver-tls\") pod \"machine-approver-56656f9798-ldb64\" (UID: \"ad1f04f2-f7c4-4bc6-9daf-0db7a0809206\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ldb64"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.661114    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca0985ab-94f6-4f4e-b8b4-0ee710e01fcf-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-zr8dk\" (UID: \"ca0985ab-94f6-4f4e-b8b4-0ee710e01fcf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zr8dk"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.661150    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-st79s\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") " pod="openshift-authentication/oauth-openshift-558db77b4-st79s"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.661230    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad1f04f2-f7c4-4bc6-9daf-0db7a0809206-config\") pod \"machine-approver-56656f9798-ldb64\" (UID: \"ad1f04f2-f7c4-4bc6-9daf-0db7a0809206\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ldb64"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.661349    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf-client-ca\") pod \"route-controller-manager-6576b87f9c-tgpgm\" (UID: \"2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tgpgm"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.661235    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-klbh8"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.663192    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567022-wf5nv"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.663447    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.665352    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj7hq\" (UniqueName: \"kubernetes.io/projected/7662a0cc-faaa-47da-90f9-f3a8907a0401-kube-api-access-xj7hq\") pod \"apiserver-7bbb656c7d-nsdw7\" (UID: \"7662a0cc-faaa-47da-90f9-f3a8907a0401\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nsdw7"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.665452    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a835e0ec-4721-4824-8846-fcc7e12db3f9-audit\") pod \"apiserver-76f77b778f-jzx77\" (UID: \"a835e0ec-4721-4824-8846-fcc7e12db3f9\") " pod="openshift-apiserver/apiserver-76f77b778f-jzx77"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.665549    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ad1f04f2-f7c4-4bc6-9daf-0db7a0809206-auth-proxy-config\") pod \"machine-approver-56656f9798-ldb64\" (UID: \"ad1f04f2-f7c4-4bc6-9daf-0db7a0809206\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ldb64"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.665583    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7662a0cc-faaa-47da-90f9-f3a8907a0401-serving-cert\") pod \"apiserver-7bbb656c7d-nsdw7\" (UID: \"7662a0cc-faaa-47da-90f9-f3a8907a0401\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nsdw7"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.665625    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-png7p\" (UniqueName: \"kubernetes.io/projected/18214bd2-9c3a-4737-885b-2b5c905311d8-kube-api-access-png7p\") pod \"router-default-5444994796-92dt7\" (UID: \"18214bd2-9c3a-4737-885b-2b5c905311d8\") " pod="openshift-ingress/router-default-5444994796-92dt7"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.688164    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-8n5gl"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.687378    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4e38bce-6ae6-451b-aa9f-7a98dfa4d974-serving-cert\") pod \"openshift-config-operator-7777fb866f-6rbg9\" (UID: \"d4e38bce-6ae6-451b-aa9f-7a98dfa4d974\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6rbg9"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.689379    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2eb5b4cf-be45-4c3d-abbf-9a3e525d5ed3-config\") pod \"console-operator-58897d9998-mkxg7\" (UID: \"2eb5b4cf-be45-4c3d-abbf-9a3e525d5ed3\") " pod="openshift-console-operator/console-operator-58897d9998-mkxg7"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.689406    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a38d833-db72-4566-b139-7788730a502a-serving-cert\") pod \"controller-manager-879f6c89f-hrm7z\" (UID: \"9a38d833-db72-4566-b139-7788730a502a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hrm7z"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.689424    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6xmg\" (UniqueName: \"kubernetes.io/projected/a250c56d-72fb-473d-98ce-c013e9d15b4a-kube-api-access-v6xmg\") pod \"dns-operator-744455d44c-nfww4\" (UID: \"a250c56d-72fb-473d-98ce-c013e9d15b4a\") " pod="openshift-dns-operator/dns-operator-744455d44c-nfww4"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.689440    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-st79s\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") " pod="openshift-authentication/oauth-openshift-558db77b4-st79s"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.689471    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/18214bd2-9c3a-4737-885b-2b5c905311d8-default-certificate\") pod \"router-default-5444994796-92dt7\" (UID: \"18214bd2-9c3a-4737-885b-2b5c905311d8\") " pod="openshift-ingress/router-default-5444994796-92dt7"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.689486    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d784e9cd-d5af-496e-abca-ce30096bb0d0-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-csmvr\" (UID: \"d784e9cd-d5af-496e-abca-ce30096bb0d0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-csmvr"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.689501    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bc0c5b5-55bb-4339-8162-bb647b833006-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-pncxq\" (UID: \"0bc0c5b5-55bb-4339-8162-bb647b833006\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pncxq"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.689516    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/5c0e41b3-aa2d-4083-acb2-f0f68a29fcce-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-k6z2l\" (UID: \"5c0e41b3-aa2d-4083-acb2-f0f68a29fcce\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-k6z2l"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.689534    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7662a0cc-faaa-47da-90f9-f3a8907a0401-etcd-client\") pod \"apiserver-7bbb656c7d-nsdw7\" (UID: \"7662a0cc-faaa-47da-90f9-f3a8907a0401\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nsdw7"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.689556    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a835e0ec-4721-4824-8846-fcc7e12db3f9-trusted-ca-bundle\") pod \"apiserver-76f77b778f-jzx77\" (UID: \"a835e0ec-4721-4824-8846-fcc7e12db3f9\") " pod="openshift-apiserver/apiserver-76f77b778f-jzx77"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.689570    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5c0e41b3-aa2d-4083-acb2-f0f68a29fcce-images\") pod \"machine-api-operator-5694c8668f-k6z2l\" (UID: \"5c0e41b3-aa2d-4083-acb2-f0f68a29fcce\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-k6z2l"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.689584    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9zgg\" (UniqueName: \"kubernetes.io/projected/d784e9cd-d5af-496e-abca-ce30096bb0d0-kube-api-access-v9zgg\") pod \"authentication-operator-69f744f599-csmvr\" (UID: \"d784e9cd-d5af-496e-abca-ce30096bb0d0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-csmvr"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.689602    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18214bd2-9c3a-4737-885b-2b5c905311d8-service-ca-bundle\") pod \"router-default-5444994796-92dt7\" (UID: \"18214bd2-9c3a-4737-885b-2b5c905311d8\") " pod="openshift-ingress/router-default-5444994796-92dt7"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.689618    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2eb5b4cf-be45-4c3d-abbf-9a3e525d5ed3-serving-cert\") pod \"console-operator-58897d9998-mkxg7\" (UID: \"2eb5b4cf-be45-4c3d-abbf-9a3e525d5ed3\") " pod="openshift-console-operator/console-operator-58897d9998-mkxg7"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.689632    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d784e9cd-d5af-496e-abca-ce30096bb0d0-serving-cert\") pod \"authentication-operator-69f744f599-csmvr\" (UID: \"d784e9cd-d5af-496e-abca-ce30096bb0d0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-csmvr"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.689656    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ggbs\" (UniqueName: \"kubernetes.io/projected/a835e0ec-4721-4824-8846-fcc7e12db3f9-kube-api-access-9ggbs\") pod \"apiserver-76f77b778f-jzx77\" (UID: \"a835e0ec-4721-4824-8846-fcc7e12db3f9\") " pod="openshift-apiserver/apiserver-76f77b778f-jzx77"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.689675    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a835e0ec-4721-4824-8846-fcc7e12db3f9-config\") pod \"apiserver-76f77b778f-jzx77\" (UID: \"a835e0ec-4721-4824-8846-fcc7e12db3f9\") " pod="openshift-apiserver/apiserver-76f77b778f-jzx77"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.689690    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf-serving-cert\") pod \"route-controller-manager-6576b87f9c-tgpgm\" (UID: \"2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tgpgm"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.689705    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qvx5\" (UniqueName: \"kubernetes.io/projected/5edbd5a9-6c8b-4ef8-950f-58deaecf36ee-kube-api-access-4qvx5\") pod \"console-f9d7485db-9kgl8\" (UID: \"5edbd5a9-6c8b-4ef8-950f-58deaecf36ee\") " pod="openshift-console/console-f9d7485db-9kgl8"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.689723    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a835e0ec-4721-4824-8846-fcc7e12db3f9-node-pullsecrets\") pod \"apiserver-76f77b778f-jzx77\" (UID: \"a835e0ec-4721-4824-8846-fcc7e12db3f9\") " pod="openshift-apiserver/apiserver-76f77b778f-jzx77"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.689760    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c0e41b3-aa2d-4083-acb2-f0f68a29fcce-config\") pod \"machine-api-operator-5694c8668f-k6z2l\" (UID: \"5c0e41b3-aa2d-4083-acb2-f0f68a29fcce\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-k6z2l"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.689775    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7b9l\" (UniqueName: \"kubernetes.io/projected/5c0e41b3-aa2d-4083-acb2-f0f68a29fcce-kube-api-access-t7b9l\") pod \"machine-api-operator-5694c8668f-k6z2l\" (UID: \"5c0e41b3-aa2d-4083-acb2-f0f68a29fcce\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-k6z2l"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.689810    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjf27\" (UniqueName: \"kubernetes.io/projected/d4e38bce-6ae6-451b-aa9f-7a98dfa4d974-kube-api-access-rjf27\") pod \"openshift-config-operator-7777fb866f-6rbg9\" (UID: \"d4e38bce-6ae6-451b-aa9f-7a98dfa4d974\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6rbg9"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.689828    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5edbd5a9-6c8b-4ef8-950f-58deaecf36ee-service-ca\") pod \"console-f9d7485db-9kgl8\" (UID: \"5edbd5a9-6c8b-4ef8-950f-58deaecf36ee\") " pod="openshift-console/console-f9d7485db-9kgl8"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.689941    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d784e9cd-d5af-496e-abca-ce30096bb0d0-config\") pod \"authentication-operator-69f744f599-csmvr\" (UID: \"d784e9cd-d5af-496e-abca-ce30096bb0d0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-csmvr"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.689956    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d581333-2d6e-44d6-a6fc-b90c3b16baad-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-kfjm5\" (UID: \"7d581333-2d6e-44d6-a6fc-b90c3b16baad\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kfjm5"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.689970    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6vj7\" (UniqueName: \"kubernetes.io/projected/7d581333-2d6e-44d6-a6fc-b90c3b16baad-kube-api-access-q6vj7\") pod \"cluster-samples-operator-665b6dd947-kfjm5\" (UID: \"7d581333-2d6e-44d6-a6fc-b90c3b16baad\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kfjm5"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.690014    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a835e0ec-4721-4824-8846-fcc7e12db3f9-encryption-config\") pod \"apiserver-76f77b778f-jzx77\" (UID: \"a835e0ec-4721-4824-8846-fcc7e12db3f9\") " pod="openshift-apiserver/apiserver-76f77b778f-jzx77"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.690286    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/18214bd2-9c3a-4737-885b-2b5c905311d8-stats-auth\") pod \"router-default-5444994796-92dt7\" (UID: \"18214bd2-9c3a-4737-885b-2b5c905311d8\") " pod="openshift-ingress/router-default-5444994796-92dt7"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.690314    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a835e0ec-4721-4824-8846-fcc7e12db3f9-serving-cert\") pod \"apiserver-76f77b778f-jzx77\" (UID: \"a835e0ec-4721-4824-8846-fcc7e12db3f9\") " pod="openshift-apiserver/apiserver-76f77b778f-jzx77"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.690335    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2499559b-b31f-4dab-89a0-964964dc596e-audit-dir\") pod \"oauth-openshift-558db77b4-st79s\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") " pod="openshift-authentication/oauth-openshift-558db77b4-st79s"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.690365    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9a38d833-db72-4566-b139-7788730a502a-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-hrm7z\" (UID: \"9a38d833-db72-4566-b139-7788730a502a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hrm7z"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.690405    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7662a0cc-faaa-47da-90f9-f3a8907a0401-audit-dir\") pod \"apiserver-7bbb656c7d-nsdw7\" (UID: \"7662a0cc-faaa-47da-90f9-f3a8907a0401\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nsdw7"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.690433    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64zrc\" (UniqueName: \"kubernetes.io/projected/e190e098-9bc8-492f-9657-f6ccfb836f23-kube-api-access-64zrc\") pod \"packageserver-d55dfcdfc-2qdqs\" (UID: \"e190e098-9bc8-492f-9657-f6ccfb836f23\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2qdqs"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.690484    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bc0c5b5-55bb-4339-8162-bb647b833006-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-pncxq\" (UID: \"0bc0c5b5-55bb-4339-8162-bb647b833006\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pncxq"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.690504    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5edbd5a9-6c8b-4ef8-950f-58deaecf36ee-console-oauth-config\") pod \"console-f9d7485db-9kgl8\" (UID: \"5edbd5a9-6c8b-4ef8-950f-58deaecf36ee\") " pod="openshift-console/console-f9d7485db-9kgl8"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.690531    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-st79s\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") " pod="openshift-authentication/oauth-openshift-558db77b4-st79s"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.690568    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7406e06c-cbde-48f9-b5e7-57a2a86b5a4d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-xspkm\" (UID: \"7406e06c-cbde-48f9-b5e7-57a2a86b5a4d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xspkm"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.690601    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-st79s\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") " pod="openshift-authentication/oauth-openshift-558db77b4-st79s"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.690662    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-st79s\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") " pod="openshift-authentication/oauth-openshift-558db77b4-st79s"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.690687    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d784e9cd-d5af-496e-abca-ce30096bb0d0-service-ca-bundle\") pod \"authentication-operator-69f744f599-csmvr\" (UID: \"d784e9cd-d5af-496e-abca-ce30096bb0d0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-csmvr"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.690720    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca0985ab-94f6-4f4e-b8b4-0ee710e01fcf-config\") pod \"openshift-apiserver-operator-796bbdcf4f-zr8dk\" (UID: \"ca0985ab-94f6-4f4e-b8b4-0ee710e01fcf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zr8dk"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.690767    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a38d833-db72-4566-b139-7788730a502a-config\") pod \"controller-manager-879f6c89f-hrm7z\" (UID: \"9a38d833-db72-4566-b139-7788730a502a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hrm7z"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.690789    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7662a0cc-faaa-47da-90f9-f3a8907a0401-encryption-config\") pod \"apiserver-7bbb656c7d-nsdw7\" (UID: \"7662a0cc-faaa-47da-90f9-f3a8907a0401\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nsdw7"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.690845    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.690849    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b24b\" (UniqueName: \"kubernetes.io/projected/ad1f04f2-f7c4-4bc6-9daf-0db7a0809206-kube-api-access-9b24b\") pod \"machine-approver-56656f9798-ldb64\" (UID: \"ad1f04f2-f7c4-4bc6-9daf-0db7a0809206\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ldb64"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.690926    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a250c56d-72fb-473d-98ce-c013e9d15b4a-metrics-tls\") pod \"dns-operator-744455d44c-nfww4\" (UID: \"a250c56d-72fb-473d-98ce-c013e9d15b4a\") " pod="openshift-dns-operator/dns-operator-744455d44c-nfww4"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.690948    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ck55\" (UniqueName: \"kubernetes.io/projected/0bc0c5b5-55bb-4339-8162-bb647b833006-kube-api-access-7ck55\") pod \"openshift-controller-manager-operator-756b6f6bc6-pncxq\" (UID: \"0bc0c5b5-55bb-4339-8162-bb647b833006\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pncxq"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.690972    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpv47\" (UniqueName: \"kubernetes.io/projected/d32c9cec-9f6c-4304-8bc9-d2e52128470a-kube-api-access-zpv47\") pod \"downloads-7954f5f757-g7hdt\" (UID: \"d32c9cec-9f6c-4304-8bc9-d2e52128470a\") " pod="openshift-console/downloads-7954f5f757-g7hdt"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.690995    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5edbd5a9-6c8b-4ef8-950f-58deaecf36ee-oauth-serving-cert\") pod \"console-f9d7485db-9kgl8\" (UID: \"5edbd5a9-6c8b-4ef8-950f-58deaecf36ee\") " pod="openshift-console/console-f9d7485db-9kgl8"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.691022    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4lws\" (UniqueName: \"kubernetes.io/projected/ca0985ab-94f6-4f4e-b8b4-0ee710e01fcf-kube-api-access-v4lws\") pod \"openshift-apiserver-operator-796bbdcf4f-zr8dk\" (UID: \"ca0985ab-94f6-4f4e-b8b4-0ee710e01fcf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zr8dk"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.691044    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a835e0ec-4721-4824-8846-fcc7e12db3f9-audit-dir\") pod \"apiserver-76f77b778f-jzx77\" (UID: \"a835e0ec-4721-4824-8846-fcc7e12db3f9\") " pod="openshift-apiserver/apiserver-76f77b778f-jzx77"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.691063    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e190e098-9bc8-492f-9657-f6ccfb836f23-apiservice-cert\") pod \"packageserver-d55dfcdfc-2qdqs\" (UID: \"e190e098-9bc8-492f-9657-f6ccfb836f23\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2qdqs"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.691088    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-st79s\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") " pod="openshift-authentication/oauth-openshift-558db77b4-st79s"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.691109    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/18214bd2-9c3a-4737-885b-2b5c905311d8-metrics-certs\") pod \"router-default-5444994796-92dt7\" (UID: \"18214bd2-9c3a-4737-885b-2b5c905311d8\") " pod="openshift-ingress/router-default-5444994796-92dt7"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.691145    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a835e0ec-4721-4824-8846-fcc7e12db3f9-etcd-client\") pod \"apiserver-76f77b778f-jzx77\" (UID: \"a835e0ec-4721-4824-8846-fcc7e12db3f9\") " pod="openshift-apiserver/apiserver-76f77b778f-jzx77"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.691175    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/7406e06c-cbde-48f9-b5e7-57a2a86b5a4d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-xspkm\" (UID: \"7406e06c-cbde-48f9-b5e7-57a2a86b5a4d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xspkm"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.691202    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9a38d833-db72-4566-b139-7788730a502a-client-ca\") pod \"controller-manager-879f6c89f-hrm7z\" (UID: \"9a38d833-db72-4566-b139-7788730a502a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hrm7z"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.691223    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7662a0cc-faaa-47da-90f9-f3a8907a0401-audit-policies\") pod \"apiserver-7bbb656c7d-nsdw7\" (UID: \"7662a0cc-faaa-47da-90f9-f3a8907a0401\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nsdw7"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.691269    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-st79s\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") " pod="openshift-authentication/oauth-openshift-558db77b4-st79s"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.691293    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-st79s\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") " pod="openshift-authentication/oauth-openshift-558db77b4-st79s"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.691314    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-st79s\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") " pod="openshift-authentication/oauth-openshift-558db77b4-st79s"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.691338    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e190e098-9bc8-492f-9657-f6ccfb836f23-webhook-cert\") pod \"packageserver-d55dfcdfc-2qdqs\" (UID: \"e190e098-9bc8-492f-9657-f6ccfb836f23\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2qdqs"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.691370    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/d4e38bce-6ae6-451b-aa9f-7a98dfa4d974-available-featuregates\") pod \"openshift-config-operator-7777fb866f-6rbg9\" (UID: \"d4e38bce-6ae6-451b-aa9f-7a98dfa4d974\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6rbg9"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.691390    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2499559b-b31f-4dab-89a0-964964dc596e-audit-policies\") pod \"oauth-openshift-558db77b4-st79s\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") " pod="openshift-authentication/oauth-openshift-558db77b4-st79s"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.691409    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5z67\" (UniqueName: \"kubernetes.io/projected/2499559b-b31f-4dab-89a0-964964dc596e-kube-api-access-l5z67\") pod \"oauth-openshift-558db77b4-st79s\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") " pod="openshift-authentication/oauth-openshift-558db77b4-st79s"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.691430    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5edbd5a9-6c8b-4ef8-950f-58deaecf36ee-console-serving-cert\") pod \"console-f9d7485db-9kgl8\" (UID: \"5edbd5a9-6c8b-4ef8-950f-58deaecf36ee\") " pod="openshift-console/console-f9d7485db-9kgl8"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.691449    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-st79s\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") " pod="openshift-authentication/oauth-openshift-558db77b4-st79s"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.691467    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7662a0cc-faaa-47da-90f9-f3a8907a0401-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-nsdw7\" (UID: \"7662a0cc-faaa-47da-90f9-f3a8907a0401\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nsdw7"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.691500    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a835e0ec-4721-4824-8846-fcc7e12db3f9-etcd-serving-ca\") pod \"apiserver-76f77b778f-jzx77\" (UID: \"a835e0ec-4721-4824-8846-fcc7e12db3f9\") " pod="openshift-apiserver/apiserver-76f77b778f-jzx77"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.691518    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf-config\") pod \"route-controller-manager-6576b87f9c-tgpgm\" (UID: \"2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tgpgm"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.691540    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wb6sw\" (UniqueName: \"kubernetes.io/projected/2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf-kube-api-access-wb6sw\") pod \"route-controller-manager-6576b87f9c-tgpgm\" (UID: \"2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tgpgm"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.691576    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7662a0cc-faaa-47da-90f9-f3a8907a0401-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-nsdw7\" (UID: \"7662a0cc-faaa-47da-90f9-f3a8907a0401\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nsdw7"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.691597    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a835e0ec-4721-4824-8846-fcc7e12db3f9-image-import-ca\") pod \"apiserver-76f77b778f-jzx77\" (UID: \"a835e0ec-4721-4824-8846-fcc7e12db3f9\") " pod="openshift-apiserver/apiserver-76f77b778f-jzx77"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.691615    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e190e098-9bc8-492f-9657-f6ccfb836f23-tmpfs\") pod \"packageserver-d55dfcdfc-2qdqs\" (UID: \"e190e098-9bc8-492f-9657-f6ccfb836f23\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2qdqs"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.691633    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5edbd5a9-6c8b-4ef8-950f-58deaecf36ee-trusted-ca-bundle\") pod \"console-f9d7485db-9kgl8\" (UID: \"5edbd5a9-6c8b-4ef8-950f-58deaecf36ee\") " pod="openshift-console/console-f9d7485db-9kgl8"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.691692    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq5zm\" (UniqueName: \"kubernetes.io/projected/9a38d833-db72-4566-b139-7788730a502a-kube-api-access-tq5zm\") pod \"controller-manager-879f6c89f-hrm7z\" (UID: \"9a38d833-db72-4566-b139-7788730a502a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hrm7z"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.704139    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-c2cgf"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.704626    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567022-wf5nv"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.705507    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-9kgl8"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.705536    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pncxq"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.705553    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xspkm"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.705844    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-c2cgf"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.712420    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-csmvr"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.712479    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-mkxg7"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.712493    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kfjm5"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.717454    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-nsdw7"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.717744    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-jzx77"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.718779    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qwghv"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.720559    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zr8dk"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.721192    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.721990    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-st79s"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.722631    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7fsc"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.723783    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m7cfz"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.726031    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-fjhxb"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.727487    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2qdqs"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.727537    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-6rbg9"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.728700    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qxnn6"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.730558    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567010-d69bc"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.732543    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mzjxx"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.733446    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m4xlq"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.733472    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bdpg6"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.735525    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sk8hp"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.735552    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-84pdq"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.738506    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.743987    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-4dp6q"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.745709    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-nfww4"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.747166    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-c2cgf"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.748548    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-k6z2l"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.748604    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-7ckfm"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.749337    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vqxff"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.749361    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jkk9s"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.749374    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-sxbnn"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.749384    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-w5gdn"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.749393    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-7hzc8"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.749401    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7ckfm"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.749410    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567022-wf5nv"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.749418    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-klbh8"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.749490    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7ckfm"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.750134    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-6w7m9"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.750898    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6w7m9"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.751157    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-jgzlv"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.752082    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-jgzlv"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.752102    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6w7m9"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.753178    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-jgzlv"]
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.758428    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.778602    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.792374    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d86jz\" (UniqueName: \"kubernetes.io/projected/7406e06c-cbde-48f9-b5e7-57a2a86b5a4d-kube-api-access-d86jz\") pod \"cluster-image-registry-operator-dc59b4c8b-xspkm\" (UID: \"7406e06c-cbde-48f9-b5e7-57a2a86b5a4d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xspkm"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.792410    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5edbd5a9-6c8b-4ef8-950f-58deaecf36ee-console-config\") pod \"console-f9d7485db-9kgl8\" (UID: \"5edbd5a9-6c8b-4ef8-950f-58deaecf36ee\") " pod="openshift-console/console-f9d7485db-9kgl8"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.792430    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-st79s\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") " pod="openshift-authentication/oauth-openshift-558db77b4-st79s"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.792448    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2eb5b4cf-be45-4c3d-abbf-9a3e525d5ed3-trusted-ca\") pod \"console-operator-58897d9998-mkxg7\" (UID: \"2eb5b4cf-be45-4c3d-abbf-9a3e525d5ed3\") " pod="openshift-console-operator/console-operator-58897d9998-mkxg7"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.792467    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/ad1f04f2-f7c4-4bc6-9daf-0db7a0809206-machine-approver-tls\") pod \"machine-approver-56656f9798-ldb64\" (UID: \"ad1f04f2-f7c4-4bc6-9daf-0db7a0809206\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ldb64"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.792482    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca0985ab-94f6-4f4e-b8b4-0ee710e01fcf-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-zr8dk\" (UID: \"ca0985ab-94f6-4f4e-b8b4-0ee710e01fcf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zr8dk"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.792498    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-st79s\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") " pod="openshift-authentication/oauth-openshift-558db77b4-st79s"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.792515    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad1f04f2-f7c4-4bc6-9daf-0db7a0809206-config\") pod \"machine-approver-56656f9798-ldb64\" (UID: \"ad1f04f2-f7c4-4bc6-9daf-0db7a0809206\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ldb64"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.793186    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad1f04f2-f7c4-4bc6-9daf-0db7a0809206-config\") pod \"machine-approver-56656f9798-ldb64\" (UID: \"ad1f04f2-f7c4-4bc6-9daf-0db7a0809206\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ldb64"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.793237    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf-client-ca\") pod \"route-controller-manager-6576b87f9c-tgpgm\" (UID: \"2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tgpgm"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.793288    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xj7hq\" (UniqueName: \"kubernetes.io/projected/7662a0cc-faaa-47da-90f9-f3a8907a0401-kube-api-access-xj7hq\") pod \"apiserver-7bbb656c7d-nsdw7\" (UID: \"7662a0cc-faaa-47da-90f9-f3a8907a0401\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nsdw7"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.793307    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ad1f04f2-f7c4-4bc6-9daf-0db7a0809206-auth-proxy-config\") pod \"machine-approver-56656f9798-ldb64\" (UID: \"ad1f04f2-f7c4-4bc6-9daf-0db7a0809206\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ldb64"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.793323    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7662a0cc-faaa-47da-90f9-f3a8907a0401-serving-cert\") pod \"apiserver-7bbb656c7d-nsdw7\" (UID: \"7662a0cc-faaa-47da-90f9-f3a8907a0401\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nsdw7"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.793858    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a835e0ec-4721-4824-8846-fcc7e12db3f9-audit\") pod \"apiserver-76f77b778f-jzx77\" (UID: \"a835e0ec-4721-4824-8846-fcc7e12db3f9\") " pod="openshift-apiserver/apiserver-76f77b778f-jzx77"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.793929    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4e38bce-6ae6-451b-aa9f-7a98dfa4d974-serving-cert\") pod \"openshift-config-operator-7777fb866f-6rbg9\" (UID: \"d4e38bce-6ae6-451b-aa9f-7a98dfa4d974\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6rbg9"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.793952    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-png7p\" (UniqueName: \"kubernetes.io/projected/18214bd2-9c3a-4737-885b-2b5c905311d8-kube-api-access-png7p\") pod \"router-default-5444994796-92dt7\" (UID: \"18214bd2-9c3a-4737-885b-2b5c905311d8\") " pod="openshift-ingress/router-default-5444994796-92dt7"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.793968    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a38d833-db72-4566-b139-7788730a502a-serving-cert\") pod \"controller-manager-879f6c89f-hrm7z\" (UID: \"9a38d833-db72-4566-b139-7788730a502a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hrm7z"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.793989    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6xmg\" (UniqueName: \"kubernetes.io/projected/a250c56d-72fb-473d-98ce-c013e9d15b4a-kube-api-access-v6xmg\") pod \"dns-operator-744455d44c-nfww4\" (UID: \"a250c56d-72fb-473d-98ce-c013e9d15b4a\") " pod="openshift-dns-operator/dns-operator-744455d44c-nfww4"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.793656    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf-client-ca\") pod \"route-controller-manager-6576b87f9c-tgpgm\" (UID: \"2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tgpgm"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.794008    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-st79s\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") " pod="openshift-authentication/oauth-openshift-558db77b4-st79s"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.793759    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2eb5b4cf-be45-4c3d-abbf-9a3e525d5ed3-trusted-ca\") pod \"console-operator-58897d9998-mkxg7\" (UID: \"2eb5b4cf-be45-4c3d-abbf-9a3e525d5ed3\") " pod="openshift-console-operator/console-operator-58897d9998-mkxg7"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.794045    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2eb5b4cf-be45-4c3d-abbf-9a3e525d5ed3-config\") pod \"console-operator-58897d9998-mkxg7\" (UID: \"2eb5b4cf-be45-4c3d-abbf-9a3e525d5ed3\") " pod="openshift-console-operator/console-operator-58897d9998-mkxg7"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.794072    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/18214bd2-9c3a-4737-885b-2b5c905311d8-default-certificate\") pod \"router-default-5444994796-92dt7\" (UID: \"18214bd2-9c3a-4737-885b-2b5c905311d8\") " pod="openshift-ingress/router-default-5444994796-92dt7"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.794091    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d784e9cd-d5af-496e-abca-ce30096bb0d0-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-csmvr\" (UID: \"d784e9cd-d5af-496e-abca-ce30096bb0d0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-csmvr"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.794108    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7662a0cc-faaa-47da-90f9-f3a8907a0401-etcd-client\") pod \"apiserver-7bbb656c7d-nsdw7\" (UID: \"7662a0cc-faaa-47da-90f9-f3a8907a0401\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nsdw7"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.794125    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bc0c5b5-55bb-4339-8162-bb647b833006-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-pncxq\" (UID: \"0bc0c5b5-55bb-4339-8162-bb647b833006\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pncxq"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.794124    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ad1f04f2-f7c4-4bc6-9daf-0db7a0809206-auth-proxy-config\") pod \"machine-approver-56656f9798-ldb64\" (UID: \"ad1f04f2-f7c4-4bc6-9daf-0db7a0809206\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ldb64"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.794144    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/5c0e41b3-aa2d-4083-acb2-f0f68a29fcce-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-k6z2l\" (UID: \"5c0e41b3-aa2d-4083-acb2-f0f68a29fcce\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-k6z2l"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.794218    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18214bd2-9c3a-4737-885b-2b5c905311d8-service-ca-bundle\") pod \"router-default-5444994796-92dt7\" (UID: \"18214bd2-9c3a-4737-885b-2b5c905311d8\") " pod="openshift-ingress/router-default-5444994796-92dt7"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.794240    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a835e0ec-4721-4824-8846-fcc7e12db3f9-trusted-ca-bundle\") pod \"apiserver-76f77b778f-jzx77\" (UID: \"a835e0ec-4721-4824-8846-fcc7e12db3f9\") " pod="openshift-apiserver/apiserver-76f77b778f-jzx77"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.794283    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5c0e41b3-aa2d-4083-acb2-f0f68a29fcce-images\") pod \"machine-api-operator-5694c8668f-k6z2l\" (UID: \"5c0e41b3-aa2d-4083-acb2-f0f68a29fcce\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-k6z2l"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.794307    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9zgg\" (UniqueName: \"kubernetes.io/projected/d784e9cd-d5af-496e-abca-ce30096bb0d0-kube-api-access-v9zgg\") pod \"authentication-operator-69f744f599-csmvr\" (UID: \"d784e9cd-d5af-496e-abca-ce30096bb0d0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-csmvr"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.794335    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2eb5b4cf-be45-4c3d-abbf-9a3e525d5ed3-serving-cert\") pod \"console-operator-58897d9998-mkxg7\" (UID: \"2eb5b4cf-be45-4c3d-abbf-9a3e525d5ed3\") " pod="openshift-console-operator/console-operator-58897d9998-mkxg7"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.794374    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ggbs\" (UniqueName: \"kubernetes.io/projected/a835e0ec-4721-4824-8846-fcc7e12db3f9-kube-api-access-9ggbs\") pod \"apiserver-76f77b778f-jzx77\" (UID: \"a835e0ec-4721-4824-8846-fcc7e12db3f9\") " pod="openshift-apiserver/apiserver-76f77b778f-jzx77"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.794400    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d784e9cd-d5af-496e-abca-ce30096bb0d0-serving-cert\") pod \"authentication-operator-69f744f599-csmvr\" (UID: \"d784e9cd-d5af-496e-abca-ce30096bb0d0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-csmvr"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.794424    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a835e0ec-4721-4824-8846-fcc7e12db3f9-node-pullsecrets\") pod \"apiserver-76f77b778f-jzx77\" (UID: \"a835e0ec-4721-4824-8846-fcc7e12db3f9\") " pod="openshift-apiserver/apiserver-76f77b778f-jzx77"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.794461    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a835e0ec-4721-4824-8846-fcc7e12db3f9-config\") pod \"apiserver-76f77b778f-jzx77\" (UID: \"a835e0ec-4721-4824-8846-fcc7e12db3f9\") " pod="openshift-apiserver/apiserver-76f77b778f-jzx77"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.794502    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf-serving-cert\") pod \"route-controller-manager-6576b87f9c-tgpgm\" (UID: \"2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tgpgm"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.794530    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qvx5\" (UniqueName: \"kubernetes.io/projected/5edbd5a9-6c8b-4ef8-950f-58deaecf36ee-kube-api-access-4qvx5\") pod \"console-f9d7485db-9kgl8\" (UID: \"5edbd5a9-6c8b-4ef8-950f-58deaecf36ee\") " pod="openshift-console/console-f9d7485db-9kgl8"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.794554    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c0e41b3-aa2d-4083-acb2-f0f68a29fcce-config\") pod \"machine-api-operator-5694c8668f-k6z2l\" (UID: \"5c0e41b3-aa2d-4083-acb2-f0f68a29fcce\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-k6z2l"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.794581    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7b9l\" (UniqueName: \"kubernetes.io/projected/5c0e41b3-aa2d-4083-acb2-f0f68a29fcce-kube-api-access-t7b9l\") pod \"machine-api-operator-5694c8668f-k6z2l\" (UID: \"5c0e41b3-aa2d-4083-acb2-f0f68a29fcce\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-k6z2l"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.794585    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a835e0ec-4721-4824-8846-fcc7e12db3f9-audit\") pod \"apiserver-76f77b778f-jzx77\" (UID: \"a835e0ec-4721-4824-8846-fcc7e12db3f9\") " pod="openshift-apiserver/apiserver-76f77b778f-jzx77"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.794950    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2eb5b4cf-be45-4c3d-abbf-9a3e525d5ed3-config\") pod \"console-operator-58897d9998-mkxg7\" (UID: \"2eb5b4cf-be45-4c3d-abbf-9a3e525d5ed3\") " pod="openshift-console-operator/console-operator-58897d9998-mkxg7"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.795558    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5edbd5a9-6c8b-4ef8-950f-58deaecf36ee-service-ca\") pod \"console-f9d7485db-9kgl8\" (UID: \"5edbd5a9-6c8b-4ef8-950f-58deaecf36ee\") " pod="openshift-console/console-f9d7485db-9kgl8"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.794598    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5edbd5a9-6c8b-4ef8-950f-58deaecf36ee-service-ca\") pod \"console-f9d7485db-9kgl8\" (UID: \"5edbd5a9-6c8b-4ef8-950f-58deaecf36ee\") " pod="openshift-console/console-f9d7485db-9kgl8"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.795647    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d784e9cd-d5af-496e-abca-ce30096bb0d0-config\") pod \"authentication-operator-69f744f599-csmvr\" (UID: \"d784e9cd-d5af-496e-abca-ce30096bb0d0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-csmvr"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.795689    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjf27\" (UniqueName: \"kubernetes.io/projected/d4e38bce-6ae6-451b-aa9f-7a98dfa4d974-kube-api-access-rjf27\") pod \"openshift-config-operator-7777fb866f-6rbg9\" (UID: \"d4e38bce-6ae6-451b-aa9f-7a98dfa4d974\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6rbg9"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.795727    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a835e0ec-4721-4824-8846-fcc7e12db3f9-encryption-config\") pod \"apiserver-76f77b778f-jzx77\" (UID: \"a835e0ec-4721-4824-8846-fcc7e12db3f9\") " pod="openshift-apiserver/apiserver-76f77b778f-jzx77"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.795758    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d581333-2d6e-44d6-a6fc-b90c3b16baad-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-kfjm5\" (UID: \"7d581333-2d6e-44d6-a6fc-b90c3b16baad\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kfjm5"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.795786    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6vj7\" (UniqueName: \"kubernetes.io/projected/7d581333-2d6e-44d6-a6fc-b90c3b16baad-kube-api-access-q6vj7\") pod \"cluster-samples-operator-665b6dd947-kfjm5\" (UID: \"7d581333-2d6e-44d6-a6fc-b90c3b16baad\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kfjm5"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.795814    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9a38d833-db72-4566-b139-7788730a502a-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-hrm7z\" (UID: \"9a38d833-db72-4566-b139-7788730a502a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hrm7z"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.795840    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7662a0cc-faaa-47da-90f9-f3a8907a0401-audit-dir\") pod \"apiserver-7bbb656c7d-nsdw7\" (UID: \"7662a0cc-faaa-47da-90f9-f3a8907a0401\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nsdw7"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.795867    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/18214bd2-9c3a-4737-885b-2b5c905311d8-stats-auth\") pod \"router-default-5444994796-92dt7\" (UID: \"18214bd2-9c3a-4737-885b-2b5c905311d8\") " pod="openshift-ingress/router-default-5444994796-92dt7"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.795925    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a835e0ec-4721-4824-8846-fcc7e12db3f9-serving-cert\") pod \"apiserver-76f77b778f-jzx77\" (UID: \"a835e0ec-4721-4824-8846-fcc7e12db3f9\") " pod="openshift-apiserver/apiserver-76f77b778f-jzx77"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.795965    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2499559b-b31f-4dab-89a0-964964dc596e-audit-dir\") pod \"oauth-openshift-558db77b4-st79s\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") " pod="openshift-authentication/oauth-openshift-558db77b4-st79s"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.795985    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64zrc\" (UniqueName: \"kubernetes.io/projected/e190e098-9bc8-492f-9657-f6ccfb836f23-kube-api-access-64zrc\") pod \"packageserver-d55dfcdfc-2qdqs\" (UID: \"e190e098-9bc8-492f-9657-f6ccfb836f23\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2qdqs"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.796004    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bc0c5b5-55bb-4339-8162-bb647b833006-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-pncxq\" (UID: \"0bc0c5b5-55bb-4339-8162-bb647b833006\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pncxq"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.796021    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5edbd5a9-6c8b-4ef8-950f-58deaecf36ee-console-oauth-config\") pod \"console-f9d7485db-9kgl8\" (UID: \"5edbd5a9-6c8b-4ef8-950f-58deaecf36ee\") " pod="openshift-console/console-f9d7485db-9kgl8"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.796042    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7406e06c-cbde-48f9-b5e7-57a2a86b5a4d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-xspkm\" (UID: \"7406e06c-cbde-48f9-b5e7-57a2a86b5a4d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xspkm"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.796061    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-st79s\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") " pod="openshift-authentication/oauth-openshift-558db77b4-st79s"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.796093    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-st79s\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") " pod="openshift-authentication/oauth-openshift-558db77b4-st79s"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.796116    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca0985ab-94f6-4f4e-b8b4-0ee710e01fcf-config\") pod \"openshift-apiserver-operator-796bbdcf4f-zr8dk\" (UID: \"ca0985ab-94f6-4f4e-b8b4-0ee710e01fcf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zr8dk"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.796133    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a38d833-db72-4566-b139-7788730a502a-config\") pod \"controller-manager-879f6c89f-hrm7z\" (UID: \"9a38d833-db72-4566-b139-7788730a502a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hrm7z"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.796156    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7662a0cc-faaa-47da-90f9-f3a8907a0401-encryption-config\") pod \"apiserver-7bbb656c7d-nsdw7\" (UID: \"7662a0cc-faaa-47da-90f9-f3a8907a0401\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nsdw7"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.796175    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-st79s\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") " pod="openshift-authentication/oauth-openshift-558db77b4-st79s"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.796193    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d784e9cd-d5af-496e-abca-ce30096bb0d0-service-ca-bundle\") pod \"authentication-operator-69f744f599-csmvr\" (UID: \"d784e9cd-d5af-496e-abca-ce30096bb0d0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-csmvr"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.796216    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ck55\" (UniqueName: \"kubernetes.io/projected/0bc0c5b5-55bb-4339-8162-bb647b833006-kube-api-access-7ck55\") pod \"openshift-controller-manager-operator-756b6f6bc6-pncxq\" (UID: \"0bc0c5b5-55bb-4339-8162-bb647b833006\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pncxq"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.796234    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9b24b\" (UniqueName: \"kubernetes.io/projected/ad1f04f2-f7c4-4bc6-9daf-0db7a0809206-kube-api-access-9b24b\") pod \"machine-approver-56656f9798-ldb64\" (UID: \"ad1f04f2-f7c4-4bc6-9daf-0db7a0809206\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ldb64"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.796255    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a250c56d-72fb-473d-98ce-c013e9d15b4a-metrics-tls\") pod \"dns-operator-744455d44c-nfww4\" (UID: \"a250c56d-72fb-473d-98ce-c013e9d15b4a\") " pod="openshift-dns-operator/dns-operator-744455d44c-nfww4"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.796287    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpv47\" (UniqueName: \"kubernetes.io/projected/d32c9cec-9f6c-4304-8bc9-d2e52128470a-kube-api-access-zpv47\") pod \"downloads-7954f5f757-g7hdt\" (UID: \"d32c9cec-9f6c-4304-8bc9-d2e52128470a\") " pod="openshift-console/downloads-7954f5f757-g7hdt"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.796307    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5edbd5a9-6c8b-4ef8-950f-58deaecf36ee-oauth-serving-cert\") pod \"console-f9d7485db-9kgl8\" (UID: \"5edbd5a9-6c8b-4ef8-950f-58deaecf36ee\") " pod="openshift-console/console-f9d7485db-9kgl8"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.796332    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4lws\" (UniqueName: \"kubernetes.io/projected/ca0985ab-94f6-4f4e-b8b4-0ee710e01fcf-kube-api-access-v4lws\") pod \"openshift-apiserver-operator-796bbdcf4f-zr8dk\" (UID: \"ca0985ab-94f6-4f4e-b8b4-0ee710e01fcf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zr8dk"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.796350    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a835e0ec-4721-4824-8846-fcc7e12db3f9-audit-dir\") pod \"apiserver-76f77b778f-jzx77\" (UID: \"a835e0ec-4721-4824-8846-fcc7e12db3f9\") " pod="openshift-apiserver/apiserver-76f77b778f-jzx77"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.796366    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e190e098-9bc8-492f-9657-f6ccfb836f23-apiservice-cert\") pod \"packageserver-d55dfcdfc-2qdqs\" (UID: \"e190e098-9bc8-492f-9657-f6ccfb836f23\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2qdqs"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.796384    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-st79s\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") " pod="openshift-authentication/oauth-openshift-558db77b4-st79s"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.796409    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9a38d833-db72-4566-b139-7788730a502a-client-ca\") pod \"controller-manager-879f6c89f-hrm7z\" (UID: \"9a38d833-db72-4566-b139-7788730a502a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hrm7z"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.796427    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7662a0cc-faaa-47da-90f9-f3a8907a0401-audit-policies\") pod \"apiserver-7bbb656c7d-nsdw7\" (UID: \"7662a0cc-faaa-47da-90f9-f3a8907a0401\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nsdw7"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.796446    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/18214bd2-9c3a-4737-885b-2b5c905311d8-metrics-certs\") pod \"router-default-5444994796-92dt7\" (UID: \"18214bd2-9c3a-4737-885b-2b5c905311d8\") " pod="openshift-ingress/router-default-5444994796-92dt7"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.796461    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a835e0ec-4721-4824-8846-fcc7e12db3f9-etcd-client\") pod \"apiserver-76f77b778f-jzx77\" (UID: \"a835e0ec-4721-4824-8846-fcc7e12db3f9\") " pod="openshift-apiserver/apiserver-76f77b778f-jzx77"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.796480    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/7406e06c-cbde-48f9-b5e7-57a2a86b5a4d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-xspkm\" (UID: \"7406e06c-cbde-48f9-b5e7-57a2a86b5a4d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xspkm"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.796512    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-st79s\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") " pod="openshift-authentication/oauth-openshift-558db77b4-st79s"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.796505    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a835e0ec-4721-4824-8846-fcc7e12db3f9-trusted-ca-bundle\") pod \"apiserver-76f77b778f-jzx77\" (UID: \"a835e0ec-4721-4824-8846-fcc7e12db3f9\") " pod="openshift-apiserver/apiserver-76f77b778f-jzx77"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.796535    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e190e098-9bc8-492f-9657-f6ccfb836f23-webhook-cert\") pod \"packageserver-d55dfcdfc-2qdqs\" (UID: \"e190e098-9bc8-492f-9657-f6ccfb836f23\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2qdqs"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.796557    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/d4e38bce-6ae6-451b-aa9f-7a98dfa4d974-available-featuregates\") pod \"openshift-config-operator-7777fb866f-6rbg9\" (UID: \"d4e38bce-6ae6-451b-aa9f-7a98dfa4d974\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6rbg9"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.796574    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2499559b-b31f-4dab-89a0-964964dc596e-audit-policies\") pod \"oauth-openshift-558db77b4-st79s\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") " pod="openshift-authentication/oauth-openshift-558db77b4-st79s"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.796593    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-st79s\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") " pod="openshift-authentication/oauth-openshift-558db77b4-st79s"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.796611    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-st79s\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") " pod="openshift-authentication/oauth-openshift-558db77b4-st79s"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.796629    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5edbd5a9-6c8b-4ef8-950f-58deaecf36ee-console-serving-cert\") pod \"console-f9d7485db-9kgl8\" (UID: \"5edbd5a9-6c8b-4ef8-950f-58deaecf36ee\") " pod="openshift-console/console-f9d7485db-9kgl8"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.796647    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-st79s\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") " pod="openshift-authentication/oauth-openshift-558db77b4-st79s"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.796665    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5z67\" (UniqueName: \"kubernetes.io/projected/2499559b-b31f-4dab-89a0-964964dc596e-kube-api-access-l5z67\") pod \"oauth-openshift-558db77b4-st79s\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") " pod="openshift-authentication/oauth-openshift-558db77b4-st79s"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.796684    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7662a0cc-faaa-47da-90f9-f3a8907a0401-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-nsdw7\" (UID: \"7662a0cc-faaa-47da-90f9-f3a8907a0401\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nsdw7"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.796701    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a835e0ec-4721-4824-8846-fcc7e12db3f9-etcd-serving-ca\") pod \"apiserver-76f77b778f-jzx77\" (UID: \"a835e0ec-4721-4824-8846-fcc7e12db3f9\") " pod="openshift-apiserver/apiserver-76f77b778f-jzx77"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.796730    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf-config\") pod \"route-controller-manager-6576b87f9c-tgpgm\" (UID: \"2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tgpgm"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.796749    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7662a0cc-faaa-47da-90f9-f3a8907a0401-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-nsdw7\" (UID: \"7662a0cc-faaa-47da-90f9-f3a8907a0401\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nsdw7"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.796767    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wb6sw\" (UniqueName: \"kubernetes.io/projected/2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf-kube-api-access-wb6sw\") pod \"route-controller-manager-6576b87f9c-tgpgm\" (UID: \"2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tgpgm"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.796801    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tq5zm\" (UniqueName: \"kubernetes.io/projected/9a38d833-db72-4566-b139-7788730a502a-kube-api-access-tq5zm\") pod \"controller-manager-879f6c89f-hrm7z\" (UID: \"9a38d833-db72-4566-b139-7788730a502a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hrm7z"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.796819    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a835e0ec-4721-4824-8846-fcc7e12db3f9-image-import-ca\") pod \"apiserver-76f77b778f-jzx77\" (UID: \"a835e0ec-4721-4824-8846-fcc7e12db3f9\") " pod="openshift-apiserver/apiserver-76f77b778f-jzx77"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.796930    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e190e098-9bc8-492f-9657-f6ccfb836f23-tmpfs\") pod \"packageserver-d55dfcdfc-2qdqs\" (UID: \"e190e098-9bc8-492f-9657-f6ccfb836f23\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2qdqs"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.796949    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5edbd5a9-6c8b-4ef8-950f-58deaecf36ee-trusted-ca-bundle\") pod \"console-f9d7485db-9kgl8\" (UID: \"5edbd5a9-6c8b-4ef8-950f-58deaecf36ee\") " pod="openshift-console/console-f9d7485db-9kgl8"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.796970    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftpl2\" (UniqueName: \"kubernetes.io/projected/2eb5b4cf-be45-4c3d-abbf-9a3e525d5ed3-kube-api-access-ftpl2\") pod \"console-operator-58897d9998-mkxg7\" (UID: \"2eb5b4cf-be45-4c3d-abbf-9a3e525d5ed3\") " pod="openshift-console-operator/console-operator-58897d9998-mkxg7"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.797003    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7406e06c-cbde-48f9-b5e7-57a2a86b5a4d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-xspkm\" (UID: \"7406e06c-cbde-48f9-b5e7-57a2a86b5a4d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xspkm"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.797342    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5c0e41b3-aa2d-4083-acb2-f0f68a29fcce-images\") pod \"machine-api-operator-5694c8668f-k6z2l\" (UID: \"5c0e41b3-aa2d-4083-acb2-f0f68a29fcce\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-k6z2l"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.797825    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7662a0cc-faaa-47da-90f9-f3a8907a0401-serving-cert\") pod \"apiserver-7bbb656c7d-nsdw7\" (UID: \"7662a0cc-faaa-47da-90f9-f3a8907a0401\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nsdw7"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.797851    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d784e9cd-d5af-496e-abca-ce30096bb0d0-config\") pod \"authentication-operator-69f744f599-csmvr\" (UID: \"d784e9cd-d5af-496e-abca-ce30096bb0d0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-csmvr"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.797912    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/ad1f04f2-f7c4-4bc6-9daf-0db7a0809206-machine-approver-tls\") pod \"machine-approver-56656f9798-ldb64\" (UID: \"ad1f04f2-f7c4-4bc6-9daf-0db7a0809206\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ldb64"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.798481    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4e38bce-6ae6-451b-aa9f-7a98dfa4d974-serving-cert\") pod \"openshift-config-operator-7777fb866f-6rbg9\" (UID: \"d4e38bce-6ae6-451b-aa9f-7a98dfa4d974\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6rbg9"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.798646    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a835e0ec-4721-4824-8846-fcc7e12db3f9-config\") pod \"apiserver-76f77b778f-jzx77\" (UID: \"a835e0ec-4721-4824-8846-fcc7e12db3f9\") " pod="openshift-apiserver/apiserver-76f77b778f-jzx77"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.798718    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a835e0ec-4721-4824-8846-fcc7e12db3f9-node-pullsecrets\") pod \"apiserver-76f77b778f-jzx77\" (UID: \"a835e0ec-4721-4824-8846-fcc7e12db3f9\") " pod="openshift-apiserver/apiserver-76f77b778f-jzx77"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.799154    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bc0c5b5-55bb-4339-8162-bb647b833006-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-pncxq\" (UID: \"0bc0c5b5-55bb-4339-8162-bb647b833006\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pncxq"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.799346    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c0e41b3-aa2d-4083-acb2-f0f68a29fcce-config\") pod \"machine-api-operator-5694c8668f-k6z2l\" (UID: \"5c0e41b3-aa2d-4083-acb2-f0f68a29fcce\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-k6z2l"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.793540    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-st79s\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") " pod="openshift-authentication/oauth-openshift-558db77b4-st79s"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.799585    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5edbd5a9-6c8b-4ef8-950f-58deaecf36ee-console-config\") pod \"console-f9d7485db-9kgl8\" (UID: \"5edbd5a9-6c8b-4ef8-950f-58deaecf36ee\") " pod="openshift-console/console-f9d7485db-9kgl8"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.799940    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/d4e38bce-6ae6-451b-aa9f-7a98dfa4d974-available-featuregates\") pod \"openshift-config-operator-7777fb866f-6rbg9\" (UID: \"d4e38bce-6ae6-451b-aa9f-7a98dfa4d974\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6rbg9"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.800108    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a38d833-db72-4566-b139-7788730a502a-serving-cert\") pod \"controller-manager-879f6c89f-hrm7z\" (UID: \"9a38d833-db72-4566-b139-7788730a502a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hrm7z"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.810415    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7662a0cc-faaa-47da-90f9-f3a8907a0401-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-nsdw7\" (UID: \"7662a0cc-faaa-47da-90f9-f3a8907a0401\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nsdw7"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.810768    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca0985ab-94f6-4f4e-b8b4-0ee710e01fcf-config\") pod \"openshift-apiserver-operator-796bbdcf4f-zr8dk\" (UID: \"ca0985ab-94f6-4f4e-b8b4-0ee710e01fcf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zr8dk"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.797011    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d784e9cd-d5af-496e-abca-ce30096bb0d0-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-csmvr\" (UID: \"d784e9cd-d5af-496e-abca-ce30096bb0d0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-csmvr"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.811072    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7662a0cc-faaa-47da-90f9-f3a8907a0401-etcd-client\") pod \"apiserver-7bbb656c7d-nsdw7\" (UID: \"7662a0cc-faaa-47da-90f9-f3a8907a0401\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nsdw7"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.811377    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2eb5b4cf-be45-4c3d-abbf-9a3e525d5ed3-serving-cert\") pod \"console-operator-58897d9998-mkxg7\" (UID: \"2eb5b4cf-be45-4c3d-abbf-9a3e525d5ed3\") " pod="openshift-console-operator/console-operator-58897d9998-mkxg7"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.811630    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5edbd5a9-6c8b-4ef8-950f-58deaecf36ee-console-serving-cert\") pod \"console-f9d7485db-9kgl8\" (UID: \"5edbd5a9-6c8b-4ef8-950f-58deaecf36ee\") " pod="openshift-console/console-f9d7485db-9kgl8"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.811665    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/7406e06c-cbde-48f9-b5e7-57a2a86b5a4d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-xspkm\" (UID: \"7406e06c-cbde-48f9-b5e7-57a2a86b5a4d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xspkm"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.811985    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-st79s\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") " pod="openshift-authentication/oauth-openshift-558db77b4-st79s"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.812126    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d581333-2d6e-44d6-a6fc-b90c3b16baad-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-kfjm5\" (UID: \"7d581333-2d6e-44d6-a6fc-b90c3b16baad\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kfjm5"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.812433    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a835e0ec-4721-4824-8846-fcc7e12db3f9-encryption-config\") pod \"apiserver-76f77b778f-jzx77\" (UID: \"a835e0ec-4721-4824-8846-fcc7e12db3f9\") " pod="openshift-apiserver/apiserver-76f77b778f-jzx77"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.812467    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf-serving-cert\") pod \"route-controller-manager-6576b87f9c-tgpgm\" (UID: \"2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tgpgm"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.812752    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2499559b-b31f-4dab-89a0-964964dc596e-audit-policies\") pod \"oauth-openshift-558db77b4-st79s\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") " pod="openshift-authentication/oauth-openshift-558db77b4-st79s"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.813295    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7662a0cc-faaa-47da-90f9-f3a8907a0401-audit-policies\") pod \"apiserver-7bbb656c7d-nsdw7\" (UID: \"7662a0cc-faaa-47da-90f9-f3a8907a0401\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nsdw7"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.813412    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d784e9cd-d5af-496e-abca-ce30096bb0d0-serving-cert\") pod \"authentication-operator-69f744f599-csmvr\" (UID: \"d784e9cd-d5af-496e-abca-ce30096bb0d0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-csmvr"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.813292    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-st79s\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") " pod="openshift-authentication/oauth-openshift-558db77b4-st79s"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.814756    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a835e0ec-4721-4824-8846-fcc7e12db3f9-image-import-ca\") pod \"apiserver-76f77b778f-jzx77\" (UID: \"a835e0ec-4721-4824-8846-fcc7e12db3f9\") " pod="openshift-apiserver/apiserver-76f77b778f-jzx77"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.814903    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf-config\") pod \"route-controller-manager-6576b87f9c-tgpgm\" (UID: \"2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tgpgm"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.815656    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7662a0cc-faaa-47da-90f9-f3a8907a0401-audit-dir\") pod \"apiserver-7bbb656c7d-nsdw7\" (UID: \"7662a0cc-faaa-47da-90f9-f3a8907a0401\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nsdw7"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.815673    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.815821    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7662a0cc-faaa-47da-90f9-f3a8907a0401-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-nsdw7\" (UID: \"7662a0cc-faaa-47da-90f9-f3a8907a0401\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nsdw7"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.815836    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-st79s\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") " pod="openshift-authentication/oauth-openshift-558db77b4-st79s"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.816088    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2499559b-b31f-4dab-89a0-964964dc596e-audit-dir\") pod \"oauth-openshift-558db77b4-st79s\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") " pod="openshift-authentication/oauth-openshift-558db77b4-st79s"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.816152    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9a38d833-db72-4566-b139-7788730a502a-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-hrm7z\" (UID: \"9a38d833-db72-4566-b139-7788730a502a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hrm7z"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.816992    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5edbd5a9-6c8b-4ef8-950f-58deaecf36ee-trusted-ca-bundle\") pod \"console-f9d7485db-9kgl8\" (UID: \"5edbd5a9-6c8b-4ef8-950f-58deaecf36ee\") " pod="openshift-console/console-f9d7485db-9kgl8"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.817630    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d784e9cd-d5af-496e-abca-ce30096bb0d0-service-ca-bundle\") pod \"authentication-operator-69f744f599-csmvr\" (UID: \"d784e9cd-d5af-496e-abca-ce30096bb0d0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-csmvr"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.817726    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a835e0ec-4721-4824-8846-fcc7e12db3f9-audit-dir\") pod \"apiserver-76f77b778f-jzx77\" (UID: \"a835e0ec-4721-4824-8846-fcc7e12db3f9\") " pod="openshift-apiserver/apiserver-76f77b778f-jzx77"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.817942    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e190e098-9bc8-492f-9657-f6ccfb836f23-tmpfs\") pod \"packageserver-d55dfcdfc-2qdqs\" (UID: \"e190e098-9bc8-492f-9657-f6ccfb836f23\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2qdqs"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.818508    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.818566    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9a38d833-db72-4566-b139-7788730a502a-client-ca\") pod \"controller-manager-879f6c89f-hrm7z\" (UID: \"9a38d833-db72-4566-b139-7788730a502a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hrm7z"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.818599    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7406e06c-cbde-48f9-b5e7-57a2a86b5a4d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-xspkm\" (UID: \"7406e06c-cbde-48f9-b5e7-57a2a86b5a4d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xspkm"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.818819    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-st79s\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") " pod="openshift-authentication/oauth-openshift-558db77b4-st79s"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.819199    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5edbd5a9-6c8b-4ef8-950f-58deaecf36ee-oauth-serving-cert\") pod \"console-f9d7485db-9kgl8\" (UID: \"5edbd5a9-6c8b-4ef8-950f-58deaecf36ee\") " pod="openshift-console/console-f9d7485db-9kgl8"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.820187    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a835e0ec-4721-4824-8846-fcc7e12db3f9-etcd-serving-ca\") pod \"apiserver-76f77b778f-jzx77\" (UID: \"a835e0ec-4721-4824-8846-fcc7e12db3f9\") " pod="openshift-apiserver/apiserver-76f77b778f-jzx77"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.820470    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca0985ab-94f6-4f4e-b8b4-0ee710e01fcf-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-zr8dk\" (UID: \"ca0985ab-94f6-4f4e-b8b4-0ee710e01fcf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zr8dk"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.821085    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7662a0cc-faaa-47da-90f9-f3a8907a0401-encryption-config\") pod \"apiserver-7bbb656c7d-nsdw7\" (UID: \"7662a0cc-faaa-47da-90f9-f3a8907a0401\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nsdw7"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.824854    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bc0c5b5-55bb-4339-8162-bb647b833006-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-pncxq\" (UID: \"0bc0c5b5-55bb-4339-8162-bb647b833006\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pncxq"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.825035    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e190e098-9bc8-492f-9657-f6ccfb836f23-apiservice-cert\") pod \"packageserver-d55dfcdfc-2qdqs\" (UID: \"e190e098-9bc8-492f-9657-f6ccfb836f23\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2qdqs"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.825118    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a835e0ec-4721-4824-8846-fcc7e12db3f9-etcd-client\") pod \"apiserver-76f77b778f-jzx77\" (UID: \"a835e0ec-4721-4824-8846-fcc7e12db3f9\") " pod="openshift-apiserver/apiserver-76f77b778f-jzx77"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.825433    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a835e0ec-4721-4824-8846-fcc7e12db3f9-serving-cert\") pod \"apiserver-76f77b778f-jzx77\" (UID: \"a835e0ec-4721-4824-8846-fcc7e12db3f9\") " pod="openshift-apiserver/apiserver-76f77b778f-jzx77"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.825752    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a250c56d-72fb-473d-98ce-c013e9d15b4a-metrics-tls\") pod \"dns-operator-744455d44c-nfww4\" (UID: \"a250c56d-72fb-473d-98ce-c013e9d15b4a\") " pod="openshift-dns-operator/dns-operator-744455d44c-nfww4"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.826293    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e190e098-9bc8-492f-9657-f6ccfb836f23-webhook-cert\") pod \"packageserver-d55dfcdfc-2qdqs\" (UID: \"e190e098-9bc8-492f-9657-f6ccfb836f23\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2qdqs"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.839697    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.858507    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.862283    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a38d833-db72-4566-b139-7788730a502a-config\") pod \"controller-manager-879f6c89f-hrm7z\" (UID: \"9a38d833-db72-4566-b139-7788730a502a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hrm7z"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.862557    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5edbd5a9-6c8b-4ef8-950f-58deaecf36ee-console-oauth-config\") pod \"console-f9d7485db-9kgl8\" (UID: \"5edbd5a9-6c8b-4ef8-950f-58deaecf36ee\") " pod="openshift-console/console-f9d7485db-9kgl8"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.862987    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/5c0e41b3-aa2d-4083-acb2-f0f68a29fcce-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-k6z2l\" (UID: \"5c0e41b3-aa2d-4083-acb2-f0f68a29fcce\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-k6z2l"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.867481    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-st79s\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") " pod="openshift-authentication/oauth-openshift-558db77b4-st79s"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.867818    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-st79s\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") " pod="openshift-authentication/oauth-openshift-558db77b4-st79s"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.868176    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-st79s\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") " pod="openshift-authentication/oauth-openshift-558db77b4-st79s"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.868507    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-st79s\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") " pod="openshift-authentication/oauth-openshift-558db77b4-st79s"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.869480    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-st79s\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") " pod="openshift-authentication/oauth-openshift-558db77b4-st79s"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.870006    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/18214bd2-9c3a-4737-885b-2b5c905311d8-default-certificate\") pod \"router-default-5444994796-92dt7\" (UID: \"18214bd2-9c3a-4737-885b-2b5c905311d8\") " pod="openshift-ingress/router-default-5444994796-92dt7"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.870693    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-st79s\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") " pod="openshift-authentication/oauth-openshift-558db77b4-st79s"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.878656    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.898490    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.908581    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/18214bd2-9c3a-4737-885b-2b5c905311d8-stats-auth\") pod \"router-default-5444994796-92dt7\" (UID: \"18214bd2-9c3a-4737-885b-2b5c905311d8\") " pod="openshift-ingress/router-default-5444994796-92dt7"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.919042    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.926747    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/18214bd2-9c3a-4737-885b-2b5c905311d8-metrics-certs\") pod \"router-default-5444994796-92dt7\" (UID: \"18214bd2-9c3a-4737-885b-2b5c905311d8\") " pod="openshift-ingress/router-default-5444994796-92dt7"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.938711    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.959039    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.966019    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18214bd2-9c3a-4737-885b-2b5c905311d8-service-ca-bundle\") pod \"router-default-5444994796-92dt7\" (UID: \"18214bd2-9c3a-4737-885b-2b5c905311d8\") " pod="openshift-ingress/router-default-5444994796-92dt7"
Mar 20 15:42:36 crc kubenswrapper[4730]: I0320 15:42:36.979554    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt"
Mar 20 15:42:37 crc kubenswrapper[4730]: I0320 15:42:37.018542    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt"
Mar 20 15:42:37 crc kubenswrapper[4730]: I0320 15:42:37.039758    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw"
Mar 20 15:42:37 crc kubenswrapper[4730]: I0320 15:42:37.059140    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert"
Mar 20 15:42:37 crc kubenswrapper[4730]: I0320 15:42:37.078624    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config"
Mar 20 15:42:37 crc kubenswrapper[4730]: I0320 15:42:37.099146    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr"
Mar 20 15:42:37 crc kubenswrapper[4730]: I0320 15:42:37.119916    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config"
Mar 20 15:42:37 crc kubenswrapper[4730]: I0320 15:42:37.139650    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert"
Mar 20 15:42:37 crc kubenswrapper[4730]: I0320 15:42:37.158372    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert"
Mar 20 15:42:37 crc kubenswrapper[4730]: I0320 15:42:37.179039    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt"
Mar 20 15:42:37 crc kubenswrapper[4730]: I0320 15:42:37.198774    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg"
Mar 20 15:42:37 crc kubenswrapper[4730]: I0320 15:42:37.219060    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt"
Mar 20 15:42:37 crc kubenswrapper[4730]: I0320 15:42:37.239270    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt"
Mar 20 15:42:37 crc kubenswrapper[4730]: I0320 15:42:37.259588    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d"
Mar 20 15:42:37 crc kubenswrapper[4730]: I0320 15:42:37.279440    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert"
Mar 20 15:42:37 crc kubenswrapper[4730]: I0320 15:42:37.299476    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config"
Mar 20 15:42:37 crc kubenswrapper[4730]: I0320 15:42:37.318368    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt"
Mar 20 15:42:37 crc kubenswrapper[4730]: I0320 15:42:37.339099    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert"
Mar 20 15:42:37 crc kubenswrapper[4730]: I0320 15:42:37.359655    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt"
Mar 20 15:42:37 crc kubenswrapper[4730]: I0320 15:42:37.380436    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf"
Mar 20 15:42:37 crc kubenswrapper[4730]: I0320 15:42:37.401386    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret"
Mar 20 15:42:37 crc kubenswrapper[4730]: I0320 15:42:37.419970    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls"
Mar 20 15:42:37 crc kubenswrapper[4730]: I0320 15:42:37.440589    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk"
Mar 20 15:42:37 crc kubenswrapper[4730]: I0320 15:42:37.459631    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt"
Mar 20 15:42:37 crc kubenswrapper[4730]: I0320 15:42:37.490728    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca"
Mar 20 15:42:37 crc kubenswrapper[4730]: I0320 15:42:37.499410    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert"
Mar 20 15:42:37 crc kubenswrapper[4730]: I0320 15:42:37.518546    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt"
Mar 20 15:42:37 crc kubenswrapper[4730]: I0320 15:42:37.539583    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert"
Mar 20 15:42:37 crc kubenswrapper[4730]: I0320 15:42:37.560271    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r"
Mar 20 15:42:37 crc kubenswrapper[4730]: I0320 15:42:37.579313    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config"
Mar 20 15:42:37 crc kubenswrapper[4730]: I0320 15:42:37.599814    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt"
Mar 20 15:42:37 crc kubenswrapper[4730]: I0320 15:42:37.620084    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images"
Mar 20 15:42:37 crc kubenswrapper[4730]: I0320 15:42:37.637129    4730 request.go:700] Waited for 1.001713222s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dmco-proxy-tls&limit=500&resourceVersion=0
Mar 20 15:42:37 crc kubenswrapper[4730]: I0320 15:42:37.639840    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls"
Mar 20 15:42:37 crc kubenswrapper[4730]: I0320 15:42:37.660037    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87"
Mar 20 15:42:37 crc kubenswrapper[4730]: I0320 15:42:37.679963    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls"
Mar 20 15:42:37 crc kubenswrapper[4730]: I0320 15:42:37.699296    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx"
Mar 20 15:42:37 crc kubenswrapper[4730]: I0320 15:42:37.719594    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert"
Mar 20 15:42:37 crc kubenswrapper[4730]: I0320 15:42:37.739911    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt"
Mar 20 15:42:37 crc kubenswrapper[4730]: I0320 15:42:37.759182    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls"
Mar 20 15:42:37 crc kubenswrapper[4730]: I0320 15:42:37.778784    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt"
Mar 20 15:42:37 crc kubenswrapper[4730]: I0320 15:42:37.799859    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client"
Mar 20 15:42:37 crc kubenswrapper[4730]: I0320 15:42:37.818615    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn"
Mar 20 15:42:37 crc kubenswrapper[4730]: I0320 15:42:37.838049    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert"
Mar 20 15:42:37 crc kubenswrapper[4730]: I0320 15:42:37.858275    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config"
Mar 20 15:42:37 crc kubenswrapper[4730]: I0320 15:42:37.878905    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config"
Mar 20 15:42:37 crc kubenswrapper[4730]: I0320 15:42:37.898540    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle"
Mar 20 15:42:37 crc kubenswrapper[4730]: I0320 15:42:37.920030    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle"
Mar 20 15:42:37 crc kubenswrapper[4730]: I0320 15:42:37.939037    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t"
Mar 20 15:42:37 crc kubenswrapper[4730]: I0320 15:42:37.959679    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt"
Mar 20 15:42:37 crc kubenswrapper[4730]: I0320 15:42:37.978323    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt"
Mar 20 15:42:37 crc kubenswrapper[4730]: I0320 15:42:37.998697    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.019782    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.039017    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.058939    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.078366    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.098231    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.124027    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.138966    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.158273    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.178709    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.198544    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.218812    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.258669    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.278089    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.297631    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.318023    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.338006    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.358021    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.378550    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.397622    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.418054    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.438326    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.458158    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.478396    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.498126    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.518843    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.538347    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.558738    4730 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.578933    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.613073    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d86jz\" (UniqueName: \"kubernetes.io/projected/7406e06c-cbde-48f9-b5e7-57a2a86b5a4d-kube-api-access-d86jz\") pod \"cluster-image-registry-operator-dc59b4c8b-xspkm\" (UID: \"7406e06c-cbde-48f9-b5e7-57a2a86b5a4d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xspkm"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.631517    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj7hq\" (UniqueName: \"kubernetes.io/projected/7662a0cc-faaa-47da-90f9-f3a8907a0401-kube-api-access-xj7hq\") pod \"apiserver-7bbb656c7d-nsdw7\" (UID: \"7662a0cc-faaa-47da-90f9-f3a8907a0401\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nsdw7"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.637213    4730 request.go:700] Waited for 1.843014254s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress/serviceaccounts/router/token
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.653400    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-png7p\" (UniqueName: \"kubernetes.io/projected/18214bd2-9c3a-4737-885b-2b5c905311d8-kube-api-access-png7p\") pod \"router-default-5444994796-92dt7\" (UID: \"18214bd2-9c3a-4737-885b-2b5c905311d8\") " pod="openshift-ingress/router-default-5444994796-92dt7"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.682530    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7406e06c-cbde-48f9-b5e7-57a2a86b5a4d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-xspkm\" (UID: \"7406e06c-cbde-48f9-b5e7-57a2a86b5a4d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xspkm"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.691667    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9zgg\" (UniqueName: \"kubernetes.io/projected/d784e9cd-d5af-496e-abca-ce30096bb0d0-kube-api-access-v9zgg\") pod \"authentication-operator-69f744f599-csmvr\" (UID: \"d784e9cd-d5af-496e-abca-ce30096bb0d0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-csmvr"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.695292    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-92dt7"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.711995    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjf27\" (UniqueName: \"kubernetes.io/projected/d4e38bce-6ae6-451b-aa9f-7a98dfa4d974-kube-api-access-rjf27\") pod \"openshift-config-operator-7777fb866f-6rbg9\" (UID: \"d4e38bce-6ae6-451b-aa9f-7a98dfa4d974\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6rbg9"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.731753    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qvx5\" (UniqueName: \"kubernetes.io/projected/5edbd5a9-6c8b-4ef8-950f-58deaecf36ee-kube-api-access-4qvx5\") pod \"console-f9d7485db-9kgl8\" (UID: \"5edbd5a9-6c8b-4ef8-950f-58deaecf36ee\") " pod="openshift-console/console-f9d7485db-9kgl8"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.756311    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ggbs\" (UniqueName: \"kubernetes.io/projected/a835e0ec-4721-4824-8846-fcc7e12db3f9-kube-api-access-9ggbs\") pod \"apiserver-76f77b778f-jzx77\" (UID: \"a835e0ec-4721-4824-8846-fcc7e12db3f9\") " pod="openshift-apiserver/apiserver-76f77b778f-jzx77"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.784636    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7b9l\" (UniqueName: \"kubernetes.io/projected/5c0e41b3-aa2d-4083-acb2-f0f68a29fcce-kube-api-access-t7b9l\") pod \"machine-api-operator-5694c8668f-k6z2l\" (UID: \"5c0e41b3-aa2d-4083-acb2-f0f68a29fcce\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-k6z2l"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.792482    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nsdw7"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.798384    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6xmg\" (UniqueName: \"kubernetes.io/projected/a250c56d-72fb-473d-98ce-c013e9d15b4a-kube-api-access-v6xmg\") pod \"dns-operator-744455d44c-nfww4\" (UID: \"a250c56d-72fb-473d-98ce-c013e9d15b4a\") " pod="openshift-dns-operator/dns-operator-744455d44c-nfww4"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.818862    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6vj7\" (UniqueName: \"kubernetes.io/projected/7d581333-2d6e-44d6-a6fc-b90c3b16baad-kube-api-access-q6vj7\") pod \"cluster-samples-operator-665b6dd947-kfjm5\" (UID: \"7d581333-2d6e-44d6-a6fc-b90c3b16baad\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kfjm5"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.832802    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kfjm5"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.839905    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wb6sw\" (UniqueName: \"kubernetes.io/projected/2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf-kube-api-access-wb6sw\") pod \"route-controller-manager-6576b87f9c-tgpgm\" (UID: \"2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tgpgm"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.840428    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-9kgl8"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.860232    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq5zm\" (UniqueName: \"kubernetes.io/projected/9a38d833-db72-4566-b139-7788730a502a-kube-api-access-tq5zm\") pod \"controller-manager-879f6c89f-hrm7z\" (UID: \"9a38d833-db72-4566-b139-7788730a502a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hrm7z"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.869298    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-csmvr"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.874461    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64zrc\" (UniqueName: \"kubernetes.io/projected/e190e098-9bc8-492f-9657-f6ccfb836f23-kube-api-access-64zrc\") pod \"packageserver-d55dfcdfc-2qdqs\" (UID: \"e190e098-9bc8-492f-9657-f6ccfb836f23\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2qdqs"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.887125    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-nfww4"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.891446    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xspkm"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.904333    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5z67\" (UniqueName: \"kubernetes.io/projected/2499559b-b31f-4dab-89a0-964964dc596e-kube-api-access-l5z67\") pod \"oauth-openshift-558db77b4-st79s\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") " pod="openshift-authentication/oauth-openshift-558db77b4-st79s"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.917180    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftpl2\" (UniqueName: \"kubernetes.io/projected/2eb5b4cf-be45-4c3d-abbf-9a3e525d5ed3-kube-api-access-ftpl2\") pod \"console-operator-58897d9998-mkxg7\" (UID: \"2eb5b4cf-be45-4c3d-abbf-9a3e525d5ed3\") " pod="openshift-console-operator/console-operator-58897d9998-mkxg7"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.919820    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-hrm7z"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.936912    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9b24b\" (UniqueName: \"kubernetes.io/projected/ad1f04f2-f7c4-4bc6-9daf-0db7a0809206-kube-api-access-9b24b\") pod \"machine-approver-56656f9798-ldb64\" (UID: \"ad1f04f2-f7c4-4bc6-9daf-0db7a0809206\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ldb64"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.942056    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-jzx77"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.952996    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-k6z2l"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.954353    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ck55\" (UniqueName: \"kubernetes.io/projected/0bc0c5b5-55bb-4339-8162-bb647b833006-kube-api-access-7ck55\") pod \"openshift-controller-manager-operator-756b6f6bc6-pncxq\" (UID: \"0bc0c5b5-55bb-4339-8162-bb647b833006\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pncxq"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.958797    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2qdqs"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.975044    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpv47\" (UniqueName: \"kubernetes.io/projected/d32c9cec-9f6c-4304-8bc9-d2e52128470a-kube-api-access-zpv47\") pod \"downloads-7954f5f757-g7hdt\" (UID: \"d32c9cec-9f6c-4304-8bc9-d2e52128470a\") " pod="openshift-console/downloads-7954f5f757-g7hdt"
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.981318    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-nsdw7"]
Mar 20 15:42:38 crc kubenswrapper[4730]: I0320 15:42:38.993625    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4lws\" (UniqueName: \"kubernetes.io/projected/ca0985ab-94f6-4f4e-b8b4-0ee710e01fcf-kube-api-access-v4lws\") pod \"openshift-apiserver-operator-796bbdcf4f-zr8dk\" (UID: \"ca0985ab-94f6-4f4e-b8b4-0ee710e01fcf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zr8dk"
Mar 20 15:42:39 crc kubenswrapper[4730]: W0320 15:42:39.001581    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7662a0cc_faaa_47da_90f9_f3a8907a0401.slice/crio-aca6a436c7afd8408d9c66f89c190d3570325670190e36d191782e17a273a429 WatchSource:0}: Error finding container aca6a436c7afd8408d9c66f89c190d3570325670190e36d191782e17a273a429: Status 404 returned error can't find the container with id aca6a436c7afd8408d9c66f89c190d3570325670190e36d191782e17a273a429
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.008537    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6rbg9"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.027790    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e-bound-sa-token\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.027840    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.027862    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l78n7\" (UniqueName: \"kubernetes.io/projected/54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e-kube-api-access-l78n7\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.027889    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e-ca-trust-extracted\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.027910    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e-installation-pull-secrets\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.027947    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e-registry-tls\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.027966    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e-registry-certificates\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.028003    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e-trusted-ca\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:39 crc kubenswrapper[4730]: E0320 15:42:39.028304    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:42:39.528293746 +0000 UTC m=+218.741665115 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bdpg6" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.050468    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tgpgm"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.060473    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ldb64"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.104370    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-csmvr"]
Mar 20 15:42:39 crc kubenswrapper[4730]: W0320 15:42:39.114406    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad1f04f2_f7c4_4bc6_9daf_0db7a0809206.slice/crio-cadb9ce1c6e0e9f7fc23aed07ca259c9113f4c4927998e0ff33fed07306ab4ff WatchSource:0}: Error finding container cadb9ce1c6e0e9f7fc23aed07ca259c9113f4c4927998e0ff33fed07306ab4ff: Status 404 returned error can't find the container with id cadb9ce1c6e0e9f7fc23aed07ca259c9113f4c4927998e0ff33fed07306ab4ff
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.117342    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-st79s"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.121698    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zr8dk"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.130636    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.130925    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ac46477-04bc-4d0a-b28e-b687c690dd5a-config\") pod \"kube-apiserver-operator-766d6c64bb-qwghv\" (UID: \"9ac46477-04bc-4d0a-b28e-b687c690dd5a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qwghv"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.130966    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afbc673a-2498-49dc-b98e-d7ddc58d2999-config\") pod \"kube-controller-manager-operator-78b949d7b-mzjxx\" (UID: \"afbc673a-2498-49dc-b98e-d7ddc58d2999\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mzjxx"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.131022    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a268a97-bf49-4ed6-b239-1a088c3c4e4f-cert\") pod \"ingress-canary-6w7m9\" (UID: \"8a268a97-bf49-4ed6-b239-1a088c3c4e4f\") " pod="openshift-ingress-canary/ingress-canary-6w7m9"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.131062    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e-bound-sa-token\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.131093    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzkcx\" (UniqueName: \"kubernetes.io/projected/8e224294-495e-4d65-96f2-8e0d2a444ef1-kube-api-access-pzkcx\") pod \"machine-config-controller-84d6567774-7hzc8\" (UID: \"8e224294-495e-4d65-96f2-8e0d2a444ef1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7hzc8"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.131123    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/0a030e24-2337-49a2-a5e2-118714cd7ff9-node-bootstrap-token\") pod \"machine-config-server-8n5gl\" (UID: \"0a030e24-2337-49a2-a5e2-118714cd7ff9\") " pod="openshift-machine-config-operator/machine-config-server-8n5gl"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.131156    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2d975f8-1a1e-4921-aef0-3c4652992a02-serving-cert\") pod \"etcd-operator-b45778765-w5gdn\" (UID: \"c2d975f8-1a1e-4921-aef0-3c4652992a02\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w5gdn"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.131196    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5612cc7f-9299-43b4-b97c-cf579a416e84-proxy-tls\") pod \"machine-config-operator-74547568cd-4dp6q\" (UID: \"5612cc7f-9299-43b4-b97c-cf579a416e84\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4dp6q"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.131216    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8e224294-495e-4d65-96f2-8e0d2a444ef1-proxy-tls\") pod \"machine-config-controller-84d6567774-7hzc8\" (UID: \"8e224294-495e-4d65-96f2-8e0d2a444ef1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7hzc8"
Mar 20 15:42:39 crc kubenswrapper[4730]: E0320 15:42:39.132193    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:42:39.632171824 +0000 UTC m=+218.845543193 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.133345    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l78n7\" (UniqueName: \"kubernetes.io/projected/54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e-kube-api-access-l78n7\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.133381    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-klbh8\" (UID: \"e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3\") " pod="openshift-marketplace/marketplace-operator-79b997595-klbh8"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.133444    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs4lb\" (UniqueName: \"kubernetes.io/projected/8a268a97-bf49-4ed6-b239-1a088c3c4e4f-kube-api-access-gs4lb\") pod \"ingress-canary-6w7m9\" (UID: \"8a268a97-bf49-4ed6-b239-1a088c3c4e4f\") " pod="openshift-ingress-canary/ingress-canary-6w7m9"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.133481    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e-ca-trust-extracted\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.133516    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/be19fb65-a04f-42df-9b96-e620b58754bb-secret-volume\") pod \"collect-profiles-29567010-d69bc\" (UID: \"be19fb65-a04f-42df-9b96-e620b58754bb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567010-d69bc"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.133548    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt52k\" (UniqueName: \"kubernetes.io/projected/f6f09179-5752-4a5a-ab79-72a176bbdd9a-kube-api-access-rt52k\") pod \"olm-operator-6b444d44fb-m7fsc\" (UID: \"f6f09179-5752-4a5a-ab79-72a176bbdd9a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7fsc"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.133572    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dbb6ff6b-d521-408a-831c-a6a9c524a671-trusted-ca\") pod \"ingress-operator-5b745b69d9-fjhxb\" (UID: \"dbb6ff6b-d521-408a-831c-a6a9c524a671\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fjhxb"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.133592    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8nxk\" (UniqueName: \"kubernetes.io/projected/dbb6ff6b-d521-408a-831c-a6a9c524a671-kube-api-access-c8nxk\") pod \"ingress-operator-5b745b69d9-fjhxb\" (UID: \"dbb6ff6b-d521-408a-831c-a6a9c524a671\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fjhxb"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.133618    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/428fa435-b92e-4363-82bb-40316d3e0a26-registration-dir\") pod \"csi-hostpathplugin-jgzlv\" (UID: \"428fa435-b92e-4363-82bb-40316d3e0a26\") " pod="hostpath-provisioner/csi-hostpathplugin-jgzlv"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.133637    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6bfj\" (UniqueName: \"kubernetes.io/projected/1010c304-4912-42b2-aa8c-17d44c4bf6cb-kube-api-access-g6bfj\") pod \"service-ca-9c57cc56f-sxbnn\" (UID: \"1010c304-4912-42b2-aa8c-17d44c4bf6cb\") " pod="openshift-service-ca/service-ca-9c57cc56f-sxbnn"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.133684    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df4fa0ea-abb1-49ea-8d74-2992c71c1a0e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vqxff\" (UID: \"df4fa0ea-abb1-49ea-8d74-2992c71c1a0e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vqxff"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.133724    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e-installation-pull-secrets\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.133772    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f6f09179-5752-4a5a-ab79-72a176bbdd9a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-m7fsc\" (UID: \"f6f09179-5752-4a5a-ab79-72a176bbdd9a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7fsc"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.133807    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f6f09179-5752-4a5a-ab79-72a176bbdd9a-srv-cert\") pod \"olm-operator-6b444d44fb-m7fsc\" (UID: \"f6f09179-5752-4a5a-ab79-72a176bbdd9a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7fsc"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.136017    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e-ca-trust-extracted\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.137430    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/c2d975f8-1a1e-4921-aef0-3c4652992a02-etcd-ca\") pod \"etcd-operator-b45778765-w5gdn\" (UID: \"c2d975f8-1a1e-4921-aef0-3c4652992a02\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w5gdn"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.137542    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/afbc673a-2498-49dc-b98e-d7ddc58d2999-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-mzjxx\" (UID: \"afbc673a-2498-49dc-b98e-d7ddc58d2999\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mzjxx"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.137575    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46gxc\" (UniqueName: \"kubernetes.io/projected/7d87adfe-3206-4175-8d8f-5a00015cc61e-kube-api-access-46gxc\") pod \"auto-csr-approver-29567022-wf5nv\" (UID: \"7d87adfe-3206-4175-8d8f-5a00015cc61e\") " pod="openshift-infra/auto-csr-approver-29567022-wf5nv"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.138210    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj2f6\" (UniqueName: \"kubernetes.io/projected/c9f80b42-cff3-48a7-9e09-02ff65e9d9f8-kube-api-access-zj2f6\") pod \"control-plane-machine-set-operator-78cbb6b69f-jkk9s\" (UID: \"c9f80b42-cff3-48a7-9e09-02ff65e9d9f8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jkk9s"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.138281    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d04de14b-8e96-44ab-818f-2b08d78d2e14-metrics-tls\") pod \"dns-default-7ckfm\" (UID: \"d04de14b-8e96-44ab-818f-2b08d78d2e14\") " pod="openshift-dns/dns-default-7ckfm"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.138322    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b82f2ec3-df30-4b45-be3a-9858edb2bb7f-config\") pod \"service-ca-operator-777779d784-c2cgf\" (UID: \"b82f2ec3-df30-4b45-be3a-9858edb2bb7f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c2cgf"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.138350    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4e2a7090-33b8-4137-be83-5c2e5ab1ccc7-profile-collector-cert\") pod \"catalog-operator-68c6474976-m7cfz\" (UID: \"4e2a7090-33b8-4137-be83-5c2e5ab1ccc7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m7cfz"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.138374    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9a5496e-57aa-4f42-b53d-590fb534d26e-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-sk8hp\" (UID: \"d9a5496e-57aa-4f42-b53d-590fb534d26e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sk8hp"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.138397    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/c2d975f8-1a1e-4921-aef0-3c4652992a02-etcd-service-ca\") pod \"etcd-operator-b45778765-w5gdn\" (UID: \"c2d975f8-1a1e-4921-aef0-3c4652992a02\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w5gdn"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.138460    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/df4fa0ea-abb1-49ea-8d74-2992c71c1a0e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vqxff\" (UID: \"df4fa0ea-abb1-49ea-8d74-2992c71c1a0e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vqxff"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.138486    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7wjq\" (UniqueName: \"kubernetes.io/projected/d9a5496e-57aa-4f42-b53d-590fb534d26e-kube-api-access-r7wjq\") pod \"kube-storage-version-migrator-operator-b67b599dd-sk8hp\" (UID: \"d9a5496e-57aa-4f42-b53d-590fb534d26e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sk8hp"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.138513    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-klbh8\" (UID: \"e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3\") " pod="openshift-marketplace/marketplace-operator-79b997595-klbh8"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.138543    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e-registry-certificates\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.138647    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ac46477-04bc-4d0a-b28e-b687c690dd5a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-qwghv\" (UID: \"9ac46477-04bc-4d0a-b28e-b687c690dd5a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qwghv"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.138677    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c2d975f8-1a1e-4921-aef0-3c4652992a02-etcd-client\") pod \"etcd-operator-b45778765-w5gdn\" (UID: \"c2d975f8-1a1e-4921-aef0-3c4652992a02\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w5gdn"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.138705    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2083343b-2ec0-4306-a0a5-f74dd0f63746-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-m4xlq\" (UID: \"2083343b-2ec0-4306-a0a5-f74dd0f63746\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m4xlq"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.138731    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxflj\" (UniqueName: \"kubernetes.io/projected/e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3-kube-api-access-jxflj\") pod \"marketplace-operator-79b997595-klbh8\" (UID: \"e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3\") " pod="openshift-marketplace/marketplace-operator-79b997595-klbh8"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.140612    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzvgx\" (UniqueName: \"kubernetes.io/projected/428fa435-b92e-4363-82bb-40316d3e0a26-kube-api-access-kzvgx\") pod \"csi-hostpathplugin-jgzlv\" (UID: \"428fa435-b92e-4363-82bb-40316d3e0a26\") " pod="hostpath-provisioner/csi-hostpathplugin-jgzlv"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.140656    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afbc673a-2498-49dc-b98e-d7ddc58d2999-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-mzjxx\" (UID: \"afbc673a-2498-49dc-b98e-d7ddc58d2999\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mzjxx"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.140887    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e-trusted-ca\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.140965    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbnh9\" (UniqueName: \"kubernetes.io/projected/be19fb65-a04f-42df-9b96-e620b58754bb-kube-api-access-wbnh9\") pod \"collect-profiles-29567010-d69bc\" (UID: \"be19fb65-a04f-42df-9b96-e620b58754bb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567010-d69bc"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.141107    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njkjz\" (UniqueName: \"kubernetes.io/projected/c2d975f8-1a1e-4921-aef0-3c4652992a02-kube-api-access-njkjz\") pod \"etcd-operator-b45778765-w5gdn\" (UID: \"c2d975f8-1a1e-4921-aef0-3c4652992a02\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w5gdn"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.141231    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e-registry-certificates\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.141934    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4e2a7090-33b8-4137-be83-5c2e5ab1ccc7-srv-cert\") pod \"catalog-operator-68c6474976-m7cfz\" (UID: \"4e2a7090-33b8-4137-be83-5c2e5ab1ccc7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m7cfz"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.141958    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6cmt\" (UniqueName: \"kubernetes.io/projected/4e2a7090-33b8-4137-be83-5c2e5ab1ccc7-kube-api-access-b6cmt\") pod \"catalog-operator-68c6474976-m7cfz\" (UID: \"4e2a7090-33b8-4137-be83-5c2e5ab1ccc7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m7cfz"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.141974    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8jhd\" (UniqueName: \"kubernetes.io/projected/2083343b-2ec0-4306-a0a5-f74dd0f63746-kube-api-access-n8jhd\") pod \"package-server-manager-789f6589d5-m4xlq\" (UID: \"2083343b-2ec0-4306-a0a5-f74dd0f63746\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m4xlq"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.141989    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df4fa0ea-abb1-49ea-8d74-2992c71c1a0e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vqxff\" (UID: \"df4fa0ea-abb1-49ea-8d74-2992c71c1a0e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vqxff"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.142019    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/428fa435-b92e-4363-82bb-40316d3e0a26-plugins-dir\") pod \"csi-hostpathplugin-jgzlv\" (UID: \"428fa435-b92e-4363-82bb-40316d3e0a26\") " pod="hostpath-provisioner/csi-hostpathplugin-jgzlv"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.142052    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.146897    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e-trusted-ca\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.147589    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e-installation-pull-secrets\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.147828    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/428fa435-b92e-4363-82bb-40316d3e0a26-csi-data-dir\") pod \"csi-hostpathplugin-jgzlv\" (UID: \"428fa435-b92e-4363-82bb-40316d3e0a26\") " pod="hostpath-provisioner/csi-hostpathplugin-jgzlv"
Mar 20 15:42:39 crc kubenswrapper[4730]: E0320 15:42:39.148231    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:42:39.648210148 +0000 UTC m=+218.861581507 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bdpg6" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.148313    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxgbn\" (UniqueName: \"kubernetes.io/projected/49896a92-a6b0-45ea-a736-09a368d90be4-kube-api-access-xxgbn\") pod \"migrator-59844c95c7-qxnn6\" (UID: \"49896a92-a6b0-45ea-a736-09a368d90be4\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qxnn6"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.149190    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be19fb65-a04f-42df-9b96-e620b58754bb-config-volume\") pod \"collect-profiles-29567010-d69bc\" (UID: \"be19fb65-a04f-42df-9b96-e620b58754bb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567010-d69bc"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.149535    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9ac46477-04bc-4d0a-b28e-b687c690dd5a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-qwghv\" (UID: \"9ac46477-04bc-4d0a-b28e-b687c690dd5a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qwghv"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.149778    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e213a906-8ad6-45c1-b832-a42d58fd91c6-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-84pdq\" (UID: \"e213a906-8ad6-45c1-b832-a42d58fd91c6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-84pdq"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.149804    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5612cc7f-9299-43b4-b97c-cf579a416e84-auth-proxy-config\") pod \"machine-config-operator-74547568cd-4dp6q\" (UID: \"5612cc7f-9299-43b4-b97c-cf579a416e84\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4dp6q"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.149834    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1010c304-4912-42b2-aa8c-17d44c4bf6cb-signing-cabundle\") pod \"service-ca-9c57cc56f-sxbnn\" (UID: \"1010c304-4912-42b2-aa8c-17d44c4bf6cb\") " pod="openshift-service-ca/service-ca-9c57cc56f-sxbnn"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.149865    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d04de14b-8e96-44ab-818f-2b08d78d2e14-config-volume\") pod \"dns-default-7ckfm\" (UID: \"d04de14b-8e96-44ab-818f-2b08d78d2e14\") " pod="openshift-dns/dns-default-7ckfm"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.150662    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/428fa435-b92e-4363-82bb-40316d3e0a26-socket-dir\") pod \"csi-hostpathplugin-jgzlv\" (UID: \"428fa435-b92e-4363-82bb-40316d3e0a26\") " pod="hostpath-provisioner/csi-hostpathplugin-jgzlv"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.150716    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/428fa435-b92e-4363-82bb-40316d3e0a26-mountpoint-dir\") pod \"csi-hostpathplugin-jgzlv\" (UID: \"428fa435-b92e-4363-82bb-40316d3e0a26\") " pod="hostpath-provisioner/csi-hostpathplugin-jgzlv"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.150996    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz2n6\" (UniqueName: \"kubernetes.io/projected/0a030e24-2337-49a2-a5e2-118714cd7ff9-kube-api-access-wz2n6\") pod \"machine-config-server-8n5gl\" (UID: \"0a030e24-2337-49a2-a5e2-118714cd7ff9\") " pod="openshift-machine-config-operator/machine-config-server-8n5gl"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.151433    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dbb6ff6b-d521-408a-831c-a6a9c524a671-bound-sa-token\") pod \"ingress-operator-5b745b69d9-fjhxb\" (UID: \"dbb6ff6b-d521-408a-831c-a6a9c524a671\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fjhxb"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.158400    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1010c304-4912-42b2-aa8c-17d44c4bf6cb-signing-key\") pod \"service-ca-9c57cc56f-sxbnn\" (UID: \"1010c304-4912-42b2-aa8c-17d44c4bf6cb\") " pod="openshift-service-ca/service-ca-9c57cc56f-sxbnn"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.158478    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r75nz\" (UniqueName: \"kubernetes.io/projected/d04de14b-8e96-44ab-818f-2b08d78d2e14-kube-api-access-r75nz\") pod \"dns-default-7ckfm\" (UID: \"d04de14b-8e96-44ab-818f-2b08d78d2e14\") " pod="openshift-dns/dns-default-7ckfm"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.158514    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2d975f8-1a1e-4921-aef0-3c4652992a02-config\") pod \"etcd-operator-b45778765-w5gdn\" (UID: \"c2d975f8-1a1e-4921-aef0-3c4652992a02\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w5gdn"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.158581    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b82f2ec3-df30-4b45-be3a-9858edb2bb7f-serving-cert\") pod \"service-ca-operator-777779d784-c2cgf\" (UID: \"b82f2ec3-df30-4b45-be3a-9858edb2bb7f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c2cgf"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.158728    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-g7hdt"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.158904    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/c9f80b42-cff3-48a7-9e09-02ff65e9d9f8-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-jkk9s\" (UID: \"c9f80b42-cff3-48a7-9e09-02ff65e9d9f8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jkk9s"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.158941    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dbb6ff6b-d521-408a-831c-a6a9c524a671-metrics-tls\") pod \"ingress-operator-5b745b69d9-fjhxb\" (UID: \"dbb6ff6b-d521-408a-831c-a6a9c524a671\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fjhxb"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.159049    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkbfb\" (UniqueName: \"kubernetes.io/projected/e213a906-8ad6-45c1-b832-a42d58fd91c6-kube-api-access-bkbfb\") pod \"multus-admission-controller-857f4d67dd-84pdq\" (UID: \"e213a906-8ad6-45c1-b832-a42d58fd91c6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-84pdq"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.160198    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e-registry-tls\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.161507    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9a5496e-57aa-4f42-b53d-590fb534d26e-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-sk8hp\" (UID: \"d9a5496e-57aa-4f42-b53d-590fb534d26e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sk8hp"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.161634    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-mkxg7"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.161945    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5612cc7f-9299-43b4-b97c-cf579a416e84-images\") pod \"machine-config-operator-74547568cd-4dp6q\" (UID: \"5612cc7f-9299-43b4-b97c-cf579a416e84\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4dp6q"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.162498    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbshk\" (UniqueName: \"kubernetes.io/projected/5612cc7f-9299-43b4-b97c-cf579a416e84-kube-api-access-fbshk\") pod \"machine-config-operator-74547568cd-4dp6q\" (UID: \"5612cc7f-9299-43b4-b97c-cf579a416e84\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4dp6q"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.163074    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8e224294-495e-4d65-96f2-8e0d2a444ef1-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-7hzc8\" (UID: \"8e224294-495e-4d65-96f2-8e0d2a444ef1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7hzc8"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.163108    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/0a030e24-2337-49a2-a5e2-118714cd7ff9-certs\") pod \"machine-config-server-8n5gl\" (UID: \"0a030e24-2337-49a2-a5e2-118714cd7ff9\") " pod="openshift-machine-config-operator/machine-config-server-8n5gl"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.163171    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27wgn\" (UniqueName: \"kubernetes.io/projected/b82f2ec3-df30-4b45-be3a-9858edb2bb7f-kube-api-access-27wgn\") pod \"service-ca-operator-777779d784-c2cgf\" (UID: \"b82f2ec3-df30-4b45-be3a-9858edb2bb7f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c2cgf"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.163231    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e-bound-sa-token\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.174227    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e-registry-tls\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.176631    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pncxq"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.177479    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l78n7\" (UniqueName: \"kubernetes.io/projected/54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e-kube-api-access-l78n7\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.236660    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ldb64" event={"ID":"ad1f04f2-f7c4-4bc6-9daf-0db7a0809206","Type":"ContainerStarted","Data":"cadb9ce1c6e0e9f7fc23aed07ca259c9113f4c4927998e0ff33fed07306ab4ff"}
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.242199    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-jzx77"]
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.245259    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-csmvr" event={"ID":"d784e9cd-d5af-496e-abca-ce30096bb0d0","Type":"ContainerStarted","Data":"ca69c531049adbce31f8306da31b6663856abafbfe3b5b42a37934246712933c"}
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.246895    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nsdw7" event={"ID":"7662a0cc-faaa-47da-90f9-f3a8907a0401","Type":"ContainerStarted","Data":"aca6a436c7afd8408d9c66f89c190d3570325670190e36d191782e17a273a429"}
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.248320    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-92dt7" event={"ID":"18214bd2-9c3a-4737-885b-2b5c905311d8","Type":"ContainerStarted","Data":"abcdbf5a3476ecc0ddcea4614869525f19124592633fbe93a76d20aee50f11ac"}
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.248350    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-92dt7" event={"ID":"18214bd2-9c3a-4737-885b-2b5c905311d8","Type":"ContainerStarted","Data":"d2f20db5f347dbd8ba3429cb3c60dcc2ce2ee189be84df39412513494c4307a4"}
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.264504    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:42:39 crc kubenswrapper[4730]: E0320 15:42:39.264635    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:42:39.764615772 +0000 UTC m=+218.977987141 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.264780    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7wjq\" (UniqueName: \"kubernetes.io/projected/d9a5496e-57aa-4f42-b53d-590fb534d26e-kube-api-access-r7wjq\") pod \"kube-storage-version-migrator-operator-b67b599dd-sk8hp\" (UID: \"d9a5496e-57aa-4f42-b53d-590fb534d26e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sk8hp"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.264812    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-klbh8\" (UID: \"e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3\") " pod="openshift-marketplace/marketplace-operator-79b997595-klbh8"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.264840    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ac46477-04bc-4d0a-b28e-b687c690dd5a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-qwghv\" (UID: \"9ac46477-04bc-4d0a-b28e-b687c690dd5a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qwghv"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.264865    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c2d975f8-1a1e-4921-aef0-3c4652992a02-etcd-client\") pod \"etcd-operator-b45778765-w5gdn\" (UID: \"c2d975f8-1a1e-4921-aef0-3c4652992a02\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w5gdn"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.264888    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzvgx\" (UniqueName: \"kubernetes.io/projected/428fa435-b92e-4363-82bb-40316d3e0a26-kube-api-access-kzvgx\") pod \"csi-hostpathplugin-jgzlv\" (UID: \"428fa435-b92e-4363-82bb-40316d3e0a26\") " pod="hostpath-provisioner/csi-hostpathplugin-jgzlv"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.264909    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afbc673a-2498-49dc-b98e-d7ddc58d2999-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-mzjxx\" (UID: \"afbc673a-2498-49dc-b98e-d7ddc58d2999\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mzjxx"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.264932    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2083343b-2ec0-4306-a0a5-f74dd0f63746-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-m4xlq\" (UID: \"2083343b-2ec0-4306-a0a5-f74dd0f63746\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m4xlq"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.264954    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxflj\" (UniqueName: \"kubernetes.io/projected/e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3-kube-api-access-jxflj\") pod \"marketplace-operator-79b997595-klbh8\" (UID: \"e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3\") " pod="openshift-marketplace/marketplace-operator-79b997595-klbh8"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.264980    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njkjz\" (UniqueName: \"kubernetes.io/projected/c2d975f8-1a1e-4921-aef0-3c4652992a02-kube-api-access-njkjz\") pod \"etcd-operator-b45778765-w5gdn\" (UID: \"c2d975f8-1a1e-4921-aef0-3c4652992a02\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w5gdn"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.265005    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbnh9\" (UniqueName: \"kubernetes.io/projected/be19fb65-a04f-42df-9b96-e620b58754bb-kube-api-access-wbnh9\") pod \"collect-profiles-29567010-d69bc\" (UID: \"be19fb65-a04f-42df-9b96-e620b58754bb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567010-d69bc"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.265029    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4e2a7090-33b8-4137-be83-5c2e5ab1ccc7-srv-cert\") pod \"catalog-operator-68c6474976-m7cfz\" (UID: \"4e2a7090-33b8-4137-be83-5c2e5ab1ccc7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m7cfz"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.265048    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6cmt\" (UniqueName: \"kubernetes.io/projected/4e2a7090-33b8-4137-be83-5c2e5ab1ccc7-kube-api-access-b6cmt\") pod \"catalog-operator-68c6474976-m7cfz\" (UID: \"4e2a7090-33b8-4137-be83-5c2e5ab1ccc7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m7cfz"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.265071    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8jhd\" (UniqueName: \"kubernetes.io/projected/2083343b-2ec0-4306-a0a5-f74dd0f63746-kube-api-access-n8jhd\") pod \"package-server-manager-789f6589d5-m4xlq\" (UID: \"2083343b-2ec0-4306-a0a5-f74dd0f63746\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m4xlq"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.265092    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/428fa435-b92e-4363-82bb-40316d3e0a26-plugins-dir\") pod \"csi-hostpathplugin-jgzlv\" (UID: \"428fa435-b92e-4363-82bb-40316d3e0a26\") " pod="hostpath-provisioner/csi-hostpathplugin-jgzlv"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.265115    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df4fa0ea-abb1-49ea-8d74-2992c71c1a0e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vqxff\" (UID: \"df4fa0ea-abb1-49ea-8d74-2992c71c1a0e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vqxff"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.265141    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.265163    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/428fa435-b92e-4363-82bb-40316d3e0a26-csi-data-dir\") pod \"csi-hostpathplugin-jgzlv\" (UID: \"428fa435-b92e-4363-82bb-40316d3e0a26\") " pod="hostpath-provisioner/csi-hostpathplugin-jgzlv"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.265186    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxgbn\" (UniqueName: \"kubernetes.io/projected/49896a92-a6b0-45ea-a736-09a368d90be4-kube-api-access-xxgbn\") pod \"migrator-59844c95c7-qxnn6\" (UID: \"49896a92-a6b0-45ea-a736-09a368d90be4\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qxnn6"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.265209    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be19fb65-a04f-42df-9b96-e620b58754bb-config-volume\") pod \"collect-profiles-29567010-d69bc\" (UID: \"be19fb65-a04f-42df-9b96-e620b58754bb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567010-d69bc"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.265231    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9ac46477-04bc-4d0a-b28e-b687c690dd5a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-qwghv\" (UID: \"9ac46477-04bc-4d0a-b28e-b687c690dd5a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qwghv"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.265277    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e213a906-8ad6-45c1-b832-a42d58fd91c6-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-84pdq\" (UID: \"e213a906-8ad6-45c1-b832-a42d58fd91c6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-84pdq"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.265301    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5612cc7f-9299-43b4-b97c-cf579a416e84-auth-proxy-config\") pod \"machine-config-operator-74547568cd-4dp6q\" (UID: \"5612cc7f-9299-43b4-b97c-cf579a416e84\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4dp6q"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.265324    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/428fa435-b92e-4363-82bb-40316d3e0a26-socket-dir\") pod \"csi-hostpathplugin-jgzlv\" (UID: \"428fa435-b92e-4363-82bb-40316d3e0a26\") " pod="hostpath-provisioner/csi-hostpathplugin-jgzlv"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.265344    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/428fa435-b92e-4363-82bb-40316d3e0a26-mountpoint-dir\") pod \"csi-hostpathplugin-jgzlv\" (UID: \"428fa435-b92e-4363-82bb-40316d3e0a26\") " pod="hostpath-provisioner/csi-hostpathplugin-jgzlv"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.265367    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1010c304-4912-42b2-aa8c-17d44c4bf6cb-signing-cabundle\") pod \"service-ca-9c57cc56f-sxbnn\" (UID: \"1010c304-4912-42b2-aa8c-17d44c4bf6cb\") " pod="openshift-service-ca/service-ca-9c57cc56f-sxbnn"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.265390    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d04de14b-8e96-44ab-818f-2b08d78d2e14-config-volume\") pod \"dns-default-7ckfm\" (UID: \"d04de14b-8e96-44ab-818f-2b08d78d2e14\") " pod="openshift-dns/dns-default-7ckfm"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.265416    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dbb6ff6b-d521-408a-831c-a6a9c524a671-bound-sa-token\") pod \"ingress-operator-5b745b69d9-fjhxb\" (UID: \"dbb6ff6b-d521-408a-831c-a6a9c524a671\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fjhxb"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.265437    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wz2n6\" (UniqueName: \"kubernetes.io/projected/0a030e24-2337-49a2-a5e2-118714cd7ff9-kube-api-access-wz2n6\") pod \"machine-config-server-8n5gl\" (UID: \"0a030e24-2337-49a2-a5e2-118714cd7ff9\") " pod="openshift-machine-config-operator/machine-config-server-8n5gl"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.265474    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1010c304-4912-42b2-aa8c-17d44c4bf6cb-signing-key\") pod \"service-ca-9c57cc56f-sxbnn\" (UID: \"1010c304-4912-42b2-aa8c-17d44c4bf6cb\") " pod="openshift-service-ca/service-ca-9c57cc56f-sxbnn"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.265496    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r75nz\" (UniqueName: \"kubernetes.io/projected/d04de14b-8e96-44ab-818f-2b08d78d2e14-kube-api-access-r75nz\") pod \"dns-default-7ckfm\" (UID: \"d04de14b-8e96-44ab-818f-2b08d78d2e14\") " pod="openshift-dns/dns-default-7ckfm"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.265519    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2d975f8-1a1e-4921-aef0-3c4652992a02-config\") pod \"etcd-operator-b45778765-w5gdn\" (UID: \"c2d975f8-1a1e-4921-aef0-3c4652992a02\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w5gdn"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.265551    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b82f2ec3-df30-4b45-be3a-9858edb2bb7f-serving-cert\") pod \"service-ca-operator-777779d784-c2cgf\" (UID: \"b82f2ec3-df30-4b45-be3a-9858edb2bb7f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c2cgf"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.265576    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/c9f80b42-cff3-48a7-9e09-02ff65e9d9f8-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-jkk9s\" (UID: \"c9f80b42-cff3-48a7-9e09-02ff65e9d9f8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jkk9s"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.265600    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dbb6ff6b-d521-408a-831c-a6a9c524a671-metrics-tls\") pod \"ingress-operator-5b745b69d9-fjhxb\" (UID: \"dbb6ff6b-d521-408a-831c-a6a9c524a671\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fjhxb"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.265621    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkbfb\" (UniqueName: \"kubernetes.io/projected/e213a906-8ad6-45c1-b832-a42d58fd91c6-kube-api-access-bkbfb\") pod \"multus-admission-controller-857f4d67dd-84pdq\" (UID: \"e213a906-8ad6-45c1-b832-a42d58fd91c6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-84pdq"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.265643    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9a5496e-57aa-4f42-b53d-590fb534d26e-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-sk8hp\" (UID: \"d9a5496e-57aa-4f42-b53d-590fb534d26e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sk8hp"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.265662    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5612cc7f-9299-43b4-b97c-cf579a416e84-images\") pod \"machine-config-operator-74547568cd-4dp6q\" (UID: \"5612cc7f-9299-43b4-b97c-cf579a416e84\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4dp6q"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.265664    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/428fa435-b92e-4363-82bb-40316d3e0a26-plugins-dir\") pod \"csi-hostpathplugin-jgzlv\" (UID: \"428fa435-b92e-4363-82bb-40316d3e0a26\") " pod="hostpath-provisioner/csi-hostpathplugin-jgzlv"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.265681    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbshk\" (UniqueName: \"kubernetes.io/projected/5612cc7f-9299-43b4-b97c-cf579a416e84-kube-api-access-fbshk\") pod \"machine-config-operator-74547568cd-4dp6q\" (UID: \"5612cc7f-9299-43b4-b97c-cf579a416e84\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4dp6q"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.265711    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8e224294-495e-4d65-96f2-8e0d2a444ef1-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-7hzc8\" (UID: \"8e224294-495e-4d65-96f2-8e0d2a444ef1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7hzc8"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.265730    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/0a030e24-2337-49a2-a5e2-118714cd7ff9-certs\") pod \"machine-config-server-8n5gl\" (UID: \"0a030e24-2337-49a2-a5e2-118714cd7ff9\") " pod="openshift-machine-config-operator/machine-config-server-8n5gl"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.265751    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27wgn\" (UniqueName: \"kubernetes.io/projected/b82f2ec3-df30-4b45-be3a-9858edb2bb7f-kube-api-access-27wgn\") pod \"service-ca-operator-777779d784-c2cgf\" (UID: \"b82f2ec3-df30-4b45-be3a-9858edb2bb7f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c2cgf"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.265781    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ac46477-04bc-4d0a-b28e-b687c690dd5a-config\") pod \"kube-apiserver-operator-766d6c64bb-qwghv\" (UID: \"9ac46477-04bc-4d0a-b28e-b687c690dd5a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qwghv"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.265802    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afbc673a-2498-49dc-b98e-d7ddc58d2999-config\") pod \"kube-controller-manager-operator-78b949d7b-mzjxx\" (UID: \"afbc673a-2498-49dc-b98e-d7ddc58d2999\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mzjxx"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.265821    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a268a97-bf49-4ed6-b239-1a088c3c4e4f-cert\") pod \"ingress-canary-6w7m9\" (UID: \"8a268a97-bf49-4ed6-b239-1a088c3c4e4f\") " pod="openshift-ingress-canary/ingress-canary-6w7m9"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.265844    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzkcx\" (UniqueName: \"kubernetes.io/projected/8e224294-495e-4d65-96f2-8e0d2a444ef1-kube-api-access-pzkcx\") pod \"machine-config-controller-84d6567774-7hzc8\" (UID: \"8e224294-495e-4d65-96f2-8e0d2a444ef1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7hzc8"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.265866    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/0a030e24-2337-49a2-a5e2-118714cd7ff9-node-bootstrap-token\") pod \"machine-config-server-8n5gl\" (UID: \"0a030e24-2337-49a2-a5e2-118714cd7ff9\") " pod="openshift-machine-config-operator/machine-config-server-8n5gl"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.265887    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2d975f8-1a1e-4921-aef0-3c4652992a02-serving-cert\") pod \"etcd-operator-b45778765-w5gdn\" (UID: \"c2d975f8-1a1e-4921-aef0-3c4652992a02\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w5gdn"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.265909    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5612cc7f-9299-43b4-b97c-cf579a416e84-proxy-tls\") pod \"machine-config-operator-74547568cd-4dp6q\" (UID: \"5612cc7f-9299-43b4-b97c-cf579a416e84\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4dp6q"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.265929    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8e224294-495e-4d65-96f2-8e0d2a444ef1-proxy-tls\") pod \"machine-config-controller-84d6567774-7hzc8\" (UID: \"8e224294-495e-4d65-96f2-8e0d2a444ef1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7hzc8"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.265968    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-klbh8\" (UID: \"e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3\") " pod="openshift-marketplace/marketplace-operator-79b997595-klbh8"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.265995    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs4lb\" (UniqueName: \"kubernetes.io/projected/8a268a97-bf49-4ed6-b239-1a088c3c4e4f-kube-api-access-gs4lb\") pod \"ingress-canary-6w7m9\" (UID: \"8a268a97-bf49-4ed6-b239-1a088c3c4e4f\") " pod="openshift-ingress-canary/ingress-canary-6w7m9"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.266019    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/be19fb65-a04f-42df-9b96-e620b58754bb-secret-volume\") pod \"collect-profiles-29567010-d69bc\" (UID: \"be19fb65-a04f-42df-9b96-e620b58754bb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567010-d69bc"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.266042    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rt52k\" (UniqueName: \"kubernetes.io/projected/f6f09179-5752-4a5a-ab79-72a176bbdd9a-kube-api-access-rt52k\") pod \"olm-operator-6b444d44fb-m7fsc\" (UID: \"f6f09179-5752-4a5a-ab79-72a176bbdd9a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7fsc"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.266064    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/428fa435-b92e-4363-82bb-40316d3e0a26-registration-dir\") pod \"csi-hostpathplugin-jgzlv\" (UID: \"428fa435-b92e-4363-82bb-40316d3e0a26\") " pod="hostpath-provisioner/csi-hostpathplugin-jgzlv"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.266085    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dbb6ff6b-d521-408a-831c-a6a9c524a671-trusted-ca\") pod \"ingress-operator-5b745b69d9-fjhxb\" (UID: \"dbb6ff6b-d521-408a-831c-a6a9c524a671\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fjhxb"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.266107    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8nxk\" (UniqueName: \"kubernetes.io/projected/dbb6ff6b-d521-408a-831c-a6a9c524a671-kube-api-access-c8nxk\") pod \"ingress-operator-5b745b69d9-fjhxb\" (UID: \"dbb6ff6b-d521-408a-831c-a6a9c524a671\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fjhxb"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.266129    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6bfj\" (UniqueName: \"kubernetes.io/projected/1010c304-4912-42b2-aa8c-17d44c4bf6cb-kube-api-access-g6bfj\") pod \"service-ca-9c57cc56f-sxbnn\" (UID: \"1010c304-4912-42b2-aa8c-17d44c4bf6cb\") " pod="openshift-service-ca/service-ca-9c57cc56f-sxbnn"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.266152    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df4fa0ea-abb1-49ea-8d74-2992c71c1a0e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vqxff\" (UID: \"df4fa0ea-abb1-49ea-8d74-2992c71c1a0e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vqxff"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.266176    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f6f09179-5752-4a5a-ab79-72a176bbdd9a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-m7fsc\" (UID: \"f6f09179-5752-4a5a-ab79-72a176bbdd9a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7fsc"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.266197    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f6f09179-5752-4a5a-ab79-72a176bbdd9a-srv-cert\") pod \"olm-operator-6b444d44fb-m7fsc\" (UID: \"f6f09179-5752-4a5a-ab79-72a176bbdd9a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7fsc"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.266218    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/afbc673a-2498-49dc-b98e-d7ddc58d2999-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-mzjxx\" (UID: \"afbc673a-2498-49dc-b98e-d7ddc58d2999\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mzjxx"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.266239    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46gxc\" (UniqueName: \"kubernetes.io/projected/7d87adfe-3206-4175-8d8f-5a00015cc61e-kube-api-access-46gxc\") pod \"auto-csr-approver-29567022-wf5nv\" (UID: \"7d87adfe-3206-4175-8d8f-5a00015cc61e\") " pod="openshift-infra/auto-csr-approver-29567022-wf5nv"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.266297    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/c2d975f8-1a1e-4921-aef0-3c4652992a02-etcd-ca\") pod \"etcd-operator-b45778765-w5gdn\" (UID: \"c2d975f8-1a1e-4921-aef0-3c4652992a02\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w5gdn"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.266333    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zj2f6\" (UniqueName: \"kubernetes.io/projected/c9f80b42-cff3-48a7-9e09-02ff65e9d9f8-kube-api-access-zj2f6\") pod \"control-plane-machine-set-operator-78cbb6b69f-jkk9s\" (UID: \"c9f80b42-cff3-48a7-9e09-02ff65e9d9f8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jkk9s"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.266355    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d04de14b-8e96-44ab-818f-2b08d78d2e14-metrics-tls\") pod \"dns-default-7ckfm\" (UID: \"d04de14b-8e96-44ab-818f-2b08d78d2e14\") " pod="openshift-dns/dns-default-7ckfm"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.266381    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b82f2ec3-df30-4b45-be3a-9858edb2bb7f-config\") pod \"service-ca-operator-777779d784-c2cgf\" (UID: \"b82f2ec3-df30-4b45-be3a-9858edb2bb7f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c2cgf"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.266402    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4e2a7090-33b8-4137-be83-5c2e5ab1ccc7-profile-collector-cert\") pod \"catalog-operator-68c6474976-m7cfz\" (UID: \"4e2a7090-33b8-4137-be83-5c2e5ab1ccc7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m7cfz"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.266422    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9a5496e-57aa-4f42-b53d-590fb534d26e-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-sk8hp\" (UID: \"d9a5496e-57aa-4f42-b53d-590fb534d26e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sk8hp"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.266464    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/c2d975f8-1a1e-4921-aef0-3c4652992a02-etcd-service-ca\") pod \"etcd-operator-b45778765-w5gdn\" (UID: \"c2d975f8-1a1e-4921-aef0-3c4652992a02\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w5gdn"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.266491    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/df4fa0ea-abb1-49ea-8d74-2992c71c1a0e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vqxff\" (UID: \"df4fa0ea-abb1-49ea-8d74-2992c71c1a0e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vqxff"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.267187    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5612cc7f-9299-43b4-b97c-cf579a416e84-auth-proxy-config\") pod \"machine-config-operator-74547568cd-4dp6q\" (UID: \"5612cc7f-9299-43b4-b97c-cf579a416e84\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4dp6q"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.267424    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/428fa435-b92e-4363-82bb-40316d3e0a26-socket-dir\") pod \"csi-hostpathplugin-jgzlv\" (UID: \"428fa435-b92e-4363-82bb-40316d3e0a26\") " pod="hostpath-provisioner/csi-hostpathplugin-jgzlv"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.267482    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/428fa435-b92e-4363-82bb-40316d3e0a26-mountpoint-dir\") pod \"csi-hostpathplugin-jgzlv\" (UID: \"428fa435-b92e-4363-82bb-40316d3e0a26\") " pod="hostpath-provisioner/csi-hostpathplugin-jgzlv"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.268333    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/428fa435-b92e-4363-82bb-40316d3e0a26-csi-data-dir\") pod \"csi-hostpathplugin-jgzlv\" (UID: \"428fa435-b92e-4363-82bb-40316d3e0a26\") " pod="hostpath-provisioner/csi-hostpathplugin-jgzlv"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.268890    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df4fa0ea-abb1-49ea-8d74-2992c71c1a0e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vqxff\" (UID: \"df4fa0ea-abb1-49ea-8d74-2992c71c1a0e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vqxff"
Mar 20 15:42:39 crc kubenswrapper[4730]: E0320 15:42:39.269153    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:42:39.769139231 +0000 UTC m=+218.982510640 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bdpg6" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.269962    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1010c304-4912-42b2-aa8c-17d44c4bf6cb-signing-cabundle\") pod \"service-ca-9c57cc56f-sxbnn\" (UID: \"1010c304-4912-42b2-aa8c-17d44c4bf6cb\") " pod="openshift-service-ca/service-ca-9c57cc56f-sxbnn"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.270632    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d04de14b-8e96-44ab-818f-2b08d78d2e14-config-volume\") pod \"dns-default-7ckfm\" (UID: \"d04de14b-8e96-44ab-818f-2b08d78d2e14\") " pod="openshift-dns/dns-default-7ckfm"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.271386    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be19fb65-a04f-42df-9b96-e620b58754bb-config-volume\") pod \"collect-profiles-29567010-d69bc\" (UID: \"be19fb65-a04f-42df-9b96-e620b58754bb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567010-d69bc"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.273191    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c2d975f8-1a1e-4921-aef0-3c4652992a02-etcd-client\") pod \"etcd-operator-b45778765-w5gdn\" (UID: \"c2d975f8-1a1e-4921-aef0-3c4652992a02\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w5gdn"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.273292    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afbc673a-2498-49dc-b98e-d7ddc58d2999-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-mzjxx\" (UID: \"afbc673a-2498-49dc-b98e-d7ddc58d2999\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mzjxx"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.273320    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/428fa435-b92e-4363-82bb-40316d3e0a26-registration-dir\") pod \"csi-hostpathplugin-jgzlv\" (UID: \"428fa435-b92e-4363-82bb-40316d3e0a26\") " pod="hostpath-provisioner/csi-hostpathplugin-jgzlv"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.275398    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8e224294-495e-4d65-96f2-8e0d2a444ef1-proxy-tls\") pod \"machine-config-controller-84d6567774-7hzc8\" (UID: \"8e224294-495e-4d65-96f2-8e0d2a444ef1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7hzc8"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.276151    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2083343b-2ec0-4306-a0a5-f74dd0f63746-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-m4xlq\" (UID: \"2083343b-2ec0-4306-a0a5-f74dd0f63746\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m4xlq"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.276869    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1010c304-4912-42b2-aa8c-17d44c4bf6cb-signing-key\") pod \"service-ca-9c57cc56f-sxbnn\" (UID: \"1010c304-4912-42b2-aa8c-17d44c4bf6cb\") " pod="openshift-service-ca/service-ca-9c57cc56f-sxbnn"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.277147    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2d975f8-1a1e-4921-aef0-3c4652992a02-config\") pod \"etcd-operator-b45778765-w5gdn\" (UID: \"c2d975f8-1a1e-4921-aef0-3c4652992a02\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w5gdn"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.277937    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f6f09179-5752-4a5a-ab79-72a176bbdd9a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-m7fsc\" (UID: \"f6f09179-5752-4a5a-ab79-72a176bbdd9a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7fsc"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.278745    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e213a906-8ad6-45c1-b832-a42d58fd91c6-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-84pdq\" (UID: \"e213a906-8ad6-45c1-b832-a42d58fd91c6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-84pdq"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.278773    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f6f09179-5752-4a5a-ab79-72a176bbdd9a-srv-cert\") pod \"olm-operator-6b444d44fb-m7fsc\" (UID: \"f6f09179-5752-4a5a-ab79-72a176bbdd9a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7fsc"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.279386    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dbb6ff6b-d521-408a-831c-a6a9c524a671-trusted-ca\") pod \"ingress-operator-5b745b69d9-fjhxb\" (UID: \"dbb6ff6b-d521-408a-831c-a6a9c524a671\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fjhxb"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.279520    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/c2d975f8-1a1e-4921-aef0-3c4652992a02-etcd-ca\") pod \"etcd-operator-b45778765-w5gdn\" (UID: \"c2d975f8-1a1e-4921-aef0-3c4652992a02\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w5gdn"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.279704    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/be19fb65-a04f-42df-9b96-e620b58754bb-secret-volume\") pod \"collect-profiles-29567010-d69bc\" (UID: \"be19fb65-a04f-42df-9b96-e620b58754bb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567010-d69bc"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.280623    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9a5496e-57aa-4f42-b53d-590fb534d26e-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-sk8hp\" (UID: \"d9a5496e-57aa-4f42-b53d-590fb534d26e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sk8hp"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.281404    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4e2a7090-33b8-4137-be83-5c2e5ab1ccc7-srv-cert\") pod \"catalog-operator-68c6474976-m7cfz\" (UID: \"4e2a7090-33b8-4137-be83-5c2e5ab1ccc7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m7cfz"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.282056    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b82f2ec3-df30-4b45-be3a-9858edb2bb7f-serving-cert\") pod \"service-ca-operator-777779d784-c2cgf\" (UID: \"b82f2ec3-df30-4b45-be3a-9858edb2bb7f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c2cgf"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.285114    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ac46477-04bc-4d0a-b28e-b687c690dd5a-config\") pod \"kube-apiserver-operator-766d6c64bb-qwghv\" (UID: \"9ac46477-04bc-4d0a-b28e-b687c690dd5a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qwghv"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.286008    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/c9f80b42-cff3-48a7-9e09-02ff65e9d9f8-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-jkk9s\" (UID: \"c9f80b42-cff3-48a7-9e09-02ff65e9d9f8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jkk9s"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.286619    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5612cc7f-9299-43b4-b97c-cf579a416e84-images\") pod \"machine-config-operator-74547568cd-4dp6q\" (UID: \"5612cc7f-9299-43b4-b97c-cf579a416e84\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4dp6q"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.286744    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-klbh8\" (UID: \"e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3\") " pod="openshift-marketplace/marketplace-operator-79b997595-klbh8"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.287421    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8e224294-495e-4d65-96f2-8e0d2a444ef1-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-7hzc8\" (UID: \"8e224294-495e-4d65-96f2-8e0d2a444ef1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7hzc8"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.289901    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4e2a7090-33b8-4137-be83-5c2e5ab1ccc7-profile-collector-cert\") pod \"catalog-operator-68c6474976-m7cfz\" (UID: \"4e2a7090-33b8-4137-be83-5c2e5ab1ccc7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m7cfz"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.290205    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b82f2ec3-df30-4b45-be3a-9858edb2bb7f-config\") pod \"service-ca-operator-777779d784-c2cgf\" (UID: \"b82f2ec3-df30-4b45-be3a-9858edb2bb7f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c2cgf"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.290716    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/0a030e24-2337-49a2-a5e2-118714cd7ff9-certs\") pod \"machine-config-server-8n5gl\" (UID: \"0a030e24-2337-49a2-a5e2-118714cd7ff9\") " pod="openshift-machine-config-operator/machine-config-server-8n5gl"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.291644    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/0a030e24-2337-49a2-a5e2-118714cd7ff9-node-bootstrap-token\") pod \"machine-config-server-8n5gl\" (UID: \"0a030e24-2337-49a2-a5e2-118714cd7ff9\") " pod="openshift-machine-config-operator/machine-config-server-8n5gl"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.295334    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9a5496e-57aa-4f42-b53d-590fb534d26e-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-sk8hp\" (UID: \"d9a5496e-57aa-4f42-b53d-590fb534d26e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sk8hp"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.296105    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/c2d975f8-1a1e-4921-aef0-3c4652992a02-etcd-service-ca\") pod \"etcd-operator-b45778765-w5gdn\" (UID: \"c2d975f8-1a1e-4921-aef0-3c4652992a02\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w5gdn"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.296581    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d04de14b-8e96-44ab-818f-2b08d78d2e14-metrics-tls\") pod \"dns-default-7ckfm\" (UID: \"d04de14b-8e96-44ab-818f-2b08d78d2e14\") " pod="openshift-dns/dns-default-7ckfm"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.296878    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df4fa0ea-abb1-49ea-8d74-2992c71c1a0e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vqxff\" (UID: \"df4fa0ea-abb1-49ea-8d74-2992c71c1a0e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vqxff"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.296924    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afbc673a-2498-49dc-b98e-d7ddc58d2999-config\") pod \"kube-controller-manager-operator-78b949d7b-mzjxx\" (UID: \"afbc673a-2498-49dc-b98e-d7ddc58d2999\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mzjxx"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.297112    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-klbh8\" (UID: \"e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3\") " pod="openshift-marketplace/marketplace-operator-79b997595-klbh8"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.298407    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ac46477-04bc-4d0a-b28e-b687c690dd5a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-qwghv\" (UID: \"9ac46477-04bc-4d0a-b28e-b687c690dd5a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qwghv"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.298586    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2d975f8-1a1e-4921-aef0-3c4652992a02-serving-cert\") pod \"etcd-operator-b45778765-w5gdn\" (UID: \"c2d975f8-1a1e-4921-aef0-3c4652992a02\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w5gdn"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.299211    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a268a97-bf49-4ed6-b239-1a088c3c4e4f-cert\") pod \"ingress-canary-6w7m9\" (UID: \"8a268a97-bf49-4ed6-b239-1a088c3c4e4f\") " pod="openshift-ingress-canary/ingress-canary-6w7m9"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.310544    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dbb6ff6b-d521-408a-831c-a6a9c524a671-metrics-tls\") pod \"ingress-operator-5b745b69d9-fjhxb\" (UID: \"dbb6ff6b-d521-408a-831c-a6a9c524a671\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fjhxb"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.310905    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5612cc7f-9299-43b4-b97c-cf579a416e84-proxy-tls\") pod \"machine-config-operator-74547568cd-4dp6q\" (UID: \"5612cc7f-9299-43b4-b97c-cf579a416e84\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4dp6q"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.320224    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7wjq\" (UniqueName: \"kubernetes.io/projected/d9a5496e-57aa-4f42-b53d-590fb534d26e-kube-api-access-r7wjq\") pod \"kube-storage-version-migrator-operator-b67b599dd-sk8hp\" (UID: \"d9a5496e-57aa-4f42-b53d-590fb534d26e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sk8hp"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.333811    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sk8hp"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.335111    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kfjm5"]
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.340280    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzvgx\" (UniqueName: \"kubernetes.io/projected/428fa435-b92e-4363-82bb-40316d3e0a26-kube-api-access-kzvgx\") pod \"csi-hostpathplugin-jgzlv\" (UID: \"428fa435-b92e-4363-82bb-40316d3e0a26\") " pod="hostpath-provisioner/csi-hostpathplugin-jgzlv"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.346829    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-9kgl8"]
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.353539    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/df4fa0ea-abb1-49ea-8d74-2992c71c1a0e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vqxff\" (UID: \"df4fa0ea-abb1-49ea-8d74-2992c71c1a0e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vqxff"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.356723    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vqxff"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.370296    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:42:39 crc kubenswrapper[4730]: E0320 15:42:39.371314    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:42:39.871278066 +0000 UTC m=+219.084649435 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.379217    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dbb6ff6b-d521-408a-831c-a6a9c524a671-bound-sa-token\") pod \"ingress-operator-5b745b69d9-fjhxb\" (UID: \"dbb6ff6b-d521-408a-831c-a6a9c524a671\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fjhxb"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.423690    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbnh9\" (UniqueName: \"kubernetes.io/projected/be19fb65-a04f-42df-9b96-e620b58754bb-kube-api-access-wbnh9\") pod \"collect-profiles-29567010-d69bc\" (UID: \"be19fb65-a04f-42df-9b96-e620b58754bb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567010-d69bc"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.430391    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz2n6\" (UniqueName: \"kubernetes.io/projected/0a030e24-2337-49a2-a5e2-118714cd7ff9-kube-api-access-wz2n6\") pod \"machine-config-server-8n5gl\" (UID: \"0a030e24-2337-49a2-a5e2-118714cd7ff9\") " pod="openshift-machine-config-operator/machine-config-server-8n5gl"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.439948    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xspkm"]
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.445341    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9ac46477-04bc-4d0a-b28e-b687c690dd5a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-qwghv\" (UID: \"9ac46477-04bc-4d0a-b28e-b687c690dd5a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qwghv"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.458609    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxgbn\" (UniqueName: \"kubernetes.io/projected/49896a92-a6b0-45ea-a736-09a368d90be4-kube-api-access-xxgbn\") pod \"migrator-59844c95c7-qxnn6\" (UID: \"49896a92-a6b0-45ea-a736-09a368d90be4\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qxnn6"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.474122    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:39 crc kubenswrapper[4730]: E0320 15:42:39.474562    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:42:39.974547205 +0000 UTC m=+219.187918574 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bdpg6" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.476233    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj2f6\" (UniqueName: \"kubernetes.io/projected/c9f80b42-cff3-48a7-9e09-02ff65e9d9f8-kube-api-access-zj2f6\") pod \"control-plane-machine-set-operator-78cbb6b69f-jkk9s\" (UID: \"c9f80b42-cff3-48a7-9e09-02ff65e9d9f8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jkk9s"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.505879    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-jgzlv"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.507053    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r75nz\" (UniqueName: \"kubernetes.io/projected/d04de14b-8e96-44ab-818f-2b08d78d2e14-kube-api-access-r75nz\") pod \"dns-default-7ckfm\" (UID: \"d04de14b-8e96-44ab-818f-2b08d78d2e14\") " pod="openshift-dns/dns-default-7ckfm"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.524500    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxflj\" (UniqueName: \"kubernetes.io/projected/e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3-kube-api-access-jxflj\") pod \"marketplace-operator-79b997595-klbh8\" (UID: \"e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3\") " pod="openshift-marketplace/marketplace-operator-79b997595-klbh8"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.539479    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2qdqs"]
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.539511    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-nfww4"]
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.542637    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njkjz\" (UniqueName: \"kubernetes.io/projected/c2d975f8-1a1e-4921-aef0-3c4652992a02-kube-api-access-njkjz\") pod \"etcd-operator-b45778765-w5gdn\" (UID: \"c2d975f8-1a1e-4921-aef0-3c4652992a02\") " pod="openshift-etcd-operator/etcd-operator-b45778765-w5gdn"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.556167    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46gxc\" (UniqueName: \"kubernetes.io/projected/7d87adfe-3206-4175-8d8f-5a00015cc61e-kube-api-access-46gxc\") pod \"auto-csr-approver-29567022-wf5nv\" (UID: \"7d87adfe-3206-4175-8d8f-5a00015cc61e\") " pod="openshift-infra/auto-csr-approver-29567022-wf5nv"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.562295    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-k6z2l"]
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.575110    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:42:39 crc kubenswrapper[4730]: E0320 15:42:39.576146    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:42:40.076120533 +0000 UTC m=+219.289491902 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.578236    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/afbc673a-2498-49dc-b98e-d7ddc58d2999-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-mzjxx\" (UID: \"afbc673a-2498-49dc-b98e-d7ddc58d2999\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mzjxx"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.593759    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8nxk\" (UniqueName: \"kubernetes.io/projected/dbb6ff6b-d521-408a-831c-a6a9c524a671-kube-api-access-c8nxk\") pod \"ingress-operator-5b745b69d9-fjhxb\" (UID: \"dbb6ff6b-d521-408a-831c-a6a9c524a671\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fjhxb"
Mar 20 15:42:39 crc kubenswrapper[4730]: W0320 15:42:39.602873    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda250c56d_72fb_473d_98ce_c013e9d15b4a.slice/crio-c78c2f0da29099b0ea2fa0b59fa2d2ed0d533a9a4639f85dd9ae55e1421c9dc4 WatchSource:0}: Error finding container c78c2f0da29099b0ea2fa0b59fa2d2ed0d533a9a4639f85dd9ae55e1421c9dc4: Status 404 returned error can't find the container with id c78c2f0da29099b0ea2fa0b59fa2d2ed0d533a9a4639f85dd9ae55e1421c9dc4
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.603481    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mzjxx"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.608320    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qwghv"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.616341    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6bfj\" (UniqueName: \"kubernetes.io/projected/1010c304-4912-42b2-aa8c-17d44c4bf6cb-kube-api-access-g6bfj\") pod \"service-ca-9c57cc56f-sxbnn\" (UID: \"1010c304-4912-42b2-aa8c-17d44c4bf6cb\") " pod="openshift-service-ca/service-ca-9c57cc56f-sxbnn"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.641940    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qxnn6"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.647893    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs4lb\" (UniqueName: \"kubernetes.io/projected/8a268a97-bf49-4ed6-b239-1a088c3c4e4f-kube-api-access-gs4lb\") pod \"ingress-canary-6w7m9\" (UID: \"8a268a97-bf49-4ed6-b239-1a088c3c4e4f\") " pod="openshift-ingress-canary/ingress-canary-6w7m9"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.657837    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6cmt\" (UniqueName: \"kubernetes.io/projected/4e2a7090-33b8-4137-be83-5c2e5ab1ccc7-kube-api-access-b6cmt\") pod \"catalog-operator-68c6474976-m7cfz\" (UID: \"4e2a7090-33b8-4137-be83-5c2e5ab1ccc7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m7cfz"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.664681    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fjhxb"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.672327    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hrm7z"]
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.682854    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8jhd\" (UniqueName: \"kubernetes.io/projected/2083343b-2ec0-4306-a0a5-f74dd0f63746-kube-api-access-n8jhd\") pod \"package-server-manager-789f6589d5-m4xlq\" (UID: \"2083343b-2ec0-4306-a0a5-f74dd0f63746\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m4xlq"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.682899    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:39 crc kubenswrapper[4730]: E0320 15:42:39.683327    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:42:40.183313643 +0000 UTC m=+219.396685012 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bdpg6" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.685761    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m4xlq"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.687602    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tgpgm"]
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.693207    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jkk9s"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.697638    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-92dt7"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.700089    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-w5gdn"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.701964    4730 patch_prober.go:28] interesting pod/router-default-5444994796-92dt7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld
Mar 20 15:42:39 crc kubenswrapper[4730]: [-]has-synced failed: reason withheld
Mar 20 15:42:39 crc kubenswrapper[4730]: [+]process-running ok
Mar 20 15:42:39 crc kubenswrapper[4730]: healthz check failed
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.702002    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-92dt7" podUID="18214bd2-9c3a-4737-885b-2b5c905311d8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.702630    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27wgn\" (UniqueName: \"kubernetes.io/projected/b82f2ec3-df30-4b45-be3a-9858edb2bb7f-kube-api-access-27wgn\") pod \"service-ca-operator-777779d784-c2cgf\" (UID: \"b82f2ec3-df30-4b45-be3a-9858edb2bb7f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c2cgf"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.708045    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567010-d69bc"
Mar 20 15:42:39 crc kubenswrapper[4730]: W0320 15:42:39.711684    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a38d833_db72_4566_b139_7788730a502a.slice/crio-064c4bf4d5c66296dd98b82862d30c6c5583edf1808fe56690235f116a074683 WatchSource:0}: Error finding container 064c4bf4d5c66296dd98b82862d30c6c5583edf1808fe56690235f116a074683: Status 404 returned error can't find the container with id 064c4bf4d5c66296dd98b82862d30c6c5583edf1808fe56690235f116a074683
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.713986    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt52k\" (UniqueName: \"kubernetes.io/projected/f6f09179-5752-4a5a-ab79-72a176bbdd9a-kube-api-access-rt52k\") pod \"olm-operator-6b444d44fb-m7fsc\" (UID: \"f6f09179-5752-4a5a-ab79-72a176bbdd9a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7fsc"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.714868    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-sxbnn"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.728928    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-8n5gl"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.732582    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbshk\" (UniqueName: \"kubernetes.io/projected/5612cc7f-9299-43b4-b97c-cf579a416e84-kube-api-access-fbshk\") pod \"machine-config-operator-74547568cd-4dp6q\" (UID: \"5612cc7f-9299-43b4-b97c-cf579a416e84\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4dp6q"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.744891    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-klbh8"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.757165    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkbfb\" (UniqueName: \"kubernetes.io/projected/e213a906-8ad6-45c1-b832-a42d58fd91c6-kube-api-access-bkbfb\") pod \"multus-admission-controller-857f4d67dd-84pdq\" (UID: \"e213a906-8ad6-45c1-b832-a42d58fd91c6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-84pdq"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.772565    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzkcx\" (UniqueName: \"kubernetes.io/projected/8e224294-495e-4d65-96f2-8e0d2a444ef1-kube-api-access-pzkcx\") pod \"machine-config-controller-84d6567774-7hzc8\" (UID: \"8e224294-495e-4d65-96f2-8e0d2a444ef1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7hzc8"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.775741    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567022-wf5nv"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.781717    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-c2cgf"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.784575    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:42:39 crc kubenswrapper[4730]: E0320 15:42:39.784999    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:42:40.284979443 +0000 UTC m=+219.498350812 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.792706    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7ckfm"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.796819    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6w7m9"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.855650    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-g7hdt"]
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.870422    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-st79s"]
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.871686    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-6rbg9"]
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.887527    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:39 crc kubenswrapper[4730]: E0320 15:42:39.888159    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:42:40.388147709 +0000 UTC m=+219.601519078 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bdpg6" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.913926    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-mkxg7"]
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.917731    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m7cfz"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.925324    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7fsc"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.930406    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zr8dk"]
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.949072    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-84pdq"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.968550    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pncxq"]
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.976114    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4dp6q"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.982697    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7hzc8"
Mar 20 15:42:39 crc kubenswrapper[4730]: I0320 15:42:39.993371    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:42:39 crc kubenswrapper[4730]: E0320 15:42:39.993743    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:42:40.49372511 +0000 UTC m=+219.707096479 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.028781    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mzjxx"]
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.028859    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sk8hp"]
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.096376    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:40 crc kubenswrapper[4730]: E0320 15:42:40.096971    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:42:40.596958928 +0000 UTC m=+219.810330297 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bdpg6" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.106775    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vqxff"]
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.108119    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-jgzlv"]
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.197201    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:42:40 crc kubenswrapper[4730]: E0320 15:42:40.197698    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:42:40.697677389 +0000 UTC m=+219.911048768 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.198799    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:40 crc kubenswrapper[4730]: E0320 15:42:40.199113    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:42:40.699102223 +0000 UTC m=+219.912473592 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bdpg6" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:40 crc kubenswrapper[4730]: W0320 15:42:40.223943    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bc0c5b5_55bb_4339_8162_bb647b833006.slice/crio-50492069065d98dc384151c573905cb8667e0fb3fbd14c80994e16f59e459f10 WatchSource:0}: Error finding container 50492069065d98dc384151c573905cb8667e0fb3fbd14c80994e16f59e459f10: Status 404 returned error can't find the container with id 50492069065d98dc384151c573905cb8667e0fb3fbd14c80994e16f59e459f10
Mar 20 15:42:40 crc kubenswrapper[4730]: W0320 15:42:40.224630    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafbc673a_2498_49dc_b98e_d7ddc58d2999.slice/crio-a094a3d0363445cd7d6b98232c158d8cd5f73dfb606055f9c0cd72f2222dd487 WatchSource:0}: Error finding container a094a3d0363445cd7d6b98232c158d8cd5f73dfb606055f9c0cd72f2222dd487: Status 404 returned error can't find the container with id a094a3d0363445cd7d6b98232c158d8cd5f73dfb606055f9c0cd72f2222dd487
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.267919    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-hrm7z" event={"ID":"9a38d833-db72-4566-b139-7788730a502a","Type":"ContainerStarted","Data":"064c4bf4d5c66296dd98b82862d30c6c5583edf1808fe56690235f116a074683"}
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.269934    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-nfww4" event={"ID":"a250c56d-72fb-473d-98ce-c013e9d15b4a","Type":"ContainerStarted","Data":"c78c2f0da29099b0ea2fa0b59fa2d2ed0d533a9a4639f85dd9ae55e1421c9dc4"}
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.273586    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-k6z2l" event={"ID":"5c0e41b3-aa2d-4083-acb2-f0f68a29fcce","Type":"ContainerStarted","Data":"2b4e8f50729275ee8c790e0f8088c598e61fb52dff6ce33a0d19bea3fb8ac220"}
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.276833    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zr8dk" event={"ID":"ca0985ab-94f6-4f4e-b8b4-0ee710e01fcf","Type":"ContainerStarted","Data":"23b2fc6e3a3fdc1cf917a4f3f5e3a91ea9c0b75307f62e707b985d0eff061698"}
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.297678    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2qdqs" event={"ID":"e190e098-9bc8-492f-9657-f6ccfb836f23","Type":"ContainerStarted","Data":"bd274a929342b144978e537e6ea2d49492c4e82f55a2b4b84204f310ad918116"}
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.297731    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2qdqs" event={"ID":"e190e098-9bc8-492f-9657-f6ccfb836f23","Type":"ContainerStarted","Data":"a039a5bbde0f3bd6a992953622be34641890cf92f110f9d769f5ee8e166ecf29"}
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.298703    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2qdqs"
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.299739    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:42:40 crc kubenswrapper[4730]: E0320 15:42:40.300134    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:42:40.800120883 +0000 UTC m=+220.013492252 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.301562    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-9kgl8" event={"ID":"5edbd5a9-6c8b-4ef8-950f-58deaecf36ee","Type":"ContainerStarted","Data":"dd7ad8497736491ff86002b539ccd73a88dfd723919819049a39b445ea55904f"}
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.301600    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-9kgl8" event={"ID":"5edbd5a9-6c8b-4ef8-950f-58deaecf36ee","Type":"ContainerStarted","Data":"e011dbdf40941c9f2e1edba06bd23dad1736901c7815ace4b7b103d548c5c8d5"}
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.302597    4730 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-2qdqs container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:5443/healthz\": dial tcp 10.217.0.22:5443: connect: connection refused" start-of-body=
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.302635    4730 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2qdqs" podUID="e190e098-9bc8-492f-9657-f6ccfb836f23" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.22:5443/healthz\": dial tcp 10.217.0.22:5443: connect: connection refused"
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.305975    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mzjxx" event={"ID":"afbc673a-2498-49dc-b98e-d7ddc58d2999","Type":"ContainerStarted","Data":"a094a3d0363445cd7d6b98232c158d8cd5f73dfb606055f9c0cd72f2222dd487"}
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.308038    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sk8hp" event={"ID":"d9a5496e-57aa-4f42-b53d-590fb534d26e","Type":"ContainerStarted","Data":"0bb0e807275a1387ac8676045c734f0052cf50a2b0d54b00ca73af6fab20a832"}
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.309904    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-mkxg7" event={"ID":"2eb5b4cf-be45-4c3d-abbf-9a3e525d5ed3","Type":"ContainerStarted","Data":"1c8da2a9ebf5915ca8d0df7153335b863ed32da9f11f6f81330baf3aae11e179"}
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.312728    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tgpgm" event={"ID":"2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf","Type":"ContainerStarted","Data":"89071ca70267ff55db3a1c4b03f32d342093e047d26dbfb562535cb4096d8fec"}
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.313760    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-8n5gl" event={"ID":"0a030e24-2337-49a2-a5e2-118714cd7ff9","Type":"ContainerStarted","Data":"32d15dc22f29032c9176bff78f122e31694be2d1343519d8b5a6ad0a4ba97919"}
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.315731    4730 generic.go:334] "Generic (PLEG): container finished" podID="a835e0ec-4721-4824-8846-fcc7e12db3f9" containerID="d8162848222abf2d634eaafe2c6d69c1712537bbdeceea6939fe1d66ff352273" exitCode=0
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.315908    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jzx77" event={"ID":"a835e0ec-4721-4824-8846-fcc7e12db3f9","Type":"ContainerDied","Data":"d8162848222abf2d634eaafe2c6d69c1712537bbdeceea6939fe1d66ff352273"}
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.315934    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jzx77" event={"ID":"a835e0ec-4721-4824-8846-fcc7e12db3f9","Type":"ContainerStarted","Data":"51c31820868f583075835a067e29cabb504158f08c52200ba4cb3b15dee730bd"}
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.317027    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vqxff" event={"ID":"df4fa0ea-abb1-49ea-8d74-2992c71c1a0e","Type":"ContainerStarted","Data":"9214030830d3037e2a4950e8f2c2a31b6d9704ba2fb21749bb3e069ed799d152"}
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.318712    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xspkm" event={"ID":"7406e06c-cbde-48f9-b5e7-57a2a86b5a4d","Type":"ContainerStarted","Data":"8b08d94e6836d69f6385db916f7474305cf7b5dfd56a9f620596f4b3d653dbba"}
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.318738    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xspkm" event={"ID":"7406e06c-cbde-48f9-b5e7-57a2a86b5a4d","Type":"ContainerStarted","Data":"a2fc941d75512533be44af5bf426a971f659780ec57dca9c624e2dc271a196e0"}
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.319870    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-csmvr" event={"ID":"d784e9cd-d5af-496e-abca-ce30096bb0d0","Type":"ContainerStarted","Data":"13ef7c50b79282096a44f992739c37f752f93e158e231b26d98ea7e454bf1246"}
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.322307    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kfjm5" event={"ID":"7d581333-2d6e-44d6-a6fc-b90c3b16baad","Type":"ContainerStarted","Data":"cb4d8a879d3b88e600163a2dbb81d34e13e7958d84c3d8cd636301338a9068d2"}
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.322338    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kfjm5" event={"ID":"7d581333-2d6e-44d6-a6fc-b90c3b16baad","Type":"ContainerStarted","Data":"5c3bf7e7eb16ae676aea972ba72436827e140b23357ea8a289eaf53fce4605d2"}
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.323213    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jgzlv" event={"ID":"428fa435-b92e-4363-82bb-40316d3e0a26","Type":"ContainerStarted","Data":"0c13d15232b4262b7ee9383dc49e152107617f19e0a83f3352de9a1be82e3bfd"}
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.326354    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6rbg9" event={"ID":"d4e38bce-6ae6-451b-aa9f-7a98dfa4d974","Type":"ContainerStarted","Data":"6753c2d08477eedd2a2f74ec7d764d675404bd8646157af35dce36f94613063f"}
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.327369    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pncxq" event={"ID":"0bc0c5b5-55bb-4339-8162-bb647b833006","Type":"ContainerStarted","Data":"50492069065d98dc384151c573905cb8667e0fb3fbd14c80994e16f59e459f10"}
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.330780    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-g7hdt" event={"ID":"d32c9cec-9f6c-4304-8bc9-d2e52128470a","Type":"ContainerStarted","Data":"4d56ab2bb10646020e5ffdd67001fda8bd9f2edaa6c93e2c743eb071781e1c75"}
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.338988    4730 generic.go:334] "Generic (PLEG): container finished" podID="7662a0cc-faaa-47da-90f9-f3a8907a0401" containerID="d79826854e3b3b91a0bb5b5b6da3d993958db7ba9916a9865bfac03bf08f6219" exitCode=0
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.339062    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nsdw7" event={"ID":"7662a0cc-faaa-47da-90f9-f3a8907a0401","Type":"ContainerDied","Data":"d79826854e3b3b91a0bb5b5b6da3d993958db7ba9916a9865bfac03bf08f6219"}
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.343370    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-st79s" event={"ID":"2499559b-b31f-4dab-89a0-964964dc596e","Type":"ContainerStarted","Data":"40fd46e178b8ce8c75de09a0315b49f9c04961cf890e7392083f9a7a77124dd2"}
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.347206    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ldb64" event={"ID":"ad1f04f2-f7c4-4bc6-9daf-0db7a0809206","Type":"ContainerStarted","Data":"05f7c615c59cffb48afbdaba593e14531efb4fc81d8ff55c1881a4b37920563e"}
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.405753    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:40 crc kubenswrapper[4730]: E0320 15:42:40.407559    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:42:40.907546591 +0000 UTC m=+220.120917960 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bdpg6" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.467782    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-fjhxb"]
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.480618    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m4xlq"]
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.486531    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-sxbnn"]
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.506110    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qwghv"]
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.514451    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:42:40 crc kubenswrapper[4730]: E0320 15:42:40.515266    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:42:41.015234006 +0000 UTC m=+220.228605375 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.552971    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-w5gdn"]
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.618505    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:40 crc kubenswrapper[4730]: E0320 15:42:40.618939    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:42:41.118924039 +0000 UTC m=+220.332295408 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bdpg6" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.644814    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6w7m9"]
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.649790    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567010-d69bc"]
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.699732    4730 patch_prober.go:28] interesting pod/router-default-5444994796-92dt7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld
Mar 20 15:42:40 crc kubenswrapper[4730]: [-]has-synced failed: reason withheld
Mar 20 15:42:40 crc kubenswrapper[4730]: [+]process-running ok
Mar 20 15:42:40 crc kubenswrapper[4730]: healthz check failed
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.700171    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-92dt7" podUID="18214bd2-9c3a-4737-885b-2b5c905311d8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500"
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.724884    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:42:40 crc kubenswrapper[4730]: E0320 15:42:40.725559    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:42:41.225528221 +0000 UTC m=+220.438899600 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.728609    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-klbh8"]
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.736017    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jkk9s"]
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.748765    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qxnn6"]
Mar 20 15:42:40 crc kubenswrapper[4730]: W0320 15:42:40.771965    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9f80b42_cff3_48a7_9e09_02ff65e9d9f8.slice/crio-6a8ec811ee5d2af7d6e0e366cb0349688bcb6f448f7910bdfad56add1b5d5f9a WatchSource:0}: Error finding container 6a8ec811ee5d2af7d6e0e366cb0349688bcb6f448f7910bdfad56add1b5d5f9a: Status 404 returned error can't find the container with id 6a8ec811ee5d2af7d6e0e366cb0349688bcb6f448f7910bdfad56add1b5d5f9a
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.824463    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-92dt7" podStartSLOduration=167.824439236 podStartE2EDuration="2m47.824439236s" podCreationTimestamp="2026-03-20 15:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:42:40.823124816 +0000 UTC m=+220.036496185" watchObservedRunningTime="2026-03-20 15:42:40.824439236 +0000 UTC m=+220.037810605"
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.827566    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:40 crc kubenswrapper[4730]: E0320 15:42:40.828043    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:42:41.328028587 +0000 UTC m=+220.541399956 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bdpg6" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.840621    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-c2cgf"]
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.917730    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7fsc"]
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.925102    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567022-wf5nv"]
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.932809    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:42:40 crc kubenswrapper[4730]: E0320 15:42:40.933302    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:42:41.433282347 +0000 UTC m=+220.646653706 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.945716    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-7hzc8"]
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.963160    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7ckfm"]
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.988834    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m7cfz"]
Mar 20 15:42:40 crc kubenswrapper[4730]: I0320 15:42:40.989963    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-84pdq"]
Mar 20 15:42:40 crc kubenswrapper[4730]: W0320 15:42:40.999380    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb82f2ec3_df30_4b45_be3a_9858edb2bb7f.slice/crio-4c90f2d398800b6f83cd9e72f53236850e7923840c593ca2e3325bc0305bd853 WatchSource:0}: Error finding container 4c90f2d398800b6f83cd9e72f53236850e7923840c593ca2e3325bc0305bd853: Status 404 returned error can't find the container with id 4c90f2d398800b6f83cd9e72f53236850e7923840c593ca2e3325bc0305bd853
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.018769    4730 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.031433    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-4dp6q"]
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.034065    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:41 crc kubenswrapper[4730]: E0320 15:42:41.034465    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:42:41.534448962 +0000 UTC m=+220.747820331 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bdpg6" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.135621    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:42:41 crc kubenswrapper[4730]: E0320 15:42:41.135807    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:42:41.635779512 +0000 UTC m=+220.849150881 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.135966    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:41 crc kubenswrapper[4730]: E0320 15:42:41.136285    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:42:41.636268347 +0000 UTC m=+220.849639716 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bdpg6" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.216467    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xspkm" podStartSLOduration=168.216437475 podStartE2EDuration="2m48.216437475s" podCreationTimestamp="2026-03-20 15:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:42:41.215783765 +0000 UTC m=+220.429155154" watchObservedRunningTime="2026-03-20 15:42:41.216437475 +0000 UTC m=+220.429808844"
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.237770    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:42:41 crc kubenswrapper[4730]: E0320 15:42:41.238055    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:42:41.738007929 +0000 UTC m=+220.951379468 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.238606    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:41 crc kubenswrapper[4730]: E0320 15:42:41.239154    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:42:41.739143634 +0000 UTC m=+220.952515023 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bdpg6" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.335750    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-csmvr" podStartSLOduration=169.335729638 podStartE2EDuration="2m49.335729638s" podCreationTimestamp="2026-03-20 15:39:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:42:41.332953703 +0000 UTC m=+220.546325072" watchObservedRunningTime="2026-03-20 15:42:41.335729638 +0000 UTC m=+220.549101007"
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.342344    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:42:41 crc kubenswrapper[4730]: E0320 15:42:41.342481    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:42:41.842460425 +0000 UTC m=+221.055831794 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.342621    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:41 crc kubenswrapper[4730]: E0320 15:42:41.343012    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:42:41.843002402 +0000 UTC m=+221.056373771 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bdpg6" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.373673    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2qdqs" podStartSLOduration=168.373652716 podStartE2EDuration="2m48.373652716s" podCreationTimestamp="2026-03-20 15:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:42:41.372475219 +0000 UTC m=+220.585846608" watchObservedRunningTime="2026-03-20 15:42:41.373652716 +0000 UTC m=+220.587024085"
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.373920    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-c2cgf" event={"ID":"b82f2ec3-df30-4b45-be3a-9858edb2bb7f","Type":"ContainerStarted","Data":"4c90f2d398800b6f83cd9e72f53236850e7923840c593ca2e3325bc0305bd853"}
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.375938    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-84pdq" event={"ID":"e213a906-8ad6-45c1-b832-a42d58fd91c6","Type":"ContainerStarted","Data":"4cf371ea7b80f4f7f2659d7fa348c114bc4ea58e993d01c8e9a6458dca12d495"}
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.384768    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7ckfm" event={"ID":"d04de14b-8e96-44ab-818f-2b08d78d2e14","Type":"ContainerStarted","Data":"0a47a67ec0dc19ef7601a0ba2ed0577a14eaad0157f2d05a756249d4366eb8d1"}
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.392017    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-hrm7z" event={"ID":"9a38d833-db72-4566-b139-7788730a502a","Type":"ContainerStarted","Data":"2bd463f1b26cfe1b9822fd238c03679adad9d834d5b448d944528f6deceb749a"}
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.392495    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-hrm7z"
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.393150    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7fsc" event={"ID":"f6f09179-5752-4a5a-ab79-72a176bbdd9a","Type":"ContainerStarted","Data":"edfc29168003c65e0a93d4afbd0615854285eea28bcaec1f42ae6712ea29871b"}
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.395914    4730 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-hrm7z container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body=
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.395967    4730 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-hrm7z" podUID="9a38d833-db72-4566-b139-7788730a502a" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused"
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.418606    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4dp6q" event={"ID":"5612cc7f-9299-43b4-b97c-cf579a416e84","Type":"ContainerStarted","Data":"27c5cbe987f40fd46a08099ce914eec5acaeabef15364c0a1d3d11cb926d6f64"}
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.428770    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-nfww4" event={"ID":"a250c56d-72fb-473d-98ce-c013e9d15b4a","Type":"ContainerStarted","Data":"93cfd3cf713293a3037ac8a290383a0c692a549bb4e8644183f233991bbba17a"}
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.430618    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m7cfz" event={"ID":"4e2a7090-33b8-4137-be83-5c2e5ab1ccc7","Type":"ContainerStarted","Data":"69217f465de0b962324e83537fcde48541bfa56121b2fc7b8f8b4deb8dd0ef37"}
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.443629    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:42:41 crc kubenswrapper[4730]: E0320 15:42:41.444139    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:42:41.944116645 +0000 UTC m=+221.157488014 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.446291    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ldb64" event={"ID":"ad1f04f2-f7c4-4bc6-9daf-0db7a0809206","Type":"ContainerStarted","Data":"5e9647c566e4a2e0ff1156cd5e7eb1ef5bda3db8c35f2d9613e275de15e9262b"}
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.451055    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-9kgl8" podStartSLOduration=169.451034518 podStartE2EDuration="2m49.451034518s" podCreationTimestamp="2026-03-20 15:39:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:42:41.409786658 +0000 UTC m=+220.623158027" watchObservedRunningTime="2026-03-20 15:42:41.451034518 +0000 UTC m=+220.664405887"
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.452993    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-hrm7z" podStartSLOduration=168.452909066 podStartE2EDuration="2m48.452909066s" podCreationTimestamp="2026-03-20 15:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:42:41.45011712 +0000 UTC m=+220.663488489" watchObservedRunningTime="2026-03-20 15:42:41.452909066 +0000 UTC m=+220.666280435"
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.458350    4730 generic.go:334] "Generic (PLEG): container finished" podID="d4e38bce-6ae6-451b-aa9f-7a98dfa4d974" containerID="d66e9193c3c6c83eb2ebedb3d3324d7da8814c89e9caf33df63870d285cbd22f" exitCode=0
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.458457    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6rbg9" event={"ID":"d4e38bce-6ae6-451b-aa9f-7a98dfa4d974","Type":"ContainerDied","Data":"d66e9193c3c6c83eb2ebedb3d3324d7da8814c89e9caf33df63870d285cbd22f"}
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.474540    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m4xlq" event={"ID":"2083343b-2ec0-4306-a0a5-f74dd0f63746","Type":"ContainerStarted","Data":"e23e2899ad0868b3e771327f197c59545128a235e8aac31bd5b6503ebfd556a0"}
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.490873    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567022-wf5nv" event={"ID":"7d87adfe-3206-4175-8d8f-5a00015cc61e","Type":"ContainerStarted","Data":"94b5cc48ba667f00e1531e3aeaa43c807fc5eafadb1a111b034b9b327635ca47"}
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.494433    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567010-d69bc" event={"ID":"be19fb65-a04f-42df-9b96-e620b58754bb","Type":"ContainerStarted","Data":"c95af936fdf0b6d7d2255a8837cf5081d1d6b867d8c027bb70bec58e5bed039e"}
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.496821    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ldb64" podStartSLOduration=169.496804867 podStartE2EDuration="2m49.496804867s" podCreationTimestamp="2026-03-20 15:39:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:42:41.495876699 +0000 UTC m=+220.709248078" watchObservedRunningTime="2026-03-20 15:42:41.496804867 +0000 UTC m=+220.710176236"
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.501588    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qxnn6" event={"ID":"49896a92-a6b0-45ea-a736-09a368d90be4","Type":"ContainerStarted","Data":"8eac9c0771b67ae50023fe08427b8f566bdc1e222e5bbe39d99f94e939b63049"}
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.506613    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-g7hdt" event={"ID":"d32c9cec-9f6c-4304-8bc9-d2e52128470a","Type":"ContainerStarted","Data":"7fc7d5b4e7ae1372876eb50a11f4bf7d7325f4335144ba6e97cc70c7eddffe35"}
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.507497    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-g7hdt"
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.510324    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-w5gdn" event={"ID":"c2d975f8-1a1e-4921-aef0-3c4652992a02","Type":"ContainerStarted","Data":"f6530cb3d765deb1f0299e16497ef47f75610b802a9660a518ba8c1cd5edba1e"}
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.510976    4730 patch_prober.go:28] interesting pod/downloads-7954f5f757-g7hdt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body=
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.511009    4730 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-g7hdt" podUID="d32c9cec-9f6c-4304-8bc9-d2e52128470a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused"
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.514276    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7hzc8" event={"ID":"8e224294-495e-4d65-96f2-8e0d2a444ef1","Type":"ContainerStarted","Data":"7e206e562f846a228d4fb8ab07357cb1314f3c3a005d6a5daa36de361b7f0bd8"}
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.516466    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tgpgm" event={"ID":"2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf","Type":"ContainerStarted","Data":"c9ee68792188768f2e70a5a006ca33e1d9f850ef2974b3faf0e1e2bdb1cd6989"}
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.516933    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tgpgm"
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.520525    4730 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-tgpgm container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body=
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.520571    4730 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tgpgm" podUID="2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused"
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.521386    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6w7m9" event={"ID":"8a268a97-bf49-4ed6-b239-1a088c3c4e4f","Type":"ContainerStarted","Data":"b903a530879a324b343372dc3be1834fdf4c99fb4ad19202401541967a2445fd"}
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.521455    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6w7m9" event={"ID":"8a268a97-bf49-4ed6-b239-1a088c3c4e4f","Type":"ContainerStarted","Data":"458e89e3b66feb1609fd81aa7a06e1940ced11ee4340c232ebf9dad3fdde7647"}
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.565644    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:41 crc kubenswrapper[4730]: E0320 15:42:41.567413    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:42:42.067395341 +0000 UTC m=+221.280766790 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bdpg6" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.579270    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-mkxg7" event={"ID":"2eb5b4cf-be45-4c3d-abbf-9a3e525d5ed3","Type":"ContainerStarted","Data":"c77b62f07c3e4d65f624ea3f3f47f2219b15cd19caf1b6d64b3a8a3457be9687"}
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.579362    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-mkxg7"
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.582132    4730 patch_prober.go:28] interesting pod/console-operator-58897d9998-mkxg7 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body=
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.582203    4730 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-mkxg7" podUID="2eb5b4cf-be45-4c3d-abbf-9a3e525d5ed3" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused"
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.587878    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-6w7m9" podStartSLOduration=5.5878520609999995 podStartE2EDuration="5.587852061s" podCreationTimestamp="2026-03-20 15:42:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:42:41.582661831 +0000 UTC m=+220.796033210" watchObservedRunningTime="2026-03-20 15:42:41.587852061 +0000 UTC m=+220.801223430"
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.589623    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-8n5gl" event={"ID":"0a030e24-2337-49a2-a5e2-118714cd7ff9","Type":"ContainerStarted","Data":"3b6a89a603769cc86462adb2fcdb558fe4a73266e899135349c50f340bcf3fe8"}
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.593457    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pncxq" event={"ID":"0bc0c5b5-55bb-4339-8162-bb647b833006","Type":"ContainerStarted","Data":"b98c1bedf76d89690ddd9448b63471ff8500717beaf329adfb8782ab79c2db62"}
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.612302    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qwghv" event={"ID":"9ac46477-04bc-4d0a-b28e-b687c690dd5a","Type":"ContainerStarted","Data":"7debbacc86d6401f66bd6c5e9ce89d016e0a163261c2f379eb398875e46a4673"}
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.613092    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tgpgm" podStartSLOduration=168.613068167 podStartE2EDuration="2m48.613068167s" podCreationTimestamp="2026-03-20 15:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:42:41.607988491 +0000 UTC m=+220.821359860" watchObservedRunningTime="2026-03-20 15:42:41.613068167 +0000 UTC m=+220.826439536"
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.625025    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-klbh8" event={"ID":"e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3","Type":"ContainerStarted","Data":"759bd2ea27103b987add0fca450b8a256b74867157aa94299acbb52889decc8f"}
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.633600    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zr8dk" event={"ID":"ca0985ab-94f6-4f4e-b8b4-0ee710e01fcf","Type":"ContainerStarted","Data":"2134b158b84d7ec00d4e13bec16838b8d83c57a2b4345e768383e4b2751337be"}
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.635363    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fjhxb" event={"ID":"dbb6ff6b-d521-408a-831c-a6a9c524a671","Type":"ContainerStarted","Data":"c3557571a15f9bef351df1be26a286f7e32e817aabc76197984264844a94269b"}
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.637230    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-sxbnn" event={"ID":"1010c304-4912-42b2-aa8c-17d44c4bf6cb","Type":"ContainerStarted","Data":"59c910422c78ad87a68d5f694fff37b6d90d96395fc4f18b5ca6596e497918bc"}
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.646428    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-k6z2l" event={"ID":"5c0e41b3-aa2d-4083-acb2-f0f68a29fcce","Type":"ContainerStarted","Data":"9832423804f3463aa221d450167df97099db90b4da825e192ef78c2beac04aa6"}
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.650956    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jkk9s" event={"ID":"c9f80b42-cff3-48a7-9e09-02ff65e9d9f8","Type":"ContainerStarted","Data":"6a8ec811ee5d2af7d6e0e366cb0349688bcb6f448f7910bdfad56add1b5d5f9a"}
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.658114    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-g7hdt" podStartSLOduration=168.658090023 podStartE2EDuration="2m48.658090023s" podCreationTimestamp="2026-03-20 15:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:42:41.658080703 +0000 UTC m=+220.871452072" watchObservedRunningTime="2026-03-20 15:42:41.658090023 +0000 UTC m=+220.871461392"
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.667129    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:42:41 crc kubenswrapper[4730]: E0320 15:42:41.668616    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:42:42.168591846 +0000 UTC m=+221.381963215 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.682810    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kfjm5" event={"ID":"7d581333-2d6e-44d6-a6fc-b90c3b16baad","Type":"ContainerStarted","Data":"41ae265ff086079d6c2cd85de5690413ab06cd5907b56c70e3f5802cb055ffdb"}
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.700636    4730 patch_prober.go:28] interesting pod/router-default-5444994796-92dt7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld
Mar 20 15:42:41 crc kubenswrapper[4730]: [-]has-synced failed: reason withheld
Mar 20 15:42:41 crc kubenswrapper[4730]: [+]process-running ok
Mar 20 15:42:41 crc kubenswrapper[4730]: healthz check failed
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.700695    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-92dt7" podUID="18214bd2-9c3a-4737-885b-2b5c905311d8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500"
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.774722    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:41 crc kubenswrapper[4730]: E0320 15:42:41.775011    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:42:42.274999213 +0000 UTC m=+221.488370582 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bdpg6" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.786641    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-8n5gl" podStartSLOduration=5.78661807 podStartE2EDuration="5.78661807s" podCreationTimestamp="2026-03-20 15:42:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:42:41.780893004 +0000 UTC m=+220.994264373" watchObservedRunningTime="2026-03-20 15:42:41.78661807 +0000 UTC m=+220.999989439"
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.860063    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pncxq" podStartSLOduration=168.860045061 podStartE2EDuration="2m48.860045061s" podCreationTimestamp="2026-03-20 15:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:42:41.857581735 +0000 UTC m=+221.070953104" watchObservedRunningTime="2026-03-20 15:42:41.860045061 +0000 UTC m=+221.073416430"
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.885470    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:42:41 crc kubenswrapper[4730]: E0320 15:42:41.886864    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:42:42.386849146 +0000 UTC m=+221.600220515 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:41 crc kubenswrapper[4730]: I0320 15:42:41.987884    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:41 crc kubenswrapper[4730]: E0320 15:42:41.988518    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:42:42.488505606 +0000 UTC m=+221.701876975 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bdpg6" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:42 crc kubenswrapper[4730]: I0320 15:42:42.103634    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:42:42 crc kubenswrapper[4730]: E0320 15:42:42.104140    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:42:42.604081124 +0000 UTC m=+221.817452493 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:42 crc kubenswrapper[4730]: I0320 15:42:42.112703    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2qdqs"
Mar 20 15:42:42 crc kubenswrapper[4730]: I0320 15:42:42.208141    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:42 crc kubenswrapper[4730]: E0320 15:42:42.208516    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:42:42.708505249 +0000 UTC m=+221.921876618 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bdpg6" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:42 crc kubenswrapper[4730]: I0320 15:42:42.215365    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zr8dk" podStartSLOduration=170.215345 podStartE2EDuration="2m50.215345s" podCreationTimestamp="2026-03-20 15:39:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:42:42.191342281 +0000 UTC m=+221.404713650" watchObservedRunningTime="2026-03-20 15:42:42.215345 +0000 UTC m=+221.428716369"
Mar 20 15:42:42 crc kubenswrapper[4730]: I0320 15:42:42.263731    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-mkxg7" podStartSLOduration=170.263710689 podStartE2EDuration="2m50.263710689s" podCreationTimestamp="2026-03-20 15:39:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:42:42.26180623 +0000 UTC m=+221.475177599" watchObservedRunningTime="2026-03-20 15:42:42.263710689 +0000 UTC m=+221.477082058"
Mar 20 15:42:42 crc kubenswrapper[4730]: I0320 15:42:42.312629    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:42:42 crc kubenswrapper[4730]: E0320 15:42:42.313144    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:42:42.813128001 +0000 UTC m=+222.026499370 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:42 crc kubenswrapper[4730]: I0320 15:42:42.416654    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:42 crc kubenswrapper[4730]: E0320 15:42:42.417384    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:42:42.91737136 +0000 UTC m=+222.130742729 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bdpg6" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:42 crc kubenswrapper[4730]: I0320 15:42:42.517359    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:42:42 crc kubenswrapper[4730]: E0320 15:42:42.517912    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:42:43.017898514 +0000 UTC m=+222.231269883 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:42 crc kubenswrapper[4730]: I0320 15:42:42.619174    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:42 crc kubenswrapper[4730]: E0320 15:42:42.619993    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:42:43.119973887 +0000 UTC m=+222.333345256 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bdpg6" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:42 crc kubenswrapper[4730]: I0320 15:42:42.695637    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-st79s" event={"ID":"2499559b-b31f-4dab-89a0-964964dc596e","Type":"ContainerStarted","Data":"e1c21b159517c024d2850d533108f097e6b934c21c31f9514baab038d75a1db0"}
Mar 20 15:42:42 crc kubenswrapper[4730]: I0320 15:42:42.696227    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-st79s"
Mar 20 15:42:42 crc kubenswrapper[4730]: I0320 15:42:42.698191    4730 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-st79s container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.16:6443/healthz\": dial tcp 10.217.0.16:6443: connect: connection refused" start-of-body=
Mar 20 15:42:42 crc kubenswrapper[4730]: I0320 15:42:42.698267    4730 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-st79s" podUID="2499559b-b31f-4dab-89a0-964964dc596e" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.16:6443/healthz\": dial tcp 10.217.0.16:6443: connect: connection refused"
Mar 20 15:42:42 crc kubenswrapper[4730]: I0320 15:42:42.698959    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-sxbnn" event={"ID":"1010c304-4912-42b2-aa8c-17d44c4bf6cb","Type":"ContainerStarted","Data":"34a828dd543a65e021600e5744d24e1c4cd7f93116bd1479dc4eaab36971fdec"}
Mar 20 15:42:42 crc kubenswrapper[4730]: I0320 15:42:42.701548    4730 patch_prober.go:28] interesting pod/router-default-5444994796-92dt7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld
Mar 20 15:42:42 crc kubenswrapper[4730]: [-]has-synced failed: reason withheld
Mar 20 15:42:42 crc kubenswrapper[4730]: [+]process-running ok
Mar 20 15:42:42 crc kubenswrapper[4730]: healthz check failed
Mar 20 15:42:42 crc kubenswrapper[4730]: I0320 15:42:42.701604    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-92dt7" podUID="18214bd2-9c3a-4737-885b-2b5c905311d8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500"
Mar 20 15:42:42 crc kubenswrapper[4730]: I0320 15:42:42.702838    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mzjxx" event={"ID":"afbc673a-2498-49dc-b98e-d7ddc58d2999","Type":"ContainerStarted","Data":"0f5573acf28dc25f24f5f3fdda2ac538b23f56f1d9a72474b58a86223e9726c6"}
Mar 20 15:42:42 crc kubenswrapper[4730]: I0320 15:42:42.719998    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-st79s" podStartSLOduration=170.719979746 podStartE2EDuration="2m50.719979746s" podCreationTimestamp="2026-03-20 15:39:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:42:42.71914272 +0000 UTC m=+221.932514089" watchObservedRunningTime="2026-03-20 15:42:42.719979746 +0000 UTC m=+221.933351115"
Mar 20 15:42:42 crc kubenswrapper[4730]: I0320 15:42:42.720345    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:42:42 crc kubenswrapper[4730]: E0320 15:42:42.720818    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:42:43.220803971 +0000 UTC m=+222.434175340 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:42 crc kubenswrapper[4730]: I0320 15:42:42.722044    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kfjm5" podStartSLOduration=170.722034539 podStartE2EDuration="2m50.722034539s" podCreationTimestamp="2026-03-20 15:39:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:42:42.436303742 +0000 UTC m=+221.649675111" watchObservedRunningTime="2026-03-20 15:42:42.722034539 +0000 UTC m=+221.935405908"
Mar 20 15:42:42 crc kubenswrapper[4730]: I0320 15:42:42.736972    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jzx77" event={"ID":"a835e0ec-4721-4824-8846-fcc7e12db3f9","Type":"ContainerStarted","Data":"4e940161de372f92b1b6ae2112926192c8e5439ab340f756511447e1e2aa051b"}
Mar 20 15:42:42 crc kubenswrapper[4730]: I0320 15:42:42.737382    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jzx77" event={"ID":"a835e0ec-4721-4824-8846-fcc7e12db3f9","Type":"ContainerStarted","Data":"431e494bdf53c4458c9d14bc444a03cea0ae1b35843b728952d5f1c5ec48d73c"}
Mar 20 15:42:42 crc kubenswrapper[4730]: I0320 15:42:42.749790    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7fsc" event={"ID":"f6f09179-5752-4a5a-ab79-72a176bbdd9a","Type":"ContainerStarted","Data":"7dffa997caec06e934ce49de06db4bfa2ff15d9b634cdef5fd96299b33964d8c"}
Mar 20 15:42:42 crc kubenswrapper[4730]: I0320 15:42:42.750841    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7fsc"
Mar 20 15:42:42 crc kubenswrapper[4730]: I0320 15:42:42.752358    4730 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-m7fsc container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body=
Mar 20 15:42:42 crc kubenswrapper[4730]: I0320 15:42:42.752536    4730 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7fsc" podUID="f6f09179-5752-4a5a-ab79-72a176bbdd9a" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused"
Mar 20 15:42:42 crc kubenswrapper[4730]: I0320 15:42:42.754994    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-sxbnn" podStartSLOduration=169.754978443 podStartE2EDuration="2m49.754978443s" podCreationTimestamp="2026-03-20 15:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:42:42.754154848 +0000 UTC m=+221.967526217" watchObservedRunningTime="2026-03-20 15:42:42.754978443 +0000 UTC m=+221.968349812"
Mar 20 15:42:42 crc kubenswrapper[4730]: I0320 15:42:42.775971    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jkk9s" event={"ID":"c9f80b42-cff3-48a7-9e09-02ff65e9d9f8","Type":"ContainerStarted","Data":"1c642122eaed210a012524280043fba3a122d1a17fa250fb9e9bbff43d7c0d99"}
Mar 20 15:42:42 crc kubenswrapper[4730]: I0320 15:42:42.786340    4730 ???:1] "http: TLS handshake error from 192.168.126.11:34230: no serving certificate available for the kubelet"
Mar 20 15:42:42 crc kubenswrapper[4730]: I0320 15:42:42.787202    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mzjxx" podStartSLOduration=169.787180425 podStartE2EDuration="2m49.787180425s" podCreationTimestamp="2026-03-20 15:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:42:42.784890794 +0000 UTC m=+221.998262163" watchObservedRunningTime="2026-03-20 15:42:42.787180425 +0000 UTC m=+222.000551794"
Mar 20 15:42:42 crc kubenswrapper[4730]: I0320 15:42:42.810097    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qwghv" event={"ID":"9ac46477-04bc-4d0a-b28e-b687c690dd5a","Type":"ContainerStarted","Data":"95661cbf74baa8e09ad74eae6de240cfa9311adc277f9664c79b9746f96a534d"}
Mar 20 15:42:42 crc kubenswrapper[4730]: I0320 15:42:42.821520    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-klbh8" event={"ID":"e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3","Type":"ContainerStarted","Data":"53a23df451c80721daf3b80414ff05d019adbd298f30cf30f417f8af1c2bafc2"}
Mar 20 15:42:42 crc kubenswrapper[4730]: I0320 15:42:42.822475    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:42 crc kubenswrapper[4730]: I0320 15:42:42.847320    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-klbh8"
Mar 20 15:42:42 crc kubenswrapper[4730]: E0320 15:42:42.853040    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:42:43.353014132 +0000 UTC m=+222.566385501 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bdpg6" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:42 crc kubenswrapper[4730]: I0320 15:42:42.862890    4730 ???:1] "http: TLS handshake error from 192.168.126.11:34234: no serving certificate available for the kubelet"
Mar 20 15:42:42 crc kubenswrapper[4730]: I0320 15:42:42.877458    4730 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-klbh8 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.42:8080/healthz\": dial tcp 10.217.0.42:8080: connect: connection refused" start-of-body=
Mar 20 15:42:42 crc kubenswrapper[4730]: I0320 15:42:42.877577    4730 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-klbh8" podUID="e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.42:8080/healthz\": dial tcp 10.217.0.42:8080: connect: connection refused"
Mar 20 15:42:42 crc kubenswrapper[4730]: I0320 15:42:42.879655    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7fsc" podStartSLOduration=169.879629411 podStartE2EDuration="2m49.879629411s" podCreationTimestamp="2026-03-20 15:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:42:42.878411224 +0000 UTC m=+222.091782603" watchObservedRunningTime="2026-03-20 15:42:42.879629411 +0000 UTC m=+222.093000780"
Mar 20 15:42:42 crc kubenswrapper[4730]: I0320 15:42:42.890633    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jkk9s" podStartSLOduration=169.890606429 podStartE2EDuration="2m49.890606429s" podCreationTimestamp="2026-03-20 15:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:42:42.820912353 +0000 UTC m=+222.034283722" watchObservedRunningTime="2026-03-20 15:42:42.890606429 +0000 UTC m=+222.103977808"
Mar 20 15:42:42 crc kubenswrapper[4730]: I0320 15:42:42.905774    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-c2cgf" event={"ID":"b82f2ec3-df30-4b45-be3a-9858edb2bb7f","Type":"ContainerStarted","Data":"00bfad27e02bae0f610ea7ac00033db490360b11939b34bebce351221d4b2059"}
Mar 20 15:42:42 crc kubenswrapper[4730]: I0320 15:42:42.921362    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vqxff" event={"ID":"df4fa0ea-abb1-49ea-8d74-2992c71c1a0e","Type":"ContainerStarted","Data":"10181f70353f4226095b25118e92db2f30271e6da8103cbb61364f2175bbb612"}
Mar 20 15:42:42 crc kubenswrapper[4730]: I0320 15:42:42.924815    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:42:42 crc kubenswrapper[4730]: E0320 15:42:42.926180    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:42:43.426164054 +0000 UTC m=+222.639535423 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:42 crc kubenswrapper[4730]: I0320 15:42:42.945034    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qwghv" podStartSLOduration=169.945014184 podStartE2EDuration="2m49.945014184s" podCreationTimestamp="2026-03-20 15:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:42:42.912842724 +0000 UTC m=+222.126214093" watchObservedRunningTime="2026-03-20 15:42:42.945014184 +0000 UTC m=+222.158385553"
Mar 20 15:42:42 crc kubenswrapper[4730]: I0320 15:42:42.964719    4730 ???:1] "http: TLS handshake error from 192.168.126.11:34250: no serving certificate available for the kubelet"
Mar 20 15:42:42 crc kubenswrapper[4730]: I0320 15:42:42.968979    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-klbh8" podStartSLOduration=169.968956882 podStartE2EDuration="2m49.968956882s" podCreationTimestamp="2026-03-20 15:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:42:42.949224854 +0000 UTC m=+222.162596233" watchObservedRunningTime="2026-03-20 15:42:42.968956882 +0000 UTC m=+222.182328251"
Mar 20 15:42:42 crc kubenswrapper[4730]: I0320 15:42:42.973587    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vqxff" podStartSLOduration=169.973563984 podStartE2EDuration="2m49.973563984s" podCreationTimestamp="2026-03-20 15:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:42:42.965881367 +0000 UTC m=+222.179252736" watchObservedRunningTime="2026-03-20 15:42:42.973563984 +0000 UTC m=+222.186935353"
Mar 20 15:42:42 crc kubenswrapper[4730]: I0320 15:42:42.991908    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m7cfz" event={"ID":"4e2a7090-33b8-4137-be83-5c2e5ab1ccc7","Type":"ContainerStarted","Data":"f244d7380b63cf2232e2621e28ee1118acdc16f21fb149431cc044383ad81627"}
Mar 20 15:42:42 crc kubenswrapper[4730]: I0320 15:42:42.992604    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m7cfz"
Mar 20 15:42:42 crc kubenswrapper[4730]: I0320 15:42:42.997418    4730 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-m7cfz container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body=
Mar 20 15:42:42 crc kubenswrapper[4730]: I0320 15:42:42.997497    4730 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m7cfz" podUID="4e2a7090-33b8-4137-be83-5c2e5ab1ccc7" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused"
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:42.999399    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-w5gdn" event={"ID":"c2d975f8-1a1e-4921-aef0-3c4652992a02","Type":"ContainerStarted","Data":"54cdbc914b6362b1ea7a8bc305eb5891b8cdc477d3fd9433625eb7cbb6fd09b1"}
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:42.999572    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-c2cgf" podStartSLOduration=169.999551934 podStartE2EDuration="2m49.999551934s" podCreationTimestamp="2026-03-20 15:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:42:42.997382127 +0000 UTC m=+222.210753496" watchObservedRunningTime="2026-03-20 15:42:42.999551934 +0000 UTC m=+222.212923313"
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.000979    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.026676    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:43 crc kubenswrapper[4730]: E0320 15:42:43.032540    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:42:43.532524279 +0000 UTC m=+222.745895648 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bdpg6" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.040043    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7ckfm" event={"ID":"d04de14b-8e96-44ab-818f-2b08d78d2e14","Type":"ContainerStarted","Data":"125158b546ebdfa884a38ad81603b50a265f167bb28865172900b1954094a057"}
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.075982    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m7cfz" podStartSLOduration=170.075961246 podStartE2EDuration="2m50.075961246s" podCreationTimestamp="2026-03-20 15:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:42:43.02769085 +0000 UTC m=+222.241062209" watchObservedRunningTime="2026-03-20 15:42:43.075961246 +0000 UTC m=+222.289332615"
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.095746    4730 ???:1] "http: TLS handshake error from 192.168.126.11:34258: no serving certificate available for the kubelet"
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.098453    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-w5gdn" podStartSLOduration=170.098432338 podStartE2EDuration="2m50.098432338s" podCreationTimestamp="2026-03-20 15:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:42:43.096616352 +0000 UTC m=+222.309987721" watchObservedRunningTime="2026-03-20 15:42:43.098432338 +0000 UTC m=+222.311803707"
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.101911    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6rbg9" event={"ID":"d4e38bce-6ae6-451b-aa9f-7a98dfa4d974","Type":"ContainerStarted","Data":"17c1379d3c8019412ef2eeb677c3ea1322a4e563c714928d4f30b53ddef6736c"}
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.103066    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6rbg9"
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.127840    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:42:43 crc kubenswrapper[4730]: E0320 15:42:43.129360    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:42:43.62934441 +0000 UTC m=+222.842715779 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.156138    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4dp6q" event={"ID":"5612cc7f-9299-43b4-b97c-cf579a416e84","Type":"ContainerStarted","Data":"c4dc87db1fe52d3f1e3497425d0229f7e1b66a568f93bb998a46373961c33bd7"}
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.167811    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qxnn6" event={"ID":"49896a92-a6b0-45ea-a736-09a368d90be4","Type":"ContainerStarted","Data":"db9154d5dd76ee3ecdb3c61eccfc8d1ee9c6bf92c29bb92fa63ea79dcea49104"}
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.183569    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m4xlq" event={"ID":"2083343b-2ec0-4306-a0a5-f74dd0f63746","Type":"ContainerStarted","Data":"4f7a2cb9dd6a91450eb10016bed75c95e0ee2f5ef2c55cd2382b1349dde38159"}
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.183617    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m4xlq" event={"ID":"2083343b-2ec0-4306-a0a5-f74dd0f63746","Type":"ContainerStarted","Data":"6900231cf56665b242f0c9c0d06301ddbd1f4646047717504ae334d7da77ec58"}
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.184206    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m4xlq"
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.188704    4730 ???:1] "http: TLS handshake error from 192.168.126.11:34264: no serving certificate available for the kubelet"
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.207185    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6rbg9" podStartSLOduration=171.207164116 podStartE2EDuration="2m51.207164116s" podCreationTimestamp="2026-03-20 15:39:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:42:43.145343512 +0000 UTC m=+222.358714881" watchObservedRunningTime="2026-03-20 15:42:43.207164116 +0000 UTC m=+222.420535495"
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.207971    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4dp6q" podStartSLOduration=170.20796343 podStartE2EDuration="2m50.20796343s" podCreationTimestamp="2026-03-20 15:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:42:43.205607728 +0000 UTC m=+222.418979097" watchObservedRunningTime="2026-03-20 15:42:43.20796343 +0000 UTC m=+222.421334819"
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.224542    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nsdw7" event={"ID":"7662a0cc-faaa-47da-90f9-f3a8907a0401","Type":"ContainerStarted","Data":"7984f06d3ad268ae2f1beedd5ac29788c74cabc76e9bb68fead3a69c2cd21f8e"}
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.225962    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qxnn6" podStartSLOduration=170.225933154 podStartE2EDuration="2m50.225933154s" podCreationTimestamp="2026-03-20 15:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:42:43.225415248 +0000 UTC m=+222.438786617" watchObservedRunningTime="2026-03-20 15:42:43.225933154 +0000 UTC m=+222.439304513"
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.229841    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:43 crc kubenswrapper[4730]: E0320 15:42:43.230497    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:42:43.730466863 +0000 UTC m=+222.943838402 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bdpg6" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.242053    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7hzc8" event={"ID":"8e224294-495e-4d65-96f2-8e0d2a444ef1","Type":"ContainerStarted","Data":"997a70846ad90a701274abde3d57150a6fa780b3ae6f7e74441f6d1aa47eafbd"}
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.264556    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567010-d69bc" event={"ID":"be19fb65-a04f-42df-9b96-e620b58754bb","Type":"ContainerStarted","Data":"647092b460bb07570b06908ca4f98239d0470ba3df7bb23adf207cb830d51de7"}
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.269184    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m4xlq" podStartSLOduration=170.269153994 podStartE2EDuration="2m50.269153994s" podCreationTimestamp="2026-03-20 15:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:42:43.26478794 +0000 UTC m=+222.478159309" watchObservedRunningTime="2026-03-20 15:42:43.269153994 +0000 UTC m=+222.482525363"
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.279702    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fjhxb" event={"ID":"dbb6ff6b-d521-408a-831c-a6a9c524a671","Type":"ContainerStarted","Data":"41f529b830bb386d3cb661870fe6a08af01d1525783d2c646e82019d8345f0a5"}
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.295224    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-nfww4" event={"ID":"a250c56d-72fb-473d-98ce-c013e9d15b4a","Type":"ContainerStarted","Data":"3dda7ec3cc48a8d70f976fb14be79ca042ce7984f02f652763bdc83faf96249f"}
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.304216    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7hzc8" podStartSLOduration=170.304190873 podStartE2EDuration="2m50.304190873s" podCreationTimestamp="2026-03-20 15:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:42:43.299421696 +0000 UTC m=+222.512793055" watchObservedRunningTime="2026-03-20 15:42:43.304190873 +0000 UTC m=+222.517562242"
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.309539    4730 ???:1] "http: TLS handshake error from 192.168.126.11:34268: no serving certificate available for the kubelet"
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.329625    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-k6z2l" event={"ID":"5c0e41b3-aa2d-4083-acb2-f0f68a29fcce","Type":"ContainerStarted","Data":"7f508926108469d183ad53fc1cb87f9de68c82f3dedad0d66461e40806ddf1e0"}
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.330718    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:42:43 crc kubenswrapper[4730]: E0320 15:42:43.330996    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:42:43.830976788 +0000 UTC m=+223.044348157 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.332162    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:43 crc kubenswrapper[4730]: E0320 15:42:43.333079    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:42:43.833062992 +0000 UTC m=+223.046434361 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bdpg6" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.366628    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29567010-d69bc" podStartSLOduration=171.366609125 podStartE2EDuration="2m51.366609125s" podCreationTimestamp="2026-03-20 15:39:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:42:43.330093231 +0000 UTC m=+222.543464600" watchObservedRunningTime="2026-03-20 15:42:43.366609125 +0000 UTC m=+222.579980494"
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.369032    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sk8hp" event={"ID":"d9a5496e-57aa-4f42-b53d-590fb534d26e","Type":"ContainerStarted","Data":"c99c44110fd7accbd0cb3909619d6731f09534a84d37946809f5159851ac44a6"}
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.370602    4730 patch_prober.go:28] interesting pod/downloads-7954f5f757-g7hdt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body=
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.370678    4730 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-g7hdt" podUID="d32c9cec-9f6c-4304-8bc9-d2e52128470a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused"
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.378552    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tgpgm"
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.392753    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nsdw7" podStartSLOduration=170.392733729 podStartE2EDuration="2m50.392733729s" podCreationTimestamp="2026-03-20 15:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:42:43.370982339 +0000 UTC m=+222.584353718" watchObservedRunningTime="2026-03-20 15:42:43.392733729 +0000 UTC m=+222.606105098"
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.394134    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-nfww4" podStartSLOduration=170.394129132 podStartE2EDuration="2m50.394129132s" podCreationTimestamp="2026-03-20 15:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:42:43.391898063 +0000 UTC m=+222.605269432" watchObservedRunningTime="2026-03-20 15:42:43.394129132 +0000 UTC m=+222.607500501"
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.400408    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-hrm7z"
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.400618    4730 ???:1] "http: TLS handshake error from 192.168.126.11:34270: no serving certificate available for the kubelet"
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.431674    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-k6z2l" podStartSLOduration=170.431656097 podStartE2EDuration="2m50.431656097s" podCreationTimestamp="2026-03-20 15:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:42:43.430719279 +0000 UTC m=+222.644090638" watchObservedRunningTime="2026-03-20 15:42:43.431656097 +0000 UTC m=+222.645027456"
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.433088    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sk8hp" podStartSLOduration=170.433083821 podStartE2EDuration="2m50.433083821s" podCreationTimestamp="2026-03-20 15:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:42:43.412231379 +0000 UTC m=+222.625602748" watchObservedRunningTime="2026-03-20 15:42:43.433083821 +0000 UTC m=+222.646455190"
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.434064    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:42:43 crc kubenswrapper[4730]: E0320 15:42:43.435224    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:42:43.935206717 +0000 UTC m=+223.148578086 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.463562    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fjhxb" podStartSLOduration=170.463539039 podStartE2EDuration="2m50.463539039s" podCreationTimestamp="2026-03-20 15:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:42:43.450408535 +0000 UTC m=+222.663779904" watchObservedRunningTime="2026-03-20 15:42:43.463539039 +0000 UTC m=+222.676910408"
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.518113    4730 ???:1] "http: TLS handshake error from 192.168.126.11:34278: no serving certificate available for the kubelet"
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.542170    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:43 crc kubenswrapper[4730]: E0320 15:42:43.542666    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:42:44.042647055 +0000 UTC m=+223.256018424 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bdpg6" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.643568    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:42:43 crc kubenswrapper[4730]: E0320 15:42:43.643786    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:42:44.143754218 +0000 UTC m=+223.357125587 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.644073    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:43 crc kubenswrapper[4730]: E0320 15:42:43.644434    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:42:44.144422668 +0000 UTC m=+223.357794037 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bdpg6" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.700008    4730 patch_prober.go:28] interesting pod/router-default-5444994796-92dt7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld
Mar 20 15:42:43 crc kubenswrapper[4730]: [-]has-synced failed: reason withheld
Mar 20 15:42:43 crc kubenswrapper[4730]: [+]process-running ok
Mar 20 15:42:43 crc kubenswrapper[4730]: healthz check failed
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.700066    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-92dt7" podUID="18214bd2-9c3a-4737-885b-2b5c905311d8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500"
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.745414    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:42:43 crc kubenswrapper[4730]: E0320 15:42:43.745538    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:42:44.245513401 +0000 UTC m=+223.458884770 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.745622    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:43 crc kubenswrapper[4730]: E0320 15:42:43.745926    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:42:44.245915673 +0000 UTC m=+223.459287042 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bdpg6" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.792672    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nsdw7"
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.792735    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nsdw7"
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.794855    4730 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-nsdw7 container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.8:8443/livez\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body=
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.794907    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nsdw7" podUID="7662a0cc-faaa-47da-90f9-f3a8907a0401" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.8:8443/livez\": dial tcp 10.217.0.8:8443: connect: connection refused"
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.847373    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:42:43 crc kubenswrapper[4730]: E0320 15:42:43.847715    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:42:44.347701197 +0000 UTC m=+223.561072566 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:43 crc kubenswrapper[4730]: I0320 15:42:43.948812    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:43 crc kubenswrapper[4730]: E0320 15:42:43.949237    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:42:44.449222313 +0000 UTC m=+223.662593682 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bdpg6" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:44 crc kubenswrapper[4730]: I0320 15:42:44.039577    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-mkxg7"
Mar 20 15:42:44 crc kubenswrapper[4730]: I0320 15:42:44.050408    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:42:44 crc kubenswrapper[4730]: E0320 15:42:44.050882    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:42:44.550859522 +0000 UTC m=+223.764230891 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:44 crc kubenswrapper[4730]: I0320 15:42:44.152566    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:44 crc kubenswrapper[4730]: E0320 15:42:44.152943    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:42:44.652929484 +0000 UTC m=+223.866300863 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bdpg6" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:44 crc kubenswrapper[4730]: I0320 15:42:44.200964    4730 ???:1] "http: TLS handshake error from 192.168.126.11:34054: no serving certificate available for the kubelet"
Mar 20 15:42:44 crc kubenswrapper[4730]: I0320 15:42:44.253841    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:42:44 crc kubenswrapper[4730]: E0320 15:42:44.253938    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:42:44.753920704 +0000 UTC m=+223.967292073 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:44 crc kubenswrapper[4730]: I0320 15:42:44.254166    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:44 crc kubenswrapper[4730]: E0320 15:42:44.254444    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:42:44.75443727 +0000 UTC m=+223.967808639 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bdpg6" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:44 crc kubenswrapper[4730]: I0320 15:42:44.355258    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:42:44 crc kubenswrapper[4730]: E0320 15:42:44.355444    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:42:44.855416669 +0000 UTC m=+224.068788038 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:44 crc kubenswrapper[4730]: I0320 15:42:44.355608    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:44 crc kubenswrapper[4730]: E0320 15:42:44.355974    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:42:44.855917684 +0000 UTC m=+224.069289053 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bdpg6" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:44 crc kubenswrapper[4730]: I0320 15:42:44.384453    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qxnn6" event={"ID":"49896a92-a6b0-45ea-a736-09a368d90be4","Type":"ContainerStarted","Data":"1d9d66ac03273f78a9f6a3ad1aae37d09c47a1b7f56e03ec96fee85dd6a4bb1e"}
Mar 20 15:42:44 crc kubenswrapper[4730]: I0320 15:42:44.392829    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7hzc8" event={"ID":"8e224294-495e-4d65-96f2-8e0d2a444ef1","Type":"ContainerStarted","Data":"44b34c7b3becd463763d417a566e9d007f7574f3829f6b84082984b14e6b178f"}
Mar 20 15:42:44 crc kubenswrapper[4730]: I0320 15:42:44.401807    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-84pdq" event={"ID":"e213a906-8ad6-45c1-b832-a42d58fd91c6","Type":"ContainerStarted","Data":"c12b4267a4d26c1887f06a65d0c0eef036471fefc0bd5f1566fbd06748d28a0b"}
Mar 20 15:42:44 crc kubenswrapper[4730]: I0320 15:42:44.401853    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-84pdq" event={"ID":"e213a906-8ad6-45c1-b832-a42d58fd91c6","Type":"ContainerStarted","Data":"978fae8413ee06ec2c75f391646b1b65e597ee8ba09b566039d9c24d7be1cc57"}
Mar 20 15:42:44 crc kubenswrapper[4730]: I0320 15:42:44.404713    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jgzlv" event={"ID":"428fa435-b92e-4363-82bb-40316d3e0a26","Type":"ContainerStarted","Data":"ea678b6938b223ad3eab964671c3e3289406901e88fa28062e8548fa322ccea9"}
Mar 20 15:42:44 crc kubenswrapper[4730]: I0320 15:42:44.407213    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7ckfm" event={"ID":"d04de14b-8e96-44ab-818f-2b08d78d2e14","Type":"ContainerStarted","Data":"dd47e8acc78b6239c0aa5d972a2690f7f8a6960538a611883344038e4c3fafb4"}
Mar 20 15:42:44 crc kubenswrapper[4730]: I0320 15:42:44.407926    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-7ckfm"
Mar 20 15:42:44 crc kubenswrapper[4730]: I0320 15:42:44.412584    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fjhxb" event={"ID":"dbb6ff6b-d521-408a-831c-a6a9c524a671","Type":"ContainerStarted","Data":"31836fab2b7c4a2c586a016e06173aaf758a2ad27b4f9986477e91cc46024853"}
Mar 20 15:42:44 crc kubenswrapper[4730]: I0320 15:42:44.425557    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-84pdq" podStartSLOduration=171.425541608 podStartE2EDuration="2m51.425541608s" podCreationTimestamp="2026-03-20 15:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:42:44.424057332 +0000 UTC m=+223.637428701" watchObservedRunningTime="2026-03-20 15:42:44.425541608 +0000 UTC m=+223.638912967"
Mar 20 15:42:44 crc kubenswrapper[4730]: I0320 15:42:44.425869    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4dp6q" event={"ID":"5612cc7f-9299-43b4-b97c-cf579a416e84","Type":"ContainerStarted","Data":"bb882482d0e3a4b547690fe73d4699abed241d6c74988eb9d9229503d849a173"}
Mar 20 15:42:44 crc kubenswrapper[4730]: I0320 15:42:44.426467    4730 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-klbh8 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.42:8080/healthz\": dial tcp 10.217.0.42:8080: connect: connection refused" start-of-body=
Mar 20 15:42:44 crc kubenswrapper[4730]: I0320 15:42:44.426505    4730 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-klbh8" podUID="e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.42:8080/healthz\": dial tcp 10.217.0.42:8080: connect: connection refused"
Mar 20 15:42:44 crc kubenswrapper[4730]: I0320 15:42:44.426579    4730 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-m7cfz container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body=
Mar 20 15:42:44 crc kubenswrapper[4730]: I0320 15:42:44.426609    4730 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m7cfz" podUID="4e2a7090-33b8-4137-be83-5c2e5ab1ccc7" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused"
Mar 20 15:42:44 crc kubenswrapper[4730]: I0320 15:42:44.426859    4730 patch_prober.go:28] interesting pod/downloads-7954f5f757-g7hdt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body=
Mar 20 15:42:44 crc kubenswrapper[4730]: I0320 15:42:44.426886    4730 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-g7hdt" podUID="d32c9cec-9f6c-4304-8bc9-d2e52128470a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused"
Mar 20 15:42:44 crc kubenswrapper[4730]: I0320 15:42:44.433690    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-st79s"
Mar 20 15:42:44 crc kubenswrapper[4730]: I0320 15:42:44.439373    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7fsc"
Mar 20 15:42:44 crc kubenswrapper[4730]: I0320 15:42:44.458928    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:42:44 crc kubenswrapper[4730]: E0320 15:42:44.459904    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:42:44.959887955 +0000 UTC m=+224.173259324 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:44 crc kubenswrapper[4730]: I0320 15:42:44.476974    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-7ckfm" podStartSLOduration=8.476957941 podStartE2EDuration="8.476957941s" podCreationTimestamp="2026-03-20 15:42:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:42:44.467819599 +0000 UTC m=+223.681190968" watchObservedRunningTime="2026-03-20 15:42:44.476957941 +0000 UTC m=+223.690329310"
Mar 20 15:42:44 crc kubenswrapper[4730]: I0320 15:42:44.554771    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-jzx77" podStartSLOduration=172.554754646 podStartE2EDuration="2m52.554754646s" podCreationTimestamp="2026-03-20 15:39:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:42:44.554120336 +0000 UTC m=+223.767491705" watchObservedRunningTime="2026-03-20 15:42:44.554754646 +0000 UTC m=+223.768126015"
Mar 20 15:42:44 crc kubenswrapper[4730]: I0320 15:42:44.566197    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:44 crc kubenswrapper[4730]: E0320 15:42:44.568654    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:42:45.068639093 +0000 UTC m=+224.282010462 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bdpg6" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:44 crc kubenswrapper[4730]: I0320 15:42:44.667789    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:42:44 crc kubenswrapper[4730]: E0320 15:42:44.668327    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:42:45.168310492 +0000 UTC m=+224.381681861 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:44 crc kubenswrapper[4730]: I0320 15:42:44.704141    4730 patch_prober.go:28] interesting pod/router-default-5444994796-92dt7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld
Mar 20 15:42:44 crc kubenswrapper[4730]: [-]has-synced failed: reason withheld
Mar 20 15:42:44 crc kubenswrapper[4730]: [+]process-running ok
Mar 20 15:42:44 crc kubenswrapper[4730]: healthz check failed
Mar 20 15:42:44 crc kubenswrapper[4730]: I0320 15:42:44.704221    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-92dt7" podUID="18214bd2-9c3a-4737-885b-2b5c905311d8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500"
Mar 20 15:42:44 crc kubenswrapper[4730]: I0320 15:42:44.769947    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:44 crc kubenswrapper[4730]: E0320 15:42:44.770361    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:42:45.270346174 +0000 UTC m=+224.483717543 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bdpg6" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:44 crc kubenswrapper[4730]: I0320 15:42:44.872588    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:42:44 crc kubenswrapper[4730]: E0320 15:42:44.872893    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:42:45.37287777 +0000 UTC m=+224.586249139 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:44 crc kubenswrapper[4730]: I0320 15:42:44.973938    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:44 crc kubenswrapper[4730]: E0320 15:42:44.974303    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:42:45.474286783 +0000 UTC m=+224.687658152 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bdpg6" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:45 crc kubenswrapper[4730]: I0320 15:42:45.010776    4730 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-6rbg9 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body=
Mar 20 15:42:45 crc kubenswrapper[4730]: I0320 15:42:45.010830    4730 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6rbg9" podUID="d4e38bce-6ae6-451b-aa9f-7a98dfa4d974" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused"
Mar 20 15:42:45 crc kubenswrapper[4730]: I0320 15:42:45.011137    4730 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-6rbg9 container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body=
Mar 20 15:42:45 crc kubenswrapper[4730]: I0320 15:42:45.011170    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6rbg9" podUID="d4e38bce-6ae6-451b-aa9f-7a98dfa4d974" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused"
Mar 20 15:42:45 crc kubenswrapper[4730]: I0320 15:42:45.075306    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:42:45 crc kubenswrapper[4730]: E0320 15:42:45.075688    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:42:45.575674374 +0000 UTC m=+224.789045743 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:45 crc kubenswrapper[4730]: I0320 15:42:45.123918    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hrm7z"]
Mar 20 15:42:45 crc kubenswrapper[4730]: I0320 15:42:45.166458    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tgpgm"]
Mar 20 15:42:45 crc kubenswrapper[4730]: I0320 15:42:45.177288    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:45 crc kubenswrapper[4730]: E0320 15:42:45.177641    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:42:45.677628753 +0000 UTC m=+224.891000122 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bdpg6" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:45 crc kubenswrapper[4730]: I0320 15:42:45.278213    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:42:45 crc kubenswrapper[4730]: E0320 15:42:45.278339    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:42:45.778321103 +0000 UTC m=+224.991692472 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:45 crc kubenswrapper[4730]: I0320 15:42:45.278432    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:45 crc kubenswrapper[4730]: E0320 15:42:45.278739    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:42:45.778732196 +0000 UTC m=+224.992103565 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bdpg6" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:45 crc kubenswrapper[4730]: I0320 15:42:45.379556    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:42:45 crc kubenswrapper[4730]: E0320 15:42:45.380053    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:42:45.880035605 +0000 UTC m=+225.093406974 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:45 crc kubenswrapper[4730]: I0320 15:42:45.430962    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-hrm7z" podUID="9a38d833-db72-4566-b139-7788730a502a" containerName="controller-manager" containerID="cri-o://2bd463f1b26cfe1b9822fd238c03679adad9d834d5b448d944528f6deceb749a" gracePeriod=30
Mar 20 15:42:45 crc kubenswrapper[4730]: I0320 15:42:45.432615    4730 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-klbh8 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.42:8080/healthz\": dial tcp 10.217.0.42:8080: connect: connection refused" start-of-body=
Mar 20 15:42:45 crc kubenswrapper[4730]: I0320 15:42:45.432669    4730 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-klbh8" podUID="e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.42:8080/healthz\": dial tcp 10.217.0.42:8080: connect: connection refused"
Mar 20 15:42:45 crc kubenswrapper[4730]: I0320 15:42:45.433312    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tgpgm" podUID="2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf" containerName="route-controller-manager" containerID="cri-o://c9ee68792188768f2e70a5a006ca33e1d9f850ef2974b3faf0e1e2bdb1cd6989" gracePeriod=30
Mar 20 15:42:45 crc kubenswrapper[4730]: I0320 15:42:45.449744    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m7cfz"
Mar 20 15:42:45 crc kubenswrapper[4730]: I0320 15:42:45.480783    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:45 crc kubenswrapper[4730]: E0320 15:42:45.481124    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:42:45.981108127 +0000 UTC m=+225.194479506 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bdpg6" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:45 crc kubenswrapper[4730]: I0320 15:42:45.580278    4730 ???:1] "http: TLS handshake error from 192.168.126.11:34066: no serving certificate available for the kubelet"
Mar 20 15:42:45 crc kubenswrapper[4730]: I0320 15:42:45.581309    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:42:45 crc kubenswrapper[4730]: E0320 15:42:45.582794    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:42:46.082766287 +0000 UTC m=+225.296137656 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:45 crc kubenswrapper[4730]: I0320 15:42:45.683748    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:45 crc kubenswrapper[4730]: E0320 15:42:45.684268    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:42:46.184239461 +0000 UTC m=+225.397610830 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bdpg6" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:45 crc kubenswrapper[4730]: I0320 15:42:45.712721    4730 patch_prober.go:28] interesting pod/router-default-5444994796-92dt7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld
Mar 20 15:42:45 crc kubenswrapper[4730]: [-]has-synced failed: reason withheld
Mar 20 15:42:45 crc kubenswrapper[4730]: [+]process-running ok
Mar 20 15:42:45 crc kubenswrapper[4730]: healthz check failed
Mar 20 15:42:45 crc kubenswrapper[4730]: I0320 15:42:45.712780    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-92dt7" podUID="18214bd2-9c3a-4737-885b-2b5c905311d8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500"
Mar 20 15:42:45 crc kubenswrapper[4730]: I0320 15:42:45.785948    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:42:45 crc kubenswrapper[4730]: E0320 15:42:45.786314    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:42:46.286226071 +0000 UTC m=+225.499597460 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:45 crc kubenswrapper[4730]: I0320 15:42:45.786477    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:45 crc kubenswrapper[4730]: E0320 15:42:45.787001    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:42:46.286993755 +0000 UTC m=+225.500365124 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bdpg6" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:45 crc kubenswrapper[4730]: I0320 15:42:45.891020    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:42:45 crc kubenswrapper[4730]: E0320 15:42:45.891589    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:42:46.391570614 +0000 UTC m=+225.604941983 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:45 crc kubenswrapper[4730]: I0320 15:42:45.993116    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:45 crc kubenswrapper[4730]: E0320 15:42:45.993408    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:42:46.493396879 +0000 UTC m=+225.706768248 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bdpg6" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.069115    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tgpgm"
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.094966    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:42:46 crc kubenswrapper[4730]: E0320 15:42:46.095385    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:42:46.595369818 +0000 UTC m=+225.808741187 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.197991    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf-serving-cert\") pod \"2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf\" (UID: \"2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf\") "
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.198403    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wb6sw\" (UniqueName: \"kubernetes.io/projected/2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf-kube-api-access-wb6sw\") pod \"2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf\" (UID: \"2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf\") "
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.198429    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf-client-ca\") pod \"2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf\" (UID: \"2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf\") "
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.198670    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf-config\") pod \"2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf\" (UID: \"2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf\") "
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.198843    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:46 crc kubenswrapper[4730]: E0320 15:42:46.199162    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:42:46.699150803 +0000 UTC m=+225.912522172 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bdpg6" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.199965    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf-client-ca" (OuterVolumeSpecName: "client-ca") pod "2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf" (UID: "2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.200761    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf-config" (OuterVolumeSpecName: "config") pod "2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf" (UID: "2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.207591    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf" (UID: "2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.220721    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf-kube-api-access-wb6sw" (OuterVolumeSpecName: "kube-api-access-wb6sw") pod "2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf" (UID: "2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf"). InnerVolumeSpecName "kube-api-access-wb6sw". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.299867    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:42:46 crc kubenswrapper[4730]: E0320 15:42:46.300022    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:42:46.799992658 +0000 UTC m=+226.013364027 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.300108    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.300208    4730 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf-serving-cert\") on node \"crc\" DevicePath \"\""
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.300343    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wb6sw\" (UniqueName: \"kubernetes.io/projected/2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf-kube-api-access-wb6sw\") on node \"crc\" DevicePath \"\""
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.300363    4730 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf-client-ca\") on node \"crc\" DevicePath \"\""
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.300374    4730 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:42:46 crc kubenswrapper[4730]: E0320 15:42:46.300417    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:42:46.800403541 +0000 UTC m=+226.013774970 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bdpg6" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.406152    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:42:46 crc kubenswrapper[4730]: E0320 15:42:46.406470    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:42:46.906405874 +0000 UTC m=+226.119777243 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.408521    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:46 crc kubenswrapper[4730]: E0320 15:42:46.408883    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:42:46.90887469 +0000 UTC m=+226.122246059 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bdpg6" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.432967    4730 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-6rbg9 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body=
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.433028    4730 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6rbg9" podUID="d4e38bce-6ae6-451b-aa9f-7a98dfa4d974" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)"
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.445514    4730 generic.go:334] "Generic (PLEG): container finished" podID="2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf" containerID="c9ee68792188768f2e70a5a006ca33e1d9f850ef2974b3faf0e1e2bdb1cd6989" exitCode=0
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.445585    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tgpgm" event={"ID":"2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf","Type":"ContainerDied","Data":"c9ee68792188768f2e70a5a006ca33e1d9f850ef2974b3faf0e1e2bdb1cd6989"}
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.445620    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tgpgm" event={"ID":"2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf","Type":"ContainerDied","Data":"89071ca70267ff55db3a1c4b03f32d342093e047d26dbfb562535cb4096d8fec"}
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.445638    4730 scope.go:117] "RemoveContainer" containerID="c9ee68792188768f2e70a5a006ca33e1d9f850ef2974b3faf0e1e2bdb1cd6989"
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.445784    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tgpgm"
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.450738    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jgzlv" event={"ID":"428fa435-b92e-4363-82bb-40316d3e0a26","Type":"ContainerStarted","Data":"45d3b2fe2f625381a2a942dae09aff957cf42cdead6169d29b84cbb202701aaa"}
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.462298    4730 generic.go:334] "Generic (PLEG): container finished" podID="9a38d833-db72-4566-b139-7788730a502a" containerID="2bd463f1b26cfe1b9822fd238c03679adad9d834d5b448d944528f6deceb749a" exitCode=0
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.463946    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-hrm7z" event={"ID":"9a38d833-db72-4566-b139-7788730a502a","Type":"ContainerDied","Data":"2bd463f1b26cfe1b9822fd238c03679adad9d834d5b448d944528f6deceb749a"}
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.488048    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tgpgm"]
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.490476    4730 scope.go:117] "RemoveContainer" containerID="c9ee68792188768f2e70a5a006ca33e1d9f850ef2974b3faf0e1e2bdb1cd6989"
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.493854    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tgpgm"]
Mar 20 15:42:46 crc kubenswrapper[4730]: E0320 15:42:46.495649    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9ee68792188768f2e70a5a006ca33e1d9f850ef2974b3faf0e1e2bdb1cd6989\": container with ID starting with c9ee68792188768f2e70a5a006ca33e1d9f850ef2974b3faf0e1e2bdb1cd6989 not found: ID does not exist" containerID="c9ee68792188768f2e70a5a006ca33e1d9f850ef2974b3faf0e1e2bdb1cd6989"
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.495687    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9ee68792188768f2e70a5a006ca33e1d9f850ef2974b3faf0e1e2bdb1cd6989"} err="failed to get container status \"c9ee68792188768f2e70a5a006ca33e1d9f850ef2974b3faf0e1e2bdb1cd6989\": rpc error: code = NotFound desc = could not find container \"c9ee68792188768f2e70a5a006ca33e1d9f850ef2974b3faf0e1e2bdb1cd6989\": container with ID starting with c9ee68792188768f2e70a5a006ca33e1d9f850ef2974b3faf0e1e2bdb1cd6989 not found: ID does not exist"
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.511007    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:42:46 crc kubenswrapper[4730]: E0320 15:42:46.511239    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:42:47.0111898 +0000 UTC m=+226.224561169 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.511309    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:46 crc kubenswrapper[4730]: E0320 15:42:46.514454    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:42:47.01442296 +0000 UTC m=+226.227794329 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bdpg6" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.567563    4730 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock"
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.613031    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:42:46 crc kubenswrapper[4730]: E0320 15:42:46.633397    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 15:42:47.133359652 +0000 UTC m=+226.346731021 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.700058    4730 patch_prober.go:28] interesting pod/router-default-5444994796-92dt7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld
Mar 20 15:42:46 crc kubenswrapper[4730]: [-]has-synced failed: reason withheld
Mar 20 15:42:46 crc kubenswrapper[4730]: [+]process-running ok
Mar 20 15:42:46 crc kubenswrapper[4730]: healthz check failed
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.700120    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-92dt7" podUID="18214bd2-9c3a-4737-885b-2b5c905311d8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500"
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.735012    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:46 crc kubenswrapper[4730]: E0320 15:42:46.735579    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 15:42:47.235565769 +0000 UTC m=+226.448937138 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bdpg6" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.754020    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rlnqc"]
Mar 20 15:42:46 crc kubenswrapper[4730]: E0320 15:42:46.754790    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf" containerName="route-controller-manager"
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.754817    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf" containerName="route-controller-manager"
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.755120    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf" containerName="route-controller-manager"
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.757964    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rlnqc"
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.758925    4730 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-20T15:42:46.567595777Z","Handler":null,"Name":""}
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.762017    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g"
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.769132    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rlnqc"]
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.792682    4730 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.792745    4730 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.836185    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") "
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.836455    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98-utilities\") pod \"certified-operators-rlnqc\" (UID: \"e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98\") " pod="openshift-marketplace/certified-operators-rlnqc"
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.836549    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98-catalog-content\") pod \"certified-operators-rlnqc\" (UID: \"e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98\") " pod="openshift-marketplace/certified-operators-rlnqc"
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.836596    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgmhp\" (UniqueName: \"kubernetes.io/projected/e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98-kube-api-access-rgmhp\") pod \"certified-operators-rlnqc\" (UID: \"e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98\") " pod="openshift-marketplace/certified-operators-rlnqc"
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.843758    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue ""
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.860695    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-hrm7z"
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.910936    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-59447fbd49-wdtl4"]
Mar 20 15:42:46 crc kubenswrapper[4730]: E0320 15:42:46.911189    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a38d833-db72-4566-b139-7788730a502a" containerName="controller-manager"
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.911207    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a38d833-db72-4566-b139-7788730a502a" containerName="controller-manager"
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.911356    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a38d833-db72-4566-b139-7788730a502a" containerName="controller-manager"
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.911796    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-59447fbd49-wdtl4"
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.913645    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58875cfd6f-xthh9"]
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.914196    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58875cfd6f-xthh9"
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.922775    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config"
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.923059    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt"
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.923201    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2"
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.923380    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca"
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.923502    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert"
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.925973    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt"
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.931377    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58875cfd6f-xthh9"]
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.935848    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-59447fbd49-wdtl4"]
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.937102    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tq5zm\" (UniqueName: \"kubernetes.io/projected/9a38d833-db72-4566-b139-7788730a502a-kube-api-access-tq5zm\") pod \"9a38d833-db72-4566-b139-7788730a502a\" (UID: \"9a38d833-db72-4566-b139-7788730a502a\") "
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.937239    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a38d833-db72-4566-b139-7788730a502a-serving-cert\") pod \"9a38d833-db72-4566-b139-7788730a502a\" (UID: \"9a38d833-db72-4566-b139-7788730a502a\") "
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.937292    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a38d833-db72-4566-b139-7788730a502a-config\") pod \"9a38d833-db72-4566-b139-7788730a502a\" (UID: \"9a38d833-db72-4566-b139-7788730a502a\") "
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.937330    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9a38d833-db72-4566-b139-7788730a502a-client-ca\") pod \"9a38d833-db72-4566-b139-7788730a502a\" (UID: \"9a38d833-db72-4566-b139-7788730a502a\") "
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.937378    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9a38d833-db72-4566-b139-7788730a502a-proxy-ca-bundles\") pod \"9a38d833-db72-4566-b139-7788730a502a\" (UID: \"9a38d833-db72-4566-b139-7788730a502a\") "
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.937534    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgmhp\" (UniqueName: \"kubernetes.io/projected/e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98-kube-api-access-rgmhp\") pod \"certified-operators-rlnqc\" (UID: \"e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98\") " pod="openshift-marketplace/certified-operators-rlnqc"
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.937562    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.937587    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98-utilities\") pod \"certified-operators-rlnqc\" (UID: \"e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98\") " pod="openshift-marketplace/certified-operators-rlnqc"
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.937684    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98-catalog-content\") pod \"certified-operators-rlnqc\" (UID: \"e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98\") " pod="openshift-marketplace/certified-operators-rlnqc"
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.938358    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98-catalog-content\") pod \"certified-operators-rlnqc\" (UID: \"e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98\") " pod="openshift-marketplace/certified-operators-rlnqc"
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.938733    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98-utilities\") pod \"certified-operators-rlnqc\" (UID: \"e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98\") " pod="openshift-marketplace/certified-operators-rlnqc"
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.939802    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a38d833-db72-4566-b139-7788730a502a-client-ca" (OuterVolumeSpecName: "client-ca") pod "9a38d833-db72-4566-b139-7788730a502a" (UID: "9a38d833-db72-4566-b139-7788730a502a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.940000    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a38d833-db72-4566-b139-7788730a502a-config" (OuterVolumeSpecName: "config") pod "9a38d833-db72-4566-b139-7788730a502a" (UID: "9a38d833-db72-4566-b139-7788730a502a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.940380    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a38d833-db72-4566-b139-7788730a502a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "9a38d833-db72-4566-b139-7788730a502a" (UID: "9a38d833-db72-4566-b139-7788730a502a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.953239    4730 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice...
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.953653    4730 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.953688    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a38d833-db72-4566-b139-7788730a502a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9a38d833-db72-4566-b139-7788730a502a" (UID: "9a38d833-db72-4566-b139-7788730a502a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.957569    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a38d833-db72-4566-b139-7788730a502a-kube-api-access-tq5zm" (OuterVolumeSpecName: "kube-api-access-tq5zm") pod "9a38d833-db72-4566-b139-7788730a502a" (UID: "9a38d833-db72-4566-b139-7788730a502a"). InnerVolumeSpecName "kube-api-access-tq5zm". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.960230    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgmhp\" (UniqueName: \"kubernetes.io/projected/e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98-kube-api-access-rgmhp\") pod \"certified-operators-rlnqc\" (UID: \"e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98\") " pod="openshift-marketplace/certified-operators-rlnqc"
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.968602    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mbtfk"]
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.969667    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mbtfk"
Mar 20 15:42:46 crc kubenswrapper[4730]: I0320 15:42:46.987043    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.014289    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mbtfk"]
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.038894    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bdpg6\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") " pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.040616    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgkk5\" (UniqueName: \"kubernetes.io/projected/be49b904-0667-4d74-ac81-e84600f0835e-kube-api-access-dgkk5\") pod \"route-controller-manager-58875cfd6f-xthh9\" (UID: \"be49b904-0667-4d74-ac81-e84600f0835e\") " pod="openshift-route-controller-manager/route-controller-manager-58875cfd6f-xthh9"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.041057    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d7422509-bd52-437b-9459-9c715c66fc33-proxy-ca-bundles\") pod \"controller-manager-59447fbd49-wdtl4\" (UID: \"d7422509-bd52-437b-9459-9c715c66fc33\") " pod="openshift-controller-manager/controller-manager-59447fbd49-wdtl4"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.041242    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2755g\" (UniqueName: \"kubernetes.io/projected/d7422509-bd52-437b-9459-9c715c66fc33-kube-api-access-2755g\") pod \"controller-manager-59447fbd49-wdtl4\" (UID: \"d7422509-bd52-437b-9459-9c715c66fc33\") " pod="openshift-controller-manager/controller-manager-59447fbd49-wdtl4"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.041634    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rd6js\" (UniqueName: \"kubernetes.io/projected/d5addb8e-1dbc-41a2-8330-8a97251bd52f-kube-api-access-rd6js\") pod \"community-operators-mbtfk\" (UID: \"d5addb8e-1dbc-41a2-8330-8a97251bd52f\") " pod="openshift-marketplace/community-operators-mbtfk"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.041693    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be49b904-0667-4d74-ac81-e84600f0835e-serving-cert\") pod \"route-controller-manager-58875cfd6f-xthh9\" (UID: \"be49b904-0667-4d74-ac81-e84600f0835e\") " pod="openshift-route-controller-manager/route-controller-manager-58875cfd6f-xthh9"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.041809    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d7422509-bd52-437b-9459-9c715c66fc33-client-ca\") pod \"controller-manager-59447fbd49-wdtl4\" (UID: \"d7422509-bd52-437b-9459-9c715c66fc33\") " pod="openshift-controller-manager/controller-manager-59447fbd49-wdtl4"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.041834    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/be49b904-0667-4d74-ac81-e84600f0835e-client-ca\") pod \"route-controller-manager-58875cfd6f-xthh9\" (UID: \"be49b904-0667-4d74-ac81-e84600f0835e\") " pod="openshift-route-controller-manager/route-controller-manager-58875cfd6f-xthh9"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.041884    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7422509-bd52-437b-9459-9c715c66fc33-serving-cert\") pod \"controller-manager-59447fbd49-wdtl4\" (UID: \"d7422509-bd52-437b-9459-9c715c66fc33\") " pod="openshift-controller-manager/controller-manager-59447fbd49-wdtl4"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.041978    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5addb8e-1dbc-41a2-8330-8a97251bd52f-catalog-content\") pod \"community-operators-mbtfk\" (UID: \"d5addb8e-1dbc-41a2-8330-8a97251bd52f\") " pod="openshift-marketplace/community-operators-mbtfk"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.042023    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5addb8e-1dbc-41a2-8330-8a97251bd52f-utilities\") pod \"community-operators-mbtfk\" (UID: \"d5addb8e-1dbc-41a2-8330-8a97251bd52f\") " pod="openshift-marketplace/community-operators-mbtfk"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.042039    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7422509-bd52-437b-9459-9c715c66fc33-config\") pod \"controller-manager-59447fbd49-wdtl4\" (UID: \"d7422509-bd52-437b-9459-9c715c66fc33\") " pod="openshift-controller-manager/controller-manager-59447fbd49-wdtl4"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.042122    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be49b904-0667-4d74-ac81-e84600f0835e-config\") pod \"route-controller-manager-58875cfd6f-xthh9\" (UID: \"be49b904-0667-4d74-ac81-e84600f0835e\") " pod="openshift-route-controller-manager/route-controller-manager-58875cfd6f-xthh9"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.042184    4730 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a38d833-db72-4566-b139-7788730a502a-serving-cert\") on node \"crc\" DevicePath \"\""
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.042195    4730 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a38d833-db72-4566-b139-7788730a502a-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.042203    4730 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9a38d833-db72-4566-b139-7788730a502a-client-ca\") on node \"crc\" DevicePath \"\""
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.042212    4730 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9a38d833-db72-4566-b139-7788730a502a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\""
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.042221    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tq5zm\" (UniqueName: \"kubernetes.io/projected/9a38d833-db72-4566-b139-7788730a502a-kube-api-access-tq5zm\") on node \"crc\" DevicePath \"\""
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.052857    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.145953    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgkk5\" (UniqueName: \"kubernetes.io/projected/be49b904-0667-4d74-ac81-e84600f0835e-kube-api-access-dgkk5\") pod \"route-controller-manager-58875cfd6f-xthh9\" (UID: \"be49b904-0667-4d74-ac81-e84600f0835e\") " pod="openshift-route-controller-manager/route-controller-manager-58875cfd6f-xthh9"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.146007    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d7422509-bd52-437b-9459-9c715c66fc33-proxy-ca-bundles\") pod \"controller-manager-59447fbd49-wdtl4\" (UID: \"d7422509-bd52-437b-9459-9c715c66fc33\") " pod="openshift-controller-manager/controller-manager-59447fbd49-wdtl4"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.146052    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2755g\" (UniqueName: \"kubernetes.io/projected/d7422509-bd52-437b-9459-9c715c66fc33-kube-api-access-2755g\") pod \"controller-manager-59447fbd49-wdtl4\" (UID: \"d7422509-bd52-437b-9459-9c715c66fc33\") " pod="openshift-controller-manager/controller-manager-59447fbd49-wdtl4"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.146077    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rd6js\" (UniqueName: \"kubernetes.io/projected/d5addb8e-1dbc-41a2-8330-8a97251bd52f-kube-api-access-rd6js\") pod \"community-operators-mbtfk\" (UID: \"d5addb8e-1dbc-41a2-8330-8a97251bd52f\") " pod="openshift-marketplace/community-operators-mbtfk"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.146127    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be49b904-0667-4d74-ac81-e84600f0835e-serving-cert\") pod \"route-controller-manager-58875cfd6f-xthh9\" (UID: \"be49b904-0667-4d74-ac81-e84600f0835e\") " pod="openshift-route-controller-manager/route-controller-manager-58875cfd6f-xthh9"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.146166    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d7422509-bd52-437b-9459-9c715c66fc33-client-ca\") pod \"controller-manager-59447fbd49-wdtl4\" (UID: \"d7422509-bd52-437b-9459-9c715c66fc33\") " pod="openshift-controller-manager/controller-manager-59447fbd49-wdtl4"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.146191    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/be49b904-0667-4d74-ac81-e84600f0835e-client-ca\") pod \"route-controller-manager-58875cfd6f-xthh9\" (UID: \"be49b904-0667-4d74-ac81-e84600f0835e\") " pod="openshift-route-controller-manager/route-controller-manager-58875cfd6f-xthh9"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.146232    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7422509-bd52-437b-9459-9c715c66fc33-serving-cert\") pod \"controller-manager-59447fbd49-wdtl4\" (UID: \"d7422509-bd52-437b-9459-9c715c66fc33\") " pod="openshift-controller-manager/controller-manager-59447fbd49-wdtl4"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.146275    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5addb8e-1dbc-41a2-8330-8a97251bd52f-catalog-content\") pod \"community-operators-mbtfk\" (UID: \"d5addb8e-1dbc-41a2-8330-8a97251bd52f\") " pod="openshift-marketplace/community-operators-mbtfk"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.146315    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5addb8e-1dbc-41a2-8330-8a97251bd52f-utilities\") pod \"community-operators-mbtfk\" (UID: \"d5addb8e-1dbc-41a2-8330-8a97251bd52f\") " pod="openshift-marketplace/community-operators-mbtfk"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.146339    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7422509-bd52-437b-9459-9c715c66fc33-config\") pod \"controller-manager-59447fbd49-wdtl4\" (UID: \"d7422509-bd52-437b-9459-9c715c66fc33\") " pod="openshift-controller-manager/controller-manager-59447fbd49-wdtl4"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.146361    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be49b904-0667-4d74-ac81-e84600f0835e-config\") pod \"route-controller-manager-58875cfd6f-xthh9\" (UID: \"be49b904-0667-4d74-ac81-e84600f0835e\") " pod="openshift-route-controller-manager/route-controller-manager-58875cfd6f-xthh9"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.148070    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5addb8e-1dbc-41a2-8330-8a97251bd52f-catalog-content\") pod \"community-operators-mbtfk\" (UID: \"d5addb8e-1dbc-41a2-8330-8a97251bd52f\") " pod="openshift-marketplace/community-operators-mbtfk"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.148146    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d7422509-bd52-437b-9459-9c715c66fc33-client-ca\") pod \"controller-manager-59447fbd49-wdtl4\" (UID: \"d7422509-bd52-437b-9459-9c715c66fc33\") " pod="openshift-controller-manager/controller-manager-59447fbd49-wdtl4"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.148395    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be49b904-0667-4d74-ac81-e84600f0835e-config\") pod \"route-controller-manager-58875cfd6f-xthh9\" (UID: \"be49b904-0667-4d74-ac81-e84600f0835e\") " pod="openshift-route-controller-manager/route-controller-manager-58875cfd6f-xthh9"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.148444    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5addb8e-1dbc-41a2-8330-8a97251bd52f-utilities\") pod \"community-operators-mbtfk\" (UID: \"d5addb8e-1dbc-41a2-8330-8a97251bd52f\") " pod="openshift-marketplace/community-operators-mbtfk"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.148766    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/be49b904-0667-4d74-ac81-e84600f0835e-client-ca\") pod \"route-controller-manager-58875cfd6f-xthh9\" (UID: \"be49b904-0667-4d74-ac81-e84600f0835e\") " pod="openshift-route-controller-manager/route-controller-manager-58875cfd6f-xthh9"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.150284    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7422509-bd52-437b-9459-9c715c66fc33-config\") pod \"controller-manager-59447fbd49-wdtl4\" (UID: \"d7422509-bd52-437b-9459-9c715c66fc33\") " pod="openshift-controller-manager/controller-manager-59447fbd49-wdtl4"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.150435    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d7422509-bd52-437b-9459-9c715c66fc33-proxy-ca-bundles\") pod \"controller-manager-59447fbd49-wdtl4\" (UID: \"d7422509-bd52-437b-9459-9c715c66fc33\") " pod="openshift-controller-manager/controller-manager-59447fbd49-wdtl4"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.152564    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rlnqc"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.166661    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7422509-bd52-437b-9459-9c715c66fc33-serving-cert\") pod \"controller-manager-59447fbd49-wdtl4\" (UID: \"d7422509-bd52-437b-9459-9c715c66fc33\") " pod="openshift-controller-manager/controller-manager-59447fbd49-wdtl4"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.166661    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be49b904-0667-4d74-ac81-e84600f0835e-serving-cert\") pod \"route-controller-manager-58875cfd6f-xthh9\" (UID: \"be49b904-0667-4d74-ac81-e84600f0835e\") " pod="openshift-route-controller-manager/route-controller-manager-58875cfd6f-xthh9"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.169356    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6mppz"]
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.170367    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6mppz"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.171221    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgkk5\" (UniqueName: \"kubernetes.io/projected/be49b904-0667-4d74-ac81-e84600f0835e-kube-api-access-dgkk5\") pod \"route-controller-manager-58875cfd6f-xthh9\" (UID: \"be49b904-0667-4d74-ac81-e84600f0835e\") " pod="openshift-route-controller-manager/route-controller-manager-58875cfd6f-xthh9"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.180265    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2755g\" (UniqueName: \"kubernetes.io/projected/d7422509-bd52-437b-9459-9c715c66fc33-kube-api-access-2755g\") pod \"controller-manager-59447fbd49-wdtl4\" (UID: \"d7422509-bd52-437b-9459-9c715c66fc33\") " pod="openshift-controller-manager/controller-manager-59447fbd49-wdtl4"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.181114    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd6js\" (UniqueName: \"kubernetes.io/projected/d5addb8e-1dbc-41a2-8330-8a97251bd52f-kube-api-access-rd6js\") pod \"community-operators-mbtfk\" (UID: \"d5addb8e-1dbc-41a2-8330-8a97251bd52f\") " pod="openshift-marketplace/community-operators-mbtfk"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.186530    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6mppz"]
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.247631    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/168c4cbd-3a44-48a5-be95-0eb4ea01d6c8-catalog-content\") pod \"certified-operators-6mppz\" (UID: \"168c4cbd-3a44-48a5-be95-0eb4ea01d6c8\") " pod="openshift-marketplace/certified-operators-6mppz"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.247685    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwqvn\" (UniqueName: \"kubernetes.io/projected/168c4cbd-3a44-48a5-be95-0eb4ea01d6c8-kube-api-access-vwqvn\") pod \"certified-operators-6mppz\" (UID: \"168c4cbd-3a44-48a5-be95-0eb4ea01d6c8\") " pod="openshift-marketplace/certified-operators-6mppz"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.247718    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/168c4cbd-3a44-48a5-be95-0eb4ea01d6c8-utilities\") pod \"certified-operators-6mppz\" (UID: \"168c4cbd-3a44-48a5-be95-0eb4ea01d6c8\") " pod="openshift-marketplace/certified-operators-6mppz"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.311986    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58875cfd6f-xthh9"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.312006    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-59447fbd49-wdtl4"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.313227    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bdpg6"]
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.360865    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/168c4cbd-3a44-48a5-be95-0eb4ea01d6c8-catalog-content\") pod \"certified-operators-6mppz\" (UID: \"168c4cbd-3a44-48a5-be95-0eb4ea01d6c8\") " pod="openshift-marketplace/certified-operators-6mppz"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.360975    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwqvn\" (UniqueName: \"kubernetes.io/projected/168c4cbd-3a44-48a5-be95-0eb4ea01d6c8-kube-api-access-vwqvn\") pod \"certified-operators-6mppz\" (UID: \"168c4cbd-3a44-48a5-be95-0eb4ea01d6c8\") " pod="openshift-marketplace/certified-operators-6mppz"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.361028    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/168c4cbd-3a44-48a5-be95-0eb4ea01d6c8-utilities\") pod \"certified-operators-6mppz\" (UID: \"168c4cbd-3a44-48a5-be95-0eb4ea01d6c8\") " pod="openshift-marketplace/certified-operators-6mppz"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.363960    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mbtfk"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.364915    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/168c4cbd-3a44-48a5-be95-0eb4ea01d6c8-utilities\") pod \"certified-operators-6mppz\" (UID: \"168c4cbd-3a44-48a5-be95-0eb4ea01d6c8\") " pod="openshift-marketplace/certified-operators-6mppz"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.365151    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/168c4cbd-3a44-48a5-be95-0eb4ea01d6c8-catalog-content\") pod \"certified-operators-6mppz\" (UID: \"168c4cbd-3a44-48a5-be95-0eb4ea01d6c8\") " pod="openshift-marketplace/certified-operators-6mppz"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.375177    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cx74p"]
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.376206    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cx74p"
Mar 20 15:42:47 crc kubenswrapper[4730]: W0320 15:42:47.384797    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54319e7c_f09f_4f3d_80cd_8d6dcd4ef88e.slice/crio-2585ec9b7501bfbd7ee8fe496b92574cacdd42248f1dc5fb5a3a27e92aa35714 WatchSource:0}: Error finding container 2585ec9b7501bfbd7ee8fe496b92574cacdd42248f1dc5fb5a3a27e92aa35714: Status 404 returned error can't find the container with id 2585ec9b7501bfbd7ee8fe496b92574cacdd42248f1dc5fb5a3a27e92aa35714
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.400307    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cx74p"]
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.417445    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwqvn\" (UniqueName: \"kubernetes.io/projected/168c4cbd-3a44-48a5-be95-0eb4ea01d6c8-kube-api-access-vwqvn\") pod \"certified-operators-6mppz\" (UID: \"168c4cbd-3a44-48a5-be95-0eb4ea01d6c8\") " pod="openshift-marketplace/certified-operators-6mppz"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.462243    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a118148-49cc-4b61-bb43-44e3ef2c3048-catalog-content\") pod \"community-operators-cx74p\" (UID: \"7a118148-49cc-4b61-bb43-44e3ef2c3048\") " pod="openshift-marketplace/community-operators-cx74p"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.462342    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5sxl\" (UniqueName: \"kubernetes.io/projected/7a118148-49cc-4b61-bb43-44e3ef2c3048-kube-api-access-v5sxl\") pod \"community-operators-cx74p\" (UID: \"7a118148-49cc-4b61-bb43-44e3ef2c3048\") " pod="openshift-marketplace/community-operators-cx74p"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.462623    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a118148-49cc-4b61-bb43-44e3ef2c3048-utilities\") pod \"community-operators-cx74p\" (UID: \"7a118148-49cc-4b61-bb43-44e3ef2c3048\") " pod="openshift-marketplace/community-operators-cx74p"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.470276    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6" event={"ID":"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e","Type":"ContainerStarted","Data":"2585ec9b7501bfbd7ee8fe496b92574cacdd42248f1dc5fb5a3a27e92aa35714"}
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.476424    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-hrm7z" event={"ID":"9a38d833-db72-4566-b139-7788730a502a","Type":"ContainerDied","Data":"064c4bf4d5c66296dd98b82862d30c6c5583edf1808fe56690235f116a074683"}
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.476465    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-hrm7z"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.476488    4730 scope.go:117] "RemoveContainer" containerID="2bd463f1b26cfe1b9822fd238c03679adad9d834d5b448d944528f6deceb749a"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.481267    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rlnqc"]
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.497302    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6mppz"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.497534    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jgzlv" event={"ID":"428fa435-b92e-4363-82bb-40316d3e0a26","Type":"ContainerStarted","Data":"67cb28a2ee07231be998e2f13dedddf5bf377cdc97da819cac1747f44545d3be"}
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.497585    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jgzlv" event={"ID":"428fa435-b92e-4363-82bb-40316d3e0a26","Type":"ContainerStarted","Data":"79a845f4d5200a2feec2b08c9e542e7fa3735695ac0ac59d83362c07fa8ae895"}
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.527215    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-jgzlv" podStartSLOduration=11.527194332 podStartE2EDuration="11.527194332s" podCreationTimestamp="2026-03-20 15:42:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:42:47.515580784 +0000 UTC m=+226.728952153" watchObservedRunningTime="2026-03-20 15:42:47.527194332 +0000 UTC m=+226.740565711"
Mar 20 15:42:47 crc kubenswrapper[4730]: W0320 15:42:47.539784    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1e9bea0_2eab_4ac3_ae73_6ad7bf4d7a98.slice/crio-3040ee2962d5bbb55e0748bc4e87b0b5953d222ba1e8e8b464bf1b4d1408cbfd WatchSource:0}: Error finding container 3040ee2962d5bbb55e0748bc4e87b0b5953d222ba1e8e8b464bf1b4d1408cbfd: Status 404 returned error can't find the container with id 3040ee2962d5bbb55e0748bc4e87b0b5953d222ba1e8e8b464bf1b4d1408cbfd
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.561743    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf" path="/var/lib/kubelet/pods/2b0a6f23-3cb9-4313-b5c0-78f7c0ffafcf/volumes"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.563051    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.563673    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hrm7z"]
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.563704    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hrm7z"]
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.566995    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a118148-49cc-4b61-bb43-44e3ef2c3048-utilities\") pod \"community-operators-cx74p\" (UID: \"7a118148-49cc-4b61-bb43-44e3ef2c3048\") " pod="openshift-marketplace/community-operators-cx74p"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.567051    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a118148-49cc-4b61-bb43-44e3ef2c3048-catalog-content\") pod \"community-operators-cx74p\" (UID: \"7a118148-49cc-4b61-bb43-44e3ef2c3048\") " pod="openshift-marketplace/community-operators-cx74p"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.567113    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5sxl\" (UniqueName: \"kubernetes.io/projected/7a118148-49cc-4b61-bb43-44e3ef2c3048-kube-api-access-v5sxl\") pod \"community-operators-cx74p\" (UID: \"7a118148-49cc-4b61-bb43-44e3ef2c3048\") " pod="openshift-marketplace/community-operators-cx74p"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.568870    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a118148-49cc-4b61-bb43-44e3ef2c3048-catalog-content\") pod \"community-operators-cx74p\" (UID: \"7a118148-49cc-4b61-bb43-44e3ef2c3048\") " pod="openshift-marketplace/community-operators-cx74p"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.570764    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a118148-49cc-4b61-bb43-44e3ef2c3048-utilities\") pod \"community-operators-cx74p\" (UID: \"7a118148-49cc-4b61-bb43-44e3ef2c3048\") " pod="openshift-marketplace/community-operators-cx74p"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.604789    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5sxl\" (UniqueName: \"kubernetes.io/projected/7a118148-49cc-4b61-bb43-44e3ef2c3048-kube-api-access-v5sxl\") pod \"community-operators-cx74p\" (UID: \"7a118148-49cc-4b61-bb43-44e3ef2c3048\") " pod="openshift-marketplace/community-operators-cx74p"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.656470    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58875cfd6f-xthh9"]
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.716464    4730 patch_prober.go:28] interesting pod/router-default-5444994796-92dt7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld
Mar 20 15:42:47 crc kubenswrapper[4730]: [-]has-synced failed: reason withheld
Mar 20 15:42:47 crc kubenswrapper[4730]: [+]process-running ok
Mar 20 15:42:47 crc kubenswrapper[4730]: healthz check failed
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.716516    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-92dt7" podUID="18214bd2-9c3a-4737-885b-2b5c905311d8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.726519    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cx74p"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.835315    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mbtfk"]
Mar 20 15:42:47 crc kubenswrapper[4730]: W0320 15:42:47.849327    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5addb8e_1dbc_41a2_8330_8a97251bd52f.slice/crio-542af9b157419c81e78bae4e6b0035cde152a688b5cf204d5b3fe35157527e94 WatchSource:0}: Error finding container 542af9b157419c81e78bae4e6b0035cde152a688b5cf204d5b3fe35157527e94: Status 404 returned error can't find the container with id 542af9b157419c81e78bae4e6b0035cde152a688b5cf204d5b3fe35157527e94
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.991159    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"]
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.991934    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.996392    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n"
Mar 20 15:42:47 crc kubenswrapper[4730]: I0320 15:42:47.997156    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt"
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.038333    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-59447fbd49-wdtl4"]
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.038810    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6rbg9"
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.043380    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"]
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.075476    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cx74p"]
Mar 20 15:42:48 crc kubenswrapper[4730]: W0320 15:42:48.076167    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7422509_bd52_437b_9459_9c715c66fc33.slice/crio-c34933180d93ae0c413fe4e3ba797162b90bde253f6503dcbf0ea4405517bfca WatchSource:0}: Error finding container c34933180d93ae0c413fe4e3ba797162b90bde253f6503dcbf0ea4405517bfca: Status 404 returned error can't find the container with id c34933180d93ae0c413fe4e3ba797162b90bde253f6503dcbf0ea4405517bfca
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.079677    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ed812b57-bdba-4cd0-be71-859fe5d52eba-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"ed812b57-bdba-4cd0-be71-859fe5d52eba\") " pod="openshift-kube-apiserver/revision-pruner-8-crc"
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.079851    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ed812b57-bdba-4cd0-be71-859fe5d52eba-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"ed812b57-bdba-4cd0-be71-859fe5d52eba\") " pod="openshift-kube-apiserver/revision-pruner-8-crc"
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.163655    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6mppz"]
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.179564    4730 ???:1] "http: TLS handshake error from 192.168.126.11:34080: no serving certificate available for the kubelet"
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.181348    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ed812b57-bdba-4cd0-be71-859fe5d52eba-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"ed812b57-bdba-4cd0-be71-859fe5d52eba\") " pod="openshift-kube-apiserver/revision-pruner-8-crc"
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.181448    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ed812b57-bdba-4cd0-be71-859fe5d52eba-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"ed812b57-bdba-4cd0-be71-859fe5d52eba\") " pod="openshift-kube-apiserver/revision-pruner-8-crc"
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.182225    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ed812b57-bdba-4cd0-be71-859fe5d52eba-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"ed812b57-bdba-4cd0-be71-859fe5d52eba\") " pod="openshift-kube-apiserver/revision-pruner-8-crc"
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.203658    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ed812b57-bdba-4cd0-be71-859fe5d52eba-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"ed812b57-bdba-4cd0-be71-859fe5d52eba\") " pod="openshift-kube-apiserver/revision-pruner-8-crc"
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.294462    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"]
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.295224    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc"
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.297381    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt"
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.297512    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n"
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.307854    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"]
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.319569    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc"
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.384619    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/23069a50-0f37-4d67-8cfd-e7a569cc6c92-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"23069a50-0f37-4d67-8cfd-e7a569cc6c92\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc"
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.384799    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/23069a50-0f37-4d67-8cfd-e7a569cc6c92-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"23069a50-0f37-4d67-8cfd-e7a569cc6c92\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc"
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.490132    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/23069a50-0f37-4d67-8cfd-e7a569cc6c92-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"23069a50-0f37-4d67-8cfd-e7a569cc6c92\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc"
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.490858    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/23069a50-0f37-4d67-8cfd-e7a569cc6c92-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"23069a50-0f37-4d67-8cfd-e7a569cc6c92\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc"
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.497335    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/23069a50-0f37-4d67-8cfd-e7a569cc6c92-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"23069a50-0f37-4d67-8cfd-e7a569cc6c92\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc"
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.511651    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/23069a50-0f37-4d67-8cfd-e7a569cc6c92-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"23069a50-0f37-4d67-8cfd-e7a569cc6c92\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc"
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.526518    4730 generic.go:334] "Generic (PLEG): container finished" podID="e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98" containerID="0936786d2af592781681255288d0bc8bfa0e6ea172412747ed1615c053176e9a" exitCode=0
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.526605    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rlnqc" event={"ID":"e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98","Type":"ContainerDied","Data":"0936786d2af592781681255288d0bc8bfa0e6ea172412747ed1615c053176e9a"}
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.526633    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rlnqc" event={"ID":"e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98","Type":"ContainerStarted","Data":"3040ee2962d5bbb55e0748bc4e87b0b5953d222ba1e8e8b464bf1b4d1408cbfd"}
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.532774    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-59447fbd49-wdtl4" event={"ID":"d7422509-bd52-437b-9459-9c715c66fc33","Type":"ContainerStarted","Data":"97c7c0bbae68297bff265d2fc691b5c0a2a1e2f15cfbe251cb407d9a746b7e48"}
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.532841    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-59447fbd49-wdtl4" event={"ID":"d7422509-bd52-437b-9459-9c715c66fc33","Type":"ContainerStarted","Data":"c34933180d93ae0c413fe4e3ba797162b90bde253f6503dcbf0ea4405517bfca"}
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.533965    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-59447fbd49-wdtl4"
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.554575    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-58875cfd6f-xthh9" event={"ID":"be49b904-0667-4d74-ac81-e84600f0835e","Type":"ContainerStarted","Data":"067f0501bb28ee46c4dab9d0b265af305bd5580b555aabc3f22ccc19201445e0"}
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.554621    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-58875cfd6f-xthh9" event={"ID":"be49b904-0667-4d74-ac81-e84600f0835e","Type":"ContainerStarted","Data":"e460ec37f646594e2bfbe40ea47d0b74f2dd0eede1593eeb04f2a323e4f35cf9"}
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.554636    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-58875cfd6f-xthh9"
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.571104    4730 generic.go:334] "Generic (PLEG): container finished" podID="7a118148-49cc-4b61-bb43-44e3ef2c3048" containerID="18e38e92f40a21b89290233a7ffe301a018e759b353ad6883c83ed52a1e47762" exitCode=0
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.571225    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cx74p" event={"ID":"7a118148-49cc-4b61-bb43-44e3ef2c3048","Type":"ContainerDied","Data":"18e38e92f40a21b89290233a7ffe301a018e759b353ad6883c83ed52a1e47762"}
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.571277    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cx74p" event={"ID":"7a118148-49cc-4b61-bb43-44e3ef2c3048","Type":"ContainerStarted","Data":"189336796f3983b3d071ba1d85e4dc4b864a1692cfe884a4465fbc035616d986"}
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.597521    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-59447fbd49-wdtl4"
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.597745    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-59447fbd49-wdtl4" podStartSLOduration=3.597721972 podStartE2EDuration="3.597721972s" podCreationTimestamp="2026-03-20 15:42:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:42:48.596660949 +0000 UTC m=+227.810032318" watchObservedRunningTime="2026-03-20 15:42:48.597721972 +0000 UTC m=+227.811093341"
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.598687    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-58875cfd6f-xthh9"
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.606850    4730 generic.go:334] "Generic (PLEG): container finished" podID="d5addb8e-1dbc-41a2-8330-8a97251bd52f" containerID="1ce763ed176ec4f4dede58163ade6fda497d7444c6c6f195c24a524a711de167" exitCode=0
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.606969    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbtfk" event={"ID":"d5addb8e-1dbc-41a2-8330-8a97251bd52f","Type":"ContainerDied","Data":"1ce763ed176ec4f4dede58163ade6fda497d7444c6c6f195c24a524a711de167"}
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.607021    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbtfk" event={"ID":"d5addb8e-1dbc-41a2-8330-8a97251bd52f","Type":"ContainerStarted","Data":"542af9b157419c81e78bae4e6b0035cde152a688b5cf204d5b3fe35157527e94"}
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.609865    4730 generic.go:334] "Generic (PLEG): container finished" podID="be19fb65-a04f-42df-9b96-e620b58754bb" containerID="647092b460bb07570b06908ca4f98239d0470ba3df7bb23adf207cb830d51de7" exitCode=0
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.609962    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567010-d69bc" event={"ID":"be19fb65-a04f-42df-9b96-e620b58754bb","Type":"ContainerDied","Data":"647092b460bb07570b06908ca4f98239d0470ba3df7bb23adf207cb830d51de7"}
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.628669    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6" event={"ID":"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e","Type":"ContainerStarted","Data":"deb22e7f4b1084e112c47fe5c83d1b16157bc080fc85d2ff74bb28b439a9502d"}
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.628888    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.631722    4730 generic.go:334] "Generic (PLEG): container finished" podID="168c4cbd-3a44-48a5-be95-0eb4ea01d6c8" containerID="f14d75be5024f0d9fd4c3cf59a10c4fbb452ecc1d6a3188f6fd40ab5bbb8ffe9" exitCode=0
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.632660    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc"
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.632999    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6mppz" event={"ID":"168c4cbd-3a44-48a5-be95-0eb4ea01d6c8","Type":"ContainerDied","Data":"f14d75be5024f0d9fd4c3cf59a10c4fbb452ecc1d6a3188f6fd40ab5bbb8ffe9"}
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.633042    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6mppz" event={"ID":"168c4cbd-3a44-48a5-be95-0eb4ea01d6c8","Type":"ContainerStarted","Data":"b24679ab1f6c7ce28e2ed00c17a4988d013e4500b53404671b46ef5509b85dc8"}
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.634658    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-58875cfd6f-xthh9" podStartSLOduration=3.634623648 podStartE2EDuration="3.634623648s" podCreationTimestamp="2026-03-20 15:42:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:42:48.628166489 +0000 UTC m=+227.841537858" watchObservedRunningTime="2026-03-20 15:42:48.634623648 +0000 UTC m=+227.847995017"
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.659834    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"]
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.697450    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-92dt7"
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.709113    4730 patch_prober.go:28] interesting pod/router-default-5444994796-92dt7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld
Mar 20 15:42:48 crc kubenswrapper[4730]: [-]has-synced failed: reason withheld
Mar 20 15:42:48 crc kubenswrapper[4730]: [+]process-running ok
Mar 20 15:42:48 crc kubenswrapper[4730]: healthz check failed
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.710889    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-92dt7" podUID="18214bd2-9c3a-4737-885b-2b5c905311d8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500"
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.753027    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-flpw2"]
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.756962    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-flpw2"
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.762437    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb"
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.779636    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-flpw2"]
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.813273    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nsdw7"
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.822959    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nsdw7"
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.840823    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-9kgl8"
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.840942    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-9kgl8"
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.844473    4730 patch_prober.go:28] interesting pod/console-f9d7485db-9kgl8 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body=
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.844549    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-9kgl8" podUID="5edbd5a9-6c8b-4ef8-950f-58deaecf36ee" containerName="console" probeResult="failure" output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused"
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.856405    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6" podStartSLOduration=175.856385825 podStartE2EDuration="2m55.856385825s" podCreationTimestamp="2026-03-20 15:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:42:48.815441205 +0000 UTC m=+228.028812584" watchObservedRunningTime="2026-03-20 15:42:48.856385825 +0000 UTC m=+228.069757194"
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.897468    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a347883-e4f7-4fcd-8920-59519533cf43-catalog-content\") pod \"redhat-marketplace-flpw2\" (UID: \"5a347883-e4f7-4fcd-8920-59519533cf43\") " pod="openshift-marketplace/redhat-marketplace-flpw2"
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.897629    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a347883-e4f7-4fcd-8920-59519533cf43-utilities\") pod \"redhat-marketplace-flpw2\" (UID: \"5a347883-e4f7-4fcd-8920-59519533cf43\") " pod="openshift-marketplace/redhat-marketplace-flpw2"
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.897691    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs46z\" (UniqueName: \"kubernetes.io/projected/5a347883-e4f7-4fcd-8920-59519533cf43-kube-api-access-hs46z\") pod \"redhat-marketplace-flpw2\" (UID: \"5a347883-e4f7-4fcd-8920-59519533cf43\") " pod="openshift-marketplace/redhat-marketplace-flpw2"
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.943742    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-jzx77"
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.944200    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-jzx77"
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.971440    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-jzx77"
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.999134    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a347883-e4f7-4fcd-8920-59519533cf43-utilities\") pod \"redhat-marketplace-flpw2\" (UID: \"5a347883-e4f7-4fcd-8920-59519533cf43\") " pod="openshift-marketplace/redhat-marketplace-flpw2"
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.999267    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hs46z\" (UniqueName: \"kubernetes.io/projected/5a347883-e4f7-4fcd-8920-59519533cf43-kube-api-access-hs46z\") pod \"redhat-marketplace-flpw2\" (UID: \"5a347883-e4f7-4fcd-8920-59519533cf43\") " pod="openshift-marketplace/redhat-marketplace-flpw2"
Mar 20 15:42:48 crc kubenswrapper[4730]: I0320 15:42:48.999390    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a347883-e4f7-4fcd-8920-59519533cf43-catalog-content\") pod \"redhat-marketplace-flpw2\" (UID: \"5a347883-e4f7-4fcd-8920-59519533cf43\") " pod="openshift-marketplace/redhat-marketplace-flpw2"
Mar 20 15:42:49 crc kubenswrapper[4730]: I0320 15:42:49.001137    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a347883-e4f7-4fcd-8920-59519533cf43-catalog-content\") pod \"redhat-marketplace-flpw2\" (UID: \"5a347883-e4f7-4fcd-8920-59519533cf43\") " pod="openshift-marketplace/redhat-marketplace-flpw2"
Mar 20 15:42:49 crc kubenswrapper[4730]: I0320 15:42:49.002576    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a347883-e4f7-4fcd-8920-59519533cf43-utilities\") pod \"redhat-marketplace-flpw2\" (UID: \"5a347883-e4f7-4fcd-8920-59519533cf43\") " pod="openshift-marketplace/redhat-marketplace-flpw2"
Mar 20 15:42:49 crc kubenswrapper[4730]: I0320 15:42:49.060338    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs46z\" (UniqueName: \"kubernetes.io/projected/5a347883-e4f7-4fcd-8920-59519533cf43-kube-api-access-hs46z\") pod \"redhat-marketplace-flpw2\" (UID: \"5a347883-e4f7-4fcd-8920-59519533cf43\") " pod="openshift-marketplace/redhat-marketplace-flpw2"
Mar 20 15:42:49 crc kubenswrapper[4730]: I0320 15:42:49.105402    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-flpw2"
Mar 20 15:42:49 crc kubenswrapper[4730]: I0320 15:42:49.159427    4730 patch_prober.go:28] interesting pod/downloads-7954f5f757-g7hdt container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body=
Mar 20 15:42:49 crc kubenswrapper[4730]: I0320 15:42:49.159494    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-g7hdt" podUID="d32c9cec-9f6c-4304-8bc9-d2e52128470a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused"
Mar 20 15:42:49 crc kubenswrapper[4730]: I0320 15:42:49.159427    4730 patch_prober.go:28] interesting pod/downloads-7954f5f757-g7hdt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body=
Mar 20 15:42:49 crc kubenswrapper[4730]: I0320 15:42:49.159758    4730 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-g7hdt" podUID="d32c9cec-9f6c-4304-8bc9-d2e52128470a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused"
Mar 20 15:42:49 crc kubenswrapper[4730]: I0320 15:42:49.178369    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"]
Mar 20 15:42:49 crc kubenswrapper[4730]: I0320 15:42:49.188069    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2z2hv"]
Mar 20 15:42:49 crc kubenswrapper[4730]: I0320 15:42:49.189139    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2z2hv"
Mar 20 15:42:49 crc kubenswrapper[4730]: I0320 15:42:49.201628    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2z2hv"]
Mar 20 15:42:49 crc kubenswrapper[4730]: I0320 15:42:49.306049    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/715cbff8-9674-4896-8deb-54a6e9a8899e-utilities\") pod \"redhat-marketplace-2z2hv\" (UID: \"715cbff8-9674-4896-8deb-54a6e9a8899e\") " pod="openshift-marketplace/redhat-marketplace-2z2hv"
Mar 20 15:42:49 crc kubenswrapper[4730]: I0320 15:42:49.306521    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/715cbff8-9674-4896-8deb-54a6e9a8899e-catalog-content\") pod \"redhat-marketplace-2z2hv\" (UID: \"715cbff8-9674-4896-8deb-54a6e9a8899e\") " pod="openshift-marketplace/redhat-marketplace-2z2hv"
Mar 20 15:42:49 crc kubenswrapper[4730]: I0320 15:42:49.306585    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmf5w\" (UniqueName: \"kubernetes.io/projected/715cbff8-9674-4896-8deb-54a6e9a8899e-kube-api-access-pmf5w\") pod \"redhat-marketplace-2z2hv\" (UID: \"715cbff8-9674-4896-8deb-54a6e9a8899e\") " pod="openshift-marketplace/redhat-marketplace-2z2hv"
Mar 20 15:42:49 crc kubenswrapper[4730]: I0320 15:42:49.408044    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/715cbff8-9674-4896-8deb-54a6e9a8899e-utilities\") pod \"redhat-marketplace-2z2hv\" (UID: \"715cbff8-9674-4896-8deb-54a6e9a8899e\") " pod="openshift-marketplace/redhat-marketplace-2z2hv"
Mar 20 15:42:49 crc kubenswrapper[4730]: I0320 15:42:49.408137    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/715cbff8-9674-4896-8deb-54a6e9a8899e-catalog-content\") pod \"redhat-marketplace-2z2hv\" (UID: \"715cbff8-9674-4896-8deb-54a6e9a8899e\") " pod="openshift-marketplace/redhat-marketplace-2z2hv"
Mar 20 15:42:49 crc kubenswrapper[4730]: I0320 15:42:49.408198    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmf5w\" (UniqueName: \"kubernetes.io/projected/715cbff8-9674-4896-8deb-54a6e9a8899e-kube-api-access-pmf5w\") pod \"redhat-marketplace-2z2hv\" (UID: \"715cbff8-9674-4896-8deb-54a6e9a8899e\") " pod="openshift-marketplace/redhat-marketplace-2z2hv"
Mar 20 15:42:49 crc kubenswrapper[4730]: I0320 15:42:49.409110    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/715cbff8-9674-4896-8deb-54a6e9a8899e-catalog-content\") pod \"redhat-marketplace-2z2hv\" (UID: \"715cbff8-9674-4896-8deb-54a6e9a8899e\") " pod="openshift-marketplace/redhat-marketplace-2z2hv"
Mar 20 15:42:49 crc kubenswrapper[4730]: I0320 15:42:49.409133    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/715cbff8-9674-4896-8deb-54a6e9a8899e-utilities\") pod \"redhat-marketplace-2z2hv\" (UID: \"715cbff8-9674-4896-8deb-54a6e9a8899e\") " pod="openshift-marketplace/redhat-marketplace-2z2hv"
Mar 20 15:42:49 crc kubenswrapper[4730]: I0320 15:42:49.463110    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmf5w\" (UniqueName: \"kubernetes.io/projected/715cbff8-9674-4896-8deb-54a6e9a8899e-kube-api-access-pmf5w\") pod \"redhat-marketplace-2z2hv\" (UID: \"715cbff8-9674-4896-8deb-54a6e9a8899e\") " pod="openshift-marketplace/redhat-marketplace-2z2hv"
Mar 20 15:42:49 crc kubenswrapper[4730]: I0320 15:42:49.564233    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a38d833-db72-4566-b139-7788730a502a" path="/var/lib/kubelet/pods/9a38d833-db72-4566-b139-7788730a502a/volumes"
Mar 20 15:42:49 crc kubenswrapper[4730]: I0320 15:42:49.601377    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2z2hv"
Mar 20 15:42:49 crc kubenswrapper[4730]: I0320 15:42:49.698856    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"23069a50-0f37-4d67-8cfd-e7a569cc6c92","Type":"ContainerStarted","Data":"ada4a56e9a542a91af4dbeb6d4cfcf6cb1bc38e39d46c8e511a8939bc38e8ff7"}
Mar 20 15:42:49 crc kubenswrapper[4730]: I0320 15:42:49.699203    4730 patch_prober.go:28] interesting pod/router-default-5444994796-92dt7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld
Mar 20 15:42:49 crc kubenswrapper[4730]: [-]has-synced failed: reason withheld
Mar 20 15:42:49 crc kubenswrapper[4730]: [+]process-running ok
Mar 20 15:42:49 crc kubenswrapper[4730]: healthz check failed
Mar 20 15:42:49 crc kubenswrapper[4730]: I0320 15:42:49.699236    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-92dt7" podUID="18214bd2-9c3a-4737-885b-2b5c905311d8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500"
Mar 20 15:42:49 crc kubenswrapper[4730]: I0320 15:42:49.734763    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ed812b57-bdba-4cd0-be71-859fe5d52eba","Type":"ContainerStarted","Data":"519bceba9c186c0dc594c407511694240cba2dedd1d4d79a064d1216c34bbca8"}
Mar 20 15:42:49 crc kubenswrapper[4730]: I0320 15:42:49.734802    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ed812b57-bdba-4cd0-be71-859fe5d52eba","Type":"ContainerStarted","Data":"6a0cd6655c2e9c825b00626c88656cb2f35f4bd32136c526b0ab430d8a3a7e0e"}
Mar 20 15:42:49 crc kubenswrapper[4730]: I0320 15:42:49.751509    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-jzx77"
Mar 20 15:42:49 crc kubenswrapper[4730]: I0320 15:42:49.761559    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-klbh8"
Mar 20 15:42:49 crc kubenswrapper[4730]: I0320 15:42:49.769699    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.769680493 podStartE2EDuration="2.769680493s" podCreationTimestamp="2026-03-20 15:42:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:42:49.76794308 +0000 UTC m=+228.981314449" watchObservedRunningTime="2026-03-20 15:42:49.769680493 +0000 UTC m=+228.983051862"
Mar 20 15:42:49 crc kubenswrapper[4730]: I0320 15:42:49.992862    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-flpw2"]
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.090888    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2z2hv"]
Mar 20 15:42:50 crc kubenswrapper[4730]: W0320 15:42:50.124394    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod715cbff8_9674_4896_8deb_54a6e9a8899e.slice/crio-8cd9a9489d9a0dbe8438d77747b3f3bcf8afc79d1dcd6dcfc2035db6222041b2 WatchSource:0}: Error finding container 8cd9a9489d9a0dbe8438d77747b3f3bcf8afc79d1dcd6dcfc2035db6222041b2: Status 404 returned error can't find the container with id 8cd9a9489d9a0dbe8438d77747b3f3bcf8afc79d1dcd6dcfc2035db6222041b2
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.155623    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8rptq"]
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.156648    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8rptq"
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.159602    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh"
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.176276    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8rptq"]
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.227372    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgx8f\" (UniqueName: \"kubernetes.io/projected/558b00fd-2589-4842-8cba-db0cffe8c826-kube-api-access-kgx8f\") pod \"redhat-operators-8rptq\" (UID: \"558b00fd-2589-4842-8cba-db0cffe8c826\") " pod="openshift-marketplace/redhat-operators-8rptq"
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.227456    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/558b00fd-2589-4842-8cba-db0cffe8c826-utilities\") pod \"redhat-operators-8rptq\" (UID: \"558b00fd-2589-4842-8cba-db0cffe8c826\") " pod="openshift-marketplace/redhat-operators-8rptq"
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.227493    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/558b00fd-2589-4842-8cba-db0cffe8c826-catalog-content\") pod \"redhat-operators-8rptq\" (UID: \"558b00fd-2589-4842-8cba-db0cffe8c826\") " pod="openshift-marketplace/redhat-operators-8rptq"
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.329380    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgx8f\" (UniqueName: \"kubernetes.io/projected/558b00fd-2589-4842-8cba-db0cffe8c826-kube-api-access-kgx8f\") pod \"redhat-operators-8rptq\" (UID: \"558b00fd-2589-4842-8cba-db0cffe8c826\") " pod="openshift-marketplace/redhat-operators-8rptq"
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.329490    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/558b00fd-2589-4842-8cba-db0cffe8c826-utilities\") pod \"redhat-operators-8rptq\" (UID: \"558b00fd-2589-4842-8cba-db0cffe8c826\") " pod="openshift-marketplace/redhat-operators-8rptq"
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.329546    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/558b00fd-2589-4842-8cba-db0cffe8c826-catalog-content\") pod \"redhat-operators-8rptq\" (UID: \"558b00fd-2589-4842-8cba-db0cffe8c826\") " pod="openshift-marketplace/redhat-operators-8rptq"
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.330223    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/558b00fd-2589-4842-8cba-db0cffe8c826-catalog-content\") pod \"redhat-operators-8rptq\" (UID: \"558b00fd-2589-4842-8cba-db0cffe8c826\") " pod="openshift-marketplace/redhat-operators-8rptq"
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.330296    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/558b00fd-2589-4842-8cba-db0cffe8c826-utilities\") pod \"redhat-operators-8rptq\" (UID: \"558b00fd-2589-4842-8cba-db0cffe8c826\") " pod="openshift-marketplace/redhat-operators-8rptq"
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.377490    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgx8f\" (UniqueName: \"kubernetes.io/projected/558b00fd-2589-4842-8cba-db0cffe8c826-kube-api-access-kgx8f\") pod \"redhat-operators-8rptq\" (UID: \"558b00fd-2589-4842-8cba-db0cffe8c826\") " pod="openshift-marketplace/redhat-operators-8rptq"
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.494196    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8rptq"
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.546160    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qmxvf"]
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.549216    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qmxvf"
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.559959    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qmxvf"]
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.639756    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c86g2\" (UniqueName: \"kubernetes.io/projected/ab6c90a0-1bc1-476d-8526-d1fe438163e3-kube-api-access-c86g2\") pod \"redhat-operators-qmxvf\" (UID: \"ab6c90a0-1bc1-476d-8526-d1fe438163e3\") " pod="openshift-marketplace/redhat-operators-qmxvf"
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.639839    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab6c90a0-1bc1-476d-8526-d1fe438163e3-catalog-content\") pod \"redhat-operators-qmxvf\" (UID: \"ab6c90a0-1bc1-476d-8526-d1fe438163e3\") " pod="openshift-marketplace/redhat-operators-qmxvf"
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.639879    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab6c90a0-1bc1-476d-8526-d1fe438163e3-utilities\") pod \"redhat-operators-qmxvf\" (UID: \"ab6c90a0-1bc1-476d-8526-d1fe438163e3\") " pod="openshift-marketplace/redhat-operators-qmxvf"
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.660267    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567010-d69bc"
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.710520    4730 patch_prober.go:28] interesting pod/router-default-5444994796-92dt7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld
Mar 20 15:42:50 crc kubenswrapper[4730]: [-]has-synced failed: reason withheld
Mar 20 15:42:50 crc kubenswrapper[4730]: [+]process-running ok
Mar 20 15:42:50 crc kubenswrapper[4730]: healthz check failed
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.710594    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-92dt7" podUID="18214bd2-9c3a-4737-885b-2b5c905311d8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500"
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.744078    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbnh9\" (UniqueName: \"kubernetes.io/projected/be19fb65-a04f-42df-9b96-e620b58754bb-kube-api-access-wbnh9\") pod \"be19fb65-a04f-42df-9b96-e620b58754bb\" (UID: \"be19fb65-a04f-42df-9b96-e620b58754bb\") "
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.744306    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/be19fb65-a04f-42df-9b96-e620b58754bb-secret-volume\") pod \"be19fb65-a04f-42df-9b96-e620b58754bb\" (UID: \"be19fb65-a04f-42df-9b96-e620b58754bb\") "
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.744349    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be19fb65-a04f-42df-9b96-e620b58754bb-config-volume\") pod \"be19fb65-a04f-42df-9b96-e620b58754bb\" (UID: \"be19fb65-a04f-42df-9b96-e620b58754bb\") "
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.745020    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c86g2\" (UniqueName: \"kubernetes.io/projected/ab6c90a0-1bc1-476d-8526-d1fe438163e3-kube-api-access-c86g2\") pod \"redhat-operators-qmxvf\" (UID: \"ab6c90a0-1bc1-476d-8526-d1fe438163e3\") " pod="openshift-marketplace/redhat-operators-qmxvf"
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.745126    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab6c90a0-1bc1-476d-8526-d1fe438163e3-catalog-content\") pod \"redhat-operators-qmxvf\" (UID: \"ab6c90a0-1bc1-476d-8526-d1fe438163e3\") " pod="openshift-marketplace/redhat-operators-qmxvf"
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.745165    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab6c90a0-1bc1-476d-8526-d1fe438163e3-utilities\") pod \"redhat-operators-qmxvf\" (UID: \"ab6c90a0-1bc1-476d-8526-d1fe438163e3\") " pod="openshift-marketplace/redhat-operators-qmxvf"
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.745810    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab6c90a0-1bc1-476d-8526-d1fe438163e3-utilities\") pod \"redhat-operators-qmxvf\" (UID: \"ab6c90a0-1bc1-476d-8526-d1fe438163e3\") " pod="openshift-marketplace/redhat-operators-qmxvf"
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.746108    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab6c90a0-1bc1-476d-8526-d1fe438163e3-catalog-content\") pod \"redhat-operators-qmxvf\" (UID: \"ab6c90a0-1bc1-476d-8526-d1fe438163e3\") " pod="openshift-marketplace/redhat-operators-qmxvf"
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.746117    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be19fb65-a04f-42df-9b96-e620b58754bb-config-volume" (OuterVolumeSpecName: "config-volume") pod "be19fb65-a04f-42df-9b96-e620b58754bb" (UID: "be19fb65-a04f-42df-9b96-e620b58754bb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.757806    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be19fb65-a04f-42df-9b96-e620b58754bb-kube-api-access-wbnh9" (OuterVolumeSpecName: "kube-api-access-wbnh9") pod "be19fb65-a04f-42df-9b96-e620b58754bb" (UID: "be19fb65-a04f-42df-9b96-e620b58754bb"). InnerVolumeSpecName "kube-api-access-wbnh9". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.759789    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be19fb65-a04f-42df-9b96-e620b58754bb-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "be19fb65-a04f-42df-9b96-e620b58754bb" (UID: "be19fb65-a04f-42df-9b96-e620b58754bb"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.796242    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c86g2\" (UniqueName: \"kubernetes.io/projected/ab6c90a0-1bc1-476d-8526-d1fe438163e3-kube-api-access-c86g2\") pod \"redhat-operators-qmxvf\" (UID: \"ab6c90a0-1bc1-476d-8526-d1fe438163e3\") " pod="openshift-marketplace/redhat-operators-qmxvf"
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.811142    4730 generic.go:334] "Generic (PLEG): container finished" podID="5a347883-e4f7-4fcd-8920-59519533cf43" containerID="1e2d0f7b622d4a27e1b76b3b32f61e354d6ed5f7ddeb8e6368356819c35fc74f" exitCode=0
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.811286    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-flpw2" event={"ID":"5a347883-e4f7-4fcd-8920-59519533cf43","Type":"ContainerDied","Data":"1e2d0f7b622d4a27e1b76b3b32f61e354d6ed5f7ddeb8e6368356819c35fc74f"}
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.811333    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-flpw2" event={"ID":"5a347883-e4f7-4fcd-8920-59519533cf43","Type":"ContainerStarted","Data":"9a27ed5cf68d2bc6928d37904a276f94c614d0e58deaf1534a9994ffbccaa224"}
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.823667    4730 generic.go:334] "Generic (PLEG): container finished" podID="715cbff8-9674-4896-8deb-54a6e9a8899e" containerID="3d0d0e86eafcfd3f1d67f4c8dfdc39b982cb1033b59bc5dcda037270619199e3" exitCode=0
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.823939    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2z2hv" event={"ID":"715cbff8-9674-4896-8deb-54a6e9a8899e","Type":"ContainerDied","Data":"3d0d0e86eafcfd3f1d67f4c8dfdc39b982cb1033b59bc5dcda037270619199e3"}
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.824003    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2z2hv" event={"ID":"715cbff8-9674-4896-8deb-54a6e9a8899e","Type":"ContainerStarted","Data":"8cd9a9489d9a0dbe8438d77747b3f3bcf8afc79d1dcd6dcfc2035db6222041b2"}
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.832106    4730 generic.go:334] "Generic (PLEG): container finished" podID="ed812b57-bdba-4cd0-be71-859fe5d52eba" containerID="519bceba9c186c0dc594c407511694240cba2dedd1d4d79a064d1216c34bbca8" exitCode=0
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.832701    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ed812b57-bdba-4cd0-be71-859fe5d52eba","Type":"ContainerDied","Data":"519bceba9c186c0dc594c407511694240cba2dedd1d4d79a064d1216c34bbca8"}
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.846556    4730 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/be19fb65-a04f-42df-9b96-e620b58754bb-secret-volume\") on node \"crc\" DevicePath \"\""
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.847469    4730 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be19fb65-a04f-42df-9b96-e620b58754bb-config-volume\") on node \"crc\" DevicePath \"\""
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.847493    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbnh9\" (UniqueName: \"kubernetes.io/projected/be19fb65-a04f-42df-9b96-e620b58754bb-kube-api-access-wbnh9\") on node \"crc\" DevicePath \"\""
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.875261    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567010-d69bc" event={"ID":"be19fb65-a04f-42df-9b96-e620b58754bb","Type":"ContainerDied","Data":"c95af936fdf0b6d7d2255a8837cf5081d1d6b867d8c027bb70bec58e5bed039e"}
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.875298    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c95af936fdf0b6d7d2255a8837cf5081d1d6b867d8c027bb70bec58e5bed039e"
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.875385    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567010-d69bc"
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.903782    4730 generic.go:334] "Generic (PLEG): container finished" podID="23069a50-0f37-4d67-8cfd-e7a569cc6c92" containerID="1795736ee21f4075505b0da99054eb59518951bfa57d0bc7b621b63bd06d7d99" exitCode=0
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.904310    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"23069a50-0f37-4d67-8cfd-e7a569cc6c92","Type":"ContainerDied","Data":"1795736ee21f4075505b0da99054eb59518951bfa57d0bc7b621b63bd06d7d99"}
Mar 20 15:42:50 crc kubenswrapper[4730]: I0320 15:42:50.972177    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qmxvf"
Mar 20 15:42:51 crc kubenswrapper[4730]: I0320 15:42:51.036372    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8rptq"]
Mar 20 15:42:51 crc kubenswrapper[4730]: I0320 15:42:51.279844    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qmxvf"]
Mar 20 15:42:51 crc kubenswrapper[4730]: W0320 15:42:51.294364    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab6c90a0_1bc1_476d_8526_d1fe438163e3.slice/crio-3663f63ee1b2ff284c18b50ae774d9b77f54d7f929ad2803b36f2f39057f8d54 WatchSource:0}: Error finding container 3663f63ee1b2ff284c18b50ae774d9b77f54d7f929ad2803b36f2f39057f8d54: Status 404 returned error can't find the container with id 3663f63ee1b2ff284c18b50ae774d9b77f54d7f929ad2803b36f2f39057f8d54
Mar 20 15:42:51 crc kubenswrapper[4730]: I0320 15:42:51.698363    4730 patch_prober.go:28] interesting pod/router-default-5444994796-92dt7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld
Mar 20 15:42:51 crc kubenswrapper[4730]: [-]has-synced failed: reason withheld
Mar 20 15:42:51 crc kubenswrapper[4730]: [+]process-running ok
Mar 20 15:42:51 crc kubenswrapper[4730]: healthz check failed
Mar 20 15:42:51 crc kubenswrapper[4730]: I0320 15:42:51.698627    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-92dt7" podUID="18214bd2-9c3a-4737-885b-2b5c905311d8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500"
Mar 20 15:42:51 crc kubenswrapper[4730]: I0320 15:42:51.914630    4730 generic.go:334] "Generic (PLEG): container finished" podID="558b00fd-2589-4842-8cba-db0cffe8c826" containerID="c2ed9a8424613f58ef80e56f4945d0a2f01a1e4ab20cce8fcafa4c0b7fc30d73" exitCode=0
Mar 20 15:42:51 crc kubenswrapper[4730]: I0320 15:42:51.914712    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8rptq" event={"ID":"558b00fd-2589-4842-8cba-db0cffe8c826","Type":"ContainerDied","Data":"c2ed9a8424613f58ef80e56f4945d0a2f01a1e4ab20cce8fcafa4c0b7fc30d73"}
Mar 20 15:42:51 crc kubenswrapper[4730]: I0320 15:42:51.914741    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8rptq" event={"ID":"558b00fd-2589-4842-8cba-db0cffe8c826","Type":"ContainerStarted","Data":"8b528dc3e6323a70e8b05e1cb0a0d95967e9a6d57d83e5d00d37458aa2621e38"}
Mar 20 15:42:51 crc kubenswrapper[4730]: I0320 15:42:51.923750    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qmxvf" event={"ID":"ab6c90a0-1bc1-476d-8526-d1fe438163e3","Type":"ContainerDied","Data":"6a84b3881231514a341a3dd596d04f1b46d9c8c10a2296046ee0ddb6c55675a3"}
Mar 20 15:42:51 crc kubenswrapper[4730]: I0320 15:42:51.923663    4730 generic.go:334] "Generic (PLEG): container finished" podID="ab6c90a0-1bc1-476d-8526-d1fe438163e3" containerID="6a84b3881231514a341a3dd596d04f1b46d9c8c10a2296046ee0ddb6c55675a3" exitCode=0
Mar 20 15:42:51 crc kubenswrapper[4730]: I0320 15:42:51.924039    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qmxvf" event={"ID":"ab6c90a0-1bc1-476d-8526-d1fe438163e3","Type":"ContainerStarted","Data":"3663f63ee1b2ff284c18b50ae774d9b77f54d7f929ad2803b36f2f39057f8d54"}
Mar 20 15:42:52 crc kubenswrapper[4730]: I0320 15:42:52.325953    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc"
Mar 20 15:42:52 crc kubenswrapper[4730]: I0320 15:42:52.361448    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc"
Mar 20 15:42:52 crc kubenswrapper[4730]: I0320 15:42:52.368008    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/23069a50-0f37-4d67-8cfd-e7a569cc6c92-kube-api-access\") pod \"23069a50-0f37-4d67-8cfd-e7a569cc6c92\" (UID: \"23069a50-0f37-4d67-8cfd-e7a569cc6c92\") "
Mar 20 15:42:52 crc kubenswrapper[4730]: I0320 15:42:52.368101    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ed812b57-bdba-4cd0-be71-859fe5d52eba-kubelet-dir\") pod \"ed812b57-bdba-4cd0-be71-859fe5d52eba\" (UID: \"ed812b57-bdba-4cd0-be71-859fe5d52eba\") "
Mar 20 15:42:52 crc kubenswrapper[4730]: I0320 15:42:52.368182    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ed812b57-bdba-4cd0-be71-859fe5d52eba-kube-api-access\") pod \"ed812b57-bdba-4cd0-be71-859fe5d52eba\" (UID: \"ed812b57-bdba-4cd0-be71-859fe5d52eba\") "
Mar 20 15:42:52 crc kubenswrapper[4730]: I0320 15:42:52.368204    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ed812b57-bdba-4cd0-be71-859fe5d52eba-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ed812b57-bdba-4cd0-be71-859fe5d52eba" (UID: "ed812b57-bdba-4cd0-be71-859fe5d52eba"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue ""
Mar 20 15:42:52 crc kubenswrapper[4730]: I0320 15:42:52.368216    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/23069a50-0f37-4d67-8cfd-e7a569cc6c92-kubelet-dir\") pod \"23069a50-0f37-4d67-8cfd-e7a569cc6c92\" (UID: \"23069a50-0f37-4d67-8cfd-e7a569cc6c92\") "
Mar 20 15:42:52 crc kubenswrapper[4730]: I0320 15:42:52.368264    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23069a50-0f37-4d67-8cfd-e7a569cc6c92-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "23069a50-0f37-4d67-8cfd-e7a569cc6c92" (UID: "23069a50-0f37-4d67-8cfd-e7a569cc6c92"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue ""
Mar 20 15:42:52 crc kubenswrapper[4730]: I0320 15:42:52.368730    4730 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ed812b57-bdba-4cd0-be71-859fe5d52eba-kubelet-dir\") on node \"crc\" DevicePath \"\""
Mar 20 15:42:52 crc kubenswrapper[4730]: I0320 15:42:52.368748    4730 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/23069a50-0f37-4d67-8cfd-e7a569cc6c92-kubelet-dir\") on node \"crc\" DevicePath \"\""
Mar 20 15:42:52 crc kubenswrapper[4730]: I0320 15:42:52.379937    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23069a50-0f37-4d67-8cfd-e7a569cc6c92-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "23069a50-0f37-4d67-8cfd-e7a569cc6c92" (UID: "23069a50-0f37-4d67-8cfd-e7a569cc6c92"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:42:52 crc kubenswrapper[4730]: I0320 15:42:52.380327    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed812b57-bdba-4cd0-be71-859fe5d52eba-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ed812b57-bdba-4cd0-be71-859fe5d52eba" (UID: "ed812b57-bdba-4cd0-be71-859fe5d52eba"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:42:52 crc kubenswrapper[4730]: I0320 15:42:52.470116    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/23069a50-0f37-4d67-8cfd-e7a569cc6c92-kube-api-access\") on node \"crc\" DevicePath \"\""
Mar 20 15:42:52 crc kubenswrapper[4730]: I0320 15:42:52.470158    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ed812b57-bdba-4cd0-be71-859fe5d52eba-kube-api-access\") on node \"crc\" DevicePath \"\""
Mar 20 15:42:52 crc kubenswrapper[4730]: I0320 15:42:52.698310    4730 patch_prober.go:28] interesting pod/router-default-5444994796-92dt7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld
Mar 20 15:42:52 crc kubenswrapper[4730]: [-]has-synced failed: reason withheld
Mar 20 15:42:52 crc kubenswrapper[4730]: [+]process-running ok
Mar 20 15:42:52 crc kubenswrapper[4730]: healthz check failed
Mar 20 15:42:52 crc kubenswrapper[4730]: I0320 15:42:52.698377    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-92dt7" podUID="18214bd2-9c3a-4737-885b-2b5c905311d8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500"
Mar 20 15:42:52 crc kubenswrapper[4730]: I0320 15:42:52.934806    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ed812b57-bdba-4cd0-be71-859fe5d52eba","Type":"ContainerDied","Data":"6a0cd6655c2e9c825b00626c88656cb2f35f4bd32136c526b0ab430d8a3a7e0e"}
Mar 20 15:42:52 crc kubenswrapper[4730]: I0320 15:42:52.934840    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc"
Mar 20 15:42:52 crc kubenswrapper[4730]: I0320 15:42:52.934857    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a0cd6655c2e9c825b00626c88656cb2f35f4bd32136c526b0ab430d8a3a7e0e"
Mar 20 15:42:52 crc kubenswrapper[4730]: I0320 15:42:52.949429    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"23069a50-0f37-4d67-8cfd-e7a569cc6c92","Type":"ContainerDied","Data":"ada4a56e9a542a91af4dbeb6d4cfcf6cb1bc38e39d46c8e511a8939bc38e8ff7"}
Mar 20 15:42:52 crc kubenswrapper[4730]: I0320 15:42:52.949483    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ada4a56e9a542a91af4dbeb6d4cfcf6cb1bc38e39d46c8e511a8939bc38e8ff7"
Mar 20 15:42:52 crc kubenswrapper[4730]: I0320 15:42:52.949573    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc"
Mar 20 15:42:53 crc kubenswrapper[4730]: I0320 15:42:53.333280    4730 ???:1] "http: TLS handshake error from 192.168.126.11:34094: no serving certificate available for the kubelet"
Mar 20 15:42:53 crc kubenswrapper[4730]: I0320 15:42:53.699028    4730 patch_prober.go:28] interesting pod/router-default-5444994796-92dt7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld
Mar 20 15:42:53 crc kubenswrapper[4730]: [-]has-synced failed: reason withheld
Mar 20 15:42:53 crc kubenswrapper[4730]: [+]process-running ok
Mar 20 15:42:53 crc kubenswrapper[4730]: healthz check failed
Mar 20 15:42:53 crc kubenswrapper[4730]: I0320 15:42:53.699323    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-92dt7" podUID="18214bd2-9c3a-4737-885b-2b5c905311d8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500"
Mar 20 15:42:54 crc kubenswrapper[4730]: I0320 15:42:54.372163    4730 ???:1] "http: TLS handshake error from 192.168.126.11:34888: no serving certificate available for the kubelet"
Mar 20 15:42:54 crc kubenswrapper[4730]: I0320 15:42:54.699352    4730 patch_prober.go:28] interesting pod/router-default-5444994796-92dt7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld
Mar 20 15:42:54 crc kubenswrapper[4730]: [-]has-synced failed: reason withheld
Mar 20 15:42:54 crc kubenswrapper[4730]: [+]process-running ok
Mar 20 15:42:54 crc kubenswrapper[4730]: healthz check failed
Mar 20 15:42:54 crc kubenswrapper[4730]: I0320 15:42:54.699405    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-92dt7" podUID="18214bd2-9c3a-4737-885b-2b5c905311d8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500"
Mar 20 15:42:54 crc kubenswrapper[4730]: I0320 15:42:54.795319    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-7ckfm"
Mar 20 15:42:55 crc kubenswrapper[4730]: I0320 15:42:55.698364    4730 patch_prober.go:28] interesting pod/router-default-5444994796-92dt7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld
Mar 20 15:42:55 crc kubenswrapper[4730]: [-]has-synced failed: reason withheld
Mar 20 15:42:55 crc kubenswrapper[4730]: [+]process-running ok
Mar 20 15:42:55 crc kubenswrapper[4730]: healthz check failed
Mar 20 15:42:55 crc kubenswrapper[4730]: I0320 15:42:55.698439    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-92dt7" podUID="18214bd2-9c3a-4737-885b-2b5c905311d8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500"
Mar 20 15:42:56 crc kubenswrapper[4730]: I0320 15:42:56.700059    4730 patch_prober.go:28] interesting pod/router-default-5444994796-92dt7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld
Mar 20 15:42:56 crc kubenswrapper[4730]: [-]has-synced failed: reason withheld
Mar 20 15:42:56 crc kubenswrapper[4730]: [+]process-running ok
Mar 20 15:42:56 crc kubenswrapper[4730]: healthz check failed
Mar 20 15:42:56 crc kubenswrapper[4730]: I0320 15:42:56.700121    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-92dt7" podUID="18214bd2-9c3a-4737-885b-2b5c905311d8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500"
Mar 20 15:42:57 crc kubenswrapper[4730]: I0320 15:42:57.698230    4730 patch_prober.go:28] interesting pod/router-default-5444994796-92dt7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld
Mar 20 15:42:57 crc kubenswrapper[4730]: [-]has-synced failed: reason withheld
Mar 20 15:42:57 crc kubenswrapper[4730]: [+]process-running ok
Mar 20 15:42:57 crc kubenswrapper[4730]: healthz check failed
Mar 20 15:42:57 crc kubenswrapper[4730]: I0320 15:42:57.698621    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-92dt7" podUID="18214bd2-9c3a-4737-885b-2b5c905311d8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500"
Mar 20 15:42:58 crc kubenswrapper[4730]: I0320 15:42:58.747772    4730 patch_prober.go:28] interesting pod/router-default-5444994796-92dt7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld
Mar 20 15:42:58 crc kubenswrapper[4730]: [-]has-synced failed: reason withheld
Mar 20 15:42:58 crc kubenswrapper[4730]: [+]process-running ok
Mar 20 15:42:58 crc kubenswrapper[4730]: healthz check failed
Mar 20 15:42:58 crc kubenswrapper[4730]: I0320 15:42:58.747837    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-92dt7" podUID="18214bd2-9c3a-4737-885b-2b5c905311d8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500"
Mar 20 15:42:58 crc kubenswrapper[4730]: I0320 15:42:58.845742    4730 patch_prober.go:28] interesting pod/console-f9d7485db-9kgl8 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body=
Mar 20 15:42:58 crc kubenswrapper[4730]: I0320 15:42:58.845799    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-9kgl8" podUID="5edbd5a9-6c8b-4ef8-950f-58deaecf36ee" containerName="console" probeResult="failure" output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused"
Mar 20 15:42:59 crc kubenswrapper[4730]: I0320 15:42:59.159722    4730 patch_prober.go:28] interesting pod/downloads-7954f5f757-g7hdt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body=
Mar 20 15:42:59 crc kubenswrapper[4730]: I0320 15:42:59.159782    4730 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-g7hdt" podUID="d32c9cec-9f6c-4304-8bc9-d2e52128470a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused"
Mar 20 15:42:59 crc kubenswrapper[4730]: I0320 15:42:59.160117    4730 patch_prober.go:28] interesting pod/downloads-7954f5f757-g7hdt container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body=
Mar 20 15:42:59 crc kubenswrapper[4730]: I0320 15:42:59.160199    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-g7hdt" podUID="d32c9cec-9f6c-4304-8bc9-d2e52128470a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused"
Mar 20 15:42:59 crc kubenswrapper[4730]: I0320 15:42:59.698527    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-92dt7"
Mar 20 15:42:59 crc kubenswrapper[4730]: I0320 15:42:59.708203    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-92dt7"
Mar 20 15:43:03 crc kubenswrapper[4730]: I0320 15:43:03.601013    4730 ???:1] "http: TLS handshake error from 192.168.126.11:48288: no serving certificate available for the kubelet"
Mar 20 15:43:04 crc kubenswrapper[4730]: I0320 15:43:04.143321    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-59447fbd49-wdtl4"]
Mar 20 15:43:04 crc kubenswrapper[4730]: I0320 15:43:04.151066    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58875cfd6f-xthh9"]
Mar 20 15:43:04 crc kubenswrapper[4730]: I0320 15:43:04.151600    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-58875cfd6f-xthh9" podUID="be49b904-0667-4d74-ac81-e84600f0835e" containerName="route-controller-manager" containerID="cri-o://067f0501bb28ee46c4dab9d0b265af305bd5580b555aabc3f22ccc19201445e0" gracePeriod=30
Mar 20 15:43:04 crc kubenswrapper[4730]: I0320 15:43:04.151807    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-59447fbd49-wdtl4" podUID="d7422509-bd52-437b-9459-9c715c66fc33" containerName="controller-manager" containerID="cri-o://97c7c0bbae68297bff265d2fc691b5c0a2a1e2f15cfbe251cb407d9a746b7e48" gracePeriod=30
Mar 20 15:43:05 crc kubenswrapper[4730]: I0320 15:43:05.017242    4730 generic.go:334] "Generic (PLEG): container finished" podID="d7422509-bd52-437b-9459-9c715c66fc33" containerID="97c7c0bbae68297bff265d2fc691b5c0a2a1e2f15cfbe251cb407d9a746b7e48" exitCode=0
Mar 20 15:43:05 crc kubenswrapper[4730]: I0320 15:43:05.017293    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-59447fbd49-wdtl4" event={"ID":"d7422509-bd52-437b-9459-9c715c66fc33","Type":"ContainerDied","Data":"97c7c0bbae68297bff265d2fc691b5c0a2a1e2f15cfbe251cb407d9a746b7e48"}
Mar 20 15:43:05 crc kubenswrapper[4730]: I0320 15:43:05.019790    4730 generic.go:334] "Generic (PLEG): container finished" podID="be49b904-0667-4d74-ac81-e84600f0835e" containerID="067f0501bb28ee46c4dab9d0b265af305bd5580b555aabc3f22ccc19201445e0" exitCode=0
Mar 20 15:43:05 crc kubenswrapper[4730]: I0320 15:43:05.019896    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-58875cfd6f-xthh9" event={"ID":"be49b904-0667-4d74-ac81-e84600f0835e","Type":"ContainerDied","Data":"067f0501bb28ee46c4dab9d0b265af305bd5580b555aabc3f22ccc19201445e0"}
Mar 20 15:43:07 crc kubenswrapper[4730]: I0320 15:43:07.059139    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:43:07 crc kubenswrapper[4730]: I0320 15:43:07.313884    4730 patch_prober.go:28] interesting pod/route-controller-manager-58875cfd6f-xthh9 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.47:8443/healthz\": dial tcp 10.217.0.47:8443: connect: connection refused" start-of-body=
Mar 20 15:43:07 crc kubenswrapper[4730]: I0320 15:43:07.313937    4730 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-58875cfd6f-xthh9" podUID="be49b904-0667-4d74-ac81-e84600f0835e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.47:8443/healthz\": dial tcp 10.217.0.47:8443: connect: connection refused"
Mar 20 15:43:07 crc kubenswrapper[4730]: I0320 15:43:07.314419    4730 patch_prober.go:28] interesting pod/controller-manager-59447fbd49-wdtl4 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.46:8443/healthz\": dial tcp 10.217.0.46:8443: connect: connection refused" start-of-body=
Mar 20 15:43:07 crc kubenswrapper[4730]: I0320 15:43:07.314471    4730 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-59447fbd49-wdtl4" podUID="d7422509-bd52-437b-9459-9c715c66fc33" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.46:8443/healthz\": dial tcp 10.217.0.46:8443: connect: connection refused"
Mar 20 15:43:08 crc kubenswrapper[4730]: I0320 15:43:08.845312    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-9kgl8"
Mar 20 15:43:08 crc kubenswrapper[4730]: I0320 15:43:08.849098    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-9kgl8"
Mar 20 15:43:09 crc kubenswrapper[4730]: I0320 15:43:09.171131    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-g7hdt"
Mar 20 15:43:12 crc kubenswrapper[4730]: I0320 15:43:12.879937    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 15:43:12 crc kubenswrapper[4730]: I0320 15:43:12.880464    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 15:43:16 crc kubenswrapper[4730]: E0320 15:43:16.689072    4730 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest"
Mar 20 15:43:16 crc kubenswrapper[4730]: E0320 15:43:16.689224    4730 kuberuntime_manager.go:1274] "Unhandled Error" err=<
Mar 20 15:43:16 crc kubenswrapper[4730]:         container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve
Mar 20 15:43:16 crc kubenswrapper[4730]:         ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-46gxc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29567022-wf5nv_openshift-infra(7d87adfe-3206-4175-8d8f-5a00015cc61e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled
Mar 20 15:43:16 crc kubenswrapper[4730]:  > logger="UnhandledError"
Mar 20 15:43:16 crc kubenswrapper[4730]: E0320 15:43:16.690461    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29567022-wf5nv" podUID="7d87adfe-3206-4175-8d8f-5a00015cc61e"
Mar 20 15:43:17 crc kubenswrapper[4730]: E0320 15:43:17.091155    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29567022-wf5nv" podUID="7d87adfe-3206-4175-8d8f-5a00015cc61e"
Mar 20 15:43:18 crc kubenswrapper[4730]: I0320 15:43:18.313361    4730 patch_prober.go:28] interesting pod/controller-manager-59447fbd49-wdtl4 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.46:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body=
Mar 20 15:43:18 crc kubenswrapper[4730]: I0320 15:43:18.313733    4730 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-59447fbd49-wdtl4" podUID="d7422509-bd52-437b-9459-9c715c66fc33" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.46:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)"
Mar 20 15:43:18 crc kubenswrapper[4730]: I0320 15:43:18.313388    4730 patch_prober.go:28] interesting pod/route-controller-manager-58875cfd6f-xthh9 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.47:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body=
Mar 20 15:43:18 crc kubenswrapper[4730]: I0320 15:43:18.313827    4730 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-58875cfd6f-xthh9" podUID="be49b904-0667-4d74-ac81-e84600f0835e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.47:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)"
Mar 20 15:43:19 crc kubenswrapper[4730]: I0320 15:43:19.690098    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-m4xlq"
Mar 20 15:43:21 crc kubenswrapper[4730]: E0320 15:43:21.387573    4730 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage3399164236/2\": happened during read: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18"
Mar 20 15:43:21 crc kubenswrapper[4730]: E0320 15:43:21.387837    4730 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kgx8f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-8rptq_openshift-marketplace(558b00fd-2589-4842-8cba-db0cffe8c826): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage3399164236/2\": happened during read: context canceled" logger="UnhandledError"
Mar 20 15:43:21 crc kubenswrapper[4730]: E0320 15:43:21.389136    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \\\"/var/tmp/container_images_storage3399164236/2\\\": happened during read: context canceled\"" pod="openshift-marketplace/redhat-operators-8rptq" podUID="558b00fd-2589-4842-8cba-db0cffe8c826"
Mar 20 15:43:21 crc kubenswrapper[4730]: E0320 15:43:21.399331    4730 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage824141395/2\": happened during read: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18"
Mar 20 15:43:21 crc kubenswrapper[4730]: E0320 15:43:21.399525    4730 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c86g2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-qmxvf_openshift-marketplace(ab6c90a0-1bc1-476d-8526-d1fe438163e3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage824141395/2\": happened during read: context canceled" logger="UnhandledError"
Mar 20 15:43:21 crc kubenswrapper[4730]: E0320 15:43:21.400759    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \\\"/var/tmp/container_images_storage824141395/2\\\": happened during read: context canceled\"" pod="openshift-marketplace/redhat-operators-qmxvf" podUID="ab6c90a0-1bc1-476d-8526-d1fe438163e3"
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.426301    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58875cfd6f-xthh9"
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.431432    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-59447fbd49-wdtl4"
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.456365    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-677b48c9fc-4n4h2"]
Mar 20 15:43:21 crc kubenswrapper[4730]: E0320 15:43:21.456571    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23069a50-0f37-4d67-8cfd-e7a569cc6c92" containerName="pruner"
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.456582    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="23069a50-0f37-4d67-8cfd-e7a569cc6c92" containerName="pruner"
Mar 20 15:43:21 crc kubenswrapper[4730]: E0320 15:43:21.456660    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be49b904-0667-4d74-ac81-e84600f0835e" containerName="route-controller-manager"
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.456668    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="be49b904-0667-4d74-ac81-e84600f0835e" containerName="route-controller-manager"
Mar 20 15:43:21 crc kubenswrapper[4730]: E0320 15:43:21.456699    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed812b57-bdba-4cd0-be71-859fe5d52eba" containerName="pruner"
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.456705    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed812b57-bdba-4cd0-be71-859fe5d52eba" containerName="pruner"
Mar 20 15:43:21 crc kubenswrapper[4730]: E0320 15:43:21.456714    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7422509-bd52-437b-9459-9c715c66fc33" containerName="controller-manager"
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.456719    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7422509-bd52-437b-9459-9c715c66fc33" containerName="controller-manager"
Mar 20 15:43:21 crc kubenswrapper[4730]: E0320 15:43:21.456726    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be19fb65-a04f-42df-9b96-e620b58754bb" containerName="collect-profiles"
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.456731    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="be19fb65-a04f-42df-9b96-e620b58754bb" containerName="collect-profiles"
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.456868    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="23069a50-0f37-4d67-8cfd-e7a569cc6c92" containerName="pruner"
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.456881    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="be49b904-0667-4d74-ac81-e84600f0835e" containerName="route-controller-manager"
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.456890    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed812b57-bdba-4cd0-be71-859fe5d52eba" containerName="pruner"
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.456896    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="be19fb65-a04f-42df-9b96-e620b58754bb" containerName="collect-profiles"
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.456903    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7422509-bd52-437b-9459-9c715c66fc33" containerName="controller-manager"
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.457418    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-677b48c9fc-4n4h2"
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.467833    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-677b48c9fc-4n4h2"]
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.517035    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7422509-bd52-437b-9459-9c715c66fc33-config\") pod \"d7422509-bd52-437b-9459-9c715c66fc33\" (UID: \"d7422509-bd52-437b-9459-9c715c66fc33\") "
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.517109    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be49b904-0667-4d74-ac81-e84600f0835e-config\") pod \"be49b904-0667-4d74-ac81-e84600f0835e\" (UID: \"be49b904-0667-4d74-ac81-e84600f0835e\") "
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.517207    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d7422509-bd52-437b-9459-9c715c66fc33-proxy-ca-bundles\") pod \"d7422509-bd52-437b-9459-9c715c66fc33\" (UID: \"d7422509-bd52-437b-9459-9c715c66fc33\") "
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.517233    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/be49b904-0667-4d74-ac81-e84600f0835e-client-ca\") pod \"be49b904-0667-4d74-ac81-e84600f0835e\" (UID: \"be49b904-0667-4d74-ac81-e84600f0835e\") "
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.517308    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7422509-bd52-437b-9459-9c715c66fc33-serving-cert\") pod \"d7422509-bd52-437b-9459-9c715c66fc33\" (UID: \"d7422509-bd52-437b-9459-9c715c66fc33\") "
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.517337    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be49b904-0667-4d74-ac81-e84600f0835e-serving-cert\") pod \"be49b904-0667-4d74-ac81-e84600f0835e\" (UID: \"be49b904-0667-4d74-ac81-e84600f0835e\") "
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.517388    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgkk5\" (UniqueName: \"kubernetes.io/projected/be49b904-0667-4d74-ac81-e84600f0835e-kube-api-access-dgkk5\") pod \"be49b904-0667-4d74-ac81-e84600f0835e\" (UID: \"be49b904-0667-4d74-ac81-e84600f0835e\") "
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.517450    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2755g\" (UniqueName: \"kubernetes.io/projected/d7422509-bd52-437b-9459-9c715c66fc33-kube-api-access-2755g\") pod \"d7422509-bd52-437b-9459-9c715c66fc33\" (UID: \"d7422509-bd52-437b-9459-9c715c66fc33\") "
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.518151    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be49b904-0667-4d74-ac81-e84600f0835e-client-ca" (OuterVolumeSpecName: "client-ca") pod "be49b904-0667-4d74-ac81-e84600f0835e" (UID: "be49b904-0667-4d74-ac81-e84600f0835e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.518193    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be49b904-0667-4d74-ac81-e84600f0835e-config" (OuterVolumeSpecName: "config") pod "be49b904-0667-4d74-ac81-e84600f0835e" (UID: "be49b904-0667-4d74-ac81-e84600f0835e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.518612    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d7422509-bd52-437b-9459-9c715c66fc33-client-ca\") pod \"d7422509-bd52-437b-9459-9c715c66fc33\" (UID: \"d7422509-bd52-437b-9459-9c715c66fc33\") "
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.519127    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/619056a7-dcfd-4038-a060-219937115302-serving-cert\") pod \"route-controller-manager-677b48c9fc-4n4h2\" (UID: \"619056a7-dcfd-4038-a060-219937115302\") " pod="openshift-route-controller-manager/route-controller-manager-677b48c9fc-4n4h2"
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.519315    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7422509-bd52-437b-9459-9c715c66fc33-client-ca" (OuterVolumeSpecName: "client-ca") pod "d7422509-bd52-437b-9459-9c715c66fc33" (UID: "d7422509-bd52-437b-9459-9c715c66fc33"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.519358    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7422509-bd52-437b-9459-9c715c66fc33-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d7422509-bd52-437b-9459-9c715c66fc33" (UID: "d7422509-bd52-437b-9459-9c715c66fc33"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.519506    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7422509-bd52-437b-9459-9c715c66fc33-config" (OuterVolumeSpecName: "config") pod "d7422509-bd52-437b-9459-9c715c66fc33" (UID: "d7422509-bd52-437b-9459-9c715c66fc33"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.519562    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/619056a7-dcfd-4038-a060-219937115302-config\") pod \"route-controller-manager-677b48c9fc-4n4h2\" (UID: \"619056a7-dcfd-4038-a060-219937115302\") " pod="openshift-route-controller-manager/route-controller-manager-677b48c9fc-4n4h2"
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.519677    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/619056a7-dcfd-4038-a060-219937115302-client-ca\") pod \"route-controller-manager-677b48c9fc-4n4h2\" (UID: \"619056a7-dcfd-4038-a060-219937115302\") " pod="openshift-route-controller-manager/route-controller-manager-677b48c9fc-4n4h2"
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.519718    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckj9c\" (UniqueName: \"kubernetes.io/projected/619056a7-dcfd-4038-a060-219937115302-kube-api-access-ckj9c\") pod \"route-controller-manager-677b48c9fc-4n4h2\" (UID: \"619056a7-dcfd-4038-a060-219937115302\") " pod="openshift-route-controller-manager/route-controller-manager-677b48c9fc-4n4h2"
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.519859    4730 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be49b904-0667-4d74-ac81-e84600f0835e-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.519876    4730 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d7422509-bd52-437b-9459-9c715c66fc33-proxy-ca-bundles\") on node \"crc\" DevicePath \"\""
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.520177    4730 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/be49b904-0667-4d74-ac81-e84600f0835e-client-ca\") on node \"crc\" DevicePath \"\""
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.520188    4730 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d7422509-bd52-437b-9459-9c715c66fc33-client-ca\") on node \"crc\" DevicePath \"\""
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.520199    4730 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7422509-bd52-437b-9459-9c715c66fc33-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.523665    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be49b904-0667-4d74-ac81-e84600f0835e-kube-api-access-dgkk5" (OuterVolumeSpecName: "kube-api-access-dgkk5") pod "be49b904-0667-4d74-ac81-e84600f0835e" (UID: "be49b904-0667-4d74-ac81-e84600f0835e"). InnerVolumeSpecName "kube-api-access-dgkk5". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.523705    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7422509-bd52-437b-9459-9c715c66fc33-kube-api-access-2755g" (OuterVolumeSpecName: "kube-api-access-2755g") pod "d7422509-bd52-437b-9459-9c715c66fc33" (UID: "d7422509-bd52-437b-9459-9c715c66fc33"). InnerVolumeSpecName "kube-api-access-2755g". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.525657    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7422509-bd52-437b-9459-9c715c66fc33-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d7422509-bd52-437b-9459-9c715c66fc33" (UID: "d7422509-bd52-437b-9459-9c715c66fc33"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.526390    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be49b904-0667-4d74-ac81-e84600f0835e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "be49b904-0667-4d74-ac81-e84600f0835e" (UID: "be49b904-0667-4d74-ac81-e84600f0835e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.621815    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/619056a7-dcfd-4038-a060-219937115302-config\") pod \"route-controller-manager-677b48c9fc-4n4h2\" (UID: \"619056a7-dcfd-4038-a060-219937115302\") " pod="openshift-route-controller-manager/route-controller-manager-677b48c9fc-4n4h2"
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.621877    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/619056a7-dcfd-4038-a060-219937115302-client-ca\") pod \"route-controller-manager-677b48c9fc-4n4h2\" (UID: \"619056a7-dcfd-4038-a060-219937115302\") " pod="openshift-route-controller-manager/route-controller-manager-677b48c9fc-4n4h2"
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.621897    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckj9c\" (UniqueName: \"kubernetes.io/projected/619056a7-dcfd-4038-a060-219937115302-kube-api-access-ckj9c\") pod \"route-controller-manager-677b48c9fc-4n4h2\" (UID: \"619056a7-dcfd-4038-a060-219937115302\") " pod="openshift-route-controller-manager/route-controller-manager-677b48c9fc-4n4h2"
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.621951    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/619056a7-dcfd-4038-a060-219937115302-serving-cert\") pod \"route-controller-manager-677b48c9fc-4n4h2\" (UID: \"619056a7-dcfd-4038-a060-219937115302\") " pod="openshift-route-controller-manager/route-controller-manager-677b48c9fc-4n4h2"
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.622023    4730 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7422509-bd52-437b-9459-9c715c66fc33-serving-cert\") on node \"crc\" DevicePath \"\""
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.622033    4730 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be49b904-0667-4d74-ac81-e84600f0835e-serving-cert\") on node \"crc\" DevicePath \"\""
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.622043    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgkk5\" (UniqueName: \"kubernetes.io/projected/be49b904-0667-4d74-ac81-e84600f0835e-kube-api-access-dgkk5\") on node \"crc\" DevicePath \"\""
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.622055    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2755g\" (UniqueName: \"kubernetes.io/projected/d7422509-bd52-437b-9459-9c715c66fc33-kube-api-access-2755g\") on node \"crc\" DevicePath \"\""
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.623088    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/619056a7-dcfd-4038-a060-219937115302-config\") pod \"route-controller-manager-677b48c9fc-4n4h2\" (UID: \"619056a7-dcfd-4038-a060-219937115302\") " pod="openshift-route-controller-manager/route-controller-manager-677b48c9fc-4n4h2"
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.623090    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/619056a7-dcfd-4038-a060-219937115302-client-ca\") pod \"route-controller-manager-677b48c9fc-4n4h2\" (UID: \"619056a7-dcfd-4038-a060-219937115302\") " pod="openshift-route-controller-manager/route-controller-manager-677b48c9fc-4n4h2"
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.627046    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/619056a7-dcfd-4038-a060-219937115302-serving-cert\") pod \"route-controller-manager-677b48c9fc-4n4h2\" (UID: \"619056a7-dcfd-4038-a060-219937115302\") " pod="openshift-route-controller-manager/route-controller-manager-677b48c9fc-4n4h2"
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.637503    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckj9c\" (UniqueName: \"kubernetes.io/projected/619056a7-dcfd-4038-a060-219937115302-kube-api-access-ckj9c\") pod \"route-controller-manager-677b48c9fc-4n4h2\" (UID: \"619056a7-dcfd-4038-a060-219937115302\") " pod="openshift-route-controller-manager/route-controller-manager-677b48c9fc-4n4h2"
Mar 20 15:43:21 crc kubenswrapper[4730]: I0320 15:43:21.785660    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-677b48c9fc-4n4h2"
Mar 20 15:43:22 crc kubenswrapper[4730]: I0320 15:43:22.114222    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58875cfd6f-xthh9"
Mar 20 15:43:22 crc kubenswrapper[4730]: I0320 15:43:22.114211    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-58875cfd6f-xthh9" event={"ID":"be49b904-0667-4d74-ac81-e84600f0835e","Type":"ContainerDied","Data":"e460ec37f646594e2bfbe40ea47d0b74f2dd0eede1593eeb04f2a323e4f35cf9"}
Mar 20 15:43:22 crc kubenswrapper[4730]: I0320 15:43:22.114369    4730 scope.go:117] "RemoveContainer" containerID="067f0501bb28ee46c4dab9d0b265af305bd5580b555aabc3f22ccc19201445e0"
Mar 20 15:43:22 crc kubenswrapper[4730]: I0320 15:43:22.117734    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-59447fbd49-wdtl4" event={"ID":"d7422509-bd52-437b-9459-9c715c66fc33","Type":"ContainerDied","Data":"c34933180d93ae0c413fe4e3ba797162b90bde253f6503dcbf0ea4405517bfca"}
Mar 20 15:43:22 crc kubenswrapper[4730]: I0320 15:43:22.118137    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-59447fbd49-wdtl4"
Mar 20 15:43:22 crc kubenswrapper[4730]: I0320 15:43:22.151954    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-59447fbd49-wdtl4"]
Mar 20 15:43:22 crc kubenswrapper[4730]: I0320 15:43:22.154969    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-59447fbd49-wdtl4"]
Mar 20 15:43:22 crc kubenswrapper[4730]: I0320 15:43:22.182443    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58875cfd6f-xthh9"]
Mar 20 15:43:22 crc kubenswrapper[4730]: I0320 15:43:22.184911    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58875cfd6f-xthh9"]
Mar 20 15:43:22 crc kubenswrapper[4730]: I0320 15:43:22.386405    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"]
Mar 20 15:43:22 crc kubenswrapper[4730]: I0320 15:43:22.387204    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc"
Mar 20 15:43:22 crc kubenswrapper[4730]: I0320 15:43:22.390598    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n"
Mar 20 15:43:22 crc kubenswrapper[4730]: I0320 15:43:22.390979    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt"
Mar 20 15:43:22 crc kubenswrapper[4730]: I0320 15:43:22.393979    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"]
Mar 20 15:43:22 crc kubenswrapper[4730]: I0320 15:43:22.432744    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/78bff99a-9296-41fe-ac5d-b41a183e2349-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"78bff99a-9296-41fe-ac5d-b41a183e2349\") " pod="openshift-kube-apiserver/revision-pruner-9-crc"
Mar 20 15:43:22 crc kubenswrapper[4730]: I0320 15:43:22.432798    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/78bff99a-9296-41fe-ac5d-b41a183e2349-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"78bff99a-9296-41fe-ac5d-b41a183e2349\") " pod="openshift-kube-apiserver/revision-pruner-9-crc"
Mar 20 15:43:22 crc kubenswrapper[4730]: I0320 15:43:22.533888    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/78bff99a-9296-41fe-ac5d-b41a183e2349-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"78bff99a-9296-41fe-ac5d-b41a183e2349\") " pod="openshift-kube-apiserver/revision-pruner-9-crc"
Mar 20 15:43:22 crc kubenswrapper[4730]: I0320 15:43:22.533934    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/78bff99a-9296-41fe-ac5d-b41a183e2349-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"78bff99a-9296-41fe-ac5d-b41a183e2349\") " pod="openshift-kube-apiserver/revision-pruner-9-crc"
Mar 20 15:43:22 crc kubenswrapper[4730]: I0320 15:43:22.534020    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/78bff99a-9296-41fe-ac5d-b41a183e2349-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"78bff99a-9296-41fe-ac5d-b41a183e2349\") " pod="openshift-kube-apiserver/revision-pruner-9-crc"
Mar 20 15:43:22 crc kubenswrapper[4730]: I0320 15:43:22.549603    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/78bff99a-9296-41fe-ac5d-b41a183e2349-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"78bff99a-9296-41fe-ac5d-b41a183e2349\") " pod="openshift-kube-apiserver/revision-pruner-9-crc"
Mar 20 15:43:22 crc kubenswrapper[4730]: I0320 15:43:22.718223    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc"
Mar 20 15:43:23 crc kubenswrapper[4730]: I0320 15:43:23.542795    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be49b904-0667-4d74-ac81-e84600f0835e" path="/var/lib/kubelet/pods/be49b904-0667-4d74-ac81-e84600f0835e/volumes"
Mar 20 15:43:23 crc kubenswrapper[4730]: I0320 15:43:23.544143    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7422509-bd52-437b-9459-9c715c66fc33" path="/var/lib/kubelet/pods/d7422509-bd52-437b-9459-9c715c66fc33/volumes"
Mar 20 15:43:23 crc kubenswrapper[4730]: I0320 15:43:23.940887    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-77874967bc-nkmqc"]
Mar 20 15:43:23 crc kubenswrapper[4730]: I0320 15:43:23.941999    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77874967bc-nkmqc"
Mar 20 15:43:23 crc kubenswrapper[4730]: I0320 15:43:23.944107    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config"
Mar 20 15:43:23 crc kubenswrapper[4730]: I0320 15:43:23.944146    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert"
Mar 20 15:43:23 crc kubenswrapper[4730]: I0320 15:43:23.946925    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt"
Mar 20 15:43:23 crc kubenswrapper[4730]: I0320 15:43:23.947457    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca"
Mar 20 15:43:23 crc kubenswrapper[4730]: I0320 15:43:23.947479    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt"
Mar 20 15:43:23 crc kubenswrapper[4730]: I0320 15:43:23.948530    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c"
Mar 20 15:43:23 crc kubenswrapper[4730]: I0320 15:43:23.958788    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-77874967bc-nkmqc"]
Mar 20 15:43:23 crc kubenswrapper[4730]: I0320 15:43:23.965111    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca"
Mar 20 15:43:23 crc kubenswrapper[4730]: I0320 15:43:23.985236    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e894fac3-fa5e-4281-9765-30dea46c6b32-config\") pod \"controller-manager-77874967bc-nkmqc\" (UID: \"e894fac3-fa5e-4281-9765-30dea46c6b32\") " pod="openshift-controller-manager/controller-manager-77874967bc-nkmqc"
Mar 20 15:43:23 crc kubenswrapper[4730]: I0320 15:43:23.985325    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e894fac3-fa5e-4281-9765-30dea46c6b32-serving-cert\") pod \"controller-manager-77874967bc-nkmqc\" (UID: \"e894fac3-fa5e-4281-9765-30dea46c6b32\") " pod="openshift-controller-manager/controller-manager-77874967bc-nkmqc"
Mar 20 15:43:23 crc kubenswrapper[4730]: I0320 15:43:23.985371    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e894fac3-fa5e-4281-9765-30dea46c6b32-client-ca\") pod \"controller-manager-77874967bc-nkmqc\" (UID: \"e894fac3-fa5e-4281-9765-30dea46c6b32\") " pod="openshift-controller-manager/controller-manager-77874967bc-nkmqc"
Mar 20 15:43:23 crc kubenswrapper[4730]: I0320 15:43:23.985393    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdv6d\" (UniqueName: \"kubernetes.io/projected/e894fac3-fa5e-4281-9765-30dea46c6b32-kube-api-access-jdv6d\") pod \"controller-manager-77874967bc-nkmqc\" (UID: \"e894fac3-fa5e-4281-9765-30dea46c6b32\") " pod="openshift-controller-manager/controller-manager-77874967bc-nkmqc"
Mar 20 15:43:23 crc kubenswrapper[4730]: I0320 15:43:23.985417    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e894fac3-fa5e-4281-9765-30dea46c6b32-proxy-ca-bundles\") pod \"controller-manager-77874967bc-nkmqc\" (UID: \"e894fac3-fa5e-4281-9765-30dea46c6b32\") " pod="openshift-controller-manager/controller-manager-77874967bc-nkmqc"
Mar 20 15:43:24 crc kubenswrapper[4730]: I0320 15:43:24.087285    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e894fac3-fa5e-4281-9765-30dea46c6b32-config\") pod \"controller-manager-77874967bc-nkmqc\" (UID: \"e894fac3-fa5e-4281-9765-30dea46c6b32\") " pod="openshift-controller-manager/controller-manager-77874967bc-nkmqc"
Mar 20 15:43:24 crc kubenswrapper[4730]: I0320 15:43:24.087389    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e894fac3-fa5e-4281-9765-30dea46c6b32-serving-cert\") pod \"controller-manager-77874967bc-nkmqc\" (UID: \"e894fac3-fa5e-4281-9765-30dea46c6b32\") " pod="openshift-controller-manager/controller-manager-77874967bc-nkmqc"
Mar 20 15:43:24 crc kubenswrapper[4730]: I0320 15:43:24.087477    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e894fac3-fa5e-4281-9765-30dea46c6b32-client-ca\") pod \"controller-manager-77874967bc-nkmqc\" (UID: \"e894fac3-fa5e-4281-9765-30dea46c6b32\") " pod="openshift-controller-manager/controller-manager-77874967bc-nkmqc"
Mar 20 15:43:24 crc kubenswrapper[4730]: I0320 15:43:24.087501    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdv6d\" (UniqueName: \"kubernetes.io/projected/e894fac3-fa5e-4281-9765-30dea46c6b32-kube-api-access-jdv6d\") pod \"controller-manager-77874967bc-nkmqc\" (UID: \"e894fac3-fa5e-4281-9765-30dea46c6b32\") " pod="openshift-controller-manager/controller-manager-77874967bc-nkmqc"
Mar 20 15:43:24 crc kubenswrapper[4730]: I0320 15:43:24.087535    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e894fac3-fa5e-4281-9765-30dea46c6b32-proxy-ca-bundles\") pod \"controller-manager-77874967bc-nkmqc\" (UID: \"e894fac3-fa5e-4281-9765-30dea46c6b32\") " pod="openshift-controller-manager/controller-manager-77874967bc-nkmqc"
Mar 20 15:43:24 crc kubenswrapper[4730]: I0320 15:43:24.127923    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-77874967bc-nkmqc"]
Mar 20 15:43:24 crc kubenswrapper[4730]: E0320 15:43:24.129020    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca config kube-api-access-jdv6d proxy-ca-bundles serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-77874967bc-nkmqc" podUID="e894fac3-fa5e-4281-9765-30dea46c6b32"
Mar 20 15:43:24 crc kubenswrapper[4730]: I0320 15:43:24.167170    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e894fac3-fa5e-4281-9765-30dea46c6b32-client-ca\") pod \"controller-manager-77874967bc-nkmqc\" (UID: \"e894fac3-fa5e-4281-9765-30dea46c6b32\") " pod="openshift-controller-manager/controller-manager-77874967bc-nkmqc"
Mar 20 15:43:24 crc kubenswrapper[4730]: I0320 15:43:24.167233    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e894fac3-fa5e-4281-9765-30dea46c6b32-config\") pod \"controller-manager-77874967bc-nkmqc\" (UID: \"e894fac3-fa5e-4281-9765-30dea46c6b32\") " pod="openshift-controller-manager/controller-manager-77874967bc-nkmqc"
Mar 20 15:43:24 crc kubenswrapper[4730]: I0320 15:43:24.168290    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e894fac3-fa5e-4281-9765-30dea46c6b32-proxy-ca-bundles\") pod \"controller-manager-77874967bc-nkmqc\" (UID: \"e894fac3-fa5e-4281-9765-30dea46c6b32\") " pod="openshift-controller-manager/controller-manager-77874967bc-nkmqc"
Mar 20 15:43:24 crc kubenswrapper[4730]: I0320 15:43:24.172909    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdv6d\" (UniqueName: \"kubernetes.io/projected/e894fac3-fa5e-4281-9765-30dea46c6b32-kube-api-access-jdv6d\") pod \"controller-manager-77874967bc-nkmqc\" (UID: \"e894fac3-fa5e-4281-9765-30dea46c6b32\") " pod="openshift-controller-manager/controller-manager-77874967bc-nkmqc"
Mar 20 15:43:24 crc kubenswrapper[4730]: I0320 15:43:24.184075    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e894fac3-fa5e-4281-9765-30dea46c6b32-serving-cert\") pod \"controller-manager-77874967bc-nkmqc\" (UID: \"e894fac3-fa5e-4281-9765-30dea46c6b32\") " pod="openshift-controller-manager/controller-manager-77874967bc-nkmqc"
Mar 20 15:43:24 crc kubenswrapper[4730]: E0320 15:43:24.221813    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-qmxvf" podUID="ab6c90a0-1bc1-476d-8526-d1fe438163e3"
Mar 20 15:43:24 crc kubenswrapper[4730]: E0320 15:43:24.222151    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-8rptq" podUID="558b00fd-2589-4842-8cba-db0cffe8c826"
Mar 20 15:43:24 crc kubenswrapper[4730]: I0320 15:43:24.236574    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-677b48c9fc-4n4h2"]
Mar 20 15:43:24 crc kubenswrapper[4730]: E0320 15:43:24.312467    4730 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18"
Mar 20 15:43:24 crc kubenswrapper[4730]: E0320 15:43:24.312700    4730 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v5sxl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-cx74p_openshift-marketplace(7a118148-49cc-4b61-bb43-44e3ef2c3048): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError"
Mar 20 15:43:24 crc kubenswrapper[4730]: E0320 15:43:24.314342    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-cx74p" podUID="7a118148-49cc-4b61-bb43-44e3ef2c3048"
Mar 20 15:43:25 crc kubenswrapper[4730]: I0320 15:43:25.139012    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77874967bc-nkmqc"
Mar 20 15:43:25 crc kubenswrapper[4730]: I0320 15:43:25.150858    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77874967bc-nkmqc"
Mar 20 15:43:25 crc kubenswrapper[4730]: I0320 15:43:25.202040    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e894fac3-fa5e-4281-9765-30dea46c6b32-proxy-ca-bundles\") pod \"e894fac3-fa5e-4281-9765-30dea46c6b32\" (UID: \"e894fac3-fa5e-4281-9765-30dea46c6b32\") "
Mar 20 15:43:25 crc kubenswrapper[4730]: I0320 15:43:25.202100    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdv6d\" (UniqueName: \"kubernetes.io/projected/e894fac3-fa5e-4281-9765-30dea46c6b32-kube-api-access-jdv6d\") pod \"e894fac3-fa5e-4281-9765-30dea46c6b32\" (UID: \"e894fac3-fa5e-4281-9765-30dea46c6b32\") "
Mar 20 15:43:25 crc kubenswrapper[4730]: I0320 15:43:25.202167    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e894fac3-fa5e-4281-9765-30dea46c6b32-client-ca\") pod \"e894fac3-fa5e-4281-9765-30dea46c6b32\" (UID: \"e894fac3-fa5e-4281-9765-30dea46c6b32\") "
Mar 20 15:43:25 crc kubenswrapper[4730]: I0320 15:43:25.202213    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e894fac3-fa5e-4281-9765-30dea46c6b32-config\") pod \"e894fac3-fa5e-4281-9765-30dea46c6b32\" (UID: \"e894fac3-fa5e-4281-9765-30dea46c6b32\") "
Mar 20 15:43:25 crc kubenswrapper[4730]: I0320 15:43:25.202264    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e894fac3-fa5e-4281-9765-30dea46c6b32-serving-cert\") pod \"e894fac3-fa5e-4281-9765-30dea46c6b32\" (UID: \"e894fac3-fa5e-4281-9765-30dea46c6b32\") "
Mar 20 15:43:25 crc kubenswrapper[4730]: I0320 15:43:25.202696    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e894fac3-fa5e-4281-9765-30dea46c6b32-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e894fac3-fa5e-4281-9765-30dea46c6b32" (UID: "e894fac3-fa5e-4281-9765-30dea46c6b32"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:43:25 crc kubenswrapper[4730]: I0320 15:43:25.202885    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e894fac3-fa5e-4281-9765-30dea46c6b32-config" (OuterVolumeSpecName: "config") pod "e894fac3-fa5e-4281-9765-30dea46c6b32" (UID: "e894fac3-fa5e-4281-9765-30dea46c6b32"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:43:25 crc kubenswrapper[4730]: I0320 15:43:25.203322    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e894fac3-fa5e-4281-9765-30dea46c6b32-client-ca" (OuterVolumeSpecName: "client-ca") pod "e894fac3-fa5e-4281-9765-30dea46c6b32" (UID: "e894fac3-fa5e-4281-9765-30dea46c6b32"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:43:25 crc kubenswrapper[4730]: I0320 15:43:25.214939    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e894fac3-fa5e-4281-9765-30dea46c6b32-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e894fac3-fa5e-4281-9765-30dea46c6b32" (UID: "e894fac3-fa5e-4281-9765-30dea46c6b32"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:43:25 crc kubenswrapper[4730]: I0320 15:43:25.230271    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e894fac3-fa5e-4281-9765-30dea46c6b32-kube-api-access-jdv6d" (OuterVolumeSpecName: "kube-api-access-jdv6d") pod "e894fac3-fa5e-4281-9765-30dea46c6b32" (UID: "e894fac3-fa5e-4281-9765-30dea46c6b32"). InnerVolumeSpecName "kube-api-access-jdv6d". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:43:25 crc kubenswrapper[4730]: I0320 15:43:25.303705    4730 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e894fac3-fa5e-4281-9765-30dea46c6b32-client-ca\") on node \"crc\" DevicePath \"\""
Mar 20 15:43:25 crc kubenswrapper[4730]: I0320 15:43:25.303738    4730 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e894fac3-fa5e-4281-9765-30dea46c6b32-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:43:25 crc kubenswrapper[4730]: I0320 15:43:25.303746    4730 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e894fac3-fa5e-4281-9765-30dea46c6b32-serving-cert\") on node \"crc\" DevicePath \"\""
Mar 20 15:43:25 crc kubenswrapper[4730]: I0320 15:43:25.303754    4730 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e894fac3-fa5e-4281-9765-30dea46c6b32-proxy-ca-bundles\") on node \"crc\" DevicePath \"\""
Mar 20 15:43:25 crc kubenswrapper[4730]: I0320 15:43:25.303764    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdv6d\" (UniqueName: \"kubernetes.io/projected/e894fac3-fa5e-4281-9765-30dea46c6b32-kube-api-access-jdv6d\") on node \"crc\" DevicePath \"\""
Mar 20 15:43:26 crc kubenswrapper[4730]: I0320 15:43:26.145293    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77874967bc-nkmqc"
Mar 20 15:43:26 crc kubenswrapper[4730]: I0320 15:43:26.206936    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6f56868448-2fbxh"]
Mar 20 15:43:26 crc kubenswrapper[4730]: I0320 15:43:26.208300    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f56868448-2fbxh"
Mar 20 15:43:26 crc kubenswrapper[4730]: I0320 15:43:26.210582    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c"
Mar 20 15:43:26 crc kubenswrapper[4730]: I0320 15:43:26.210723    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert"
Mar 20 15:43:26 crc kubenswrapper[4730]: I0320 15:43:26.214135    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca"
Mar 20 15:43:26 crc kubenswrapper[4730]: I0320 15:43:26.215088    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-77874967bc-nkmqc"]
Mar 20 15:43:26 crc kubenswrapper[4730]: I0320 15:43:26.215636    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config"
Mar 20 15:43:26 crc kubenswrapper[4730]: I0320 15:43:26.215722    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt"
Mar 20 15:43:26 crc kubenswrapper[4730]: I0320 15:43:26.217695    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt"
Mar 20 15:43:26 crc kubenswrapper[4730]: I0320 15:43:26.218994    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-77874967bc-nkmqc"]
Mar 20 15:43:26 crc kubenswrapper[4730]: I0320 15:43:26.222293    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca"
Mar 20 15:43:26 crc kubenswrapper[4730]: I0320 15:43:26.225925    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6f56868448-2fbxh"]
Mar 20 15:43:26 crc kubenswrapper[4730]: I0320 15:43:26.318824    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d747680-5dde-4793-863a-252a5f67233a-client-ca\") pod \"controller-manager-6f56868448-2fbxh\" (UID: \"9d747680-5dde-4793-863a-252a5f67233a\") " pod="openshift-controller-manager/controller-manager-6f56868448-2fbxh"
Mar 20 15:43:26 crc kubenswrapper[4730]: I0320 15:43:26.318883    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68xrx\" (UniqueName: \"kubernetes.io/projected/9d747680-5dde-4793-863a-252a5f67233a-kube-api-access-68xrx\") pod \"controller-manager-6f56868448-2fbxh\" (UID: \"9d747680-5dde-4793-863a-252a5f67233a\") " pod="openshift-controller-manager/controller-manager-6f56868448-2fbxh"
Mar 20 15:43:26 crc kubenswrapper[4730]: I0320 15:43:26.318934    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9d747680-5dde-4793-863a-252a5f67233a-proxy-ca-bundles\") pod \"controller-manager-6f56868448-2fbxh\" (UID: \"9d747680-5dde-4793-863a-252a5f67233a\") " pod="openshift-controller-manager/controller-manager-6f56868448-2fbxh"
Mar 20 15:43:26 crc kubenswrapper[4730]: I0320 15:43:26.318952    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d747680-5dde-4793-863a-252a5f67233a-serving-cert\") pod \"controller-manager-6f56868448-2fbxh\" (UID: \"9d747680-5dde-4793-863a-252a5f67233a\") " pod="openshift-controller-manager/controller-manager-6f56868448-2fbxh"
Mar 20 15:43:26 crc kubenswrapper[4730]: I0320 15:43:26.318976    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d747680-5dde-4793-863a-252a5f67233a-config\") pod \"controller-manager-6f56868448-2fbxh\" (UID: \"9d747680-5dde-4793-863a-252a5f67233a\") " pod="openshift-controller-manager/controller-manager-6f56868448-2fbxh"
Mar 20 15:43:26 crc kubenswrapper[4730]: E0320 15:43:26.378497    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-cx74p" podUID="7a118148-49cc-4b61-bb43-44e3ef2c3048"
Mar 20 15:43:26 crc kubenswrapper[4730]: I0320 15:43:26.421394    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68xrx\" (UniqueName: \"kubernetes.io/projected/9d747680-5dde-4793-863a-252a5f67233a-kube-api-access-68xrx\") pod \"controller-manager-6f56868448-2fbxh\" (UID: \"9d747680-5dde-4793-863a-252a5f67233a\") " pod="openshift-controller-manager/controller-manager-6f56868448-2fbxh"
Mar 20 15:43:26 crc kubenswrapper[4730]: I0320 15:43:26.421508    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9d747680-5dde-4793-863a-252a5f67233a-proxy-ca-bundles\") pod \"controller-manager-6f56868448-2fbxh\" (UID: \"9d747680-5dde-4793-863a-252a5f67233a\") " pod="openshift-controller-manager/controller-manager-6f56868448-2fbxh"
Mar 20 15:43:26 crc kubenswrapper[4730]: I0320 15:43:26.421539    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d747680-5dde-4793-863a-252a5f67233a-serving-cert\") pod \"controller-manager-6f56868448-2fbxh\" (UID: \"9d747680-5dde-4793-863a-252a5f67233a\") " pod="openshift-controller-manager/controller-manager-6f56868448-2fbxh"
Mar 20 15:43:26 crc kubenswrapper[4730]: I0320 15:43:26.421575    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d747680-5dde-4793-863a-252a5f67233a-config\") pod \"controller-manager-6f56868448-2fbxh\" (UID: \"9d747680-5dde-4793-863a-252a5f67233a\") " pod="openshift-controller-manager/controller-manager-6f56868448-2fbxh"
Mar 20 15:43:26 crc kubenswrapper[4730]: I0320 15:43:26.421673    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d747680-5dde-4793-863a-252a5f67233a-client-ca\") pod \"controller-manager-6f56868448-2fbxh\" (UID: \"9d747680-5dde-4793-863a-252a5f67233a\") " pod="openshift-controller-manager/controller-manager-6f56868448-2fbxh"
Mar 20 15:43:26 crc kubenswrapper[4730]: I0320 15:43:26.423383    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d747680-5dde-4793-863a-252a5f67233a-client-ca\") pod \"controller-manager-6f56868448-2fbxh\" (UID: \"9d747680-5dde-4793-863a-252a5f67233a\") " pod="openshift-controller-manager/controller-manager-6f56868448-2fbxh"
Mar 20 15:43:26 crc kubenswrapper[4730]: I0320 15:43:26.425007    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d747680-5dde-4793-863a-252a5f67233a-config\") pod \"controller-manager-6f56868448-2fbxh\" (UID: \"9d747680-5dde-4793-863a-252a5f67233a\") " pod="openshift-controller-manager/controller-manager-6f56868448-2fbxh"
Mar 20 15:43:26 crc kubenswrapper[4730]: I0320 15:43:26.426434    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9d747680-5dde-4793-863a-252a5f67233a-proxy-ca-bundles\") pod \"controller-manager-6f56868448-2fbxh\" (UID: \"9d747680-5dde-4793-863a-252a5f67233a\") " pod="openshift-controller-manager/controller-manager-6f56868448-2fbxh"
Mar 20 15:43:26 crc kubenswrapper[4730]: I0320 15:43:26.439049    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d747680-5dde-4793-863a-252a5f67233a-serving-cert\") pod \"controller-manager-6f56868448-2fbxh\" (UID: \"9d747680-5dde-4793-863a-252a5f67233a\") " pod="openshift-controller-manager/controller-manager-6f56868448-2fbxh"
Mar 20 15:43:26 crc kubenswrapper[4730]: I0320 15:43:26.443987    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68xrx\" (UniqueName: \"kubernetes.io/projected/9d747680-5dde-4793-863a-252a5f67233a-kube-api-access-68xrx\") pod \"controller-manager-6f56868448-2fbxh\" (UID: \"9d747680-5dde-4793-863a-252a5f67233a\") " pod="openshift-controller-manager/controller-manager-6f56868448-2fbxh"
Mar 20 15:43:26 crc kubenswrapper[4730]: E0320 15:43:26.469262    4730 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18"
Mar 20 15:43:26 crc kubenswrapper[4730]: E0320 15:43:26.469527    4730 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vwqvn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-6mppz_openshift-marketplace(168c4cbd-3a44-48a5-be95-0eb4ea01d6c8): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError"
Mar 20 15:43:26 crc kubenswrapper[4730]: E0320 15:43:26.470757    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-6mppz" podUID="168c4cbd-3a44-48a5-be95-0eb4ea01d6c8"
Mar 20 15:43:26 crc kubenswrapper[4730]: E0320 15:43:26.474488    4730 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18"
Mar 20 15:43:26 crc kubenswrapper[4730]: E0320 15:43:26.474696    4730 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rd6js,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-mbtfk_openshift-marketplace(d5addb8e-1dbc-41a2-8330-8a97251bd52f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError"
Mar 20 15:43:26 crc kubenswrapper[4730]: E0320 15:43:26.475918    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-mbtfk" podUID="d5addb8e-1dbc-41a2-8330-8a97251bd52f"
Mar 20 15:43:26 crc kubenswrapper[4730]: I0320 15:43:26.542183    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f56868448-2fbxh"
Mar 20 15:43:26 crc kubenswrapper[4730]: I0320 15:43:26.578956    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"]
Mar 20 15:43:26 crc kubenswrapper[4730]: I0320 15:43:26.579643    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc"
Mar 20 15:43:26 crc kubenswrapper[4730]: I0320 15:43:26.591670    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"]
Mar 20 15:43:26 crc kubenswrapper[4730]: I0320 15:43:26.624674    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e48519c7-0cdc-419b-bd72-2bab0e911af8-var-lock\") pod \"installer-9-crc\" (UID: \"e48519c7-0cdc-419b-bd72-2bab0e911af8\") " pod="openshift-kube-apiserver/installer-9-crc"
Mar 20 15:43:26 crc kubenswrapper[4730]: I0320 15:43:26.624768    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e48519c7-0cdc-419b-bd72-2bab0e911af8-kubelet-dir\") pod \"installer-9-crc\" (UID: \"e48519c7-0cdc-419b-bd72-2bab0e911af8\") " pod="openshift-kube-apiserver/installer-9-crc"
Mar 20 15:43:26 crc kubenswrapper[4730]: I0320 15:43:26.624944    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e48519c7-0cdc-419b-bd72-2bab0e911af8-kube-api-access\") pod \"installer-9-crc\" (UID: \"e48519c7-0cdc-419b-bd72-2bab0e911af8\") " pod="openshift-kube-apiserver/installer-9-crc"
Mar 20 15:43:26 crc kubenswrapper[4730]: I0320 15:43:26.726309    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e48519c7-0cdc-419b-bd72-2bab0e911af8-var-lock\") pod \"installer-9-crc\" (UID: \"e48519c7-0cdc-419b-bd72-2bab0e911af8\") " pod="openshift-kube-apiserver/installer-9-crc"
Mar 20 15:43:26 crc kubenswrapper[4730]: I0320 15:43:26.726397    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e48519c7-0cdc-419b-bd72-2bab0e911af8-kubelet-dir\") pod \"installer-9-crc\" (UID: \"e48519c7-0cdc-419b-bd72-2bab0e911af8\") " pod="openshift-kube-apiserver/installer-9-crc"
Mar 20 15:43:26 crc kubenswrapper[4730]: I0320 15:43:26.726424    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e48519c7-0cdc-419b-bd72-2bab0e911af8-kube-api-access\") pod \"installer-9-crc\" (UID: \"e48519c7-0cdc-419b-bd72-2bab0e911af8\") " pod="openshift-kube-apiserver/installer-9-crc"
Mar 20 15:43:26 crc kubenswrapper[4730]: I0320 15:43:26.726889    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e48519c7-0cdc-419b-bd72-2bab0e911af8-var-lock\") pod \"installer-9-crc\" (UID: \"e48519c7-0cdc-419b-bd72-2bab0e911af8\") " pod="openshift-kube-apiserver/installer-9-crc"
Mar 20 15:43:26 crc kubenswrapper[4730]: I0320 15:43:26.726890    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e48519c7-0cdc-419b-bd72-2bab0e911af8-kubelet-dir\") pod \"installer-9-crc\" (UID: \"e48519c7-0cdc-419b-bd72-2bab0e911af8\") " pod="openshift-kube-apiserver/installer-9-crc"
Mar 20 15:43:26 crc kubenswrapper[4730]: I0320 15:43:26.743871    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e48519c7-0cdc-419b-bd72-2bab0e911af8-kube-api-access\") pod \"installer-9-crc\" (UID: \"e48519c7-0cdc-419b-bd72-2bab0e911af8\") " pod="openshift-kube-apiserver/installer-9-crc"
Mar 20 15:43:26 crc kubenswrapper[4730]: I0320 15:43:26.908625    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc"
Mar 20 15:43:27 crc kubenswrapper[4730]: I0320 15:43:27.540729    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e894fac3-fa5e-4281-9765-30dea46c6b32" path="/var/lib/kubelet/pods/e894fac3-fa5e-4281-9765-30dea46c6b32/volumes"
Mar 20 15:43:27 crc kubenswrapper[4730]: E0320 15:43:27.924625    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-6mppz" podUID="168c4cbd-3a44-48a5-be95-0eb4ea01d6c8"
Mar 20 15:43:27 crc kubenswrapper[4730]: E0320 15:43:27.924803    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-mbtfk" podUID="d5addb8e-1dbc-41a2-8330-8a97251bd52f"
Mar 20 15:43:27 crc kubenswrapper[4730]: I0320 15:43:27.957736    4730 scope.go:117] "RemoveContainer" containerID="97c7c0bbae68297bff265d2fc691b5c0a2a1e2f15cfbe251cb407d9a746b7e48"
Mar 20 15:43:28 crc kubenswrapper[4730]: E0320 15:43:28.006992    4730 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18"
Mar 20 15:43:28 crc kubenswrapper[4730]: E0320 15:43:28.007545    4730 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hs46z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-flpw2_openshift-marketplace(5a347883-e4f7-4fcd-8920-59519533cf43): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError"
Mar 20 15:43:28 crc kubenswrapper[4730]: E0320 15:43:28.008914    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-flpw2" podUID="5a347883-e4f7-4fcd-8920-59519533cf43"
Mar 20 15:43:28 crc kubenswrapper[4730]: E0320 15:43:28.050228    4730 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18"
Mar 20 15:43:28 crc kubenswrapper[4730]: E0320 15:43:28.050528    4730 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pmf5w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-2z2hv_openshift-marketplace(715cbff8-9674-4896-8deb-54a6e9a8899e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError"
Mar 20 15:43:28 crc kubenswrapper[4730]: E0320 15:43:28.051772    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-2z2hv" podUID="715cbff8-9674-4896-8deb-54a6e9a8899e"
Mar 20 15:43:28 crc kubenswrapper[4730]: E0320 15:43:28.072993    4730 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18"
Mar 20 15:43:28 crc kubenswrapper[4730]: E0320 15:43:28.073267    4730 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rgmhp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-rlnqc_openshift-marketplace(e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError"
Mar 20 15:43:28 crc kubenswrapper[4730]: E0320 15:43:28.074685    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-rlnqc" podUID="e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98"
Mar 20 15:43:28 crc kubenswrapper[4730]: E0320 15:43:28.171990    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-flpw2" podUID="5a347883-e4f7-4fcd-8920-59519533cf43"
Mar 20 15:43:28 crc kubenswrapper[4730]: E0320 15:43:28.172593    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-rlnqc" podUID="e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98"
Mar 20 15:43:28 crc kubenswrapper[4730]: E0320 15:43:28.172640    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-2z2hv" podUID="715cbff8-9674-4896-8deb-54a6e9a8899e"
Mar 20 15:43:28 crc kubenswrapper[4730]: I0320 15:43:28.290609    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"]
Mar 20 15:43:28 crc kubenswrapper[4730]: I0320 15:43:28.390658    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"]
Mar 20 15:43:28 crc kubenswrapper[4730]: I0320 15:43:28.440913    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6f56868448-2fbxh"]
Mar 20 15:43:28 crc kubenswrapper[4730]: I0320 15:43:28.444117    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-677b48c9fc-4n4h2"]
Mar 20 15:43:28 crc kubenswrapper[4730]: W0320 15:43:28.456917    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod619056a7_dcfd_4038_a060_219937115302.slice/crio-5329a4a78efd925df61b8a787b8f9a45a1ac33c3586857f71eed6f7475be4589 WatchSource:0}: Error finding container 5329a4a78efd925df61b8a787b8f9a45a1ac33c3586857f71eed6f7475be4589: Status 404 returned error can't find the container with id 5329a4a78efd925df61b8a787b8f9a45a1ac33c3586857f71eed6f7475be4589
Mar 20 15:43:28 crc kubenswrapper[4730]: W0320 15:43:28.463758    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d747680_5dde_4793_863a_252a5f67233a.slice/crio-2d0d6a6d61d99c2e38791ba9cd9580b05d3d5ca5c004c4c372aff09d24128220 WatchSource:0}: Error finding container 2d0d6a6d61d99c2e38791ba9cd9580b05d3d5ca5c004c4c372aff09d24128220: Status 404 returned error can't find the container with id 2d0d6a6d61d99c2e38791ba9cd9580b05d3d5ca5c004c4c372aff09d24128220
Mar 20 15:43:29 crc kubenswrapper[4730]: I0320 15:43:29.176003    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e48519c7-0cdc-419b-bd72-2bab0e911af8","Type":"ContainerStarted","Data":"d1983b19ac38ba32d4fa20a02bf50a7e57dd7a9e5c61bb3d5cfddbb58ce8788c"}
Mar 20 15:43:29 crc kubenswrapper[4730]: I0320 15:43:29.176482    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e48519c7-0cdc-419b-bd72-2bab0e911af8","Type":"ContainerStarted","Data":"5af4b462745de074ee5968bfbd84bbf7129ced0fcb060ef525aacd425c95e3c1"}
Mar 20 15:43:29 crc kubenswrapper[4730]: I0320 15:43:29.177705    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f56868448-2fbxh" event={"ID":"9d747680-5dde-4793-863a-252a5f67233a","Type":"ContainerStarted","Data":"6a969344117b0e223a95c25576df0115301c9833e8c4f08723a3caa581f16b8b"}
Mar 20 15:43:29 crc kubenswrapper[4730]: I0320 15:43:29.177739    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f56868448-2fbxh" event={"ID":"9d747680-5dde-4793-863a-252a5f67233a","Type":"ContainerStarted","Data":"2d0d6a6d61d99c2e38791ba9cd9580b05d3d5ca5c004c4c372aff09d24128220"}
Mar 20 15:43:29 crc kubenswrapper[4730]: I0320 15:43:29.178011    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6f56868448-2fbxh"
Mar 20 15:43:29 crc kubenswrapper[4730]: I0320 15:43:29.179257    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"78bff99a-9296-41fe-ac5d-b41a183e2349","Type":"ContainerStarted","Data":"5d7f9de408c9d5d877667c8e376ecf1e8e460670d7d56cd105eae027fc2488bf"}
Mar 20 15:43:29 crc kubenswrapper[4730]: I0320 15:43:29.179307    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"78bff99a-9296-41fe-ac5d-b41a183e2349","Type":"ContainerStarted","Data":"41294ad67a411ffc3356abec6adfd499182ee81671d1eae6d8ef8df30b5e38f9"}
Mar 20 15:43:29 crc kubenswrapper[4730]: I0320 15:43:29.180838    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-677b48c9fc-4n4h2" event={"ID":"619056a7-dcfd-4038-a060-219937115302","Type":"ContainerStarted","Data":"f597a41156cb37b583c222dbf5b174ede3554dd0f42eebcfaae39f06d1dc9a19"}
Mar 20 15:43:29 crc kubenswrapper[4730]: I0320 15:43:29.180868    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-677b48c9fc-4n4h2" event={"ID":"619056a7-dcfd-4038-a060-219937115302","Type":"ContainerStarted","Data":"5329a4a78efd925df61b8a787b8f9a45a1ac33c3586857f71eed6f7475be4589"}
Mar 20 15:43:29 crc kubenswrapper[4730]: I0320 15:43:29.180947    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-677b48c9fc-4n4h2" podUID="619056a7-dcfd-4038-a060-219937115302" containerName="route-controller-manager" containerID="cri-o://f597a41156cb37b583c222dbf5b174ede3554dd0f42eebcfaae39f06d1dc9a19" gracePeriod=30
Mar 20 15:43:29 crc kubenswrapper[4730]: I0320 15:43:29.182184    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-677b48c9fc-4n4h2"
Mar 20 15:43:29 crc kubenswrapper[4730]: I0320 15:43:29.183416    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6f56868448-2fbxh"
Mar 20 15:43:29 crc kubenswrapper[4730]: I0320 15:43:29.185835    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-677b48c9fc-4n4h2"
Mar 20 15:43:29 crc kubenswrapper[4730]: I0320 15:43:29.190687    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=3.190672167 podStartE2EDuration="3.190672167s" podCreationTimestamp="2026-03-20 15:43:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:43:29.189042247 +0000 UTC m=+268.402413616" watchObservedRunningTime="2026-03-20 15:43:29.190672167 +0000 UTC m=+268.404043536"
Mar 20 15:43:29 crc kubenswrapper[4730]: I0320 15:43:29.230083    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-677b48c9fc-4n4h2" podStartSLOduration=25.230060956 podStartE2EDuration="25.230060956s" podCreationTimestamp="2026-03-20 15:43:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:43:29.212523548 +0000 UTC m=+268.425894917" watchObservedRunningTime="2026-03-20 15:43:29.230060956 +0000 UTC m=+268.443432335"
Mar 20 15:43:29 crc kubenswrapper[4730]: I0320 15:43:29.232517    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6f56868448-2fbxh" podStartSLOduration=5.232505241 podStartE2EDuration="5.232505241s" podCreationTimestamp="2026-03-20 15:43:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:43:29.231551682 +0000 UTC m=+268.444923071" watchObservedRunningTime="2026-03-20 15:43:29.232505241 +0000 UTC m=+268.445876610"
Mar 20 15:43:29 crc kubenswrapper[4730]: I0320 15:43:29.254059    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=7.254038542 podStartE2EDuration="7.254038542s" podCreationTimestamp="2026-03-20 15:43:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:43:29.250353849 +0000 UTC m=+268.463725228" watchObservedRunningTime="2026-03-20 15:43:29.254038542 +0000 UTC m=+268.467409911"
Mar 20 15:43:29 crc kubenswrapper[4730]: I0320 15:43:29.512712    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-677b48c9fc-4n4h2"
Mar 20 15:43:29 crc kubenswrapper[4730]: I0320 15:43:29.545764    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b75b5f765-8wjw2"]
Mar 20 15:43:29 crc kubenswrapper[4730]: E0320 15:43:29.546078    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="619056a7-dcfd-4038-a060-219937115302" containerName="route-controller-manager"
Mar 20 15:43:29 crc kubenswrapper[4730]: I0320 15:43:29.546092    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="619056a7-dcfd-4038-a060-219937115302" containerName="route-controller-manager"
Mar 20 15:43:29 crc kubenswrapper[4730]: I0320 15:43:29.546349    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="619056a7-dcfd-4038-a060-219937115302" containerName="route-controller-manager"
Mar 20 15:43:29 crc kubenswrapper[4730]: I0320 15:43:29.547181    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b75b5f765-8wjw2"
Mar 20 15:43:29 crc kubenswrapper[4730]: I0320 15:43:29.552467    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b75b5f765-8wjw2"]
Mar 20 15:43:29 crc kubenswrapper[4730]: I0320 15:43:29.569935    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/619056a7-dcfd-4038-a060-219937115302-client-ca\") pod \"619056a7-dcfd-4038-a060-219937115302\" (UID: \"619056a7-dcfd-4038-a060-219937115302\") "
Mar 20 15:43:29 crc kubenswrapper[4730]: I0320 15:43:29.570064    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckj9c\" (UniqueName: \"kubernetes.io/projected/619056a7-dcfd-4038-a060-219937115302-kube-api-access-ckj9c\") pod \"619056a7-dcfd-4038-a060-219937115302\" (UID: \"619056a7-dcfd-4038-a060-219937115302\") "
Mar 20 15:43:29 crc kubenswrapper[4730]: I0320 15:43:29.570159    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/619056a7-dcfd-4038-a060-219937115302-config\") pod \"619056a7-dcfd-4038-a060-219937115302\" (UID: \"619056a7-dcfd-4038-a060-219937115302\") "
Mar 20 15:43:29 crc kubenswrapper[4730]: I0320 15:43:29.570194    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/619056a7-dcfd-4038-a060-219937115302-serving-cert\") pod \"619056a7-dcfd-4038-a060-219937115302\" (UID: \"619056a7-dcfd-4038-a060-219937115302\") "
Mar 20 15:43:29 crc kubenswrapper[4730]: I0320 15:43:29.573283    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/619056a7-dcfd-4038-a060-219937115302-client-ca" (OuterVolumeSpecName: "client-ca") pod "619056a7-dcfd-4038-a060-219937115302" (UID: "619056a7-dcfd-4038-a060-219937115302"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:43:29 crc kubenswrapper[4730]: I0320 15:43:29.573294    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/619056a7-dcfd-4038-a060-219937115302-config" (OuterVolumeSpecName: "config") pod "619056a7-dcfd-4038-a060-219937115302" (UID: "619056a7-dcfd-4038-a060-219937115302"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:43:29 crc kubenswrapper[4730]: I0320 15:43:29.576576    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/619056a7-dcfd-4038-a060-219937115302-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "619056a7-dcfd-4038-a060-219937115302" (UID: "619056a7-dcfd-4038-a060-219937115302"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:43:29 crc kubenswrapper[4730]: I0320 15:43:29.579557    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/619056a7-dcfd-4038-a060-219937115302-kube-api-access-ckj9c" (OuterVolumeSpecName: "kube-api-access-ckj9c") pod "619056a7-dcfd-4038-a060-219937115302" (UID: "619056a7-dcfd-4038-a060-219937115302"). InnerVolumeSpecName "kube-api-access-ckj9c". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:43:29 crc kubenswrapper[4730]: I0320 15:43:29.671522    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4d20fab-86cc-44d8-a8b9-c60f6835c5e0-serving-cert\") pod \"route-controller-manager-b75b5f765-8wjw2\" (UID: \"d4d20fab-86cc-44d8-a8b9-c60f6835c5e0\") " pod="openshift-route-controller-manager/route-controller-manager-b75b5f765-8wjw2"
Mar 20 15:43:29 crc kubenswrapper[4730]: I0320 15:43:29.671609    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4d20fab-86cc-44d8-a8b9-c60f6835c5e0-client-ca\") pod \"route-controller-manager-b75b5f765-8wjw2\" (UID: \"d4d20fab-86cc-44d8-a8b9-c60f6835c5e0\") " pod="openshift-route-controller-manager/route-controller-manager-b75b5f765-8wjw2"
Mar 20 15:43:29 crc kubenswrapper[4730]: I0320 15:43:29.671649    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xdnv\" (UniqueName: \"kubernetes.io/projected/d4d20fab-86cc-44d8-a8b9-c60f6835c5e0-kube-api-access-4xdnv\") pod \"route-controller-manager-b75b5f765-8wjw2\" (UID: \"d4d20fab-86cc-44d8-a8b9-c60f6835c5e0\") " pod="openshift-route-controller-manager/route-controller-manager-b75b5f765-8wjw2"
Mar 20 15:43:29 crc kubenswrapper[4730]: I0320 15:43:29.671680    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4d20fab-86cc-44d8-a8b9-c60f6835c5e0-config\") pod \"route-controller-manager-b75b5f765-8wjw2\" (UID: \"d4d20fab-86cc-44d8-a8b9-c60f6835c5e0\") " pod="openshift-route-controller-manager/route-controller-manager-b75b5f765-8wjw2"
Mar 20 15:43:29 crc kubenswrapper[4730]: I0320 15:43:29.671739    4730 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/619056a7-dcfd-4038-a060-219937115302-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:43:29 crc kubenswrapper[4730]: I0320 15:43:29.671749    4730 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/619056a7-dcfd-4038-a060-219937115302-serving-cert\") on node \"crc\" DevicePath \"\""
Mar 20 15:43:29 crc kubenswrapper[4730]: I0320 15:43:29.671757    4730 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/619056a7-dcfd-4038-a060-219937115302-client-ca\") on node \"crc\" DevicePath \"\""
Mar 20 15:43:29 crc kubenswrapper[4730]: I0320 15:43:29.671765    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckj9c\" (UniqueName: \"kubernetes.io/projected/619056a7-dcfd-4038-a060-219937115302-kube-api-access-ckj9c\") on node \"crc\" DevicePath \"\""
Mar 20 15:43:29 crc kubenswrapper[4730]: I0320 15:43:29.772885    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4d20fab-86cc-44d8-a8b9-c60f6835c5e0-serving-cert\") pod \"route-controller-manager-b75b5f765-8wjw2\" (UID: \"d4d20fab-86cc-44d8-a8b9-c60f6835c5e0\") " pod="openshift-route-controller-manager/route-controller-manager-b75b5f765-8wjw2"
Mar 20 15:43:29 crc kubenswrapper[4730]: I0320 15:43:29.773468    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4d20fab-86cc-44d8-a8b9-c60f6835c5e0-client-ca\") pod \"route-controller-manager-b75b5f765-8wjw2\" (UID: \"d4d20fab-86cc-44d8-a8b9-c60f6835c5e0\") " pod="openshift-route-controller-manager/route-controller-manager-b75b5f765-8wjw2"
Mar 20 15:43:29 crc kubenswrapper[4730]: I0320 15:43:29.773492    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xdnv\" (UniqueName: \"kubernetes.io/projected/d4d20fab-86cc-44d8-a8b9-c60f6835c5e0-kube-api-access-4xdnv\") pod \"route-controller-manager-b75b5f765-8wjw2\" (UID: \"d4d20fab-86cc-44d8-a8b9-c60f6835c5e0\") " pod="openshift-route-controller-manager/route-controller-manager-b75b5f765-8wjw2"
Mar 20 15:43:29 crc kubenswrapper[4730]: I0320 15:43:29.773525    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4d20fab-86cc-44d8-a8b9-c60f6835c5e0-config\") pod \"route-controller-manager-b75b5f765-8wjw2\" (UID: \"d4d20fab-86cc-44d8-a8b9-c60f6835c5e0\") " pod="openshift-route-controller-manager/route-controller-manager-b75b5f765-8wjw2"
Mar 20 15:43:29 crc kubenswrapper[4730]: I0320 15:43:29.774681    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4d20fab-86cc-44d8-a8b9-c60f6835c5e0-config\") pod \"route-controller-manager-b75b5f765-8wjw2\" (UID: \"d4d20fab-86cc-44d8-a8b9-c60f6835c5e0\") " pod="openshift-route-controller-manager/route-controller-manager-b75b5f765-8wjw2"
Mar 20 15:43:29 crc kubenswrapper[4730]: I0320 15:43:29.775443    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4d20fab-86cc-44d8-a8b9-c60f6835c5e0-client-ca\") pod \"route-controller-manager-b75b5f765-8wjw2\" (UID: \"d4d20fab-86cc-44d8-a8b9-c60f6835c5e0\") " pod="openshift-route-controller-manager/route-controller-manager-b75b5f765-8wjw2"
Mar 20 15:43:29 crc kubenswrapper[4730]: I0320 15:43:29.789891    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4d20fab-86cc-44d8-a8b9-c60f6835c5e0-serving-cert\") pod \"route-controller-manager-b75b5f765-8wjw2\" (UID: \"d4d20fab-86cc-44d8-a8b9-c60f6835c5e0\") " pod="openshift-route-controller-manager/route-controller-manager-b75b5f765-8wjw2"
Mar 20 15:43:29 crc kubenswrapper[4730]: I0320 15:43:29.796100    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xdnv\" (UniqueName: \"kubernetes.io/projected/d4d20fab-86cc-44d8-a8b9-c60f6835c5e0-kube-api-access-4xdnv\") pod \"route-controller-manager-b75b5f765-8wjw2\" (UID: \"d4d20fab-86cc-44d8-a8b9-c60f6835c5e0\") " pod="openshift-route-controller-manager/route-controller-manager-b75b5f765-8wjw2"
Mar 20 15:43:29 crc kubenswrapper[4730]: I0320 15:43:29.866809    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b75b5f765-8wjw2"
Mar 20 15:43:30 crc kubenswrapper[4730]: I0320 15:43:30.185601    4730 generic.go:334] "Generic (PLEG): container finished" podID="619056a7-dcfd-4038-a060-219937115302" containerID="f597a41156cb37b583c222dbf5b174ede3554dd0f42eebcfaae39f06d1dc9a19" exitCode=0
Mar 20 15:43:30 crc kubenswrapper[4730]: I0320 15:43:30.185664    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-677b48c9fc-4n4h2" event={"ID":"619056a7-dcfd-4038-a060-219937115302","Type":"ContainerDied","Data":"f597a41156cb37b583c222dbf5b174ede3554dd0f42eebcfaae39f06d1dc9a19"}
Mar 20 15:43:30 crc kubenswrapper[4730]: I0320 15:43:30.186012    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-677b48c9fc-4n4h2" event={"ID":"619056a7-dcfd-4038-a060-219937115302","Type":"ContainerDied","Data":"5329a4a78efd925df61b8a787b8f9a45a1ac33c3586857f71eed6f7475be4589"}
Mar 20 15:43:30 crc kubenswrapper[4730]: I0320 15:43:30.186034    4730 scope.go:117] "RemoveContainer" containerID="f597a41156cb37b583c222dbf5b174ede3554dd0f42eebcfaae39f06d1dc9a19"
Mar 20 15:43:30 crc kubenswrapper[4730]: I0320 15:43:30.185725    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-677b48c9fc-4n4h2"
Mar 20 15:43:30 crc kubenswrapper[4730]: I0320 15:43:30.188103    4730 generic.go:334] "Generic (PLEG): container finished" podID="78bff99a-9296-41fe-ac5d-b41a183e2349" containerID="5d7f9de408c9d5d877667c8e376ecf1e8e460670d7d56cd105eae027fc2488bf" exitCode=0
Mar 20 15:43:30 crc kubenswrapper[4730]: I0320 15:43:30.188185    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"78bff99a-9296-41fe-ac5d-b41a183e2349","Type":"ContainerDied","Data":"5d7f9de408c9d5d877667c8e376ecf1e8e460670d7d56cd105eae027fc2488bf"}
Mar 20 15:43:30 crc kubenswrapper[4730]: I0320 15:43:30.198956    4730 scope.go:117] "RemoveContainer" containerID="f597a41156cb37b583c222dbf5b174ede3554dd0f42eebcfaae39f06d1dc9a19"
Mar 20 15:43:30 crc kubenswrapper[4730]: E0320 15:43:30.199280    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f597a41156cb37b583c222dbf5b174ede3554dd0f42eebcfaae39f06d1dc9a19\": container with ID starting with f597a41156cb37b583c222dbf5b174ede3554dd0f42eebcfaae39f06d1dc9a19 not found: ID does not exist" containerID="f597a41156cb37b583c222dbf5b174ede3554dd0f42eebcfaae39f06d1dc9a19"
Mar 20 15:43:30 crc kubenswrapper[4730]: I0320 15:43:30.199332    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f597a41156cb37b583c222dbf5b174ede3554dd0f42eebcfaae39f06d1dc9a19"} err="failed to get container status \"f597a41156cb37b583c222dbf5b174ede3554dd0f42eebcfaae39f06d1dc9a19\": rpc error: code = NotFound desc = could not find container \"f597a41156cb37b583c222dbf5b174ede3554dd0f42eebcfaae39f06d1dc9a19\": container with ID starting with f597a41156cb37b583c222dbf5b174ede3554dd0f42eebcfaae39f06d1dc9a19 not found: ID does not exist"
Mar 20 15:43:30 crc kubenswrapper[4730]: I0320 15:43:30.381241    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-677b48c9fc-4n4h2"]
Mar 20 15:43:30 crc kubenswrapper[4730]: I0320 15:43:30.384617    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-677b48c9fc-4n4h2"]
Mar 20 15:43:30 crc kubenswrapper[4730]: I0320 15:43:30.464874    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b75b5f765-8wjw2"]
Mar 20 15:43:30 crc kubenswrapper[4730]: W0320 15:43:30.472433    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4d20fab_86cc_44d8_a8b9_c60f6835c5e0.slice/crio-4e99a4704ba10afb385846d96e0db2ec14bcce1391fc5cd9c5fede455f436bf0 WatchSource:0}: Error finding container 4e99a4704ba10afb385846d96e0db2ec14bcce1391fc5cd9c5fede455f436bf0: Status 404 returned error can't find the container with id 4e99a4704ba10afb385846d96e0db2ec14bcce1391fc5cd9c5fede455f436bf0
Mar 20 15:43:31 crc kubenswrapper[4730]: I0320 15:43:31.196229    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b75b5f765-8wjw2" event={"ID":"d4d20fab-86cc-44d8-a8b9-c60f6835c5e0","Type":"ContainerStarted","Data":"34175b3ad56804c8ddffbef2e3fafd18e67e87b94833ec1a23054a68de4fe0be"}
Mar 20 15:43:31 crc kubenswrapper[4730]: I0320 15:43:31.196769    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b75b5f765-8wjw2" event={"ID":"d4d20fab-86cc-44d8-a8b9-c60f6835c5e0","Type":"ContainerStarted","Data":"4e99a4704ba10afb385846d96e0db2ec14bcce1391fc5cd9c5fede455f436bf0"}
Mar 20 15:43:31 crc kubenswrapper[4730]: I0320 15:43:31.196800    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-b75b5f765-8wjw2"
Mar 20 15:43:31 crc kubenswrapper[4730]: I0320 15:43:31.202189    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-b75b5f765-8wjw2"
Mar 20 15:43:31 crc kubenswrapper[4730]: I0320 15:43:31.218505    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-b75b5f765-8wjw2" podStartSLOduration=7.218484886 podStartE2EDuration="7.218484886s" podCreationTimestamp="2026-03-20 15:43:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:43:31.214486753 +0000 UTC m=+270.427858142" watchObservedRunningTime="2026-03-20 15:43:31.218484886 +0000 UTC m=+270.431856265"
Mar 20 15:43:31 crc kubenswrapper[4730]: I0320 15:43:31.447134    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc"
Mar 20 15:43:31 crc kubenswrapper[4730]: I0320 15:43:31.496344    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/78bff99a-9296-41fe-ac5d-b41a183e2349-kube-api-access\") pod \"78bff99a-9296-41fe-ac5d-b41a183e2349\" (UID: \"78bff99a-9296-41fe-ac5d-b41a183e2349\") "
Mar 20 15:43:31 crc kubenswrapper[4730]: I0320 15:43:31.496488    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/78bff99a-9296-41fe-ac5d-b41a183e2349-kubelet-dir\") pod \"78bff99a-9296-41fe-ac5d-b41a183e2349\" (UID: \"78bff99a-9296-41fe-ac5d-b41a183e2349\") "
Mar 20 15:43:31 crc kubenswrapper[4730]: I0320 15:43:31.496576    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/78bff99a-9296-41fe-ac5d-b41a183e2349-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "78bff99a-9296-41fe-ac5d-b41a183e2349" (UID: "78bff99a-9296-41fe-ac5d-b41a183e2349"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue ""
Mar 20 15:43:31 crc kubenswrapper[4730]: I0320 15:43:31.496849    4730 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/78bff99a-9296-41fe-ac5d-b41a183e2349-kubelet-dir\") on node \"crc\" DevicePath \"\""
Mar 20 15:43:31 crc kubenswrapper[4730]: I0320 15:43:31.505928    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78bff99a-9296-41fe-ac5d-b41a183e2349-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "78bff99a-9296-41fe-ac5d-b41a183e2349" (UID: "78bff99a-9296-41fe-ac5d-b41a183e2349"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:43:31 crc kubenswrapper[4730]: I0320 15:43:31.543801    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="619056a7-dcfd-4038-a060-219937115302" path="/var/lib/kubelet/pods/619056a7-dcfd-4038-a060-219937115302/volumes"
Mar 20 15:43:31 crc kubenswrapper[4730]: I0320 15:43:31.597891    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/78bff99a-9296-41fe-ac5d-b41a183e2349-kube-api-access\") on node \"crc\" DevicePath \"\""
Mar 20 15:43:32 crc kubenswrapper[4730]: I0320 15:43:32.203932    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc"
Mar 20 15:43:32 crc kubenswrapper[4730]: I0320 15:43:32.203962    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"78bff99a-9296-41fe-ac5d-b41a183e2349","Type":"ContainerDied","Data":"41294ad67a411ffc3356abec6adfd499182ee81671d1eae6d8ef8df30b5e38f9"}
Mar 20 15:43:32 crc kubenswrapper[4730]: I0320 15:43:32.205325    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41294ad67a411ffc3356abec6adfd499182ee81671d1eae6d8ef8df30b5e38f9"
Mar 20 15:43:33 crc kubenswrapper[4730]: I0320 15:43:33.204356    4730 csr.go:261] certificate signing request csr-hm6sj is approved, waiting to be issued
Mar 20 15:43:33 crc kubenswrapper[4730]: I0320 15:43:33.213782    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567022-wf5nv" event={"ID":"7d87adfe-3206-4175-8d8f-5a00015cc61e","Type":"ContainerStarted","Data":"bb5b04ddf5d3880ba3c77fa4e7069bd85e272160b1890e28a7de00d43e3a9f9e"}
Mar 20 15:43:33 crc kubenswrapper[4730]: I0320 15:43:33.215102    4730 csr.go:257] certificate signing request csr-hm6sj is issued
Mar 20 15:43:33 crc kubenswrapper[4730]: I0320 15:43:33.230214    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567022-wf5nv" podStartSLOduration=41.712544319 podStartE2EDuration="1m33.230196289s" podCreationTimestamp="2026-03-20 15:42:00 +0000 UTC" firstStartedPulling="2026-03-20 15:42:41.018688017 +0000 UTC m=+220.232059386" lastFinishedPulling="2026-03-20 15:43:32.536339977 +0000 UTC m=+271.749711356" observedRunningTime="2026-03-20 15:43:33.228412224 +0000 UTC m=+272.441783593" watchObservedRunningTime="2026-03-20 15:43:33.230196289 +0000 UTC m=+272.443567658"
Mar 20 15:43:33 crc kubenswrapper[4730]: E0320 15:43:33.260740    4730 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d87adfe_3206_4175_8d8f_5a00015cc61e.slice/crio-bb5b04ddf5d3880ba3c77fa4e7069bd85e272160b1890e28a7de00d43e3a9f9e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d87adfe_3206_4175_8d8f_5a00015cc61e.slice/crio-conmon-bb5b04ddf5d3880ba3c77fa4e7069bd85e272160b1890e28a7de00d43e3a9f9e.scope\": RecentStats: unable to find data in memory cache]"
Mar 20 15:43:34 crc kubenswrapper[4730]: I0320 15:43:34.216369    4730 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-28 20:39:48.298982454 +0000 UTC
Mar 20 15:43:34 crc kubenswrapper[4730]: I0320 15:43:34.216724    4730 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6796h56m14.082261607s for next certificate rotation
Mar 20 15:43:34 crc kubenswrapper[4730]: I0320 15:43:34.435914    4730 generic.go:334] "Generic (PLEG): container finished" podID="7d87adfe-3206-4175-8d8f-5a00015cc61e" containerID="bb5b04ddf5d3880ba3c77fa4e7069bd85e272160b1890e28a7de00d43e3a9f9e" exitCode=0
Mar 20 15:43:34 crc kubenswrapper[4730]: I0320 15:43:34.435962    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567022-wf5nv" event={"ID":"7d87adfe-3206-4175-8d8f-5a00015cc61e","Type":"ContainerDied","Data":"bb5b04ddf5d3880ba3c77fa4e7069bd85e272160b1890e28a7de00d43e3a9f9e"}
Mar 20 15:43:35 crc kubenswrapper[4730]: I0320 15:43:35.292924    4730 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-29 07:33:08.724990939 +0000 UTC
Mar 20 15:43:35 crc kubenswrapper[4730]: I0320 15:43:35.293004    4730 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6807h49m33.431990817s for next certificate rotation
Mar 20 15:43:35 crc kubenswrapper[4730]: I0320 15:43:35.731722    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567022-wf5nv"
Mar 20 15:43:35 crc kubenswrapper[4730]: I0320 15:43:35.832912    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46gxc\" (UniqueName: \"kubernetes.io/projected/7d87adfe-3206-4175-8d8f-5a00015cc61e-kube-api-access-46gxc\") pod \"7d87adfe-3206-4175-8d8f-5a00015cc61e\" (UID: \"7d87adfe-3206-4175-8d8f-5a00015cc61e\") "
Mar 20 15:43:35 crc kubenswrapper[4730]: I0320 15:43:35.858854    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d87adfe-3206-4175-8d8f-5a00015cc61e-kube-api-access-46gxc" (OuterVolumeSpecName: "kube-api-access-46gxc") pod "7d87adfe-3206-4175-8d8f-5a00015cc61e" (UID: "7d87adfe-3206-4175-8d8f-5a00015cc61e"). InnerVolumeSpecName "kube-api-access-46gxc". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:43:35 crc kubenswrapper[4730]: I0320 15:43:35.934276    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46gxc\" (UniqueName: \"kubernetes.io/projected/7d87adfe-3206-4175-8d8f-5a00015cc61e-kube-api-access-46gxc\") on node \"crc\" DevicePath \"\""
Mar 20 15:43:36 crc kubenswrapper[4730]: I0320 15:43:36.446431    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567022-wf5nv" event={"ID":"7d87adfe-3206-4175-8d8f-5a00015cc61e","Type":"ContainerDied","Data":"94b5cc48ba667f00e1531e3aeaa43c807fc5eafadb1a111b034b9b327635ca47"}
Mar 20 15:43:36 crc kubenswrapper[4730]: I0320 15:43:36.446483    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94b5cc48ba667f00e1531e3aeaa43c807fc5eafadb1a111b034b9b327635ca47"
Mar 20 15:43:36 crc kubenswrapper[4730]: I0320 15:43:36.446502    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567022-wf5nv"
Mar 20 15:43:42 crc kubenswrapper[4730]: I0320 15:43:42.481548    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6mppz" event={"ID":"168c4cbd-3a44-48a5-be95-0eb4ea01d6c8","Type":"ContainerStarted","Data":"4d4654a93d90cdb600960802fd1dbb00c64ea5360936651b230f4fce570720a5"}
Mar 20 15:43:42 crc kubenswrapper[4730]: I0320 15:43:42.485123    4730 generic.go:334] "Generic (PLEG): container finished" podID="d5addb8e-1dbc-41a2-8330-8a97251bd52f" containerID="027ff3ee79dd3768bc7352d26b5e9a7647079a2f17aa58047546ce0332c5b335" exitCode=0
Mar 20 15:43:42 crc kubenswrapper[4730]: I0320 15:43:42.485169    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbtfk" event={"ID":"d5addb8e-1dbc-41a2-8330-8a97251bd52f","Type":"ContainerDied","Data":"027ff3ee79dd3768bc7352d26b5e9a7647079a2f17aa58047546ce0332c5b335"}
Mar 20 15:43:42 crc kubenswrapper[4730]: I0320 15:43:42.490860    4730 generic.go:334] "Generic (PLEG): container finished" podID="7a118148-49cc-4b61-bb43-44e3ef2c3048" containerID="5b4ec1c83cd975ec260c6743263c1e94e91b48d95b84a27c4a117e322048189c" exitCode=0
Mar 20 15:43:42 crc kubenswrapper[4730]: I0320 15:43:42.490897    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cx74p" event={"ID":"7a118148-49cc-4b61-bb43-44e3ef2c3048","Type":"ContainerDied","Data":"5b4ec1c83cd975ec260c6743263c1e94e91b48d95b84a27c4a117e322048189c"}
Mar 20 15:43:42 crc kubenswrapper[4730]: I0320 15:43:42.880132    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 15:43:42 crc kubenswrapper[4730]: I0320 15:43:42.880194    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 15:43:43 crc kubenswrapper[4730]: I0320 15:43:43.499305    4730 generic.go:334] "Generic (PLEG): container finished" podID="5a347883-e4f7-4fcd-8920-59519533cf43" containerID="9a83b1f8dc654ef4f4276c65729d5eabf19cc5bf1944836a69eeb1d195139aba" exitCode=0
Mar 20 15:43:43 crc kubenswrapper[4730]: I0320 15:43:43.499348    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-flpw2" event={"ID":"5a347883-e4f7-4fcd-8920-59519533cf43","Type":"ContainerDied","Data":"9a83b1f8dc654ef4f4276c65729d5eabf19cc5bf1944836a69eeb1d195139aba"}
Mar 20 15:43:43 crc kubenswrapper[4730]: I0320 15:43:43.504945    4730 generic.go:334] "Generic (PLEG): container finished" podID="168c4cbd-3a44-48a5-be95-0eb4ea01d6c8" containerID="4d4654a93d90cdb600960802fd1dbb00c64ea5360936651b230f4fce570720a5" exitCode=0
Mar 20 15:43:43 crc kubenswrapper[4730]: I0320 15:43:43.504971    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6mppz" event={"ID":"168c4cbd-3a44-48a5-be95-0eb4ea01d6c8","Type":"ContainerDied","Data":"4d4654a93d90cdb600960802fd1dbb00c64ea5360936651b230f4fce570720a5"}
Mar 20 15:43:49 crc kubenswrapper[4730]: I0320 15:43:49.544802    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbtfk" event={"ID":"d5addb8e-1dbc-41a2-8330-8a97251bd52f","Type":"ContainerStarted","Data":"b0b7cf1aa8683df6582d7fa32a0ee12665587f8843d63db7cc45648643eb352c"}
Mar 20 15:43:49 crc kubenswrapper[4730]: I0320 15:43:49.548165    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cx74p" event={"ID":"7a118148-49cc-4b61-bb43-44e3ef2c3048","Type":"ContainerStarted","Data":"70a4d7cafba64be4931c32255724c04bc6838f8411a18d46af296704cb3005d7"}
Mar 20 15:43:49 crc kubenswrapper[4730]: I0320 15:43:49.550014    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-flpw2" event={"ID":"5a347883-e4f7-4fcd-8920-59519533cf43","Type":"ContainerStarted","Data":"8a137f491e123e26bfe8e53249675fb8ec9405c7b00ec70ee4673e9a88e5d6bf"}
Mar 20 15:43:49 crc kubenswrapper[4730]: I0320 15:43:49.552908    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6mppz" event={"ID":"168c4cbd-3a44-48a5-be95-0eb4ea01d6c8","Type":"ContainerStarted","Data":"ea01bd16d25708ee20b82a86c40239ffa37364d97477418fd97a8527e934e439"}
Mar 20 15:43:49 crc kubenswrapper[4730]: I0320 15:43:49.555913    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rlnqc" event={"ID":"e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98","Type":"ContainerStarted","Data":"2c3e34ad9ea0b6c3222cf006f08a02a03e69e35e189c65669c0748e767b79f75"}
Mar 20 15:43:49 crc kubenswrapper[4730]: I0320 15:43:49.558078    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8rptq" event={"ID":"558b00fd-2589-4842-8cba-db0cffe8c826","Type":"ContainerStarted","Data":"0b9feaef40e353d64a848dba5e34276e42725c50bac6122fd4b5265fc07ad6a1"}
Mar 20 15:43:49 crc kubenswrapper[4730]: I0320 15:43:49.567553    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qmxvf" event={"ID":"ab6c90a0-1bc1-476d-8526-d1fe438163e3","Type":"ContainerStarted","Data":"1d25cea84c8b33aa09a01d2a67ef03e54e2640ab453060480220fbbf97ebde61"}
Mar 20 15:43:49 crc kubenswrapper[4730]: I0320 15:43:49.577343    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mbtfk" podStartSLOduration=3.583370045 podStartE2EDuration="1m3.577322051s" podCreationTimestamp="2026-03-20 15:42:46 +0000 UTC" firstStartedPulling="2026-03-20 15:42:48.608899116 +0000 UTC m=+227.822270495" lastFinishedPulling="2026-03-20 15:43:48.602851132 +0000 UTC m=+287.816222501" observedRunningTime="2026-03-20 15:43:49.577235248 +0000 UTC m=+288.790606617" watchObservedRunningTime="2026-03-20 15:43:49.577322051 +0000 UTC m=+288.790693420"
Mar 20 15:43:49 crc kubenswrapper[4730]: I0320 15:43:49.584441    4730 generic.go:334] "Generic (PLEG): container finished" podID="715cbff8-9674-4896-8deb-54a6e9a8899e" containerID="33d9320e7f40c8c36e8b7683ba1de97d97d1f3c11216a749d1fb865904354d4e" exitCode=0
Mar 20 15:43:49 crc kubenswrapper[4730]: I0320 15:43:49.584493    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2z2hv" event={"ID":"715cbff8-9674-4896-8deb-54a6e9a8899e","Type":"ContainerDied","Data":"33d9320e7f40c8c36e8b7683ba1de97d97d1f3c11216a749d1fb865904354d4e"}
Mar 20 15:43:49 crc kubenswrapper[4730]: I0320 15:43:49.603468    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cx74p" podStartSLOduration=2.439415182 podStartE2EDuration="1m2.603452863s" podCreationTimestamp="2026-03-20 15:42:47 +0000 UTC" firstStartedPulling="2026-03-20 15:42:48.57655372 +0000 UTC m=+227.789925089" lastFinishedPulling="2026-03-20 15:43:48.740591401 +0000 UTC m=+287.953962770" observedRunningTime="2026-03-20 15:43:49.597299184 +0000 UTC m=+288.810670553" watchObservedRunningTime="2026-03-20 15:43:49.603452863 +0000 UTC m=+288.816824232"
Mar 20 15:43:49 crc kubenswrapper[4730]: I0320 15:43:49.649883    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6mppz" podStartSLOduration=2.211895153 podStartE2EDuration="1m2.649863038s" podCreationTimestamp="2026-03-20 15:42:47 +0000 UTC" firstStartedPulling="2026-03-20 15:42:48.646603687 +0000 UTC m=+227.859975056" lastFinishedPulling="2026-03-20 15:43:49.084571572 +0000 UTC m=+288.297942941" observedRunningTime="2026-03-20 15:43:49.648991201 +0000 UTC m=+288.862362570" watchObservedRunningTime="2026-03-20 15:43:49.649863038 +0000 UTC m=+288.863234407"
Mar 20 15:43:49 crc kubenswrapper[4730]: I0320 15:43:49.753339    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-flpw2" podStartSLOduration=3.563624451 podStartE2EDuration="1m1.753317514s" podCreationTimestamp="2026-03-20 15:42:48 +0000 UTC" firstStartedPulling="2026-03-20 15:42:50.819201556 +0000 UTC m=+230.032572925" lastFinishedPulling="2026-03-20 15:43:49.008894619 +0000 UTC m=+288.222265988" observedRunningTime="2026-03-20 15:43:49.740335696 +0000 UTC m=+288.953707075" watchObservedRunningTime="2026-03-20 15:43:49.753317514 +0000 UTC m=+288.966688903"
Mar 20 15:43:50 crc kubenswrapper[4730]: I0320 15:43:50.592380    4730 generic.go:334] "Generic (PLEG): container finished" podID="e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98" containerID="2c3e34ad9ea0b6c3222cf006f08a02a03e69e35e189c65669c0748e767b79f75" exitCode=0
Mar 20 15:43:50 crc kubenswrapper[4730]: I0320 15:43:50.592440    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rlnqc" event={"ID":"e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98","Type":"ContainerDied","Data":"2c3e34ad9ea0b6c3222cf006f08a02a03e69e35e189c65669c0748e767b79f75"}
Mar 20 15:43:50 crc kubenswrapper[4730]: I0320 15:43:50.596840    4730 generic.go:334] "Generic (PLEG): container finished" podID="558b00fd-2589-4842-8cba-db0cffe8c826" containerID="0b9feaef40e353d64a848dba5e34276e42725c50bac6122fd4b5265fc07ad6a1" exitCode=0
Mar 20 15:43:50 crc kubenswrapper[4730]: I0320 15:43:50.596907    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8rptq" event={"ID":"558b00fd-2589-4842-8cba-db0cffe8c826","Type":"ContainerDied","Data":"0b9feaef40e353d64a848dba5e34276e42725c50bac6122fd4b5265fc07ad6a1"}
Mar 20 15:43:50 crc kubenswrapper[4730]: I0320 15:43:50.602971    4730 generic.go:334] "Generic (PLEG): container finished" podID="ab6c90a0-1bc1-476d-8526-d1fe438163e3" containerID="1d25cea84c8b33aa09a01d2a67ef03e54e2640ab453060480220fbbf97ebde61" exitCode=0
Mar 20 15:43:50 crc kubenswrapper[4730]: I0320 15:43:50.603007    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qmxvf" event={"ID":"ab6c90a0-1bc1-476d-8526-d1fe438163e3","Type":"ContainerDied","Data":"1d25cea84c8b33aa09a01d2a67ef03e54e2640ab453060480220fbbf97ebde61"}
Mar 20 15:43:53 crc kubenswrapper[4730]: I0320 15:43:53.632224    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2z2hv" event={"ID":"715cbff8-9674-4896-8deb-54a6e9a8899e","Type":"ContainerStarted","Data":"483eb6bb311253e2717943e6bf3c10d5b83b566c99009c1e72adb9334e3302ee"}
Mar 20 15:43:53 crc kubenswrapper[4730]: I0320 15:43:53.649708    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2z2hv" podStartSLOduration=2.890834289 podStartE2EDuration="1m4.649687931s" podCreationTimestamp="2026-03-20 15:42:49 +0000 UTC" firstStartedPulling="2026-03-20 15:42:50.825041296 +0000 UTC m=+230.038412665" lastFinishedPulling="2026-03-20 15:43:52.583894938 +0000 UTC m=+291.797266307" observedRunningTime="2026-03-20 15:43:53.647644508 +0000 UTC m=+292.861015897" watchObservedRunningTime="2026-03-20 15:43:53.649687931 +0000 UTC m=+292.863059300"
Mar 20 15:43:55 crc kubenswrapper[4730]: I0320 15:43:55.643517    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qmxvf" event={"ID":"ab6c90a0-1bc1-476d-8526-d1fe438163e3","Type":"ContainerStarted","Data":"9a52c050b4986758df8c76456386e12221d78e4ea6fa2b1c10d15807ad001b19"}
Mar 20 15:43:56 crc kubenswrapper[4730]: I0320 15:43:56.666607    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qmxvf" podStartSLOduration=3.930535787 podStartE2EDuration="1m6.666587716s" podCreationTimestamp="2026-03-20 15:42:50 +0000 UTC" firstStartedPulling="2026-03-20 15:42:51.926292572 +0000 UTC m=+231.139663931" lastFinishedPulling="2026-03-20 15:43:54.662344491 +0000 UTC m=+293.875715860" observedRunningTime="2026-03-20 15:43:56.664639506 +0000 UTC m=+295.878010885" watchObservedRunningTime="2026-03-20 15:43:56.666587716 +0000 UTC m=+295.879959095"
Mar 20 15:43:57 crc kubenswrapper[4730]: I0320 15:43:57.365581    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mbtfk"
Mar 20 15:43:57 crc kubenswrapper[4730]: I0320 15:43:57.365643    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mbtfk"
Mar 20 15:43:57 crc kubenswrapper[4730]: I0320 15:43:57.497734    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6mppz"
Mar 20 15:43:57 crc kubenswrapper[4730]: I0320 15:43:57.497828    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6mppz"
Mar 20 15:43:57 crc kubenswrapper[4730]: I0320 15:43:57.656984    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8rptq" event={"ID":"558b00fd-2589-4842-8cba-db0cffe8c826","Type":"ContainerStarted","Data":"be8ad8b5a0faeab783b7ccafb3517fc045687f3f5ccf91534d7b0f3ee31c621e"}
Mar 20 15:43:57 crc kubenswrapper[4730]: I0320 15:43:57.687592    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8rptq" podStartSLOduration=3.689957659 podStartE2EDuration="1m7.687572191s" podCreationTimestamp="2026-03-20 15:42:50 +0000 UTC" firstStartedPulling="2026-03-20 15:42:51.916345706 +0000 UTC m=+231.129717065" lastFinishedPulling="2026-03-20 15:43:55.913960228 +0000 UTC m=+295.127331597" observedRunningTime="2026-03-20 15:43:57.684357843 +0000 UTC m=+296.897729292" watchObservedRunningTime="2026-03-20 15:43:57.687572191 +0000 UTC m=+296.900943560"
Mar 20 15:43:57 crc kubenswrapper[4730]: I0320 15:43:57.730063    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cx74p"
Mar 20 15:43:57 crc kubenswrapper[4730]: I0320 15:43:57.730117    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cx74p"
Mar 20 15:43:58 crc kubenswrapper[4730]: I0320 15:43:58.066977    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mbtfk"
Mar 20 15:43:58 crc kubenswrapper[4730]: I0320 15:43:58.067083    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6mppz"
Mar 20 15:43:58 crc kubenswrapper[4730]: I0320 15:43:58.080963    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cx74p"
Mar 20 15:43:58 crc kubenswrapper[4730]: I0320 15:43:58.110555    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-st79s"]
Mar 20 15:43:58 crc kubenswrapper[4730]: I0320 15:43:58.146967    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6mppz"
Mar 20 15:43:58 crc kubenswrapper[4730]: I0320 15:43:58.147525    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mbtfk"
Mar 20 15:43:58 crc kubenswrapper[4730]: I0320 15:43:58.738291    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cx74p"
Mar 20 15:43:59 crc kubenswrapper[4730]: I0320 15:43:59.106592    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-flpw2"
Mar 20 15:43:59 crc kubenswrapper[4730]: I0320 15:43:59.106659    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-flpw2"
Mar 20 15:43:59 crc kubenswrapper[4730]: I0320 15:43:59.150770    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-flpw2"
Mar 20 15:43:59 crc kubenswrapper[4730]: I0320 15:43:59.603566    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2z2hv"
Mar 20 15:43:59 crc kubenswrapper[4730]: I0320 15:43:59.603633    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2z2hv"
Mar 20 15:43:59 crc kubenswrapper[4730]: I0320 15:43:59.641897    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2z2hv"
Mar 20 15:43:59 crc kubenswrapper[4730]: I0320 15:43:59.696862    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rlnqc" event={"ID":"e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98","Type":"ContainerStarted","Data":"3303b366b010494b00cff91f0adf58b15d0be7946981888a990192d9cd69b3fa"}
Mar 20 15:43:59 crc kubenswrapper[4730]: I0320 15:43:59.717512    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rlnqc" podStartSLOduration=4.247728806 podStartE2EDuration="1m13.717490345s" podCreationTimestamp="2026-03-20 15:42:46 +0000 UTC" firstStartedPulling="2026-03-20 15:42:48.538620912 +0000 UTC m=+227.751992281" lastFinishedPulling="2026-03-20 15:43:58.008382451 +0000 UTC m=+297.221753820" observedRunningTime="2026-03-20 15:43:59.716291448 +0000 UTC m=+298.929662817" watchObservedRunningTime="2026-03-20 15:43:59.717490345 +0000 UTC m=+298.930861714"
Mar 20 15:43:59 crc kubenswrapper[4730]: I0320 15:43:59.745681    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2z2hv"
Mar 20 15:43:59 crc kubenswrapper[4730]: I0320 15:43:59.750575    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-flpw2"
Mar 20 15:44:00 crc kubenswrapper[4730]: I0320 15:44:00.134234    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567024-s2r9c"]
Mar 20 15:44:00 crc kubenswrapper[4730]: E0320 15:44:00.134476    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78bff99a-9296-41fe-ac5d-b41a183e2349" containerName="pruner"
Mar 20 15:44:00 crc kubenswrapper[4730]: I0320 15:44:00.134488    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="78bff99a-9296-41fe-ac5d-b41a183e2349" containerName="pruner"
Mar 20 15:44:00 crc kubenswrapper[4730]: E0320 15:44:00.134512    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d87adfe-3206-4175-8d8f-5a00015cc61e" containerName="oc"
Mar 20 15:44:00 crc kubenswrapper[4730]: I0320 15:44:00.134518    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d87adfe-3206-4175-8d8f-5a00015cc61e" containerName="oc"
Mar 20 15:44:00 crc kubenswrapper[4730]: I0320 15:44:00.134610    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="78bff99a-9296-41fe-ac5d-b41a183e2349" containerName="pruner"
Mar 20 15:44:00 crc kubenswrapper[4730]: I0320 15:44:00.134626    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d87adfe-3206-4175-8d8f-5a00015cc61e" containerName="oc"
Mar 20 15:44:00 crc kubenswrapper[4730]: I0320 15:44:00.134982    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567024-s2r9c"
Mar 20 15:44:00 crc kubenswrapper[4730]: I0320 15:44:00.142624    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vlqbl"
Mar 20 15:44:00 crc kubenswrapper[4730]: I0320 15:44:00.142704    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt"
Mar 20 15:44:00 crc kubenswrapper[4730]: I0320 15:44:00.143704    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt"
Mar 20 15:44:00 crc kubenswrapper[4730]: I0320 15:44:00.144355    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567024-s2r9c"]
Mar 20 15:44:00 crc kubenswrapper[4730]: I0320 15:44:00.292239    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdc6x\" (UniqueName: \"kubernetes.io/projected/3f093381-3bf4-49ff-beb4-f44aa012c521-kube-api-access-kdc6x\") pod \"auto-csr-approver-29567024-s2r9c\" (UID: \"3f093381-3bf4-49ff-beb4-f44aa012c521\") " pod="openshift-infra/auto-csr-approver-29567024-s2r9c"
Mar 20 15:44:00 crc kubenswrapper[4730]: I0320 15:44:00.361116    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6mppz"]
Mar 20 15:44:00 crc kubenswrapper[4730]: I0320 15:44:00.361353    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6mppz" podUID="168c4cbd-3a44-48a5-be95-0eb4ea01d6c8" containerName="registry-server" containerID="cri-o://ea01bd16d25708ee20b82a86c40239ffa37364d97477418fd97a8527e934e439" gracePeriod=2
Mar 20 15:44:00 crc kubenswrapper[4730]: I0320 15:44:00.393461    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdc6x\" (UniqueName: \"kubernetes.io/projected/3f093381-3bf4-49ff-beb4-f44aa012c521-kube-api-access-kdc6x\") pod \"auto-csr-approver-29567024-s2r9c\" (UID: \"3f093381-3bf4-49ff-beb4-f44aa012c521\") " pod="openshift-infra/auto-csr-approver-29567024-s2r9c"
Mar 20 15:44:00 crc kubenswrapper[4730]: I0320 15:44:00.414141    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdc6x\" (UniqueName: \"kubernetes.io/projected/3f093381-3bf4-49ff-beb4-f44aa012c521-kube-api-access-kdc6x\") pod \"auto-csr-approver-29567024-s2r9c\" (UID: \"3f093381-3bf4-49ff-beb4-f44aa012c521\") " pod="openshift-infra/auto-csr-approver-29567024-s2r9c"
Mar 20 15:44:00 crc kubenswrapper[4730]: I0320 15:44:00.450415    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567024-s2r9c"
Mar 20 15:44:00 crc kubenswrapper[4730]: I0320 15:44:00.495232    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8rptq"
Mar 20 15:44:00 crc kubenswrapper[4730]: I0320 15:44:00.495302    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8rptq"
Mar 20 15:44:00 crc kubenswrapper[4730]: I0320 15:44:00.904205    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567024-s2r9c"]
Mar 20 15:44:00 crc kubenswrapper[4730]: I0320 15:44:00.973048    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qmxvf"
Mar 20 15:44:00 crc kubenswrapper[4730]: I0320 15:44:00.973655    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qmxvf"
Mar 20 15:44:01 crc kubenswrapper[4730]: I0320 15:44:01.366601    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cx74p"]
Mar 20 15:44:01 crc kubenswrapper[4730]: I0320 15:44:01.367056    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cx74p" podUID="7a118148-49cc-4b61-bb43-44e3ef2c3048" containerName="registry-server" containerID="cri-o://70a4d7cafba64be4931c32255724c04bc6838f8411a18d46af296704cb3005d7" gracePeriod=2
Mar 20 15:44:01 crc kubenswrapper[4730]: I0320 15:44:01.550384    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8rptq" podUID="558b00fd-2589-4842-8cba-db0cffe8c826" containerName="registry-server" probeResult="failure" output=<
Mar 20 15:44:01 crc kubenswrapper[4730]:         timeout: failed to connect service ":50051" within 1s
Mar 20 15:44:01 crc kubenswrapper[4730]:  >
Mar 20 15:44:01 crc kubenswrapper[4730]: I0320 15:44:01.707057    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567024-s2r9c" event={"ID":"3f093381-3bf4-49ff-beb4-f44aa012c521","Type":"ContainerStarted","Data":"ae9eb78df15bf57e2c0e8bf27ad609713813b7d995be50840b72aa366815e8fc"}
Mar 20 15:44:01 crc kubenswrapper[4730]: I0320 15:44:01.710005    4730 generic.go:334] "Generic (PLEG): container finished" podID="168c4cbd-3a44-48a5-be95-0eb4ea01d6c8" containerID="ea01bd16d25708ee20b82a86c40239ffa37364d97477418fd97a8527e934e439" exitCode=0
Mar 20 15:44:01 crc kubenswrapper[4730]: I0320 15:44:01.710213    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6mppz" event={"ID":"168c4cbd-3a44-48a5-be95-0eb4ea01d6c8","Type":"ContainerDied","Data":"ea01bd16d25708ee20b82a86c40239ffa37364d97477418fd97a8527e934e439"}
Mar 20 15:44:02 crc kubenswrapper[4730]: I0320 15:44:02.013463    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qmxvf" podUID="ab6c90a0-1bc1-476d-8526-d1fe438163e3" containerName="registry-server" probeResult="failure" output=<
Mar 20 15:44:02 crc kubenswrapper[4730]:         timeout: failed to connect service ":50051" within 1s
Mar 20 15:44:02 crc kubenswrapper[4730]:  >
Mar 20 15:44:02 crc kubenswrapper[4730]: I0320 15:44:02.659750    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6mppz"
Mar 20 15:44:02 crc kubenswrapper[4730]: I0320 15:44:02.718229    4730 generic.go:334] "Generic (PLEG): container finished" podID="7a118148-49cc-4b61-bb43-44e3ef2c3048" containerID="70a4d7cafba64be4931c32255724c04bc6838f8411a18d46af296704cb3005d7" exitCode=0
Mar 20 15:44:02 crc kubenswrapper[4730]: I0320 15:44:02.718285    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cx74p" event={"ID":"7a118148-49cc-4b61-bb43-44e3ef2c3048","Type":"ContainerDied","Data":"70a4d7cafba64be4931c32255724c04bc6838f8411a18d46af296704cb3005d7"}
Mar 20 15:44:02 crc kubenswrapper[4730]: I0320 15:44:02.722510    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6mppz" event={"ID":"168c4cbd-3a44-48a5-be95-0eb4ea01d6c8","Type":"ContainerDied","Data":"b24679ab1f6c7ce28e2ed00c17a4988d013e4500b53404671b46ef5509b85dc8"}
Mar 20 15:44:02 crc kubenswrapper[4730]: I0320 15:44:02.722565    4730 scope.go:117] "RemoveContainer" containerID="ea01bd16d25708ee20b82a86c40239ffa37364d97477418fd97a8527e934e439"
Mar 20 15:44:02 crc kubenswrapper[4730]: I0320 15:44:02.722618    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6mppz"
Mar 20 15:44:02 crc kubenswrapper[4730]: I0320 15:44:02.723969    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwqvn\" (UniqueName: \"kubernetes.io/projected/168c4cbd-3a44-48a5-be95-0eb4ea01d6c8-kube-api-access-vwqvn\") pod \"168c4cbd-3a44-48a5-be95-0eb4ea01d6c8\" (UID: \"168c4cbd-3a44-48a5-be95-0eb4ea01d6c8\") "
Mar 20 15:44:02 crc kubenswrapper[4730]: I0320 15:44:02.724039    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/168c4cbd-3a44-48a5-be95-0eb4ea01d6c8-utilities\") pod \"168c4cbd-3a44-48a5-be95-0eb4ea01d6c8\" (UID: \"168c4cbd-3a44-48a5-be95-0eb4ea01d6c8\") "
Mar 20 15:44:02 crc kubenswrapper[4730]: I0320 15:44:02.724060    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/168c4cbd-3a44-48a5-be95-0eb4ea01d6c8-catalog-content\") pod \"168c4cbd-3a44-48a5-be95-0eb4ea01d6c8\" (UID: \"168c4cbd-3a44-48a5-be95-0eb4ea01d6c8\") "
Mar 20 15:44:02 crc kubenswrapper[4730]: I0320 15:44:02.727670    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/168c4cbd-3a44-48a5-be95-0eb4ea01d6c8-utilities" (OuterVolumeSpecName: "utilities") pod "168c4cbd-3a44-48a5-be95-0eb4ea01d6c8" (UID: "168c4cbd-3a44-48a5-be95-0eb4ea01d6c8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 15:44:02 crc kubenswrapper[4730]: I0320 15:44:02.731421    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/168c4cbd-3a44-48a5-be95-0eb4ea01d6c8-kube-api-access-vwqvn" (OuterVolumeSpecName: "kube-api-access-vwqvn") pod "168c4cbd-3a44-48a5-be95-0eb4ea01d6c8" (UID: "168c4cbd-3a44-48a5-be95-0eb4ea01d6c8"). InnerVolumeSpecName "kube-api-access-vwqvn". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:44:02 crc kubenswrapper[4730]: I0320 15:44:02.737667    4730 scope.go:117] "RemoveContainer" containerID="4d4654a93d90cdb600960802fd1dbb00c64ea5360936651b230f4fce570720a5"
Mar 20 15:44:02 crc kubenswrapper[4730]: I0320 15:44:02.755913    4730 scope.go:117] "RemoveContainer" containerID="f14d75be5024f0d9fd4c3cf59a10c4fbb452ecc1d6a3188f6fd40ab5bbb8ffe9"
Mar 20 15:44:02 crc kubenswrapper[4730]: I0320 15:44:02.766122    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2z2hv"]
Mar 20 15:44:02 crc kubenswrapper[4730]: I0320 15:44:02.766369    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2z2hv" podUID="715cbff8-9674-4896-8deb-54a6e9a8899e" containerName="registry-server" containerID="cri-o://483eb6bb311253e2717943e6bf3c10d5b83b566c99009c1e72adb9334e3302ee" gracePeriod=2
Mar 20 15:44:02 crc kubenswrapper[4730]: I0320 15:44:02.787479    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/168c4cbd-3a44-48a5-be95-0eb4ea01d6c8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "168c4cbd-3a44-48a5-be95-0eb4ea01d6c8" (UID: "168c4cbd-3a44-48a5-be95-0eb4ea01d6c8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 15:44:02 crc kubenswrapper[4730]: I0320 15:44:02.824941    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwqvn\" (UniqueName: \"kubernetes.io/projected/168c4cbd-3a44-48a5-be95-0eb4ea01d6c8-kube-api-access-vwqvn\") on node \"crc\" DevicePath \"\""
Mar 20 15:44:02 crc kubenswrapper[4730]: I0320 15:44:02.824972    4730 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/168c4cbd-3a44-48a5-be95-0eb4ea01d6c8-utilities\") on node \"crc\" DevicePath \"\""
Mar 20 15:44:02 crc kubenswrapper[4730]: I0320 15:44:02.824981    4730 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/168c4cbd-3a44-48a5-be95-0eb4ea01d6c8-catalog-content\") on node \"crc\" DevicePath \"\""
Mar 20 15:44:03 crc kubenswrapper[4730]: I0320 15:44:03.729233    4730 generic.go:334] "Generic (PLEG): container finished" podID="715cbff8-9674-4896-8deb-54a6e9a8899e" containerID="483eb6bb311253e2717943e6bf3c10d5b83b566c99009c1e72adb9334e3302ee" exitCode=0
Mar 20 15:44:03 crc kubenswrapper[4730]: I0320 15:44:03.730341    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2z2hv" event={"ID":"715cbff8-9674-4896-8deb-54a6e9a8899e","Type":"ContainerDied","Data":"483eb6bb311253e2717943e6bf3c10d5b83b566c99009c1e72adb9334e3302ee"}
Mar 20 15:44:03 crc kubenswrapper[4730]: I0320 15:44:03.758632    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6mppz"]
Mar 20 15:44:03 crc kubenswrapper[4730]: I0320 15:44:03.763628    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6mppz"]
Mar 20 15:44:03 crc kubenswrapper[4730]: E0320 15:44:03.774262    4730 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod168c4cbd_3a44_48a5_be95_0eb4ea01d6c8.slice/crio-b24679ab1f6c7ce28e2ed00c17a4988d013e4500b53404671b46ef5509b85dc8\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod168c4cbd_3a44_48a5_be95_0eb4ea01d6c8.slice\": RecentStats: unable to find data in memory cache]"
Mar 20 15:44:03 crc kubenswrapper[4730]: I0320 15:44:03.909100    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cx74p"
Mar 20 15:44:04 crc kubenswrapper[4730]: I0320 15:44:04.043182    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5sxl\" (UniqueName: \"kubernetes.io/projected/7a118148-49cc-4b61-bb43-44e3ef2c3048-kube-api-access-v5sxl\") pod \"7a118148-49cc-4b61-bb43-44e3ef2c3048\" (UID: \"7a118148-49cc-4b61-bb43-44e3ef2c3048\") "
Mar 20 15:44:04 crc kubenswrapper[4730]: I0320 15:44:04.043306    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a118148-49cc-4b61-bb43-44e3ef2c3048-catalog-content\") pod \"7a118148-49cc-4b61-bb43-44e3ef2c3048\" (UID: \"7a118148-49cc-4b61-bb43-44e3ef2c3048\") "
Mar 20 15:44:04 crc kubenswrapper[4730]: I0320 15:44:04.043350    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a118148-49cc-4b61-bb43-44e3ef2c3048-utilities\") pod \"7a118148-49cc-4b61-bb43-44e3ef2c3048\" (UID: \"7a118148-49cc-4b61-bb43-44e3ef2c3048\") "
Mar 20 15:44:04 crc kubenswrapper[4730]: I0320 15:44:04.044372    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a118148-49cc-4b61-bb43-44e3ef2c3048-utilities" (OuterVolumeSpecName: "utilities") pod "7a118148-49cc-4b61-bb43-44e3ef2c3048" (UID: "7a118148-49cc-4b61-bb43-44e3ef2c3048"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 15:44:04 crc kubenswrapper[4730]: I0320 15:44:04.069239    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a118148-49cc-4b61-bb43-44e3ef2c3048-kube-api-access-v5sxl" (OuterVolumeSpecName: "kube-api-access-v5sxl") pod "7a118148-49cc-4b61-bb43-44e3ef2c3048" (UID: "7a118148-49cc-4b61-bb43-44e3ef2c3048"). InnerVolumeSpecName "kube-api-access-v5sxl". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:44:04 crc kubenswrapper[4730]: I0320 15:44:04.096264    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a118148-49cc-4b61-bb43-44e3ef2c3048-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7a118148-49cc-4b61-bb43-44e3ef2c3048" (UID: "7a118148-49cc-4b61-bb43-44e3ef2c3048"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 15:44:04 crc kubenswrapper[4730]: I0320 15:44:04.145101    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5sxl\" (UniqueName: \"kubernetes.io/projected/7a118148-49cc-4b61-bb43-44e3ef2c3048-kube-api-access-v5sxl\") on node \"crc\" DevicePath \"\""
Mar 20 15:44:04 crc kubenswrapper[4730]: I0320 15:44:04.145164    4730 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a118148-49cc-4b61-bb43-44e3ef2c3048-catalog-content\") on node \"crc\" DevicePath \"\""
Mar 20 15:44:04 crc kubenswrapper[4730]: I0320 15:44:04.145177    4730 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a118148-49cc-4b61-bb43-44e3ef2c3048-utilities\") on node \"crc\" DevicePath \"\""
Mar 20 15:44:04 crc kubenswrapper[4730]: I0320 15:44:04.339092    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6f56868448-2fbxh"]
Mar 20 15:44:04 crc kubenswrapper[4730]: I0320 15:44:04.339335    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6f56868448-2fbxh" podUID="9d747680-5dde-4793-863a-252a5f67233a" containerName="controller-manager" containerID="cri-o://6a969344117b0e223a95c25576df0115301c9833e8c4f08723a3caa581f16b8b" gracePeriod=30
Mar 20 15:44:04 crc kubenswrapper[4730]: I0320 15:44:04.437066    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b75b5f765-8wjw2"]
Mar 20 15:44:04 crc kubenswrapper[4730]: I0320 15:44:04.437661    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-b75b5f765-8wjw2" podUID="d4d20fab-86cc-44d8-a8b9-c60f6835c5e0" containerName="route-controller-manager" containerID="cri-o://34175b3ad56804c8ddffbef2e3fafd18e67e87b94833ec1a23054a68de4fe0be" gracePeriod=30
Mar 20 15:44:04 crc kubenswrapper[4730]: I0320 15:44:04.737954    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cx74p" event={"ID":"7a118148-49cc-4b61-bb43-44e3ef2c3048","Type":"ContainerDied","Data":"189336796f3983b3d071ba1d85e4dc4b864a1692cfe884a4465fbc035616d986"}
Mar 20 15:44:04 crc kubenswrapper[4730]: I0320 15:44:04.738015    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cx74p"
Mar 20 15:44:04 crc kubenswrapper[4730]: I0320 15:44:04.738036    4730 scope.go:117] "RemoveContainer" containerID="70a4d7cafba64be4931c32255724c04bc6838f8411a18d46af296704cb3005d7"
Mar 20 15:44:04 crc kubenswrapper[4730]: I0320 15:44:04.752218    4730 scope.go:117] "RemoveContainer" containerID="5b4ec1c83cd975ec260c6743263c1e94e91b48d95b84a27c4a117e322048189c"
Mar 20 15:44:04 crc kubenswrapper[4730]: I0320 15:44:04.763203    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cx74p"]
Mar 20 15:44:04 crc kubenswrapper[4730]: I0320 15:44:04.769791    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cx74p"]
Mar 20 15:44:04 crc kubenswrapper[4730]: I0320 15:44:04.788545    4730 scope.go:117] "RemoveContainer" containerID="18e38e92f40a21b89290233a7ffe301a018e759b353ad6883c83ed52a1e47762"
Mar 20 15:44:05 crc kubenswrapper[4730]: I0320 15:44:05.540529    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="168c4cbd-3a44-48a5-be95-0eb4ea01d6c8" path="/var/lib/kubelet/pods/168c4cbd-3a44-48a5-be95-0eb4ea01d6c8/volumes"
Mar 20 15:44:05 crc kubenswrapper[4730]: I0320 15:44:05.541227    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a118148-49cc-4b61-bb43-44e3ef2c3048" path="/var/lib/kubelet/pods/7a118148-49cc-4b61-bb43-44e3ef2c3048/volumes"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.317811    4730 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"]
Mar 20 15:44:06 crc kubenswrapper[4730]: E0320 15:44:06.318078    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a118148-49cc-4b61-bb43-44e3ef2c3048" containerName="extract-utilities"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.318090    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a118148-49cc-4b61-bb43-44e3ef2c3048" containerName="extract-utilities"
Mar 20 15:44:06 crc kubenswrapper[4730]: E0320 15:44:06.318099    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="168c4cbd-3a44-48a5-be95-0eb4ea01d6c8" containerName="registry-server"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.318105    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="168c4cbd-3a44-48a5-be95-0eb4ea01d6c8" containerName="registry-server"
Mar 20 15:44:06 crc kubenswrapper[4730]: E0320 15:44:06.318117    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a118148-49cc-4b61-bb43-44e3ef2c3048" containerName="extract-content"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.318122    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a118148-49cc-4b61-bb43-44e3ef2c3048" containerName="extract-content"
Mar 20 15:44:06 crc kubenswrapper[4730]: E0320 15:44:06.318133    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="168c4cbd-3a44-48a5-be95-0eb4ea01d6c8" containerName="extract-content"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.318138    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="168c4cbd-3a44-48a5-be95-0eb4ea01d6c8" containerName="extract-content"
Mar 20 15:44:06 crc kubenswrapper[4730]: E0320 15:44:06.318148    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="168c4cbd-3a44-48a5-be95-0eb4ea01d6c8" containerName="extract-utilities"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.318154    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="168c4cbd-3a44-48a5-be95-0eb4ea01d6c8" containerName="extract-utilities"
Mar 20 15:44:06 crc kubenswrapper[4730]: E0320 15:44:06.318161    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a118148-49cc-4b61-bb43-44e3ef2c3048" containerName="registry-server"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.318170    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a118148-49cc-4b61-bb43-44e3ef2c3048" containerName="registry-server"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.318283    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a118148-49cc-4b61-bb43-44e3ef2c3048" containerName="registry-server"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.318302    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="168c4cbd-3a44-48a5-be95-0eb4ea01d6c8" containerName="registry-server"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.318699    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.319934    4730 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"]
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.320103    4730 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"]
Mar 20 15:44:06 crc kubenswrapper[4730]: E0320 15:44:06.320555    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.320608    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://b137ba07c1866334b1897d83f9590d1627b5724b950efaef0bbda497186fe1ad" gracePeriod=15
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.320688    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://5d96c704317c9bfbcb499beae611ca6fcb6337084728df20ad791267de34359b" gracePeriod=15
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.320777    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://1f8e7f1bfc56e54242e12f30487e8120eeb3d85dc86cfc17b587255889bd60bf" gracePeriod=15
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.320594    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://2660f772d76479685f586aeb9b242975695ac5e28668840bac6b8a75c09443a4" gracePeriod=15
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.321072    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints"
Mar 20 15:44:06 crc kubenswrapper[4730]: E0320 15:44:06.321097    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.321108    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer"
Mar 20 15:44:06 crc kubenswrapper[4730]: E0320 15:44:06.321124    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.321134    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver"
Mar 20 15:44:06 crc kubenswrapper[4730]: E0320 15:44:06.321146    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.321155    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints"
Mar 20 15:44:06 crc kubenswrapper[4730]: E0320 15:44:06.321170    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.321179    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz"
Mar 20 15:44:06 crc kubenswrapper[4730]: E0320 15:44:06.321195    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.321204    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints"
Mar 20 15:44:06 crc kubenswrapper[4730]: E0320 15:44:06.321215    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.321223    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller"
Mar 20 15:44:06 crc kubenswrapper[4730]: E0320 15:44:06.321237    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.321268    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints"
Mar 20 15:44:06 crc kubenswrapper[4730]: E0320 15:44:06.321281    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.321290    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.321460    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.321476    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.321486    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.321500    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.321513    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.321523    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.321537    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer"
Mar 20 15:44:06 crc kubenswrapper[4730]: E0320 15:44:06.321673    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.321685    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.321824    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.321838    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.320720    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://180d79b8131e00449715e74d883031078dcceb74ca44ea372b6c3f651d617007" gracePeriod=15
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.375307    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"]
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.477921    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.478015    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.478042    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.478065    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.478093    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.478120    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.478167    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.478192    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.543083    4730 patch_prober.go:28] interesting pod/controller-manager-6f56868448-2fbxh container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.61:8443/healthz\": dial tcp 10.217.0.61:8443: connect: connection refused" start-of-body=
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.543227    4730 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6f56868448-2fbxh" podUID="9d747680-5dde-4793-863a-252a5f67233a" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.61:8443/healthz\": dial tcp 10.217.0.61:8443: connect: connection refused"
Mar 20 15:44:06 crc kubenswrapper[4730]: E0320 15:44:06.544206    4730 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/events\": dial tcp 38.102.83.162:6443: connect: connection refused" event=<
Mar 20 15:44:06 crc kubenswrapper[4730]:         &Event{ObjectMeta:{controller-manager-6f56868448-2fbxh.189e971b4120d4fc  openshift-controller-manager    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-controller-manager,Name:controller-manager-6f56868448-2fbxh,UID:9d747680-5dde-4793-863a-252a5f67233a,APIVersion:v1,ResourceVersion:29368,FieldPath:spec.containers{controller-manager},},Reason:ProbeError,Message:Readiness probe error: Get "https://10.217.0.61:8443/healthz": dial tcp 10.217.0.61:8443: connect: connection refused
Mar 20 15:44:06 crc kubenswrapper[4730]:         body:
Mar 20 15:44:06 crc kubenswrapper[4730]:         ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:44:06.543135996 +0000 UTC m=+305.756507365,LastTimestamp:2026-03-20 15:44:06.543135996 +0000 UTC m=+305.756507365,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}
Mar 20 15:44:06 crc kubenswrapper[4730]:  >
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.579582    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.579686    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.579680    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.579729    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.579782    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.579798    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.579826    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.579866    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.579881    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.579902    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.579916    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.580030    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.580037    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.580061    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.580086    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.579980    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.663834    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.677860    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2z2hv"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.678432    4730 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.678698    4730 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.679162    4730 status_manager.go:851] "Failed to get status for pod" podUID="715cbff8-9674-4896-8deb-54a6e9a8899e" pod="openshift-marketplace/redhat-marketplace-2z2hv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2z2hv\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:06 crc kubenswrapper[4730]: W0320 15:44:06.683824    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-c34ac7228db74b5c2b59324dec36183935690e57ab3c48f33b950fb7fee99773 WatchSource:0}: Error finding container c34ac7228db74b5c2b59324dec36183935690e57ab3c48f33b950fb7fee99773: Status 404 returned error can't find the container with id c34ac7228db74b5c2b59324dec36183935690e57ab3c48f33b950fb7fee99773
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.751424    4730 generic.go:334] "Generic (PLEG): container finished" podID="9d747680-5dde-4793-863a-252a5f67233a" containerID="6a969344117b0e223a95c25576df0115301c9833e8c4f08723a3caa581f16b8b" exitCode=0
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.751501    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f56868448-2fbxh" event={"ID":"9d747680-5dde-4793-863a-252a5f67233a","Type":"ContainerDied","Data":"6a969344117b0e223a95c25576df0115301c9833e8c4f08723a3caa581f16b8b"}
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.752882    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"c34ac7228db74b5c2b59324dec36183935690e57ab3c48f33b950fb7fee99773"}
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.755285    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.756371    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.757356    4730 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="180d79b8131e00449715e74d883031078dcceb74ca44ea372b6c3f651d617007" exitCode=2
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.759283    4730 generic.go:334] "Generic (PLEG): container finished" podID="d4d20fab-86cc-44d8-a8b9-c60f6835c5e0" containerID="34175b3ad56804c8ddffbef2e3fafd18e67e87b94833ec1a23054a68de4fe0be" exitCode=0
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.759312    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b75b5f765-8wjw2" event={"ID":"d4d20fab-86cc-44d8-a8b9-c60f6835c5e0","Type":"ContainerDied","Data":"34175b3ad56804c8ddffbef2e3fafd18e67e87b94833ec1a23054a68de4fe0be"}
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.761735    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2z2hv" event={"ID":"715cbff8-9674-4896-8deb-54a6e9a8899e","Type":"ContainerDied","Data":"8cd9a9489d9a0dbe8438d77747b3f3bcf8afc79d1dcd6dcfc2035db6222041b2"}
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.761775    4730 scope.go:117] "RemoveContainer" containerID="483eb6bb311253e2717943e6bf3c10d5b83b566c99009c1e72adb9334e3302ee"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.761797    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2z2hv"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.762468    4730 status_manager.go:851] "Failed to get status for pod" podUID="715cbff8-9674-4896-8deb-54a6e9a8899e" pod="openshift-marketplace/redhat-marketplace-2z2hv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2z2hv\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.762948    4730 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.763497    4730 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.780743    4730 scope.go:117] "RemoveContainer" containerID="33d9320e7f40c8c36e8b7683ba1de97d97d1f3c11216a749d1fb865904354d4e"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.781420    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/715cbff8-9674-4896-8deb-54a6e9a8899e-catalog-content\") pod \"715cbff8-9674-4896-8deb-54a6e9a8899e\" (UID: \"715cbff8-9674-4896-8deb-54a6e9a8899e\") "
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.781496    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/715cbff8-9674-4896-8deb-54a6e9a8899e-utilities\") pod \"715cbff8-9674-4896-8deb-54a6e9a8899e\" (UID: \"715cbff8-9674-4896-8deb-54a6e9a8899e\") "
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.781541    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmf5w\" (UniqueName: \"kubernetes.io/projected/715cbff8-9674-4896-8deb-54a6e9a8899e-kube-api-access-pmf5w\") pod \"715cbff8-9674-4896-8deb-54a6e9a8899e\" (UID: \"715cbff8-9674-4896-8deb-54a6e9a8899e\") "
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.782496    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/715cbff8-9674-4896-8deb-54a6e9a8899e-utilities" (OuterVolumeSpecName: "utilities") pod "715cbff8-9674-4896-8deb-54a6e9a8899e" (UID: "715cbff8-9674-4896-8deb-54a6e9a8899e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.785491    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/715cbff8-9674-4896-8deb-54a6e9a8899e-kube-api-access-pmf5w" (OuterVolumeSpecName: "kube-api-access-pmf5w") pod "715cbff8-9674-4896-8deb-54a6e9a8899e" (UID: "715cbff8-9674-4896-8deb-54a6e9a8899e"). InnerVolumeSpecName "kube-api-access-pmf5w". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.796361    4730 scope.go:117] "RemoveContainer" containerID="3d0d0e86eafcfd3f1d67f4c8dfdc39b982cb1033b59bc5dcda037270619199e3"
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.809749    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/715cbff8-9674-4896-8deb-54a6e9a8899e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "715cbff8-9674-4896-8deb-54a6e9a8899e" (UID: "715cbff8-9674-4896-8deb-54a6e9a8899e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.883382    4730 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/715cbff8-9674-4896-8deb-54a6e9a8899e-catalog-content\") on node \"crc\" DevicePath \"\""
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.883427    4730 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/715cbff8-9674-4896-8deb-54a6e9a8899e-utilities\") on node \"crc\" DevicePath \"\""
Mar 20 15:44:06 crc kubenswrapper[4730]: I0320 15:44:06.883437    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmf5w\" (UniqueName: \"kubernetes.io/projected/715cbff8-9674-4896-8deb-54a6e9a8899e-kube-api-access-pmf5w\") on node \"crc\" DevicePath \"\""
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.086675    4730 status_manager.go:851] "Failed to get status for pod" podUID="715cbff8-9674-4896-8deb-54a6e9a8899e" pod="openshift-marketplace/redhat-marketplace-2z2hv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2z2hv\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.087383    4730 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.087854    4730 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.153974    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rlnqc"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.154026    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rlnqc"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.198050    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rlnqc"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.198688    4730 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.199161    4730 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.199446    4730 status_manager.go:851] "Failed to get status for pod" podUID="e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98" pod="openshift-marketplace/certified-operators-rlnqc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-rlnqc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.199669    4730 status_manager.go:851] "Failed to get status for pod" podUID="715cbff8-9674-4896-8deb-54a6e9a8899e" pod="openshift-marketplace/redhat-marketplace-2z2hv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2z2hv\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.488008    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f56868448-2fbxh"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.489509    4730 status_manager.go:851] "Failed to get status for pod" podUID="715cbff8-9674-4896-8deb-54a6e9a8899e" pod="openshift-marketplace/redhat-marketplace-2z2hv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2z2hv\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.489762    4730 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.489943    4730 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.490172    4730 status_manager.go:851] "Failed to get status for pod" podUID="e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98" pod="openshift-marketplace/certified-operators-rlnqc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-rlnqc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.490456    4730 status_manager.go:851] "Failed to get status for pod" podUID="9d747680-5dde-4793-863a-252a5f67233a" pod="openshift-controller-manager/controller-manager-6f56868448-2fbxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f56868448-2fbxh\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.492433    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b75b5f765-8wjw2"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.492790    4730 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.493218    4730 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.493562    4730 status_manager.go:851] "Failed to get status for pod" podUID="e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98" pod="openshift-marketplace/certified-operators-rlnqc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-rlnqc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.493847    4730 status_manager.go:851] "Failed to get status for pod" podUID="9d747680-5dde-4793-863a-252a5f67233a" pod="openshift-controller-manager/controller-manager-6f56868448-2fbxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f56868448-2fbxh\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.494080    4730 status_manager.go:851] "Failed to get status for pod" podUID="d4d20fab-86cc-44d8-a8b9-c60f6835c5e0" pod="openshift-route-controller-manager/route-controller-manager-b75b5f765-8wjw2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-b75b5f765-8wjw2\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.494354    4730 status_manager.go:851] "Failed to get status for pod" podUID="715cbff8-9674-4896-8deb-54a6e9a8899e" pod="openshift-marketplace/redhat-marketplace-2z2hv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2z2hv\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.593661    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9d747680-5dde-4793-863a-252a5f67233a-proxy-ca-bundles\") pod \"9d747680-5dde-4793-863a-252a5f67233a\" (UID: \"9d747680-5dde-4793-863a-252a5f67233a\") "
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.593751    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d747680-5dde-4793-863a-252a5f67233a-config\") pod \"9d747680-5dde-4793-863a-252a5f67233a\" (UID: \"9d747680-5dde-4793-863a-252a5f67233a\") "
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.593774    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d747680-5dde-4793-863a-252a5f67233a-client-ca\") pod \"9d747680-5dde-4793-863a-252a5f67233a\" (UID: \"9d747680-5dde-4793-863a-252a5f67233a\") "
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.593807    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d747680-5dde-4793-863a-252a5f67233a-serving-cert\") pod \"9d747680-5dde-4793-863a-252a5f67233a\" (UID: \"9d747680-5dde-4793-863a-252a5f67233a\") "
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.593857    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68xrx\" (UniqueName: \"kubernetes.io/projected/9d747680-5dde-4793-863a-252a5f67233a-kube-api-access-68xrx\") pod \"9d747680-5dde-4793-863a-252a5f67233a\" (UID: \"9d747680-5dde-4793-863a-252a5f67233a\") "
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.594807    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d747680-5dde-4793-863a-252a5f67233a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "9d747680-5dde-4793-863a-252a5f67233a" (UID: "9d747680-5dde-4793-863a-252a5f67233a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.594854    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d747680-5dde-4793-863a-252a5f67233a-client-ca" (OuterVolumeSpecName: "client-ca") pod "9d747680-5dde-4793-863a-252a5f67233a" (UID: "9d747680-5dde-4793-863a-252a5f67233a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.595117    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d747680-5dde-4793-863a-252a5f67233a-config" (OuterVolumeSpecName: "config") pod "9d747680-5dde-4793-863a-252a5f67233a" (UID: "9d747680-5dde-4793-863a-252a5f67233a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.599047    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d747680-5dde-4793-863a-252a5f67233a-kube-api-access-68xrx" (OuterVolumeSpecName: "kube-api-access-68xrx") pod "9d747680-5dde-4793-863a-252a5f67233a" (UID: "9d747680-5dde-4793-863a-252a5f67233a"). InnerVolumeSpecName "kube-api-access-68xrx". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.599469    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d747680-5dde-4793-863a-252a5f67233a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d747680-5dde-4793-863a-252a5f67233a" (UID: "9d747680-5dde-4793-863a-252a5f67233a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.694969    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4d20fab-86cc-44d8-a8b9-c60f6835c5e0-config\") pod \"d4d20fab-86cc-44d8-a8b9-c60f6835c5e0\" (UID: \"d4d20fab-86cc-44d8-a8b9-c60f6835c5e0\") "
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.695129    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4d20fab-86cc-44d8-a8b9-c60f6835c5e0-serving-cert\") pod \"d4d20fab-86cc-44d8-a8b9-c60f6835c5e0\" (UID: \"d4d20fab-86cc-44d8-a8b9-c60f6835c5e0\") "
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.695170    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xdnv\" (UniqueName: \"kubernetes.io/projected/d4d20fab-86cc-44d8-a8b9-c60f6835c5e0-kube-api-access-4xdnv\") pod \"d4d20fab-86cc-44d8-a8b9-c60f6835c5e0\" (UID: \"d4d20fab-86cc-44d8-a8b9-c60f6835c5e0\") "
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.695320    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4d20fab-86cc-44d8-a8b9-c60f6835c5e0-client-ca\") pod \"d4d20fab-86cc-44d8-a8b9-c60f6835c5e0\" (UID: \"d4d20fab-86cc-44d8-a8b9-c60f6835c5e0\") "
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.695776    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68xrx\" (UniqueName: \"kubernetes.io/projected/9d747680-5dde-4793-863a-252a5f67233a-kube-api-access-68xrx\") on node \"crc\" DevicePath \"\""
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.695817    4730 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9d747680-5dde-4793-863a-252a5f67233a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\""
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.695838    4730 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d747680-5dde-4793-863a-252a5f67233a-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.695856    4730 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d747680-5dde-4793-863a-252a5f67233a-client-ca\") on node \"crc\" DevicePath \"\""
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.695876    4730 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d747680-5dde-4793-863a-252a5f67233a-serving-cert\") on node \"crc\" DevicePath \"\""
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.696139    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4d20fab-86cc-44d8-a8b9-c60f6835c5e0-client-ca" (OuterVolumeSpecName: "client-ca") pod "d4d20fab-86cc-44d8-a8b9-c60f6835c5e0" (UID: "d4d20fab-86cc-44d8-a8b9-c60f6835c5e0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.696178    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4d20fab-86cc-44d8-a8b9-c60f6835c5e0-config" (OuterVolumeSpecName: "config") pod "d4d20fab-86cc-44d8-a8b9-c60f6835c5e0" (UID: "d4d20fab-86cc-44d8-a8b9-c60f6835c5e0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.698023    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4d20fab-86cc-44d8-a8b9-c60f6835c5e0-kube-api-access-4xdnv" (OuterVolumeSpecName: "kube-api-access-4xdnv") pod "d4d20fab-86cc-44d8-a8b9-c60f6835c5e0" (UID: "d4d20fab-86cc-44d8-a8b9-c60f6835c5e0"). InnerVolumeSpecName "kube-api-access-4xdnv". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:44:07 crc kubenswrapper[4730]: E0320 15:44:07.698468    4730 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/events\": dial tcp 38.102.83.162:6443: connect: connection refused" event=<
Mar 20 15:44:07 crc kubenswrapper[4730]:         &Event{ObjectMeta:{controller-manager-6f56868448-2fbxh.189e971b4120d4fc  openshift-controller-manager    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-controller-manager,Name:controller-manager-6f56868448-2fbxh,UID:9d747680-5dde-4793-863a-252a5f67233a,APIVersion:v1,ResourceVersion:29368,FieldPath:spec.containers{controller-manager},},Reason:ProbeError,Message:Readiness probe error: Get "https://10.217.0.61:8443/healthz": dial tcp 10.217.0.61:8443: connect: connection refused
Mar 20 15:44:07 crc kubenswrapper[4730]:         body:
Mar 20 15:44:07 crc kubenswrapper[4730]:         ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:44:06.543135996 +0000 UTC m=+305.756507365,LastTimestamp:2026-03-20 15:44:06.543135996 +0000 UTC m=+305.756507365,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}
Mar 20 15:44:07 crc kubenswrapper[4730]:  >
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.699792    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4d20fab-86cc-44d8-a8b9-c60f6835c5e0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d4d20fab-86cc-44d8-a8b9-c60f6835c5e0" (UID: "d4d20fab-86cc-44d8-a8b9-c60f6835c5e0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.769806    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"2bd02e0e36575d63682737a4f6e6a51da85e3c01b703e67cc6238582df76514f"}
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.770591    4730 status_manager.go:851] "Failed to get status for pod" podUID="e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98" pod="openshift-marketplace/certified-operators-rlnqc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-rlnqc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.771061    4730 status_manager.go:851] "Failed to get status for pod" podUID="9d747680-5dde-4793-863a-252a5f67233a" pod="openshift-controller-manager/controller-manager-6f56868448-2fbxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f56868448-2fbxh\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.771439    4730 status_manager.go:851] "Failed to get status for pod" podUID="d4d20fab-86cc-44d8-a8b9-c60f6835c5e0" pod="openshift-route-controller-manager/route-controller-manager-b75b5f765-8wjw2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-b75b5f765-8wjw2\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.771756    4730 status_manager.go:851] "Failed to get status for pod" podUID="715cbff8-9674-4896-8deb-54a6e9a8899e" pod="openshift-marketplace/redhat-marketplace-2z2hv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2z2hv\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.771975    4730 generic.go:334] "Generic (PLEG): container finished" podID="e48519c7-0cdc-419b-bd72-2bab0e911af8" containerID="d1983b19ac38ba32d4fa20a02bf50a7e57dd7a9e5c61bb3d5cfddbb58ce8788c" exitCode=0
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.772020    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e48519c7-0cdc-419b-bd72-2bab0e911af8","Type":"ContainerDied","Data":"d1983b19ac38ba32d4fa20a02bf50a7e57dd7a9e5c61bb3d5cfddbb58ce8788c"}
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.772125    4730 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.772576    4730 status_manager.go:851] "Failed to get status for pod" podUID="e48519c7-0cdc-419b-bd72-2bab0e911af8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.772887    4730 status_manager.go:851] "Failed to get status for pod" podUID="d4d20fab-86cc-44d8-a8b9-c60f6835c5e0" pod="openshift-route-controller-manager/route-controller-manager-b75b5f765-8wjw2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-b75b5f765-8wjw2\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.773075    4730 status_manager.go:851] "Failed to get status for pod" podUID="9d747680-5dde-4793-863a-252a5f67233a" pod="openshift-controller-manager/controller-manager-6f56868448-2fbxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f56868448-2fbxh\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.773312    4730 status_manager.go:851] "Failed to get status for pod" podUID="715cbff8-9674-4896-8deb-54a6e9a8899e" pod="openshift-marketplace/redhat-marketplace-2z2hv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2z2hv\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.773607    4730 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.773989    4730 status_manager.go:851] "Failed to get status for pod" podUID="e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98" pod="openshift-marketplace/certified-operators-rlnqc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-rlnqc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.774870    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.776072    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.777286    4730 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b137ba07c1866334b1897d83f9590d1627b5724b950efaef0bbda497186fe1ad" exitCode=0
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.777319    4730 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1f8e7f1bfc56e54242e12f30487e8120eeb3d85dc86cfc17b587255889bd60bf" exitCode=0
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.777331    4730 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5d96c704317c9bfbcb499beae611ca6fcb6337084728df20ad791267de34359b" exitCode=0
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.777409    4730 scope.go:117] "RemoveContainer" containerID="688754d61ab497303640174e7b0a3aab82b31f77e5cc50b6993f85222e3053c5"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.779182    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b75b5f765-8wjw2" event={"ID":"d4d20fab-86cc-44d8-a8b9-c60f6835c5e0","Type":"ContainerDied","Data":"4e99a4704ba10afb385846d96e0db2ec14bcce1391fc5cd9c5fede455f436bf0"}
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.779201    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b75b5f765-8wjw2"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.779894    4730 status_manager.go:851] "Failed to get status for pod" podUID="e48519c7-0cdc-419b-bd72-2bab0e911af8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.780108    4730 status_manager.go:851] "Failed to get status for pod" podUID="d4d20fab-86cc-44d8-a8b9-c60f6835c5e0" pod="openshift-route-controller-manager/route-controller-manager-b75b5f765-8wjw2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-b75b5f765-8wjw2\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.780406    4730 status_manager.go:851] "Failed to get status for pod" podUID="9d747680-5dde-4793-863a-252a5f67233a" pod="openshift-controller-manager/controller-manager-6f56868448-2fbxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f56868448-2fbxh\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.780799    4730 status_manager.go:851] "Failed to get status for pod" podUID="715cbff8-9674-4896-8deb-54a6e9a8899e" pod="openshift-marketplace/redhat-marketplace-2z2hv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2z2hv\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.781149    4730 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.781443    4730 status_manager.go:851] "Failed to get status for pod" podUID="e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98" pod="openshift-marketplace/certified-operators-rlnqc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-rlnqc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.782235    4730 generic.go:334] "Generic (PLEG): container finished" podID="3f093381-3bf4-49ff-beb4-f44aa012c521" containerID="6ba1acd4b6440038c4d2f11f36de1734bab2b24cdd1e2d4018cd0e97b421d598" exitCode=0
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.782349    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567024-s2r9c" event={"ID":"3f093381-3bf4-49ff-beb4-f44aa012c521","Type":"ContainerDied","Data":"6ba1acd4b6440038c4d2f11f36de1734bab2b24cdd1e2d4018cd0e97b421d598"}
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.782807    4730 status_manager.go:851] "Failed to get status for pod" podUID="3f093381-3bf4-49ff-beb4-f44aa012c521" pod="openshift-infra/auto-csr-approver-29567024-s2r9c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29567024-s2r9c\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.783350    4730 status_manager.go:851] "Failed to get status for pod" podUID="e48519c7-0cdc-419b-bd72-2bab0e911af8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.783640    4730 status_manager.go:851] "Failed to get status for pod" podUID="9d747680-5dde-4793-863a-252a5f67233a" pod="openshift-controller-manager/controller-manager-6f56868448-2fbxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f56868448-2fbxh\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.783896    4730 status_manager.go:851] "Failed to get status for pod" podUID="d4d20fab-86cc-44d8-a8b9-c60f6835c5e0" pod="openshift-route-controller-manager/route-controller-manager-b75b5f765-8wjw2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-b75b5f765-8wjw2\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.784188    4730 status_manager.go:851] "Failed to get status for pod" podUID="715cbff8-9674-4896-8deb-54a6e9a8899e" pod="openshift-marketplace/redhat-marketplace-2z2hv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2z2hv\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.784368    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f56868448-2fbxh" event={"ID":"9d747680-5dde-4793-863a-252a5f67233a","Type":"ContainerDied","Data":"2d0d6a6d61d99c2e38791ba9cd9580b05d3d5ca5c004c4c372aff09d24128220"}
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.784400    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f56868448-2fbxh"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.784500    4730 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.784759    4730 status_manager.go:851] "Failed to get status for pod" podUID="e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98" pod="openshift-marketplace/certified-operators-rlnqc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-rlnqc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.785275    4730 status_manager.go:851] "Failed to get status for pod" podUID="3f093381-3bf4-49ff-beb4-f44aa012c521" pod="openshift-infra/auto-csr-approver-29567024-s2r9c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29567024-s2r9c\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.785542    4730 status_manager.go:851] "Failed to get status for pod" podUID="e48519c7-0cdc-419b-bd72-2bab0e911af8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.785793    4730 status_manager.go:851] "Failed to get status for pod" podUID="9d747680-5dde-4793-863a-252a5f67233a" pod="openshift-controller-manager/controller-manager-6f56868448-2fbxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f56868448-2fbxh\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.786447    4730 status_manager.go:851] "Failed to get status for pod" podUID="d4d20fab-86cc-44d8-a8b9-c60f6835c5e0" pod="openshift-route-controller-manager/route-controller-manager-b75b5f765-8wjw2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-b75b5f765-8wjw2\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.786750    4730 status_manager.go:851] "Failed to get status for pod" podUID="715cbff8-9674-4896-8deb-54a6e9a8899e" pod="openshift-marketplace/redhat-marketplace-2z2hv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2z2hv\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.787041    4730 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.787327    4730 status_manager.go:851] "Failed to get status for pod" podUID="e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98" pod="openshift-marketplace/certified-operators-rlnqc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-rlnqc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.796375    4730 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4d20fab-86cc-44d8-a8b9-c60f6835c5e0-client-ca\") on node \"crc\" DevicePath \"\""
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.796412    4730 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4d20fab-86cc-44d8-a8b9-c60f6835c5e0-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.796424    4730 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4d20fab-86cc-44d8-a8b9-c60f6835c5e0-serving-cert\") on node \"crc\" DevicePath \"\""
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.796433    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xdnv\" (UniqueName: \"kubernetes.io/projected/d4d20fab-86cc-44d8-a8b9-c60f6835c5e0-kube-api-access-4xdnv\") on node \"crc\" DevicePath \"\""
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.806775    4730 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.807308    4730 status_manager.go:851] "Failed to get status for pod" podUID="e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98" pod="openshift-marketplace/certified-operators-rlnqc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-rlnqc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.807691    4730 status_manager.go:851] "Failed to get status for pod" podUID="3f093381-3bf4-49ff-beb4-f44aa012c521" pod="openshift-infra/auto-csr-approver-29567024-s2r9c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29567024-s2r9c\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.808022    4730 status_manager.go:851] "Failed to get status for pod" podUID="e48519c7-0cdc-419b-bd72-2bab0e911af8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.808324    4730 status_manager.go:851] "Failed to get status for pod" podUID="9d747680-5dde-4793-863a-252a5f67233a" pod="openshift-controller-manager/controller-manager-6f56868448-2fbxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f56868448-2fbxh\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.808584    4730 status_manager.go:851] "Failed to get status for pod" podUID="d4d20fab-86cc-44d8-a8b9-c60f6835c5e0" pod="openshift-route-controller-manager/route-controller-manager-b75b5f765-8wjw2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-b75b5f765-8wjw2\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.808863    4730 status_manager.go:851] "Failed to get status for pod" podUID="715cbff8-9674-4896-8deb-54a6e9a8899e" pod="openshift-marketplace/redhat-marketplace-2z2hv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2z2hv\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.809272    4730 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.809527    4730 status_manager.go:851] "Failed to get status for pod" podUID="e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98" pod="openshift-marketplace/certified-operators-rlnqc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-rlnqc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.809820    4730 status_manager.go:851] "Failed to get status for pod" podUID="3f093381-3bf4-49ff-beb4-f44aa012c521" pod="openshift-infra/auto-csr-approver-29567024-s2r9c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29567024-s2r9c\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.810036    4730 status_manager.go:851] "Failed to get status for pod" podUID="e48519c7-0cdc-419b-bd72-2bab0e911af8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.810404    4730 status_manager.go:851] "Failed to get status for pod" podUID="9d747680-5dde-4793-863a-252a5f67233a" pod="openshift-controller-manager/controller-manager-6f56868448-2fbxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f56868448-2fbxh\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.810753    4730 status_manager.go:851] "Failed to get status for pod" podUID="d4d20fab-86cc-44d8-a8b9-c60f6835c5e0" pod="openshift-route-controller-manager/route-controller-manager-b75b5f765-8wjw2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-b75b5f765-8wjw2\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.811057    4730 status_manager.go:851] "Failed to get status for pod" podUID="715cbff8-9674-4896-8deb-54a6e9a8899e" pod="openshift-marketplace/redhat-marketplace-2z2hv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2z2hv\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.812959    4730 scope.go:117] "RemoveContainer" containerID="34175b3ad56804c8ddffbef2e3fafd18e67e87b94833ec1a23054a68de4fe0be"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.828394    4730 scope.go:117] "RemoveContainer" containerID="6a969344117b0e223a95c25576df0115301c9833e8c4f08723a3caa581f16b8b"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.834318    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rlnqc"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.834829    4730 status_manager.go:851] "Failed to get status for pod" podUID="e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98" pod="openshift-marketplace/certified-operators-rlnqc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-rlnqc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.835130    4730 status_manager.go:851] "Failed to get status for pod" podUID="3f093381-3bf4-49ff-beb4-f44aa012c521" pod="openshift-infra/auto-csr-approver-29567024-s2r9c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29567024-s2r9c\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.835314    4730 status_manager.go:851] "Failed to get status for pod" podUID="e48519c7-0cdc-419b-bd72-2bab0e911af8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.835462    4730 status_manager.go:851] "Failed to get status for pod" podUID="9d747680-5dde-4793-863a-252a5f67233a" pod="openshift-controller-manager/controller-manager-6f56868448-2fbxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f56868448-2fbxh\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.835606    4730 status_manager.go:851] "Failed to get status for pod" podUID="d4d20fab-86cc-44d8-a8b9-c60f6835c5e0" pod="openshift-route-controller-manager/route-controller-manager-b75b5f765-8wjw2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-b75b5f765-8wjw2\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.835753    4730 status_manager.go:851] "Failed to get status for pod" podUID="715cbff8-9674-4896-8deb-54a6e9a8899e" pod="openshift-marketplace/redhat-marketplace-2z2hv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2z2hv\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:07 crc kubenswrapper[4730]: I0320 15:44:07.836096    4730 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:08 crc kubenswrapper[4730]: E0320 15:44:08.364518    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:44:08Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:44:08Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:44:08Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:44:08Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4869c69128f74d9c3b178ea6c8c8d38df169b6bce05eb821a65f0aaf514c563a\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:f3c2ad90e251062165f8d6623ca4994c0b3e28324e4b5b17fd588b162ec97766\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1746912226},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:ce2b0bfbec08802afec185b6eebec59a5a016291cad2c3515b0d06af6c34fde3\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:d5656ec3f5691d96c2da35350810e5bd700559213851b28fc3523a059efce76f\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1252734685},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:898c67bf7fc973e99114f3148976a6c21ae0dbe413051415588fa9b995f5b331\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:a641939d2096609a4cf6eec872a1476b7c671bfd81cffc2edeb6e9f13c9deeba\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1231028434},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:422eef49f9b56aaa481c870199db8b853dba5d36f00adcc19d22a6960345f1cc\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:667c2f632cea73b8b5354e1fbd365169f285c9b8460c5e81f63967a72b8f90e8\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1223676630},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-cli@sha256:69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9\\\",\\\"registry.redhat.io/openshift4/ose-cli@sha256:ef83967297f619f45075e7fd1428a1eb981622a6c174c46fb53b158ed24bed85\\\",\\\"registry.redhat.io/openshift4/ose-cli:latest\\\"],\\\"sizeBytes\\\":584351326},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:08 crc kubenswrapper[4730]: E0320 15:44:08.365479    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:08 crc kubenswrapper[4730]: E0320 15:44:08.366329    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:08 crc kubenswrapper[4730]: E0320 15:44:08.366836    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:08 crc kubenswrapper[4730]: E0320 15:44:08.367113    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:08 crc kubenswrapper[4730]: E0320 15:44:08.367140    4730 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count"
Mar 20 15:44:08 crc kubenswrapper[4730]: I0320 15:44:08.681080    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log"
Mar 20 15:44:08 crc kubenswrapper[4730]: I0320 15:44:08.682053    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc"
Mar 20 15:44:08 crc kubenswrapper[4730]: I0320 15:44:08.682702    4730 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:08 crc kubenswrapper[4730]: I0320 15:44:08.682972    4730 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:08 crc kubenswrapper[4730]: I0320 15:44:08.683218    4730 status_manager.go:851] "Failed to get status for pod" podUID="e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98" pod="openshift-marketplace/certified-operators-rlnqc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-rlnqc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:08 crc kubenswrapper[4730]: I0320 15:44:08.683452    4730 status_manager.go:851] "Failed to get status for pod" podUID="3f093381-3bf4-49ff-beb4-f44aa012c521" pod="openshift-infra/auto-csr-approver-29567024-s2r9c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29567024-s2r9c\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:08 crc kubenswrapper[4730]: I0320 15:44:08.683690    4730 status_manager.go:851] "Failed to get status for pod" podUID="e48519c7-0cdc-419b-bd72-2bab0e911af8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:08 crc kubenswrapper[4730]: I0320 15:44:08.683906    4730 status_manager.go:851] "Failed to get status for pod" podUID="9d747680-5dde-4793-863a-252a5f67233a" pod="openshift-controller-manager/controller-manager-6f56868448-2fbxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f56868448-2fbxh\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:08 crc kubenswrapper[4730]: I0320 15:44:08.684100    4730 status_manager.go:851] "Failed to get status for pod" podUID="d4d20fab-86cc-44d8-a8b9-c60f6835c5e0" pod="openshift-route-controller-manager/route-controller-manager-b75b5f765-8wjw2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-b75b5f765-8wjw2\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:08 crc kubenswrapper[4730]: I0320 15:44:08.684327    4730 status_manager.go:851] "Failed to get status for pod" podUID="715cbff8-9674-4896-8deb-54a6e9a8899e" pod="openshift-marketplace/redhat-marketplace-2z2hv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2z2hv\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:08 crc kubenswrapper[4730]: I0320 15:44:08.793973    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log"
Mar 20 15:44:08 crc kubenswrapper[4730]: I0320 15:44:08.795293    4730 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2660f772d76479685f586aeb9b242975695ac5e28668840bac6b8a75c09443a4" exitCode=0
Mar 20 15:44:08 crc kubenswrapper[4730]: I0320 15:44:08.795455    4730 scope.go:117] "RemoveContainer" containerID="b137ba07c1866334b1897d83f9590d1627b5724b950efaef0bbda497186fe1ad"
Mar 20 15:44:08 crc kubenswrapper[4730]: I0320 15:44:08.795560    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc"
Mar 20 15:44:08 crc kubenswrapper[4730]: I0320 15:44:08.809208    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") "
Mar 20 15:44:08 crc kubenswrapper[4730]: I0320 15:44:08.809264    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") "
Mar 20 15:44:08 crc kubenswrapper[4730]: I0320 15:44:08.809296    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") "
Mar 20 15:44:08 crc kubenswrapper[4730]: I0320 15:44:08.809386    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue ""
Mar 20 15:44:08 crc kubenswrapper[4730]: I0320 15:44:08.809382    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue ""
Mar 20 15:44:08 crc kubenswrapper[4730]: I0320 15:44:08.809459    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue ""
Mar 20 15:44:08 crc kubenswrapper[4730]: I0320 15:44:08.809775    4730 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\""
Mar 20 15:44:08 crc kubenswrapper[4730]: I0320 15:44:08.809801    4730 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\""
Mar 20 15:44:08 crc kubenswrapper[4730]: I0320 15:44:08.809829    4730 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\""
Mar 20 15:44:08 crc kubenswrapper[4730]: I0320 15:44:08.814173    4730 scope.go:117] "RemoveContainer" containerID="1f8e7f1bfc56e54242e12f30487e8120eeb3d85dc86cfc17b587255889bd60bf"
Mar 20 15:44:08 crc kubenswrapper[4730]: I0320 15:44:08.825806    4730 scope.go:117] "RemoveContainer" containerID="5d96c704317c9bfbcb499beae611ca6fcb6337084728df20ad791267de34359b"
Mar 20 15:44:08 crc kubenswrapper[4730]: I0320 15:44:08.838299    4730 scope.go:117] "RemoveContainer" containerID="180d79b8131e00449715e74d883031078dcceb74ca44ea372b6c3f651d617007"
Mar 20 15:44:08 crc kubenswrapper[4730]: I0320 15:44:08.853924    4730 scope.go:117] "RemoveContainer" containerID="2660f772d76479685f586aeb9b242975695ac5e28668840bac6b8a75c09443a4"
Mar 20 15:44:08 crc kubenswrapper[4730]: I0320 15:44:08.877617    4730 scope.go:117] "RemoveContainer" containerID="e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e"
Mar 20 15:44:08 crc kubenswrapper[4730]: I0320 15:44:08.898589    4730 scope.go:117] "RemoveContainer" containerID="b137ba07c1866334b1897d83f9590d1627b5724b950efaef0bbda497186fe1ad"
Mar 20 15:44:08 crc kubenswrapper[4730]: E0320 15:44:08.903755    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b137ba07c1866334b1897d83f9590d1627b5724b950efaef0bbda497186fe1ad\": container with ID starting with b137ba07c1866334b1897d83f9590d1627b5724b950efaef0bbda497186fe1ad not found: ID does not exist" containerID="b137ba07c1866334b1897d83f9590d1627b5724b950efaef0bbda497186fe1ad"
Mar 20 15:44:08 crc kubenswrapper[4730]: I0320 15:44:08.903788    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b137ba07c1866334b1897d83f9590d1627b5724b950efaef0bbda497186fe1ad"} err="failed to get container status \"b137ba07c1866334b1897d83f9590d1627b5724b950efaef0bbda497186fe1ad\": rpc error: code = NotFound desc = could not find container \"b137ba07c1866334b1897d83f9590d1627b5724b950efaef0bbda497186fe1ad\": container with ID starting with b137ba07c1866334b1897d83f9590d1627b5724b950efaef0bbda497186fe1ad not found: ID does not exist"
Mar 20 15:44:08 crc kubenswrapper[4730]: I0320 15:44:08.903826    4730 scope.go:117] "RemoveContainer" containerID="1f8e7f1bfc56e54242e12f30487e8120eeb3d85dc86cfc17b587255889bd60bf"
Mar 20 15:44:08 crc kubenswrapper[4730]: E0320 15:44:08.904848    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f8e7f1bfc56e54242e12f30487e8120eeb3d85dc86cfc17b587255889bd60bf\": container with ID starting with 1f8e7f1bfc56e54242e12f30487e8120eeb3d85dc86cfc17b587255889bd60bf not found: ID does not exist" containerID="1f8e7f1bfc56e54242e12f30487e8120eeb3d85dc86cfc17b587255889bd60bf"
Mar 20 15:44:08 crc kubenswrapper[4730]: I0320 15:44:08.904889    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f8e7f1bfc56e54242e12f30487e8120eeb3d85dc86cfc17b587255889bd60bf"} err="failed to get container status \"1f8e7f1bfc56e54242e12f30487e8120eeb3d85dc86cfc17b587255889bd60bf\": rpc error: code = NotFound desc = could not find container \"1f8e7f1bfc56e54242e12f30487e8120eeb3d85dc86cfc17b587255889bd60bf\": container with ID starting with 1f8e7f1bfc56e54242e12f30487e8120eeb3d85dc86cfc17b587255889bd60bf not found: ID does not exist"
Mar 20 15:44:08 crc kubenswrapper[4730]: I0320 15:44:08.904904    4730 scope.go:117] "RemoveContainer" containerID="5d96c704317c9bfbcb499beae611ca6fcb6337084728df20ad791267de34359b"
Mar 20 15:44:08 crc kubenswrapper[4730]: E0320 15:44:08.909445    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d96c704317c9bfbcb499beae611ca6fcb6337084728df20ad791267de34359b\": container with ID starting with 5d96c704317c9bfbcb499beae611ca6fcb6337084728df20ad791267de34359b not found: ID does not exist" containerID="5d96c704317c9bfbcb499beae611ca6fcb6337084728df20ad791267de34359b"
Mar 20 15:44:08 crc kubenswrapper[4730]: I0320 15:44:08.909541    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d96c704317c9bfbcb499beae611ca6fcb6337084728df20ad791267de34359b"} err="failed to get container status \"5d96c704317c9bfbcb499beae611ca6fcb6337084728df20ad791267de34359b\": rpc error: code = NotFound desc = could not find container \"5d96c704317c9bfbcb499beae611ca6fcb6337084728df20ad791267de34359b\": container with ID starting with 5d96c704317c9bfbcb499beae611ca6fcb6337084728df20ad791267de34359b not found: ID does not exist"
Mar 20 15:44:08 crc kubenswrapper[4730]: I0320 15:44:08.909566    4730 scope.go:117] "RemoveContainer" containerID="180d79b8131e00449715e74d883031078dcceb74ca44ea372b6c3f651d617007"
Mar 20 15:44:08 crc kubenswrapper[4730]: E0320 15:44:08.910226    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"180d79b8131e00449715e74d883031078dcceb74ca44ea372b6c3f651d617007\": container with ID starting with 180d79b8131e00449715e74d883031078dcceb74ca44ea372b6c3f651d617007 not found: ID does not exist" containerID="180d79b8131e00449715e74d883031078dcceb74ca44ea372b6c3f651d617007"
Mar 20 15:44:08 crc kubenswrapper[4730]: I0320 15:44:08.910292    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"180d79b8131e00449715e74d883031078dcceb74ca44ea372b6c3f651d617007"} err="failed to get container status \"180d79b8131e00449715e74d883031078dcceb74ca44ea372b6c3f651d617007\": rpc error: code = NotFound desc = could not find container \"180d79b8131e00449715e74d883031078dcceb74ca44ea372b6c3f651d617007\": container with ID starting with 180d79b8131e00449715e74d883031078dcceb74ca44ea372b6c3f651d617007 not found: ID does not exist"
Mar 20 15:44:08 crc kubenswrapper[4730]: I0320 15:44:08.910331    4730 scope.go:117] "RemoveContainer" containerID="2660f772d76479685f586aeb9b242975695ac5e28668840bac6b8a75c09443a4"
Mar 20 15:44:08 crc kubenswrapper[4730]: E0320 15:44:08.910697    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2660f772d76479685f586aeb9b242975695ac5e28668840bac6b8a75c09443a4\": container with ID starting with 2660f772d76479685f586aeb9b242975695ac5e28668840bac6b8a75c09443a4 not found: ID does not exist" containerID="2660f772d76479685f586aeb9b242975695ac5e28668840bac6b8a75c09443a4"
Mar 20 15:44:08 crc kubenswrapper[4730]: I0320 15:44:08.910741    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2660f772d76479685f586aeb9b242975695ac5e28668840bac6b8a75c09443a4"} err="failed to get container status \"2660f772d76479685f586aeb9b242975695ac5e28668840bac6b8a75c09443a4\": rpc error: code = NotFound desc = could not find container \"2660f772d76479685f586aeb9b242975695ac5e28668840bac6b8a75c09443a4\": container with ID starting with 2660f772d76479685f586aeb9b242975695ac5e28668840bac6b8a75c09443a4 not found: ID does not exist"
Mar 20 15:44:08 crc kubenswrapper[4730]: I0320 15:44:08.910756    4730 scope.go:117] "RemoveContainer" containerID="e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e"
Mar 20 15:44:08 crc kubenswrapper[4730]: E0320 15:44:08.911080    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\": container with ID starting with e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e not found: ID does not exist" containerID="e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e"
Mar 20 15:44:08 crc kubenswrapper[4730]: I0320 15:44:08.911141    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e"} err="failed to get container status \"e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\": rpc error: code = NotFound desc = could not find container \"e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e\": container with ID starting with e683393c25a03a11f7a3e2e4bc5231217db3fa7cd807971f61734acea801ca9e not found: ID does not exist"
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.023763    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc"
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.024273    4730 status_manager.go:851] "Failed to get status for pod" podUID="e48519c7-0cdc-419b-bd72-2bab0e911af8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.024690    4730 status_manager.go:851] "Failed to get status for pod" podUID="9d747680-5dde-4793-863a-252a5f67233a" pod="openshift-controller-manager/controller-manager-6f56868448-2fbxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f56868448-2fbxh\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.025087    4730 status_manager.go:851] "Failed to get status for pod" podUID="d4d20fab-86cc-44d8-a8b9-c60f6835c5e0" pod="openshift-route-controller-manager/route-controller-manager-b75b5f765-8wjw2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-b75b5f765-8wjw2\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.025442    4730 status_manager.go:851] "Failed to get status for pod" podUID="715cbff8-9674-4896-8deb-54a6e9a8899e" pod="openshift-marketplace/redhat-marketplace-2z2hv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2z2hv\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.025690    4730 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.025990    4730 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.026184    4730 status_manager.go:851] "Failed to get status for pod" podUID="e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98" pod="openshift-marketplace/certified-operators-rlnqc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-rlnqc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.026467    4730 status_manager.go:851] "Failed to get status for pod" podUID="3f093381-3bf4-49ff-beb4-f44aa012c521" pod="openshift-infra/auto-csr-approver-29567024-s2r9c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29567024-s2r9c\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.027786    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567024-s2r9c"
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.028140    4730 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.028346    4730 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.028593    4730 status_manager.go:851] "Failed to get status for pod" podUID="e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98" pod="openshift-marketplace/certified-operators-rlnqc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-rlnqc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.028845    4730 status_manager.go:851] "Failed to get status for pod" podUID="3f093381-3bf4-49ff-beb4-f44aa012c521" pod="openshift-infra/auto-csr-approver-29567024-s2r9c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29567024-s2r9c\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.029090    4730 status_manager.go:851] "Failed to get status for pod" podUID="e48519c7-0cdc-419b-bd72-2bab0e911af8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.029360    4730 status_manager.go:851] "Failed to get status for pod" podUID="d4d20fab-86cc-44d8-a8b9-c60f6835c5e0" pod="openshift-route-controller-manager/route-controller-manager-b75b5f765-8wjw2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-b75b5f765-8wjw2\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.029596    4730 status_manager.go:851] "Failed to get status for pod" podUID="9d747680-5dde-4793-863a-252a5f67233a" pod="openshift-controller-manager/controller-manager-6f56868448-2fbxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f56868448-2fbxh\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.029879    4730 status_manager.go:851] "Failed to get status for pod" podUID="715cbff8-9674-4896-8deb-54a6e9a8899e" pod="openshift-marketplace/redhat-marketplace-2z2hv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2z2hv\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.109701    4730 status_manager.go:851] "Failed to get status for pod" podUID="715cbff8-9674-4896-8deb-54a6e9a8899e" pod="openshift-marketplace/redhat-marketplace-2z2hv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2z2hv\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.110059    4730 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.110207    4730 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.110467    4730 status_manager.go:851] "Failed to get status for pod" podUID="e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98" pod="openshift-marketplace/certified-operators-rlnqc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-rlnqc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.110671    4730 status_manager.go:851] "Failed to get status for pod" podUID="3f093381-3bf4-49ff-beb4-f44aa012c521" pod="openshift-infra/auto-csr-approver-29567024-s2r9c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29567024-s2r9c\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.110845    4730 status_manager.go:851] "Failed to get status for pod" podUID="e48519c7-0cdc-419b-bd72-2bab0e911af8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.111001    4730 status_manager.go:851] "Failed to get status for pod" podUID="d4d20fab-86cc-44d8-a8b9-c60f6835c5e0" pod="openshift-route-controller-manager/route-controller-manager-b75b5f765-8wjw2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-b75b5f765-8wjw2\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.111150    4730 status_manager.go:851] "Failed to get status for pod" podUID="9d747680-5dde-4793-863a-252a5f67233a" pod="openshift-controller-manager/controller-manager-6f56868448-2fbxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f56868448-2fbxh\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.215768    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e48519c7-0cdc-419b-bd72-2bab0e911af8-kubelet-dir\") pod \"e48519c7-0cdc-419b-bd72-2bab0e911af8\" (UID: \"e48519c7-0cdc-419b-bd72-2bab0e911af8\") "
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.215951    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e48519c7-0cdc-419b-bd72-2bab0e911af8-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e48519c7-0cdc-419b-bd72-2bab0e911af8" (UID: "e48519c7-0cdc-419b-bd72-2bab0e911af8"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue ""
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.216027    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e48519c7-0cdc-419b-bd72-2bab0e911af8-var-lock\") pod \"e48519c7-0cdc-419b-bd72-2bab0e911af8\" (UID: \"e48519c7-0cdc-419b-bd72-2bab0e911af8\") "
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.216110    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e48519c7-0cdc-419b-bd72-2bab0e911af8-var-lock" (OuterVolumeSpecName: "var-lock") pod "e48519c7-0cdc-419b-bd72-2bab0e911af8" (UID: "e48519c7-0cdc-419b-bd72-2bab0e911af8"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue ""
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.216113    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdc6x\" (UniqueName: \"kubernetes.io/projected/3f093381-3bf4-49ff-beb4-f44aa012c521-kube-api-access-kdc6x\") pod \"3f093381-3bf4-49ff-beb4-f44aa012c521\" (UID: \"3f093381-3bf4-49ff-beb4-f44aa012c521\") "
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.216333    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e48519c7-0cdc-419b-bd72-2bab0e911af8-kube-api-access\") pod \"e48519c7-0cdc-419b-bd72-2bab0e911af8\" (UID: \"e48519c7-0cdc-419b-bd72-2bab0e911af8\") "
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.216902    4730 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e48519c7-0cdc-419b-bd72-2bab0e911af8-var-lock\") on node \"crc\" DevicePath \"\""
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.216933    4730 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e48519c7-0cdc-419b-bd72-2bab0e911af8-kubelet-dir\") on node \"crc\" DevicePath \"\""
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.222052    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e48519c7-0cdc-419b-bd72-2bab0e911af8-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e48519c7-0cdc-419b-bd72-2bab0e911af8" (UID: "e48519c7-0cdc-419b-bd72-2bab0e911af8"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.223129    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f093381-3bf4-49ff-beb4-f44aa012c521-kube-api-access-kdc6x" (OuterVolumeSpecName: "kube-api-access-kdc6x") pod "3f093381-3bf4-49ff-beb4-f44aa012c521" (UID: "3f093381-3bf4-49ff-beb4-f44aa012c521"). InnerVolumeSpecName "kube-api-access-kdc6x". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.318088    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e48519c7-0cdc-419b-bd72-2bab0e911af8-kube-api-access\") on node \"crc\" DevicePath \"\""
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.318128    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdc6x\" (UniqueName: \"kubernetes.io/projected/3f093381-3bf4-49ff-beb4-f44aa012c521-kube-api-access-kdc6x\") on node \"crc\" DevicePath \"\""
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.540537    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes"
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.801955    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc"
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.802012    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e48519c7-0cdc-419b-bd72-2bab0e911af8","Type":"ContainerDied","Data":"5af4b462745de074ee5968bfbd84bbf7129ced0fcb060ef525aacd425c95e3c1"}
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.802676    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5af4b462745de074ee5968bfbd84bbf7129ced0fcb060ef525aacd425c95e3c1"
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.804727    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567024-s2r9c" event={"ID":"3f093381-3bf4-49ff-beb4-f44aa012c521","Type":"ContainerDied","Data":"ae9eb78df15bf57e2c0e8bf27ad609713813b7d995be50840b72aa366815e8fc"}
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.804760    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae9eb78df15bf57e2c0e8bf27ad609713813b7d995be50840b72aa366815e8fc"
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.804818    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567024-s2r9c"
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.807795    4730 status_manager.go:851] "Failed to get status for pod" podUID="e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98" pod="openshift-marketplace/certified-operators-rlnqc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-rlnqc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.808086    4730 status_manager.go:851] "Failed to get status for pod" podUID="3f093381-3bf4-49ff-beb4-f44aa012c521" pod="openshift-infra/auto-csr-approver-29567024-s2r9c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29567024-s2r9c\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.808411    4730 status_manager.go:851] "Failed to get status for pod" podUID="e48519c7-0cdc-419b-bd72-2bab0e911af8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.808620    4730 status_manager.go:851] "Failed to get status for pod" podUID="9d747680-5dde-4793-863a-252a5f67233a" pod="openshift-controller-manager/controller-manager-6f56868448-2fbxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f56868448-2fbxh\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.808907    4730 status_manager.go:851] "Failed to get status for pod" podUID="d4d20fab-86cc-44d8-a8b9-c60f6835c5e0" pod="openshift-route-controller-manager/route-controller-manager-b75b5f765-8wjw2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-b75b5f765-8wjw2\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.809158    4730 status_manager.go:851] "Failed to get status for pod" podUID="715cbff8-9674-4896-8deb-54a6e9a8899e" pod="openshift-marketplace/redhat-marketplace-2z2hv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2z2hv\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.809371    4730 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.809677    4730 status_manager.go:851] "Failed to get status for pod" podUID="e48519c7-0cdc-419b-bd72-2bab0e911af8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.809906    4730 status_manager.go:851] "Failed to get status for pod" podUID="9d747680-5dde-4793-863a-252a5f67233a" pod="openshift-controller-manager/controller-manager-6f56868448-2fbxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f56868448-2fbxh\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.810126    4730 status_manager.go:851] "Failed to get status for pod" podUID="d4d20fab-86cc-44d8-a8b9-c60f6835c5e0" pod="openshift-route-controller-manager/route-controller-manager-b75b5f765-8wjw2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-b75b5f765-8wjw2\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.810339    4730 status_manager.go:851] "Failed to get status for pod" podUID="715cbff8-9674-4896-8deb-54a6e9a8899e" pod="openshift-marketplace/redhat-marketplace-2z2hv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2z2hv\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.810506    4730 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.810760    4730 status_manager.go:851] "Failed to get status for pod" podUID="e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98" pod="openshift-marketplace/certified-operators-rlnqc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-rlnqc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:09 crc kubenswrapper[4730]: I0320 15:44:09.811001    4730 status_manager.go:851] "Failed to get status for pod" podUID="3f093381-3bf4-49ff-beb4-f44aa012c521" pod="openshift-infra/auto-csr-approver-29567024-s2r9c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29567024-s2r9c\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:10 crc kubenswrapper[4730]: I0320 15:44:10.542009    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8rptq"
Mar 20 15:44:10 crc kubenswrapper[4730]: I0320 15:44:10.542617    4730 status_manager.go:851] "Failed to get status for pod" podUID="715cbff8-9674-4896-8deb-54a6e9a8899e" pod="openshift-marketplace/redhat-marketplace-2z2hv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2z2hv\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:10 crc kubenswrapper[4730]: I0320 15:44:10.543221    4730 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:10 crc kubenswrapper[4730]: I0320 15:44:10.543616    4730 status_manager.go:851] "Failed to get status for pod" podUID="e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98" pod="openshift-marketplace/certified-operators-rlnqc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-rlnqc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:10 crc kubenswrapper[4730]: I0320 15:44:10.544022    4730 status_manager.go:851] "Failed to get status for pod" podUID="3f093381-3bf4-49ff-beb4-f44aa012c521" pod="openshift-infra/auto-csr-approver-29567024-s2r9c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29567024-s2r9c\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:10 crc kubenswrapper[4730]: I0320 15:44:10.544209    4730 status_manager.go:851] "Failed to get status for pod" podUID="e48519c7-0cdc-419b-bd72-2bab0e911af8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:10 crc kubenswrapper[4730]: I0320 15:44:10.544396    4730 status_manager.go:851] "Failed to get status for pod" podUID="558b00fd-2589-4842-8cba-db0cffe8c826" pod="openshift-marketplace/redhat-operators-8rptq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-8rptq\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:10 crc kubenswrapper[4730]: I0320 15:44:10.544557    4730 status_manager.go:851] "Failed to get status for pod" podUID="9d747680-5dde-4793-863a-252a5f67233a" pod="openshift-controller-manager/controller-manager-6f56868448-2fbxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f56868448-2fbxh\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:10 crc kubenswrapper[4730]: I0320 15:44:10.544716    4730 status_manager.go:851] "Failed to get status for pod" podUID="d4d20fab-86cc-44d8-a8b9-c60f6835c5e0" pod="openshift-route-controller-manager/route-controller-manager-b75b5f765-8wjw2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-b75b5f765-8wjw2\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:10 crc kubenswrapper[4730]: I0320 15:44:10.579183    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8rptq"
Mar 20 15:44:10 crc kubenswrapper[4730]: I0320 15:44:10.579889    4730 status_manager.go:851] "Failed to get status for pod" podUID="e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98" pod="openshift-marketplace/certified-operators-rlnqc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-rlnqc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:10 crc kubenswrapper[4730]: I0320 15:44:10.580342    4730 status_manager.go:851] "Failed to get status for pod" podUID="3f093381-3bf4-49ff-beb4-f44aa012c521" pod="openshift-infra/auto-csr-approver-29567024-s2r9c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29567024-s2r9c\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:10 crc kubenswrapper[4730]: I0320 15:44:10.580593    4730 status_manager.go:851] "Failed to get status for pod" podUID="e48519c7-0cdc-419b-bd72-2bab0e911af8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:10 crc kubenswrapper[4730]: I0320 15:44:10.580832    4730 status_manager.go:851] "Failed to get status for pod" podUID="558b00fd-2589-4842-8cba-db0cffe8c826" pod="openshift-marketplace/redhat-operators-8rptq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-8rptq\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:10 crc kubenswrapper[4730]: I0320 15:44:10.581067    4730 status_manager.go:851] "Failed to get status for pod" podUID="9d747680-5dde-4793-863a-252a5f67233a" pod="openshift-controller-manager/controller-manager-6f56868448-2fbxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f56868448-2fbxh\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:10 crc kubenswrapper[4730]: I0320 15:44:10.581329    4730 status_manager.go:851] "Failed to get status for pod" podUID="d4d20fab-86cc-44d8-a8b9-c60f6835c5e0" pod="openshift-route-controller-manager/route-controller-manager-b75b5f765-8wjw2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-b75b5f765-8wjw2\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:10 crc kubenswrapper[4730]: I0320 15:44:10.581539    4730 status_manager.go:851] "Failed to get status for pod" podUID="715cbff8-9674-4896-8deb-54a6e9a8899e" pod="openshift-marketplace/redhat-marketplace-2z2hv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2z2hv\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:10 crc kubenswrapper[4730]: I0320 15:44:10.581741    4730 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:11 crc kubenswrapper[4730]: I0320 15:44:11.009162    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qmxvf"
Mar 20 15:44:11 crc kubenswrapper[4730]: I0320 15:44:11.009778    4730 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:11 crc kubenswrapper[4730]: I0320 15:44:11.010165    4730 status_manager.go:851] "Failed to get status for pod" podUID="e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98" pod="openshift-marketplace/certified-operators-rlnqc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-rlnqc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:11 crc kubenswrapper[4730]: I0320 15:44:11.010468    4730 status_manager.go:851] "Failed to get status for pod" podUID="3f093381-3bf4-49ff-beb4-f44aa012c521" pod="openshift-infra/auto-csr-approver-29567024-s2r9c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29567024-s2r9c\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:11 crc kubenswrapper[4730]: I0320 15:44:11.010714    4730 status_manager.go:851] "Failed to get status for pod" podUID="ab6c90a0-1bc1-476d-8526-d1fe438163e3" pod="openshift-marketplace/redhat-operators-qmxvf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qmxvf\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:11 crc kubenswrapper[4730]: I0320 15:44:11.010977    4730 status_manager.go:851] "Failed to get status for pod" podUID="e48519c7-0cdc-419b-bd72-2bab0e911af8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:11 crc kubenswrapper[4730]: I0320 15:44:11.011208    4730 status_manager.go:851] "Failed to get status for pod" podUID="558b00fd-2589-4842-8cba-db0cffe8c826" pod="openshift-marketplace/redhat-operators-8rptq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-8rptq\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:11 crc kubenswrapper[4730]: I0320 15:44:11.011459    4730 status_manager.go:851] "Failed to get status for pod" podUID="9d747680-5dde-4793-863a-252a5f67233a" pod="openshift-controller-manager/controller-manager-6f56868448-2fbxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f56868448-2fbxh\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:11 crc kubenswrapper[4730]: I0320 15:44:11.011781    4730 status_manager.go:851] "Failed to get status for pod" podUID="d4d20fab-86cc-44d8-a8b9-c60f6835c5e0" pod="openshift-route-controller-manager/route-controller-manager-b75b5f765-8wjw2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-b75b5f765-8wjw2\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:11 crc kubenswrapper[4730]: I0320 15:44:11.012172    4730 status_manager.go:851] "Failed to get status for pod" podUID="715cbff8-9674-4896-8deb-54a6e9a8899e" pod="openshift-marketplace/redhat-marketplace-2z2hv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2z2hv\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:11 crc kubenswrapper[4730]: I0320 15:44:11.046852    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qmxvf"
Mar 20 15:44:11 crc kubenswrapper[4730]: I0320 15:44:11.047656    4730 status_manager.go:851] "Failed to get status for pod" podUID="3f093381-3bf4-49ff-beb4-f44aa012c521" pod="openshift-infra/auto-csr-approver-29567024-s2r9c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29567024-s2r9c\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:11 crc kubenswrapper[4730]: I0320 15:44:11.048027    4730 status_manager.go:851] "Failed to get status for pod" podUID="e48519c7-0cdc-419b-bd72-2bab0e911af8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:11 crc kubenswrapper[4730]: I0320 15:44:11.048553    4730 status_manager.go:851] "Failed to get status for pod" podUID="ab6c90a0-1bc1-476d-8526-d1fe438163e3" pod="openshift-marketplace/redhat-operators-qmxvf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qmxvf\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:11 crc kubenswrapper[4730]: I0320 15:44:11.048877    4730 status_manager.go:851] "Failed to get status for pod" podUID="558b00fd-2589-4842-8cba-db0cffe8c826" pod="openshift-marketplace/redhat-operators-8rptq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-8rptq\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:11 crc kubenswrapper[4730]: I0320 15:44:11.049219    4730 status_manager.go:851] "Failed to get status for pod" podUID="9d747680-5dde-4793-863a-252a5f67233a" pod="openshift-controller-manager/controller-manager-6f56868448-2fbxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f56868448-2fbxh\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:11 crc kubenswrapper[4730]: I0320 15:44:11.049502    4730 status_manager.go:851] "Failed to get status for pod" podUID="d4d20fab-86cc-44d8-a8b9-c60f6835c5e0" pod="openshift-route-controller-manager/route-controller-manager-b75b5f765-8wjw2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-b75b5f765-8wjw2\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:11 crc kubenswrapper[4730]: I0320 15:44:11.049754    4730 status_manager.go:851] "Failed to get status for pod" podUID="715cbff8-9674-4896-8deb-54a6e9a8899e" pod="openshift-marketplace/redhat-marketplace-2z2hv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2z2hv\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:11 crc kubenswrapper[4730]: I0320 15:44:11.049937    4730 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:11 crc kubenswrapper[4730]: I0320 15:44:11.050192    4730 status_manager.go:851] "Failed to get status for pod" podUID="e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98" pod="openshift-marketplace/certified-operators-rlnqc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-rlnqc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:11 crc kubenswrapper[4730]: I0320 15:44:11.989229    4730 status_manager.go:851] "Failed to get status for pod" podUID="715cbff8-9674-4896-8deb-54a6e9a8899e" pod="openshift-marketplace/redhat-marketplace-2z2hv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2z2hv\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:11 crc kubenswrapper[4730]: I0320 15:44:11.993871    4730 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:11 crc kubenswrapper[4730]: I0320 15:44:11.994167    4730 status_manager.go:851] "Failed to get status for pod" podUID="e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98" pod="openshift-marketplace/certified-operators-rlnqc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-rlnqc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:11 crc kubenswrapper[4730]: I0320 15:44:11.994401    4730 status_manager.go:851] "Failed to get status for pod" podUID="3f093381-3bf4-49ff-beb4-f44aa012c521" pod="openshift-infra/auto-csr-approver-29567024-s2r9c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29567024-s2r9c\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:11 crc kubenswrapper[4730]: I0320 15:44:11.994614    4730 status_manager.go:851] "Failed to get status for pod" podUID="e48519c7-0cdc-419b-bd72-2bab0e911af8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:11 crc kubenswrapper[4730]: I0320 15:44:11.994930    4730 status_manager.go:851] "Failed to get status for pod" podUID="ab6c90a0-1bc1-476d-8526-d1fe438163e3" pod="openshift-marketplace/redhat-operators-qmxvf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qmxvf\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:11 crc kubenswrapper[4730]: I0320 15:44:11.995473    4730 status_manager.go:851] "Failed to get status for pod" podUID="558b00fd-2589-4842-8cba-db0cffe8c826" pod="openshift-marketplace/redhat-operators-8rptq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-8rptq\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:12 crc kubenswrapper[4730]: I0320 15:44:12.002297    4730 status_manager.go:851] "Failed to get status for pod" podUID="9d747680-5dde-4793-863a-252a5f67233a" pod="openshift-controller-manager/controller-manager-6f56868448-2fbxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f56868448-2fbxh\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:12 crc kubenswrapper[4730]: I0320 15:44:12.002669    4730 status_manager.go:851] "Failed to get status for pod" podUID="d4d20fab-86cc-44d8-a8b9-c60f6835c5e0" pod="openshift-route-controller-manager/route-controller-manager-b75b5f765-8wjw2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-b75b5f765-8wjw2\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:12 crc kubenswrapper[4730]: I0320 15:44:12.879651    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 15:44:12 crc kubenswrapper[4730]: I0320 15:44:12.879725    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 15:44:12 crc kubenswrapper[4730]: I0320 15:44:12.879789    4730 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf"
Mar 20 15:44:12 crc kubenswrapper[4730]: I0320 15:44:12.880510    4730 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cd661cda796a2bd61d1446bee672c4471d60370245caf7cb54faf54dfa9c58a0"} pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted"
Mar 20 15:44:12 crc kubenswrapper[4730]: I0320 15:44:12.880568    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" containerID="cri-o://cd661cda796a2bd61d1446bee672c4471d60370245caf7cb54faf54dfa9c58a0" gracePeriod=600
Mar 20 15:44:13 crc kubenswrapper[4730]: I0320 15:44:13.008571    4730 generic.go:334] "Generic (PLEG): container finished" podID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerID="cd661cda796a2bd61d1446bee672c4471d60370245caf7cb54faf54dfa9c58a0" exitCode=0
Mar 20 15:44:13 crc kubenswrapper[4730]: I0320 15:44:13.008628    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" event={"ID":"7fcd3db3-55f1-4c23-8fa9-78844495cea3","Type":"ContainerDied","Data":"cd661cda796a2bd61d1446bee672c4471d60370245caf7cb54faf54dfa9c58a0"}
Mar 20 15:44:14 crc kubenswrapper[4730]: I0320 15:44:14.018339    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" event={"ID":"7fcd3db3-55f1-4c23-8fa9-78844495cea3","Type":"ContainerStarted","Data":"418b64bd31efa72e03b6036c281348bfc6e1d5be086f3887fe653df9e0316583"}
Mar 20 15:44:14 crc kubenswrapper[4730]: I0320 15:44:14.019075    4730 status_manager.go:851] "Failed to get status for pod" podUID="e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98" pod="openshift-marketplace/certified-operators-rlnqc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-rlnqc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:14 crc kubenswrapper[4730]: I0320 15:44:14.019505    4730 status_manager.go:851] "Failed to get status for pod" podUID="3f093381-3bf4-49ff-beb4-f44aa012c521" pod="openshift-infra/auto-csr-approver-29567024-s2r9c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29567024-s2r9c\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:14 crc kubenswrapper[4730]: I0320 15:44:14.019886    4730 status_manager.go:851] "Failed to get status for pod" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-p5qvf\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:14 crc kubenswrapper[4730]: I0320 15:44:14.020176    4730 status_manager.go:851] "Failed to get status for pod" podUID="ab6c90a0-1bc1-476d-8526-d1fe438163e3" pod="openshift-marketplace/redhat-operators-qmxvf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qmxvf\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:14 crc kubenswrapper[4730]: I0320 15:44:14.020449    4730 status_manager.go:851] "Failed to get status for pod" podUID="e48519c7-0cdc-419b-bd72-2bab0e911af8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:14 crc kubenswrapper[4730]: I0320 15:44:14.020757    4730 status_manager.go:851] "Failed to get status for pod" podUID="558b00fd-2589-4842-8cba-db0cffe8c826" pod="openshift-marketplace/redhat-operators-8rptq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-8rptq\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:14 crc kubenswrapper[4730]: I0320 15:44:14.021003    4730 status_manager.go:851] "Failed to get status for pod" podUID="9d747680-5dde-4793-863a-252a5f67233a" pod="openshift-controller-manager/controller-manager-6f56868448-2fbxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f56868448-2fbxh\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:14 crc kubenswrapper[4730]: I0320 15:44:14.021261    4730 status_manager.go:851] "Failed to get status for pod" podUID="d4d20fab-86cc-44d8-a8b9-c60f6835c5e0" pod="openshift-route-controller-manager/route-controller-manager-b75b5f765-8wjw2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-b75b5f765-8wjw2\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:14 crc kubenswrapper[4730]: I0320 15:44:14.021616    4730 status_manager.go:851] "Failed to get status for pod" podUID="715cbff8-9674-4896-8deb-54a6e9a8899e" pod="openshift-marketplace/redhat-marketplace-2z2hv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2z2hv\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:14 crc kubenswrapper[4730]: I0320 15:44:14.021857    4730 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:15 crc kubenswrapper[4730]: E0320 15:44:15.939498    4730 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:15 crc kubenswrapper[4730]: E0320 15:44:15.939953    4730 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:15 crc kubenswrapper[4730]: E0320 15:44:15.940603    4730 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:15 crc kubenswrapper[4730]: E0320 15:44:15.940974    4730 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:15 crc kubenswrapper[4730]: E0320 15:44:15.941460    4730 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:15 crc kubenswrapper[4730]: I0320 15:44:15.941506    4730 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease"
Mar 20 15:44:15 crc kubenswrapper[4730]: E0320 15:44:15.941974    4730 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused" interval="200ms"
Mar 20 15:44:16 crc kubenswrapper[4730]: E0320 15:44:16.142675    4730 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused" interval="400ms"
Mar 20 15:44:16 crc kubenswrapper[4730]: E0320 15:44:16.544300    4730 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused" interval="800ms"
Mar 20 15:44:17 crc kubenswrapper[4730]: E0320 15:44:17.345081    4730 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused" interval="1.6s"
Mar 20 15:44:17 crc kubenswrapper[4730]: E0320 15:44:17.555942    4730 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.162:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6" volumeName="registry-storage"
Mar 20 15:44:17 crc kubenswrapper[4730]: E0320 15:44:17.699625    4730 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/events\": dial tcp 38.102.83.162:6443: connect: connection refused" event=<
Mar 20 15:44:17 crc kubenswrapper[4730]:         &Event{ObjectMeta:{controller-manager-6f56868448-2fbxh.189e971b4120d4fc  openshift-controller-manager    0 0001-01-01 00:00:00 +0000 UTC   map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-controller-manager,Name:controller-manager-6f56868448-2fbxh,UID:9d747680-5dde-4793-863a-252a5f67233a,APIVersion:v1,ResourceVersion:29368,FieldPath:spec.containers{controller-manager},},Reason:ProbeError,Message:Readiness probe error: Get "https://10.217.0.61:8443/healthz": dial tcp 10.217.0.61:8443: connect: connection refused
Mar 20 15:44:17 crc kubenswrapper[4730]:         body:
Mar 20 15:44:17 crc kubenswrapper[4730]:         ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 15:44:06.543135996 +0000 UTC m=+305.756507365,LastTimestamp:2026-03-20 15:44:06.543135996 +0000 UTC m=+305.756507365,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}
Mar 20 15:44:17 crc kubenswrapper[4730]:  >
Mar 20 15:44:18 crc kubenswrapper[4730]: E0320 15:44:18.640724    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:44:18Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:44:18Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:44:18Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T15:44:18Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4869c69128f74d9c3b178ea6c8c8d38df169b6bce05eb821a65f0aaf514c563a\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:f3c2ad90e251062165f8d6623ca4994c0b3e28324e4b5b17fd588b162ec97766\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1746912226},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:ce2b0bfbec08802afec185b6eebec59a5a016291cad2c3515b0d06af6c34fde3\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:d5656ec3f5691d96c2da35350810e5bd700559213851b28fc3523a059efce76f\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1252734685},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:898c67bf7fc973e99114f3148976a6c21ae0dbe413051415588fa9b995f5b331\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:a641939d2096609a4cf6eec872a1476b7c671bfd81cffc2edeb6e9f13c9deeba\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1231028434},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:422eef49f9b56aaa481c870199db8b853dba5d36f00adcc19d22a6960345f1cc\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:667c2f632cea73b8b5354e1fbd365169f285c9b8460c5e81f63967a72b8f90e8\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1223676630},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-cli@sha256:69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9\\\",\\\"registry.redhat.io/openshift4/ose-cli@sha256:ef83967297f619f45075e7fd1428a1eb981622a6c174c46fb53b158ed24bed85\\\",\\\"registry.redhat.io/openshift4/ose-cli:latest\\\"],\\\"sizeBytes\\\":584351326},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:18 crc kubenswrapper[4730]: E0320 15:44:18.641138    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:18 crc kubenswrapper[4730]: E0320 15:44:18.641383    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:18 crc kubenswrapper[4730]: E0320 15:44:18.641553    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:18 crc kubenswrapper[4730]: E0320 15:44:18.641896    4730 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:18 crc kubenswrapper[4730]: E0320 15:44:18.641926    4730 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count"
Mar 20 15:44:18 crc kubenswrapper[4730]: E0320 15:44:18.945828    4730 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.162:6443: connect: connection refused" interval="3.2s"
Mar 20 15:44:19 crc kubenswrapper[4730]: I0320 15:44:19.533094    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc"
Mar 20 15:44:19 crc kubenswrapper[4730]: I0320 15:44:19.534801    4730 status_manager.go:851] "Failed to get status for pod" podUID="e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98" pod="openshift-marketplace/certified-operators-rlnqc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-rlnqc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:19 crc kubenswrapper[4730]: I0320 15:44:19.535527    4730 status_manager.go:851] "Failed to get status for pod" podUID="3f093381-3bf4-49ff-beb4-f44aa012c521" pod="openshift-infra/auto-csr-approver-29567024-s2r9c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29567024-s2r9c\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:19 crc kubenswrapper[4730]: I0320 15:44:19.535949    4730 status_manager.go:851] "Failed to get status for pod" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-p5qvf\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:19 crc kubenswrapper[4730]: I0320 15:44:19.536422    4730 status_manager.go:851] "Failed to get status for pod" podUID="ab6c90a0-1bc1-476d-8526-d1fe438163e3" pod="openshift-marketplace/redhat-operators-qmxvf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qmxvf\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:19 crc kubenswrapper[4730]: I0320 15:44:19.536821    4730 status_manager.go:851] "Failed to get status for pod" podUID="e48519c7-0cdc-419b-bd72-2bab0e911af8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:19 crc kubenswrapper[4730]: I0320 15:44:19.537302    4730 status_manager.go:851] "Failed to get status for pod" podUID="558b00fd-2589-4842-8cba-db0cffe8c826" pod="openshift-marketplace/redhat-operators-8rptq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-8rptq\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:19 crc kubenswrapper[4730]: I0320 15:44:19.537715    4730 status_manager.go:851] "Failed to get status for pod" podUID="9d747680-5dde-4793-863a-252a5f67233a" pod="openshift-controller-manager/controller-manager-6f56868448-2fbxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f56868448-2fbxh\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:19 crc kubenswrapper[4730]: I0320 15:44:19.538133    4730 status_manager.go:851] "Failed to get status for pod" podUID="d4d20fab-86cc-44d8-a8b9-c60f6835c5e0" pod="openshift-route-controller-manager/route-controller-manager-b75b5f765-8wjw2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-b75b5f765-8wjw2\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:19 crc kubenswrapper[4730]: I0320 15:44:19.538617    4730 status_manager.go:851] "Failed to get status for pod" podUID="715cbff8-9674-4896-8deb-54a6e9a8899e" pod="openshift-marketplace/redhat-marketplace-2z2hv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2z2hv\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:19 crc kubenswrapper[4730]: I0320 15:44:19.539672    4730 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:19 crc kubenswrapper[4730]: I0320 15:44:19.554340    4730 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d5c3fe2f-3c67-4dee-becb-3ecfe2758384"
Mar 20 15:44:19 crc kubenswrapper[4730]: I0320 15:44:19.554402    4730 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d5c3fe2f-3c67-4dee-becb-3ecfe2758384"
Mar 20 15:44:19 crc kubenswrapper[4730]: E0320 15:44:19.555070    4730 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.162:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc"
Mar 20 15:44:19 crc kubenswrapper[4730]: I0320 15:44:19.555874    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc"
Mar 20 15:44:19 crc kubenswrapper[4730]: W0320 15:44:19.587110    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-423757f9018398c0322c14eda69c0767c93dd2116d3b88327e7c8f1d21a1bb58 WatchSource:0}: Error finding container 423757f9018398c0322c14eda69c0767c93dd2116d3b88327e7c8f1d21a1bb58: Status 404 returned error can't find the container with id 423757f9018398c0322c14eda69c0767c93dd2116d3b88327e7c8f1d21a1bb58
Mar 20 15:44:20 crc kubenswrapper[4730]: I0320 15:44:20.049498    4730 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="773693b0b72251a6e1ea08a21557ab2598e497198dbc99a6932258ce18180350" exitCode=0
Mar 20 15:44:20 crc kubenswrapper[4730]: I0320 15:44:20.049580    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"773693b0b72251a6e1ea08a21557ab2598e497198dbc99a6932258ce18180350"}
Mar 20 15:44:20 crc kubenswrapper[4730]: I0320 15:44:20.049847    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"423757f9018398c0322c14eda69c0767c93dd2116d3b88327e7c8f1d21a1bb58"}
Mar 20 15:44:20 crc kubenswrapper[4730]: I0320 15:44:20.050188    4730 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d5c3fe2f-3c67-4dee-becb-3ecfe2758384"
Mar 20 15:44:20 crc kubenswrapper[4730]: I0320 15:44:20.050205    4730 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d5c3fe2f-3c67-4dee-becb-3ecfe2758384"
Mar 20 15:44:20 crc kubenswrapper[4730]: I0320 15:44:20.050648    4730 status_manager.go:851] "Failed to get status for pod" podUID="715cbff8-9674-4896-8deb-54a6e9a8899e" pod="openshift-marketplace/redhat-marketplace-2z2hv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-2z2hv\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:20 crc kubenswrapper[4730]: E0320 15:44:20.050824    4730 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.162:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc"
Mar 20 15:44:20 crc kubenswrapper[4730]: I0320 15:44:20.050970    4730 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:20 crc kubenswrapper[4730]: I0320 15:44:20.051416    4730 status_manager.go:851] "Failed to get status for pod" podUID="e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98" pod="openshift-marketplace/certified-operators-rlnqc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-rlnqc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:20 crc kubenswrapper[4730]: I0320 15:44:20.052063    4730 status_manager.go:851] "Failed to get status for pod" podUID="3f093381-3bf4-49ff-beb4-f44aa012c521" pod="openshift-infra/auto-csr-approver-29567024-s2r9c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29567024-s2r9c\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:20 crc kubenswrapper[4730]: I0320 15:44:20.052369    4730 status_manager.go:851] "Failed to get status for pod" podUID="ab6c90a0-1bc1-476d-8526-d1fe438163e3" pod="openshift-marketplace/redhat-operators-qmxvf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-qmxvf\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:20 crc kubenswrapper[4730]: I0320 15:44:20.052766    4730 status_manager.go:851] "Failed to get status for pod" podUID="e48519c7-0cdc-419b-bd72-2bab0e911af8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:20 crc kubenswrapper[4730]: I0320 15:44:20.053367    4730 status_manager.go:851] "Failed to get status for pod" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-p5qvf\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:20 crc kubenswrapper[4730]: I0320 15:44:20.053724    4730 status_manager.go:851] "Failed to get status for pod" podUID="558b00fd-2589-4842-8cba-db0cffe8c826" pod="openshift-marketplace/redhat-operators-8rptq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-8rptq\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:20 crc kubenswrapper[4730]: I0320 15:44:20.054155    4730 status_manager.go:851] "Failed to get status for pod" podUID="9d747680-5dde-4793-863a-252a5f67233a" pod="openshift-controller-manager/controller-manager-6f56868448-2fbxh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f56868448-2fbxh\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:20 crc kubenswrapper[4730]: I0320 15:44:20.054452    4730 status_manager.go:851] "Failed to get status for pod" podUID="d4d20fab-86cc-44d8-a8b9-c60f6835c5e0" pod="openshift-route-controller-manager/route-controller-manager-b75b5f765-8wjw2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-b75b5f765-8wjw2\": dial tcp 38.102.83.162:6443: connect: connection refused"
Mar 20 15:44:20 crc kubenswrapper[4730]: I0320 15:44:20.890024    4730 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Liveness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body=
Mar 20 15:44:20 crc kubenswrapper[4730]: I0320 15:44:20.890084    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused"
Mar 20 15:44:21 crc kubenswrapper[4730]: I0320 15:44:21.058468    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log"
Mar 20 15:44:21 crc kubenswrapper[4730]: I0320 15:44:21.059022    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log"
Mar 20 15:44:21 crc kubenswrapper[4730]: I0320 15:44:21.059067    4730 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="1aee2dcf43ecf6df4a1615aa6e468921053ccb529d3c6dbc2c2ad641e264e606" exitCode=1
Mar 20 15:44:21 crc kubenswrapper[4730]: I0320 15:44:21.059100    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"1aee2dcf43ecf6df4a1615aa6e468921053ccb529d3c6dbc2c2ad641e264e606"}
Mar 20 15:44:21 crc kubenswrapper[4730]: I0320 15:44:21.059852    4730 scope.go:117] "RemoveContainer" containerID="1aee2dcf43ecf6df4a1615aa6e468921053ccb529d3c6dbc2c2ad641e264e606"
Mar 20 15:44:21 crc kubenswrapper[4730]: I0320 15:44:21.062620    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b7515709c29b59886b5f3c0e0fd7f770aabc4ca181d727a6ffd015554050aa13"}
Mar 20 15:44:21 crc kubenswrapper[4730]: I0320 15:44:21.062648    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"babadf9682b7a455a0d01b3d1233fc95ac577d6dae2dae96af4cdb0a6f9b912f"}
Mar 20 15:44:21 crc kubenswrapper[4730]: I0320 15:44:21.062657    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2129bfdd6bb9819f82e5914936de1b5eb2b0d8d4592165d70f6b0017ad7598f4"}
Mar 20 15:44:21 crc kubenswrapper[4730]: I0320 15:44:21.062667    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"60fd03149c83fb8a865b1433e87a2b82d3b27fda19034cc39ecc33663eb285bf"}
Mar 20 15:44:21 crc kubenswrapper[4730]: I0320 15:44:21.062677    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f854d225e50ffbb4730722c7c45e64b1732db4cfc74820d3efd2e24b32d69b42"}
Mar 20 15:44:21 crc kubenswrapper[4730]: I0320 15:44:21.062879    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc"
Mar 20 15:44:21 crc kubenswrapper[4730]: I0320 15:44:21.062884    4730 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d5c3fe2f-3c67-4dee-becb-3ecfe2758384"
Mar 20 15:44:21 crc kubenswrapper[4730]: I0320 15:44:21.062904    4730 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d5c3fe2f-3c67-4dee-becb-3ecfe2758384"
Mar 20 15:44:22 crc kubenswrapper[4730]: I0320 15:44:22.101904    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log"
Mar 20 15:44:22 crc kubenswrapper[4730]: I0320 15:44:22.103070    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log"
Mar 20 15:44:22 crc kubenswrapper[4730]: I0320 15:44:22.103107    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e1f2bc557068d23d31fb94ba0b8755d440a6066e6a0d9e74613e4436d64e826e"}
Mar 20 15:44:22 crc kubenswrapper[4730]: I0320 15:44:22.158512    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc"
Mar 20 15:44:22 crc kubenswrapper[4730]: I0320 15:44:22.158604    4730 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body=
Mar 20 15:44:22 crc kubenswrapper[4730]: I0320 15:44:22.158638    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused"
Mar 20 15:44:22 crc kubenswrapper[4730]: I0320 15:44:22.660867    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:44:22 crc kubenswrapper[4730]: I0320 15:44:22.661049    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:44:22 crc kubenswrapper[4730]: I0320 15:44:22.661086    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:44:22 crc kubenswrapper[4730]: I0320 15:44:22.661123    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:44:22 crc kubenswrapper[4730]: I0320 15:44:22.868888    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a-metrics-certs\") pod \"network-metrics-daemon-2prfn\" (UID: \"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\") " pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:44:23 crc kubenswrapper[4730]: I0320 15:44:23.153276    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-st79s" podUID="2499559b-b31f-4dab-89a0-964964dc596e" containerName="oauth-openshift" containerID="cri-o://e1c21b159517c024d2850d533108f097e6b934c21c31f9514baab038d75a1db0" gracePeriod=15
Mar 20 15:44:23 crc kubenswrapper[4730]: I0320 15:44:23.506881    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-st79s"
Mar 20 15:44:23 crc kubenswrapper[4730]: E0320 15:44:23.661954    4730 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: failed to sync secret cache: timed out waiting for the condition
Mar 20 15:44:23 crc kubenswrapper[4730]: E0320 15:44:23.661998    4730 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: failed to sync configmap cache: timed out waiting for the condition
Mar 20 15:44:23 crc kubenswrapper[4730]: E0320 15:44:23.662047    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 15:46:25.66202742 +0000 UTC m=+444.875398789 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : failed to sync secret cache: timed out waiting for the condition
Mar 20 15:44:23 crc kubenswrapper[4730]: E0320 15:44:23.662075    4730 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition
Mar 20 15:44:23 crc kubenswrapper[4730]: E0320 15:44:23.662086    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 15:46:25.662063651 +0000 UTC m=+444.875435070 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : failed to sync configmap cache: timed out waiting for the condition
Mar 20 15:44:23 crc kubenswrapper[4730]: E0320 15:44:23.662138    4730 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition
Mar 20 15:44:23 crc kubenswrapper[4730]: I0320 15:44:23.663581    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt"
Mar 20 15:44:23 crc kubenswrapper[4730]: E0320 15:44:23.672767    4730 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: failed to sync configmap cache: timed out waiting for the condition
Mar 20 15:44:23 crc kubenswrapper[4730]: E0320 15:44:23.672820    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 15:46:25.672806911 +0000 UTC m=+444.886178280 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : failed to sync configmap cache: timed out waiting for the condition
Mar 20 15:44:23 crc kubenswrapper[4730]: E0320 15:44:23.672845    4730 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: failed to sync configmap cache: timed out waiting for the condition
Mar 20 15:44:23 crc kubenswrapper[4730]: E0320 15:44:23.672870    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 15:46:25.672864222 +0000 UTC m=+444.886235591 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : failed to sync configmap cache: timed out waiting for the condition
Mar 20 15:44:23 crc kubenswrapper[4730]: I0320 15:44:23.680368    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-system-ocp-branding-template\") pod \"2499559b-b31f-4dab-89a0-964964dc596e\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") "
Mar 20 15:44:23 crc kubenswrapper[4730]: I0320 15:44:23.680409    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2499559b-b31f-4dab-89a0-964964dc596e-audit-policies\") pod \"2499559b-b31f-4dab-89a0-964964dc596e\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") "
Mar 20 15:44:23 crc kubenswrapper[4730]: I0320 15:44:23.680441    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-system-serving-cert\") pod \"2499559b-b31f-4dab-89a0-964964dc596e\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") "
Mar 20 15:44:23 crc kubenswrapper[4730]: I0320 15:44:23.680479    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2499559b-b31f-4dab-89a0-964964dc596e-audit-dir\") pod \"2499559b-b31f-4dab-89a0-964964dc596e\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") "
Mar 20 15:44:23 crc kubenswrapper[4730]: I0320 15:44:23.680525    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-user-template-login\") pod \"2499559b-b31f-4dab-89a0-964964dc596e\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") "
Mar 20 15:44:23 crc kubenswrapper[4730]: I0320 15:44:23.680559    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-system-session\") pod \"2499559b-b31f-4dab-89a0-964964dc596e\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") "
Mar 20 15:44:23 crc kubenswrapper[4730]: I0320 15:44:23.680591    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-user-idp-0-file-data\") pod \"2499559b-b31f-4dab-89a0-964964dc596e\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") "
Mar 20 15:44:23 crc kubenswrapper[4730]: I0320 15:44:23.680602    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2499559b-b31f-4dab-89a0-964964dc596e-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "2499559b-b31f-4dab-89a0-964964dc596e" (UID: "2499559b-b31f-4dab-89a0-964964dc596e"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue ""
Mar 20 15:44:23 crc kubenswrapper[4730]: I0320 15:44:23.681362    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2499559b-b31f-4dab-89a0-964964dc596e-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "2499559b-b31f-4dab-89a0-964964dc596e" (UID: "2499559b-b31f-4dab-89a0-964964dc596e"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:44:23 crc kubenswrapper[4730]: I0320 15:44:23.681629    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-system-cliconfig\") pod \"2499559b-b31f-4dab-89a0-964964dc596e\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") "
Mar 20 15:44:23 crc kubenswrapper[4730]: I0320 15:44:23.681656    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-system-trusted-ca-bundle\") pod \"2499559b-b31f-4dab-89a0-964964dc596e\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") "
Mar 20 15:44:23 crc kubenswrapper[4730]: I0320 15:44:23.681700    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-user-template-error\") pod \"2499559b-b31f-4dab-89a0-964964dc596e\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") "
Mar 20 15:44:23 crc kubenswrapper[4730]: I0320 15:44:23.681732    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-user-template-provider-selection\") pod \"2499559b-b31f-4dab-89a0-964964dc596e\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") "
Mar 20 15:44:23 crc kubenswrapper[4730]: I0320 15:44:23.681761    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-system-service-ca\") pod \"2499559b-b31f-4dab-89a0-964964dc596e\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") "
Mar 20 15:44:23 crc kubenswrapper[4730]: I0320 15:44:23.681786    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-system-router-certs\") pod \"2499559b-b31f-4dab-89a0-964964dc596e\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") "
Mar 20 15:44:23 crc kubenswrapper[4730]: I0320 15:44:23.681813    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5z67\" (UniqueName: \"kubernetes.io/projected/2499559b-b31f-4dab-89a0-964964dc596e-kube-api-access-l5z67\") pod \"2499559b-b31f-4dab-89a0-964964dc596e\" (UID: \"2499559b-b31f-4dab-89a0-964964dc596e\") "
Mar 20 15:44:23 crc kubenswrapper[4730]: I0320 15:44:23.682078    4730 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2499559b-b31f-4dab-89a0-964964dc596e-audit-policies\") on node \"crc\" DevicePath \"\""
Mar 20 15:44:23 crc kubenswrapper[4730]: I0320 15:44:23.682101    4730 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2499559b-b31f-4dab-89a0-964964dc596e-audit-dir\") on node \"crc\" DevicePath \"\""
Mar 20 15:44:23 crc kubenswrapper[4730]: I0320 15:44:23.684837    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "2499559b-b31f-4dab-89a0-964964dc596e" (UID: "2499559b-b31f-4dab-89a0-964964dc596e"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:44:23 crc kubenswrapper[4730]: I0320 15:44:23.685434    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "2499559b-b31f-4dab-89a0-964964dc596e" (UID: "2499559b-b31f-4dab-89a0-964964dc596e"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:44:23 crc kubenswrapper[4730]: I0320 15:44:23.686392    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "2499559b-b31f-4dab-89a0-964964dc596e" (UID: "2499559b-b31f-4dab-89a0-964964dc596e"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:44:23 crc kubenswrapper[4730]: I0320 15:44:23.686700    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2499559b-b31f-4dab-89a0-964964dc596e-kube-api-access-l5z67" (OuterVolumeSpecName: "kube-api-access-l5z67") pod "2499559b-b31f-4dab-89a0-964964dc596e" (UID: "2499559b-b31f-4dab-89a0-964964dc596e"). InnerVolumeSpecName "kube-api-access-l5z67". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:44:23 crc kubenswrapper[4730]: I0320 15:44:23.687232    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "2499559b-b31f-4dab-89a0-964964dc596e" (UID: "2499559b-b31f-4dab-89a0-964964dc596e"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:44:23 crc kubenswrapper[4730]: I0320 15:44:23.687747    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "2499559b-b31f-4dab-89a0-964964dc596e" (UID: "2499559b-b31f-4dab-89a0-964964dc596e"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:44:23 crc kubenswrapper[4730]: I0320 15:44:23.687886    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "2499559b-b31f-4dab-89a0-964964dc596e" (UID: "2499559b-b31f-4dab-89a0-964964dc596e"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:44:23 crc kubenswrapper[4730]: I0320 15:44:23.688184    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "2499559b-b31f-4dab-89a0-964964dc596e" (UID: "2499559b-b31f-4dab-89a0-964964dc596e"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:44:23 crc kubenswrapper[4730]: I0320 15:44:23.688285    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "2499559b-b31f-4dab-89a0-964964dc596e" (UID: "2499559b-b31f-4dab-89a0-964964dc596e"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:44:23 crc kubenswrapper[4730]: I0320 15:44:23.688423    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "2499559b-b31f-4dab-89a0-964964dc596e" (UID: "2499559b-b31f-4dab-89a0-964964dc596e"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:44:23 crc kubenswrapper[4730]: I0320 15:44:23.688449    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "2499559b-b31f-4dab-89a0-964964dc596e" (UID: "2499559b-b31f-4dab-89a0-964964dc596e"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:44:23 crc kubenswrapper[4730]: I0320 15:44:23.688614    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "2499559b-b31f-4dab-89a0-964964dc596e" (UID: "2499559b-b31f-4dab-89a0-964964dc596e"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:44:23 crc kubenswrapper[4730]: I0320 15:44:23.782717    4730 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\""
Mar 20 15:44:23 crc kubenswrapper[4730]: I0320 15:44:23.782753    4730 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\""
Mar 20 15:44:23 crc kubenswrapper[4730]: I0320 15:44:23.782764    4730 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\""
Mar 20 15:44:23 crc kubenswrapper[4730]: I0320 15:44:23.782773    4730 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\""
Mar 20 15:44:23 crc kubenswrapper[4730]: I0320 15:44:23.782784    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5z67\" (UniqueName: \"kubernetes.io/projected/2499559b-b31f-4dab-89a0-964964dc596e-kube-api-access-l5z67\") on node \"crc\" DevicePath \"\""
Mar 20 15:44:23 crc kubenswrapper[4730]: I0320 15:44:23.782796    4730 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\""
Mar 20 15:44:23 crc kubenswrapper[4730]: I0320 15:44:23.782808    4730 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\""
Mar 20 15:44:23 crc kubenswrapper[4730]: I0320 15:44:23.782820    4730 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\""
Mar 20 15:44:23 crc kubenswrapper[4730]: I0320 15:44:23.782832    4730 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-system-session\") on node \"crc\" DevicePath \"\""
Mar 20 15:44:23 crc kubenswrapper[4730]: I0320 15:44:23.782842    4730 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\""
Mar 20 15:44:23 crc kubenswrapper[4730]: I0320 15:44:23.782850    4730 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 15:44:23 crc kubenswrapper[4730]: I0320 15:44:23.782860    4730 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2499559b-b31f-4dab-89a0-964964dc596e-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\""
Mar 20 15:44:23 crc kubenswrapper[4730]: E0320 15:44:23.869973    4730 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: failed to sync secret cache: timed out waiting for the condition
Mar 20 15:44:23 crc kubenswrapper[4730]: E0320 15:44:23.870055    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a-metrics-certs podName:db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a nodeName:}" failed. No retries permitted until 2026-03-20 15:46:25.870035976 +0000 UTC m=+445.083407335 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a-metrics-certs") pod "network-metrics-daemon-2prfn" (UID: "db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a") : failed to sync secret cache: timed out waiting for the condition
Mar 20 15:44:24 crc kubenswrapper[4730]: I0320 15:44:24.117186    4730 generic.go:334] "Generic (PLEG): container finished" podID="2499559b-b31f-4dab-89a0-964964dc596e" containerID="e1c21b159517c024d2850d533108f097e6b934c21c31f9514baab038d75a1db0" exitCode=0
Mar 20 15:44:24 crc kubenswrapper[4730]: I0320 15:44:24.117233    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-st79s" event={"ID":"2499559b-b31f-4dab-89a0-964964dc596e","Type":"ContainerDied","Data":"e1c21b159517c024d2850d533108f097e6b934c21c31f9514baab038d75a1db0"}
Mar 20 15:44:24 crc kubenswrapper[4730]: I0320 15:44:24.117259    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-st79s"
Mar 20 15:44:24 crc kubenswrapper[4730]: I0320 15:44:24.117285    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-st79s" event={"ID":"2499559b-b31f-4dab-89a0-964964dc596e","Type":"ContainerDied","Data":"40fd46e178b8ce8c75de09a0315b49f9c04961cf890e7392083f9a7a77124dd2"}
Mar 20 15:44:24 crc kubenswrapper[4730]: I0320 15:44:24.117307    4730 scope.go:117] "RemoveContainer" containerID="e1c21b159517c024d2850d533108f097e6b934c21c31f9514baab038d75a1db0"
Mar 20 15:44:24 crc kubenswrapper[4730]: I0320 15:44:24.135787    4730 scope.go:117] "RemoveContainer" containerID="e1c21b159517c024d2850d533108f097e6b934c21c31f9514baab038d75a1db0"
Mar 20 15:44:24 crc kubenswrapper[4730]: E0320 15:44:24.136264    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1c21b159517c024d2850d533108f097e6b934c21c31f9514baab038d75a1db0\": container with ID starting with e1c21b159517c024d2850d533108f097e6b934c21c31f9514baab038d75a1db0 not found: ID does not exist" containerID="e1c21b159517c024d2850d533108f097e6b934c21c31f9514baab038d75a1db0"
Mar 20 15:44:24 crc kubenswrapper[4730]: I0320 15:44:24.136296    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1c21b159517c024d2850d533108f097e6b934c21c31f9514baab038d75a1db0"} err="failed to get container status \"e1c21b159517c024d2850d533108f097e6b934c21c31f9514baab038d75a1db0\": rpc error: code = NotFound desc = could not find container \"e1c21b159517c024d2850d533108f097e6b934c21c31f9514baab038d75a1db0\": container with ID starting with e1c21b159517c024d2850d533108f097e6b934c21c31f9514baab038d75a1db0 not found: ID does not exist"
Mar 20 15:44:24 crc kubenswrapper[4730]: I0320 15:44:24.556568    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc"
Mar 20 15:44:24 crc kubenswrapper[4730]: I0320 15:44:24.556614    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc"
Mar 20 15:44:24 crc kubenswrapper[4730]: I0320 15:44:24.561365    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc"
Mar 20 15:44:26 crc kubenswrapper[4730]: I0320 15:44:26.880537    4730 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc"
Mar 20 15:44:27 crc kubenswrapper[4730]: I0320 15:44:27.135415    4730 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d5c3fe2f-3c67-4dee-becb-3ecfe2758384"
Mar 20 15:44:27 crc kubenswrapper[4730]: I0320 15:44:27.135454    4730 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d5c3fe2f-3c67-4dee-becb-3ecfe2758384"
Mar 20 15:44:27 crc kubenswrapper[4730]: I0320 15:44:27.139367    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc"
Mar 20 15:44:27 crc kubenswrapper[4730]: I0320 15:44:27.195338    4730 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="6aab3c27-7254-49fc-a52a-2f6ba7e3fced"
Mar 20 15:44:27 crc kubenswrapper[4730]: I0320 15:44:27.291146    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc"
Mar 20 15:44:27 crc kubenswrapper[4730]: I0320 15:44:27.664388    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt"
Mar 20 15:44:27 crc kubenswrapper[4730]: I0320 15:44:27.664707    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert"
Mar 20 15:44:27 crc kubenswrapper[4730]: I0320 15:44:27.665691    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin"
Mar 20 15:44:27 crc kubenswrapper[4730]: I0320 15:44:27.873481    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret"
Mar 20 15:44:28 crc kubenswrapper[4730]: I0320 15:44:28.138872    4730 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d5c3fe2f-3c67-4dee-becb-3ecfe2758384"
Mar 20 15:44:28 crc kubenswrapper[4730]: I0320 15:44:28.138910    4730 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d5c3fe2f-3c67-4dee-becb-3ecfe2758384"
Mar 20 15:44:28 crc kubenswrapper[4730]: I0320 15:44:28.142709    4730 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="6aab3c27-7254-49fc-a52a-2f6ba7e3fced"
Mar 20 15:44:32 crc kubenswrapper[4730]: I0320 15:44:32.119422    4730 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body=
Mar 20 15:44:32 crc kubenswrapper[4730]: I0320 15:44:32.119964    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused"
Mar 20 15:44:35 crc kubenswrapper[4730]: E0320 15:44:35.545820    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-cqllr], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447"
Mar 20 15:44:35 crc kubenswrapper[4730]: E0320 15:44:35.552366    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert nginx-conf], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8"
Mar 20 15:44:35 crc kubenswrapper[4730]: E0320 15:44:35.556779    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-s2dwl], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5"
Mar 20 15:44:36 crc kubenswrapper[4730]: E0320 15:44:36.544700    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-2prfn" podUID="db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a"
Mar 20 15:44:36 crc kubenswrapper[4730]: I0320 15:44:36.637678    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx"
Mar 20 15:44:37 crc kubenswrapper[4730]: I0320 15:44:37.360912    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets"
Mar 20 15:44:38 crc kubenswrapper[4730]: I0320 15:44:38.101480    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk"
Mar 20 15:44:38 crc kubenswrapper[4730]: I0320 15:44:38.117630    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config"
Mar 20 15:44:38 crc kubenswrapper[4730]: I0320 15:44:38.166861    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default"
Mar 20 15:44:38 crc kubenswrapper[4730]: I0320 15:44:38.215133    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt"
Mar 20 15:44:38 crc kubenswrapper[4730]: I0320 15:44:38.400597    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt"
Mar 20 15:44:38 crc kubenswrapper[4730]: I0320 15:44:38.829541    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config"
Mar 20 15:44:38 crc kubenswrapper[4730]: I0320 15:44:38.990990    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret"
Mar 20 15:44:39 crc kubenswrapper[4730]: I0320 15:44:39.397612    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy"
Mar 20 15:44:39 crc kubenswrapper[4730]: I0320 15:44:39.414988    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config"
Mar 20 15:44:39 crc kubenswrapper[4730]: I0320 15:44:39.455041    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client"
Mar 20 15:44:39 crc kubenswrapper[4730]: I0320 15:44:39.628732    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert"
Mar 20 15:44:39 crc kubenswrapper[4730]: I0320 15:44:39.705231    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt"
Mar 20 15:44:39 crc kubenswrapper[4730]: I0320 15:44:39.986078    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt"
Mar 20 15:44:40 crc kubenswrapper[4730]: I0320 15:44:40.057593    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt"
Mar 20 15:44:40 crc kubenswrapper[4730]: I0320 15:44:40.096368    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config"
Mar 20 15:44:40 crc kubenswrapper[4730]: I0320 15:44:40.152401    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7"
Mar 20 15:44:40 crc kubenswrapper[4730]: I0320 15:44:40.180605    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt"
Mar 20 15:44:40 crc kubenswrapper[4730]: I0320 15:44:40.225745    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb"
Mar 20 15:44:40 crc kubenswrapper[4730]: I0320 15:44:40.315893    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert"
Mar 20 15:44:40 crc kubenswrapper[4730]: I0320 15:44:40.397795    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt"
Mar 20 15:44:40 crc kubenswrapper[4730]: I0320 15:44:40.440590    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token"
Mar 20 15:44:40 crc kubenswrapper[4730]: I0320 15:44:40.561562    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt"
Mar 20 15:44:40 crc kubenswrapper[4730]: I0320 15:44:40.565798    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1"
Mar 20 15:44:40 crc kubenswrapper[4730]: I0320 15:44:40.694649    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl"
Mar 20 15:44:40 crc kubenswrapper[4730]: I0320 15:44:40.705602    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert"
Mar 20 15:44:40 crc kubenswrapper[4730]: I0320 15:44:40.740554    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw"
Mar 20 15:44:41 crc kubenswrapper[4730]: I0320 15:44:41.042636    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt"
Mar 20 15:44:41 crc kubenswrapper[4730]: I0320 15:44:41.105289    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d"
Mar 20 15:44:41 crc kubenswrapper[4730]: I0320 15:44:41.205399    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt"
Mar 20 15:44:41 crc kubenswrapper[4730]: I0320 15:44:41.213597    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert"
Mar 20 15:44:41 crc kubenswrapper[4730]: I0320 15:44:41.439279    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle"
Mar 20 15:44:41 crc kubenswrapper[4730]: I0320 15:44:41.486730    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt"
Mar 20 15:44:41 crc kubenswrapper[4730]: I0320 15:44:41.524984    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4"
Mar 20 15:44:41 crc kubenswrapper[4730]: I0320 15:44:41.550475    4730 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160
Mar 20 15:44:41 crc kubenswrapper[4730]: I0320 15:44:41.561834    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert"
Mar 20 15:44:41 crc kubenswrapper[4730]: I0320 15:44:41.588175    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert"
Mar 20 15:44:41 crc kubenswrapper[4730]: I0320 15:44:41.756848    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87"
Mar 20 15:44:41 crc kubenswrapper[4730]: I0320 15:44:41.776078    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert"
Mar 20 15:44:41 crc kubenswrapper[4730]: I0320 15:44:41.796389    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt"
Mar 20 15:44:41 crc kubenswrapper[4730]: I0320 15:44:41.835620    4730 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160
Mar 20 15:44:41 crc kubenswrapper[4730]: I0320 15:44:41.867582    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt"
Mar 20 15:44:41 crc kubenswrapper[4730]: I0320 15:44:41.885174    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd"
Mar 20 15:44:42 crc kubenswrapper[4730]: I0320 15:44:42.120127    4730 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body=
Mar 20 15:44:42 crc kubenswrapper[4730]: I0320 15:44:42.120183    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused"
Mar 20 15:44:42 crc kubenswrapper[4730]: I0320 15:44:42.120299    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc"
Mar 20 15:44:42 crc kubenswrapper[4730]: I0320 15:44:42.120962    4730 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"e1f2bc557068d23d31fb94ba0b8755d440a6066e6a0d9e74613e4436d64e826e"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted"
Mar 20 15:44:42 crc kubenswrapper[4730]: I0320 15:44:42.121102    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://e1f2bc557068d23d31fb94ba0b8755d440a6066e6a0d9e74613e4436d64e826e" gracePeriod=30
Mar 20 15:44:42 crc kubenswrapper[4730]: I0320 15:44:42.135369    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics"
Mar 20 15:44:42 crc kubenswrapper[4730]: I0320 15:44:42.387169    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt"
Mar 20 15:44:42 crc kubenswrapper[4730]: I0320 15:44:42.424935    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z"
Mar 20 15:44:42 crc kubenswrapper[4730]: I0320 15:44:42.469226    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd"
Mar 20 15:44:42 crc kubenswrapper[4730]: I0320 15:44:42.561940    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls"
Mar 20 15:44:42 crc kubenswrapper[4730]: I0320 15:44:42.615932    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt"
Mar 20 15:44:42 crc kubenswrapper[4730]: I0320 15:44:42.823811    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt"
Mar 20 15:44:42 crc kubenswrapper[4730]: I0320 15:44:42.852747    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt"
Mar 20 15:44:42 crc kubenswrapper[4730]: I0320 15:44:42.853982    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g"
Mar 20 15:44:42 crc kubenswrapper[4730]: I0320 15:44:42.882629    4730 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66
Mar 20 15:44:42 crc kubenswrapper[4730]: I0320 15:44:42.885202    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=36.88518357 podStartE2EDuration="36.88518357s" podCreationTimestamp="2026-03-20 15:44:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:44:26.919754059 +0000 UTC m=+326.133125448" watchObservedRunningTime="2026-03-20 15:44:42.88518357 +0000 UTC m=+342.098554939"
Mar 20 15:44:42 crc kubenswrapper[4730]: I0320 15:44:42.886773    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client"
Mar 20 15:44:42 crc kubenswrapper[4730]: I0320 15:44:42.887756    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6f56868448-2fbxh","openshift-authentication/oauth-openshift-558db77b4-st79s","openshift-route-controller-manager/route-controller-manager-b75b5f765-8wjw2","openshift-marketplace/redhat-marketplace-2z2hv","openshift-kube-apiserver/kube-apiserver-crc"]
Mar 20 15:44:42 crc kubenswrapper[4730]: I0320 15:44:42.887820    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"]
Mar 20 15:44:42 crc kubenswrapper[4730]: I0320 15:44:42.891605    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc"
Mar 20 15:44:42 crc kubenswrapper[4730]: I0320 15:44:42.906583    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=16.906566161 podStartE2EDuration="16.906566161s" podCreationTimestamp="2026-03-20 15:44:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:44:42.905589071 +0000 UTC m=+342.118960460" watchObservedRunningTime="2026-03-20 15:44:42.906566161 +0000 UTC m=+342.119937530"
Mar 20 15:44:42 crc kubenswrapper[4730]: I0320 15:44:42.919813    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt"
Mar 20 15:44:42 crc kubenswrapper[4730]: I0320 15:44:42.921006    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config"
Mar 20 15:44:42 crc kubenswrapper[4730]: I0320 15:44:42.982551    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt"
Mar 20 15:44:43 crc kubenswrapper[4730]: I0320 15:44:43.192656    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt"
Mar 20 15:44:43 crc kubenswrapper[4730]: I0320 15:44:43.297345    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt"
Mar 20 15:44:43 crc kubenswrapper[4730]: I0320 15:44:43.384649    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert"
Mar 20 15:44:43 crc kubenswrapper[4730]: I0320 15:44:43.389223    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key"
Mar 20 15:44:43 crc kubenswrapper[4730]: I0320 15:44:43.399573    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c"
Mar 20 15:44:43 crc kubenswrapper[4730]: I0320 15:44:43.510425    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script"
Mar 20 15:44:43 crc kubenswrapper[4730]: I0320 15:44:43.540644    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2499559b-b31f-4dab-89a0-964964dc596e" path="/var/lib/kubelet/pods/2499559b-b31f-4dab-89a0-964964dc596e/volumes"
Mar 20 15:44:43 crc kubenswrapper[4730]: I0320 15:44:43.541559    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="715cbff8-9674-4896-8deb-54a6e9a8899e" path="/var/lib/kubelet/pods/715cbff8-9674-4896-8deb-54a6e9a8899e/volumes"
Mar 20 15:44:43 crc kubenswrapper[4730]: I0320 15:44:43.542380    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d747680-5dde-4793-863a-252a5f67233a" path="/var/lib/kubelet/pods/9d747680-5dde-4793-863a-252a5f67233a/volumes"
Mar 20 15:44:43 crc kubenswrapper[4730]: I0320 15:44:43.543472    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4d20fab-86cc-44d8-a8b9-c60f6835c5e0" path="/var/lib/kubelet/pods/d4d20fab-86cc-44d8-a8b9-c60f6835c5e0/volumes"
Mar 20 15:44:43 crc kubenswrapper[4730]: I0320 15:44:43.582029    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt"
Mar 20 15:44:43 crc kubenswrapper[4730]: I0320 15:44:43.732802    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt"
Mar 20 15:44:43 crc kubenswrapper[4730]: I0320 15:44:43.866780    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt"
Mar 20 15:44:43 crc kubenswrapper[4730]: I0320 15:44:43.870318    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle"
Mar 20 15:44:43 crc kubenswrapper[4730]: I0320 15:44:43.915373    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx"
Mar 20 15:44:43 crc kubenswrapper[4730]: I0320 15:44:43.919759    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib"
Mar 20 15:44:43 crc kubenswrapper[4730]: I0320 15:44:43.977866    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr"
Mar 20 15:44:44 crc kubenswrapper[4730]: I0320 15:44:44.011594    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config"
Mar 20 15:44:44 crc kubenswrapper[4730]: I0320 15:44:44.033195    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert"
Mar 20 15:44:44 crc kubenswrapper[4730]: I0320 15:44:44.087545    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca"
Mar 20 15:44:44 crc kubenswrapper[4730]: I0320 15:44:44.094533    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5"
Mar 20 15:44:44 crc kubenswrapper[4730]: I0320 15:44:44.132578    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images"
Mar 20 15:44:44 crc kubenswrapper[4730]: I0320 15:44:44.158095    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt"
Mar 20 15:44:44 crc kubenswrapper[4730]: I0320 15:44:44.238938    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt"
Mar 20 15:44:44 crc kubenswrapper[4730]: I0320 15:44:44.605583    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt"
Mar 20 15:44:44 crc kubenswrapper[4730]: I0320 15:44:44.702014    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1"
Mar 20 15:44:44 crc kubenswrapper[4730]: I0320 15:44:44.709659    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls"
Mar 20 15:44:44 crc kubenswrapper[4730]: I0320 15:44:44.749455    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt"
Mar 20 15:44:44 crc kubenswrapper[4730]: I0320 15:44:44.755065    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates"
Mar 20 15:44:44 crc kubenswrapper[4730]: I0320 15:44:44.871600    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq"
Mar 20 15:44:44 crc kubenswrapper[4730]: I0320 15:44:44.906137    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6"
Mar 20 15:44:44 crc kubenswrapper[4730]: I0320 15:44:44.930061    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr"
Mar 20 15:44:44 crc kubenswrapper[4730]: I0320 15:44:44.982355    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r"
Mar 20 15:44:44 crc kubenswrapper[4730]: I0320 15:44:44.988277    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt"
Mar 20 15:44:45 crc kubenswrapper[4730]: I0320 15:44:45.118701    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images"
Mar 20 15:44:45 crc kubenswrapper[4730]: I0320 15:44:45.152927    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt"
Mar 20 15:44:45 crc kubenswrapper[4730]: I0320 15:44:45.157144    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk"
Mar 20 15:44:45 crc kubenswrapper[4730]: I0320 15:44:45.176493    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls"
Mar 20 15:44:45 crc kubenswrapper[4730]: I0320 15:44:45.278322    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw"
Mar 20 15:44:45 crc kubenswrapper[4730]: I0320 15:44:45.409298    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert"
Mar 20 15:44:45 crc kubenswrapper[4730]: I0320 15:44:45.462347    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls"
Mar 20 15:44:45 crc kubenswrapper[4730]: I0320 15:44:45.570022    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt"
Mar 20 15:44:45 crc kubenswrapper[4730]: I0320 15:44:45.613608    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle"
Mar 20 15:44:45 crc kubenswrapper[4730]: I0320 15:44:45.681138    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt"
Mar 20 15:44:45 crc kubenswrapper[4730]: I0320 15:44:45.862464    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt"
Mar 20 15:44:45 crc kubenswrapper[4730]: I0320 15:44:45.909808    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca"
Mar 20 15:44:45 crc kubenswrapper[4730]: I0320 15:44:45.952695    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca"
Mar 20 15:44:45 crc kubenswrapper[4730]: I0320 15:44:45.954844    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm"
Mar 20 15:44:45 crc kubenswrapper[4730]: I0320 15:44:45.988716    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist"
Mar 20 15:44:46 crc kubenswrapper[4730]: I0320 15:44:46.034933    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt"
Mar 20 15:44:46 crc kubenswrapper[4730]: I0320 15:44:46.153428    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt"
Mar 20 15:44:46 crc kubenswrapper[4730]: I0320 15:44:46.168107    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert"
Mar 20 15:44:46 crc kubenswrapper[4730]: I0320 15:44:46.243841    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls"
Mar 20 15:44:46 crc kubenswrapper[4730]: I0320 15:44:46.532047    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:44:46 crc kubenswrapper[4730]: I0320 15:44:46.532138    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:44:46 crc kubenswrapper[4730]: I0320 15:44:46.532047    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:44:46 crc kubenswrapper[4730]: I0320 15:44:46.567364    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt"
Mar 20 15:44:46 crc kubenswrapper[4730]: I0320 15:44:46.642955    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert"
Mar 20 15:44:46 crc kubenswrapper[4730]: I0320 15:44:46.653383    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default"
Mar 20 15:44:46 crc kubenswrapper[4730]: I0320 15:44:46.671155    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config"
Mar 20 15:44:46 crc kubenswrapper[4730]: I0320 15:44:46.737796    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv"
Mar 20 15:44:46 crc kubenswrapper[4730]: I0320 15:44:46.932995    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt"
Mar 20 15:44:46 crc kubenswrapper[4730]: I0320 15:44:46.940596    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.129848    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.164907    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.228180    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.231153    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.295426    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.376641    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.392157    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.408712    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.438713    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-86dfd77dcd-rv2mt"]
Mar 20 15:44:47 crc kubenswrapper[4730]: E0320 15:44:47.439175    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e48519c7-0cdc-419b-bd72-2bab0e911af8" containerName="installer"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.439190    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="e48519c7-0cdc-419b-bd72-2bab0e911af8" containerName="installer"
Mar 20 15:44:47 crc kubenswrapper[4730]: E0320 15:44:47.439198    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="715cbff8-9674-4896-8deb-54a6e9a8899e" containerName="registry-server"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.439205    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="715cbff8-9674-4896-8deb-54a6e9a8899e" containerName="registry-server"
Mar 20 15:44:47 crc kubenswrapper[4730]: E0320 15:44:47.439214    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f093381-3bf4-49ff-beb4-f44aa012c521" containerName="oc"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.439222    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f093381-3bf4-49ff-beb4-f44aa012c521" containerName="oc"
Mar 20 15:44:47 crc kubenswrapper[4730]: E0320 15:44:47.439230    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d747680-5dde-4793-863a-252a5f67233a" containerName="controller-manager"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.439237    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d747680-5dde-4793-863a-252a5f67233a" containerName="controller-manager"
Mar 20 15:44:47 crc kubenswrapper[4730]: E0320 15:44:47.439263    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="715cbff8-9674-4896-8deb-54a6e9a8899e" containerName="extract-utilities"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.439272    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="715cbff8-9674-4896-8deb-54a6e9a8899e" containerName="extract-utilities"
Mar 20 15:44:47 crc kubenswrapper[4730]: E0320 15:44:47.439281    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4d20fab-86cc-44d8-a8b9-c60f6835c5e0" containerName="route-controller-manager"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.439288    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4d20fab-86cc-44d8-a8b9-c60f6835c5e0" containerName="route-controller-manager"
Mar 20 15:44:47 crc kubenswrapper[4730]: E0320 15:44:47.439296    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="715cbff8-9674-4896-8deb-54a6e9a8899e" containerName="extract-content"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.439301    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="715cbff8-9674-4896-8deb-54a6e9a8899e" containerName="extract-content"
Mar 20 15:44:47 crc kubenswrapper[4730]: E0320 15:44:47.439314    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2499559b-b31f-4dab-89a0-964964dc596e" containerName="oauth-openshift"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.439320    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="2499559b-b31f-4dab-89a0-964964dc596e" containerName="oauth-openshift"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.439415    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d747680-5dde-4793-863a-252a5f67233a" containerName="controller-manager"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.439426    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="715cbff8-9674-4896-8deb-54a6e9a8899e" containerName="registry-server"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.439434    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="e48519c7-0cdc-419b-bd72-2bab0e911af8" containerName="installer"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.439441    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f093381-3bf4-49ff-beb4-f44aa012c521" containerName="oc"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.439453    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="2499559b-b31f-4dab-89a0-964964dc596e" containerName="oauth-openshift"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.439461    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4d20fab-86cc-44d8-a8b9-c60f6835c5e0" containerName="route-controller-manager"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.439896    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86dfd77dcd-rv2mt"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.443727    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6768bc9c9c-k6zb8"]
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.444569    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6768bc9c9c-k6zb8"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.446177    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.446203    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-754f74fd94-6mnkj"]
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.446831    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-754f74fd94-6mnkj"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.446213    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.446349    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.446396    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.446412    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.446502    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.451475    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.451647    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.451820    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.452074    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.452116    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.452278    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.452309    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.452712    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.453061    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.453185    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.453724    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.454382    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.454557    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.455585    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.455752    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.455927    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-86dfd77dcd-rv2mt"]
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.459836    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.459875    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.459946    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.462418    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.464471    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.465309    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-754f74fd94-6mnkj"]
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.465782    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.470179    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6768bc9c9c-k6zb8"]
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.476949    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.481047    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.486144    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.510397    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpzmq\" (UniqueName: \"kubernetes.io/projected/ef9a1400-2cee-4019-b907-440c025638b6-kube-api-access-hpzmq\") pod \"route-controller-manager-754f74fd94-6mnkj\" (UID: \"ef9a1400-2cee-4019-b907-440c025638b6\") " pod="openshift-route-controller-manager/route-controller-manager-754f74fd94-6mnkj"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.510476    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef9a1400-2cee-4019-b907-440c025638b6-config\") pod \"route-controller-manager-754f74fd94-6mnkj\" (UID: \"ef9a1400-2cee-4019-b907-440c025638b6\") " pod="openshift-route-controller-manager/route-controller-manager-754f74fd94-6mnkj"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.510549    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef9a1400-2cee-4019-b907-440c025638b6-serving-cert\") pod \"route-controller-manager-754f74fd94-6mnkj\" (UID: \"ef9a1400-2cee-4019-b907-440c025638b6\") " pod="openshift-route-controller-manager/route-controller-manager-754f74fd94-6mnkj"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.510616    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef9a1400-2cee-4019-b907-440c025638b6-client-ca\") pod \"route-controller-manager-754f74fd94-6mnkj\" (UID: \"ef9a1400-2cee-4019-b907-440c025638b6\") " pod="openshift-route-controller-manager/route-controller-manager-754f74fd94-6mnkj"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.566948    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.611801    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d4c66130-d966-4244-abec-e2aefba87726-v4-0-config-system-session\") pod \"oauth-openshift-6768bc9c9c-k6zb8\" (UID: \"d4c66130-d966-4244-abec-e2aefba87726\") " pod="openshift-authentication/oauth-openshift-6768bc9c9c-k6zb8"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.611870    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d4c66130-d966-4244-abec-e2aefba87726-audit-dir\") pod \"oauth-openshift-6768bc9c9c-k6zb8\" (UID: \"d4c66130-d966-4244-abec-e2aefba87726\") " pod="openshift-authentication/oauth-openshift-6768bc9c9c-k6zb8"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.611907    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d4c66130-d966-4244-abec-e2aefba87726-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6768bc9c9c-k6zb8\" (UID: \"d4c66130-d966-4244-abec-e2aefba87726\") " pod="openshift-authentication/oauth-openshift-6768bc9c9c-k6zb8"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.611934    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d4c66130-d966-4244-abec-e2aefba87726-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6768bc9c9c-k6zb8\" (UID: \"d4c66130-d966-4244-abec-e2aefba87726\") " pod="openshift-authentication/oauth-openshift-6768bc9c9c-k6zb8"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.611960    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4c66130-d966-4244-abec-e2aefba87726-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6768bc9c9c-k6zb8\" (UID: \"d4c66130-d966-4244-abec-e2aefba87726\") " pod="openshift-authentication/oauth-openshift-6768bc9c9c-k6zb8"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.612007    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpzmq\" (UniqueName: \"kubernetes.io/projected/ef9a1400-2cee-4019-b907-440c025638b6-kube-api-access-hpzmq\") pod \"route-controller-manager-754f74fd94-6mnkj\" (UID: \"ef9a1400-2cee-4019-b907-440c025638b6\") " pod="openshift-route-controller-manager/route-controller-manager-754f74fd94-6mnkj"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.612037    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d4c66130-d966-4244-abec-e2aefba87726-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6768bc9c9c-k6zb8\" (UID: \"d4c66130-d966-4244-abec-e2aefba87726\") " pod="openshift-authentication/oauth-openshift-6768bc9c9c-k6zb8"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.612064    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d180ebe8-2390-4a66-a6e0-1d02e256279a-config\") pod \"controller-manager-86dfd77dcd-rv2mt\" (UID: \"d180ebe8-2390-4a66-a6e0-1d02e256279a\") " pod="openshift-controller-manager/controller-manager-86dfd77dcd-rv2mt"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.612086    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjzm9\" (UniqueName: \"kubernetes.io/projected/d180ebe8-2390-4a66-a6e0-1d02e256279a-kube-api-access-sjzm9\") pod \"controller-manager-86dfd77dcd-rv2mt\" (UID: \"d180ebe8-2390-4a66-a6e0-1d02e256279a\") " pod="openshift-controller-manager/controller-manager-86dfd77dcd-rv2mt"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.612124    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef9a1400-2cee-4019-b907-440c025638b6-config\") pod \"route-controller-manager-754f74fd94-6mnkj\" (UID: \"ef9a1400-2cee-4019-b907-440c025638b6\") " pod="openshift-route-controller-manager/route-controller-manager-754f74fd94-6mnkj"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.612150    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d4c66130-d966-4244-abec-e2aefba87726-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6768bc9c9c-k6zb8\" (UID: \"d4c66130-d966-4244-abec-e2aefba87726\") " pod="openshift-authentication/oauth-openshift-6768bc9c9c-k6zb8"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.612190    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d180ebe8-2390-4a66-a6e0-1d02e256279a-serving-cert\") pod \"controller-manager-86dfd77dcd-rv2mt\" (UID: \"d180ebe8-2390-4a66-a6e0-1d02e256279a\") " pod="openshift-controller-manager/controller-manager-86dfd77dcd-rv2mt"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.612297    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d4c66130-d966-4244-abec-e2aefba87726-v4-0-config-system-router-certs\") pod \"oauth-openshift-6768bc9c9c-k6zb8\" (UID: \"d4c66130-d966-4244-abec-e2aefba87726\") " pod="openshift-authentication/oauth-openshift-6768bc9c9c-k6zb8"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.612455    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d4c66130-d966-4244-abec-e2aefba87726-audit-policies\") pod \"oauth-openshift-6768bc9c9c-k6zb8\" (UID: \"d4c66130-d966-4244-abec-e2aefba87726\") " pod="openshift-authentication/oauth-openshift-6768bc9c9c-k6zb8"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.612514    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef9a1400-2cee-4019-b907-440c025638b6-serving-cert\") pod \"route-controller-manager-754f74fd94-6mnkj\" (UID: \"ef9a1400-2cee-4019-b907-440c025638b6\") " pod="openshift-route-controller-manager/route-controller-manager-754f74fd94-6mnkj"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.612589    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d4c66130-d966-4244-abec-e2aefba87726-v4-0-config-user-template-error\") pod \"oauth-openshift-6768bc9c9c-k6zb8\" (UID: \"d4c66130-d966-4244-abec-e2aefba87726\") " pod="openshift-authentication/oauth-openshift-6768bc9c9c-k6zb8"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.612647    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d180ebe8-2390-4a66-a6e0-1d02e256279a-proxy-ca-bundles\") pod \"controller-manager-86dfd77dcd-rv2mt\" (UID: \"d180ebe8-2390-4a66-a6e0-1d02e256279a\") " pod="openshift-controller-manager/controller-manager-86dfd77dcd-rv2mt"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.612681    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d4c66130-d966-4244-abec-e2aefba87726-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6768bc9c9c-k6zb8\" (UID: \"d4c66130-d966-4244-abec-e2aefba87726\") " pod="openshift-authentication/oauth-openshift-6768bc9c9c-k6zb8"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.612725    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d180ebe8-2390-4a66-a6e0-1d02e256279a-client-ca\") pod \"controller-manager-86dfd77dcd-rv2mt\" (UID: \"d180ebe8-2390-4a66-a6e0-1d02e256279a\") " pod="openshift-controller-manager/controller-manager-86dfd77dcd-rv2mt"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.612779    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d4c66130-d966-4244-abec-e2aefba87726-v4-0-config-user-template-login\") pod \"oauth-openshift-6768bc9c9c-k6zb8\" (UID: \"d4c66130-d966-4244-abec-e2aefba87726\") " pod="openshift-authentication/oauth-openshift-6768bc9c9c-k6zb8"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.612830    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d4c66130-d966-4244-abec-e2aefba87726-v4-0-config-system-service-ca\") pod \"oauth-openshift-6768bc9c9c-k6zb8\" (UID: \"d4c66130-d966-4244-abec-e2aefba87726\") " pod="openshift-authentication/oauth-openshift-6768bc9c9c-k6zb8"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.612863    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vg6x\" (UniqueName: \"kubernetes.io/projected/d4c66130-d966-4244-abec-e2aefba87726-kube-api-access-4vg6x\") pod \"oauth-openshift-6768bc9c9c-k6zb8\" (UID: \"d4c66130-d966-4244-abec-e2aefba87726\") " pod="openshift-authentication/oauth-openshift-6768bc9c9c-k6zb8"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.612910    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef9a1400-2cee-4019-b907-440c025638b6-client-ca\") pod \"route-controller-manager-754f74fd94-6mnkj\" (UID: \"ef9a1400-2cee-4019-b907-440c025638b6\") " pod="openshift-route-controller-manager/route-controller-manager-754f74fd94-6mnkj"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.613457    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef9a1400-2cee-4019-b907-440c025638b6-config\") pod \"route-controller-manager-754f74fd94-6mnkj\" (UID: \"ef9a1400-2cee-4019-b907-440c025638b6\") " pod="openshift-route-controller-manager/route-controller-manager-754f74fd94-6mnkj"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.614341    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef9a1400-2cee-4019-b907-440c025638b6-client-ca\") pod \"route-controller-manager-754f74fd94-6mnkj\" (UID: \"ef9a1400-2cee-4019-b907-440c025638b6\") " pod="openshift-route-controller-manager/route-controller-manager-754f74fd94-6mnkj"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.622056    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.631469    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpzmq\" (UniqueName: \"kubernetes.io/projected/ef9a1400-2cee-4019-b907-440c025638b6-kube-api-access-hpzmq\") pod \"route-controller-manager-754f74fd94-6mnkj\" (UID: \"ef9a1400-2cee-4019-b907-440c025638b6\") " pod="openshift-route-controller-manager/route-controller-manager-754f74fd94-6mnkj"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.634320    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef9a1400-2cee-4019-b907-440c025638b6-serving-cert\") pod \"route-controller-manager-754f74fd94-6mnkj\" (UID: \"ef9a1400-2cee-4019-b907-440c025638b6\") " pod="openshift-route-controller-manager/route-controller-manager-754f74fd94-6mnkj"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.671042    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.713998    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d4c66130-d966-4244-abec-e2aefba87726-v4-0-config-user-template-login\") pod \"oauth-openshift-6768bc9c9c-k6zb8\" (UID: \"d4c66130-d966-4244-abec-e2aefba87726\") " pod="openshift-authentication/oauth-openshift-6768bc9c9c-k6zb8"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.714047    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vg6x\" (UniqueName: \"kubernetes.io/projected/d4c66130-d966-4244-abec-e2aefba87726-kube-api-access-4vg6x\") pod \"oauth-openshift-6768bc9c9c-k6zb8\" (UID: \"d4c66130-d966-4244-abec-e2aefba87726\") " pod="openshift-authentication/oauth-openshift-6768bc9c9c-k6zb8"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.714067    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d4c66130-d966-4244-abec-e2aefba87726-v4-0-config-system-service-ca\") pod \"oauth-openshift-6768bc9c9c-k6zb8\" (UID: \"d4c66130-d966-4244-abec-e2aefba87726\") " pod="openshift-authentication/oauth-openshift-6768bc9c9c-k6zb8"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.714087    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d4c66130-d966-4244-abec-e2aefba87726-v4-0-config-system-session\") pod \"oauth-openshift-6768bc9c9c-k6zb8\" (UID: \"d4c66130-d966-4244-abec-e2aefba87726\") " pod="openshift-authentication/oauth-openshift-6768bc9c9c-k6zb8"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.714111    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d4c66130-d966-4244-abec-e2aefba87726-audit-dir\") pod \"oauth-openshift-6768bc9c9c-k6zb8\" (UID: \"d4c66130-d966-4244-abec-e2aefba87726\") " pod="openshift-authentication/oauth-openshift-6768bc9c9c-k6zb8"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.714138    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d4c66130-d966-4244-abec-e2aefba87726-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6768bc9c9c-k6zb8\" (UID: \"d4c66130-d966-4244-abec-e2aefba87726\") " pod="openshift-authentication/oauth-openshift-6768bc9c9c-k6zb8"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.714158    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d4c66130-d966-4244-abec-e2aefba87726-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6768bc9c9c-k6zb8\" (UID: \"d4c66130-d966-4244-abec-e2aefba87726\") " pod="openshift-authentication/oauth-openshift-6768bc9c9c-k6zb8"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.714259    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d4c66130-d966-4244-abec-e2aefba87726-audit-dir\") pod \"oauth-openshift-6768bc9c9c-k6zb8\" (UID: \"d4c66130-d966-4244-abec-e2aefba87726\") " pod="openshift-authentication/oauth-openshift-6768bc9c9c-k6zb8"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.714309    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4c66130-d966-4244-abec-e2aefba87726-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6768bc9c9c-k6zb8\" (UID: \"d4c66130-d966-4244-abec-e2aefba87726\") " pod="openshift-authentication/oauth-openshift-6768bc9c9c-k6zb8"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.714334    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d4c66130-d966-4244-abec-e2aefba87726-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6768bc9c9c-k6zb8\" (UID: \"d4c66130-d966-4244-abec-e2aefba87726\") " pod="openshift-authentication/oauth-openshift-6768bc9c9c-k6zb8"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.714552    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d180ebe8-2390-4a66-a6e0-1d02e256279a-config\") pod \"controller-manager-86dfd77dcd-rv2mt\" (UID: \"d180ebe8-2390-4a66-a6e0-1d02e256279a\") " pod="openshift-controller-manager/controller-manager-86dfd77dcd-rv2mt"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.714571    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjzm9\" (UniqueName: \"kubernetes.io/projected/d180ebe8-2390-4a66-a6e0-1d02e256279a-kube-api-access-sjzm9\") pod \"controller-manager-86dfd77dcd-rv2mt\" (UID: \"d180ebe8-2390-4a66-a6e0-1d02e256279a\") " pod="openshift-controller-manager/controller-manager-86dfd77dcd-rv2mt"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.714605    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d4c66130-d966-4244-abec-e2aefba87726-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6768bc9c9c-k6zb8\" (UID: \"d4c66130-d966-4244-abec-e2aefba87726\") " pod="openshift-authentication/oauth-openshift-6768bc9c9c-k6zb8"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.714634    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d180ebe8-2390-4a66-a6e0-1d02e256279a-serving-cert\") pod \"controller-manager-86dfd77dcd-rv2mt\" (UID: \"d180ebe8-2390-4a66-a6e0-1d02e256279a\") " pod="openshift-controller-manager/controller-manager-86dfd77dcd-rv2mt"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.714653    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d4c66130-d966-4244-abec-e2aefba87726-v4-0-config-system-router-certs\") pod \"oauth-openshift-6768bc9c9c-k6zb8\" (UID: \"d4c66130-d966-4244-abec-e2aefba87726\") " pod="openshift-authentication/oauth-openshift-6768bc9c9c-k6zb8"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.714691    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d4c66130-d966-4244-abec-e2aefba87726-audit-policies\") pod \"oauth-openshift-6768bc9c9c-k6zb8\" (UID: \"d4c66130-d966-4244-abec-e2aefba87726\") " pod="openshift-authentication/oauth-openshift-6768bc9c9c-k6zb8"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.714724    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d4c66130-d966-4244-abec-e2aefba87726-v4-0-config-user-template-error\") pod \"oauth-openshift-6768bc9c9c-k6zb8\" (UID: \"d4c66130-d966-4244-abec-e2aefba87726\") " pod="openshift-authentication/oauth-openshift-6768bc9c9c-k6zb8"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.714745    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d180ebe8-2390-4a66-a6e0-1d02e256279a-proxy-ca-bundles\") pod \"controller-manager-86dfd77dcd-rv2mt\" (UID: \"d180ebe8-2390-4a66-a6e0-1d02e256279a\") " pod="openshift-controller-manager/controller-manager-86dfd77dcd-rv2mt"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.714762    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d4c66130-d966-4244-abec-e2aefba87726-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6768bc9c9c-k6zb8\" (UID: \"d4c66130-d966-4244-abec-e2aefba87726\") " pod="openshift-authentication/oauth-openshift-6768bc9c9c-k6zb8"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.714783    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d180ebe8-2390-4a66-a6e0-1d02e256279a-client-ca\") pod \"controller-manager-86dfd77dcd-rv2mt\" (UID: \"d180ebe8-2390-4a66-a6e0-1d02e256279a\") " pod="openshift-controller-manager/controller-manager-86dfd77dcd-rv2mt"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.715800    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d4c66130-d966-4244-abec-e2aefba87726-v4-0-config-system-service-ca\") pod \"oauth-openshift-6768bc9c9c-k6zb8\" (UID: \"d4c66130-d966-4244-abec-e2aefba87726\") " pod="openshift-authentication/oauth-openshift-6768bc9c9c-k6zb8"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.716103    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d4c66130-d966-4244-abec-e2aefba87726-audit-policies\") pod \"oauth-openshift-6768bc9c9c-k6zb8\" (UID: \"d4c66130-d966-4244-abec-e2aefba87726\") " pod="openshift-authentication/oauth-openshift-6768bc9c9c-k6zb8"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.716285    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d180ebe8-2390-4a66-a6e0-1d02e256279a-config\") pod \"controller-manager-86dfd77dcd-rv2mt\" (UID: \"d180ebe8-2390-4a66-a6e0-1d02e256279a\") " pod="openshift-controller-manager/controller-manager-86dfd77dcd-rv2mt"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.716931    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d4c66130-d966-4244-abec-e2aefba87726-v4-0-config-user-template-login\") pod \"oauth-openshift-6768bc9c9c-k6zb8\" (UID: \"d4c66130-d966-4244-abec-e2aefba87726\") " pod="openshift-authentication/oauth-openshift-6768bc9c9c-k6zb8"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.717306    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d180ebe8-2390-4a66-a6e0-1d02e256279a-proxy-ca-bundles\") pod \"controller-manager-86dfd77dcd-rv2mt\" (UID: \"d180ebe8-2390-4a66-a6e0-1d02e256279a\") " pod="openshift-controller-manager/controller-manager-86dfd77dcd-rv2mt"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.717401    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4c66130-d966-4244-abec-e2aefba87726-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6768bc9c9c-k6zb8\" (UID: \"d4c66130-d966-4244-abec-e2aefba87726\") " pod="openshift-authentication/oauth-openshift-6768bc9c9c-k6zb8"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.717870    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d4c66130-d966-4244-abec-e2aefba87726-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6768bc9c9c-k6zb8\" (UID: \"d4c66130-d966-4244-abec-e2aefba87726\") " pod="openshift-authentication/oauth-openshift-6768bc9c9c-k6zb8"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.718013    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d4c66130-d966-4244-abec-e2aefba87726-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6768bc9c9c-k6zb8\" (UID: \"d4c66130-d966-4244-abec-e2aefba87726\") " pod="openshift-authentication/oauth-openshift-6768bc9c9c-k6zb8"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.718041    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d4c66130-d966-4244-abec-e2aefba87726-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6768bc9c9c-k6zb8\" (UID: \"d4c66130-d966-4244-abec-e2aefba87726\") " pod="openshift-authentication/oauth-openshift-6768bc9c9c-k6zb8"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.718340    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d180ebe8-2390-4a66-a6e0-1d02e256279a-client-ca\") pod \"controller-manager-86dfd77dcd-rv2mt\" (UID: \"d180ebe8-2390-4a66-a6e0-1d02e256279a\") " pod="openshift-controller-manager/controller-manager-86dfd77dcd-rv2mt"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.719714    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d4c66130-d966-4244-abec-e2aefba87726-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6768bc9c9c-k6zb8\" (UID: \"d4c66130-d966-4244-abec-e2aefba87726\") " pod="openshift-authentication/oauth-openshift-6768bc9c9c-k6zb8"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.728690    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d4c66130-d966-4244-abec-e2aefba87726-v4-0-config-system-session\") pod \"oauth-openshift-6768bc9c9c-k6zb8\" (UID: \"d4c66130-d966-4244-abec-e2aefba87726\") " pod="openshift-authentication/oauth-openshift-6768bc9c9c-k6zb8"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.728817    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d4c66130-d966-4244-abec-e2aefba87726-v4-0-config-system-router-certs\") pod \"oauth-openshift-6768bc9c9c-k6zb8\" (UID: \"d4c66130-d966-4244-abec-e2aefba87726\") " pod="openshift-authentication/oauth-openshift-6768bc9c9c-k6zb8"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.728931    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d4c66130-d966-4244-abec-e2aefba87726-v4-0-config-user-template-error\") pod \"oauth-openshift-6768bc9c9c-k6zb8\" (UID: \"d4c66130-d966-4244-abec-e2aefba87726\") " pod="openshift-authentication/oauth-openshift-6768bc9c9c-k6zb8"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.729156    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d180ebe8-2390-4a66-a6e0-1d02e256279a-serving-cert\") pod \"controller-manager-86dfd77dcd-rv2mt\" (UID: \"d180ebe8-2390-4a66-a6e0-1d02e256279a\") " pod="openshift-controller-manager/controller-manager-86dfd77dcd-rv2mt"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.730046    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d4c66130-d966-4244-abec-e2aefba87726-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6768bc9c9c-k6zb8\" (UID: \"d4c66130-d966-4244-abec-e2aefba87726\") " pod="openshift-authentication/oauth-openshift-6768bc9c9c-k6zb8"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.731323    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjzm9\" (UniqueName: \"kubernetes.io/projected/d180ebe8-2390-4a66-a6e0-1d02e256279a-kube-api-access-sjzm9\") pod \"controller-manager-86dfd77dcd-rv2mt\" (UID: \"d180ebe8-2390-4a66-a6e0-1d02e256279a\") " pod="openshift-controller-manager/controller-manager-86dfd77dcd-rv2mt"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.735726    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vg6x\" (UniqueName: \"kubernetes.io/projected/d4c66130-d966-4244-abec-e2aefba87726-kube-api-access-4vg6x\") pod \"oauth-openshift-6768bc9c9c-k6zb8\" (UID: \"d4c66130-d966-4244-abec-e2aefba87726\") " pod="openshift-authentication/oauth-openshift-6768bc9c9c-k6zb8"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.765655    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86dfd77dcd-rv2mt"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.775350    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.785668    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6768bc9c9c-k6zb8"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.803105    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-754f74fd94-6mnkj"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.813958    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.826635    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.976153    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.979779    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources"
Mar 20 15:44:47 crc kubenswrapper[4730]: I0320 15:44:47.982727    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca"
Mar 20 15:44:48 crc kubenswrapper[4730]: I0320 15:44:48.001447    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt"
Mar 20 15:44:48 crc kubenswrapper[4730]: I0320 15:44:48.215561    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls"
Mar 20 15:44:48 crc kubenswrapper[4730]: I0320 15:44:48.225280    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-86dfd77dcd-rv2mt"]
Mar 20 15:44:48 crc kubenswrapper[4730]: I0320 15:44:48.238193    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86dfd77dcd-rv2mt" event={"ID":"d180ebe8-2390-4a66-a6e0-1d02e256279a","Type":"ContainerStarted","Data":"1f20c0c6cb3f095a9ebf090bce89d3cdbdc4242e46cb281771f335b8d4272464"}
Mar 20 15:44:48 crc kubenswrapper[4730]: I0320 15:44:48.266848    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6768bc9c9c-k6zb8"]
Mar 20 15:44:48 crc kubenswrapper[4730]: W0320 15:44:48.266964    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4c66130_d966_4244_abec_e2aefba87726.slice/crio-7789633566c7ef452d701f6956579ab64153ee92f2c67df787e8cfb7fba13d91 WatchSource:0}: Error finding container 7789633566c7ef452d701f6956579ab64153ee92f2c67df787e8cfb7fba13d91: Status 404 returned error can't find the container with id 7789633566c7ef452d701f6956579ab64153ee92f2c67df787e8cfb7fba13d91
Mar 20 15:44:48 crc kubenswrapper[4730]: I0320 15:44:48.292682    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config"
Mar 20 15:44:48 crc kubenswrapper[4730]: I0320 15:44:48.342394    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt"
Mar 20 15:44:48 crc kubenswrapper[4730]: I0320 15:44:48.343441    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config"
Mar 20 15:44:48 crc kubenswrapper[4730]: I0320 15:44:48.354512    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-754f74fd94-6mnkj"]
Mar 20 15:44:48 crc kubenswrapper[4730]: W0320 15:44:48.359416    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef9a1400_2cee_4019_b907_440c025638b6.slice/crio-dc3dc13f49503d3d06bcb012aeab946187b3c373989eaa5abd5e06eb7c51d310 WatchSource:0}: Error finding container dc3dc13f49503d3d06bcb012aeab946187b3c373989eaa5abd5e06eb7c51d310: Status 404 returned error can't find the container with id dc3dc13f49503d3d06bcb012aeab946187b3c373989eaa5abd5e06eb7c51d310
Mar 20 15:44:48 crc kubenswrapper[4730]: I0320 15:44:48.364862    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt"
Mar 20 15:44:48 crc kubenswrapper[4730]: I0320 15:44:48.416172    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config"
Mar 20 15:44:48 crc kubenswrapper[4730]: I0320 15:44:48.463783    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt"
Mar 20 15:44:48 crc kubenswrapper[4730]: I0320 15:44:48.520574    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt"
Mar 20 15:44:48 crc kubenswrapper[4730]: I0320 15:44:48.541866    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt"
Mar 20 15:44:48 crc kubenswrapper[4730]: I0320 15:44:48.567155    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl"
Mar 20 15:44:48 crc kubenswrapper[4730]: I0320 15:44:48.594951    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls"
Mar 20 15:44:48 crc kubenswrapper[4730]: I0320 15:44:48.871762    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert"
Mar 20 15:44:48 crc kubenswrapper[4730]: I0320 15:44:48.875511    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86"
Mar 20 15:44:48 crc kubenswrapper[4730]: I0320 15:44:48.899320    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca"
Mar 20 15:44:48 crc kubenswrapper[4730]: I0320 15:44:48.939859    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w"
Mar 20 15:44:48 crc kubenswrapper[4730]: I0320 15:44:48.948757    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx"
Mar 20 15:44:49 crc kubenswrapper[4730]: I0320 15:44:49.013993    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt"
Mar 20 15:44:49 crc kubenswrapper[4730]: I0320 15:44:49.062714    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert"
Mar 20 15:44:49 crc kubenswrapper[4730]: I0320 15:44:49.094339    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default"
Mar 20 15:44:49 crc kubenswrapper[4730]: I0320 15:44:49.173018    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides"
Mar 20 15:44:49 crc kubenswrapper[4730]: I0320 15:44:49.195336    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert"
Mar 20 15:44:49 crc kubenswrapper[4730]: I0320 15:44:49.246390    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86dfd77dcd-rv2mt" event={"ID":"d180ebe8-2390-4a66-a6e0-1d02e256279a","Type":"ContainerStarted","Data":"79c37e5d53d635bd788a7db0e9cd6a218a8e4fb2962907f08447c07acfcacfe6"}
Mar 20 15:44:49 crc kubenswrapper[4730]: I0320 15:44:49.246433    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-86dfd77dcd-rv2mt"
Mar 20 15:44:49 crc kubenswrapper[4730]: I0320 15:44:49.249294    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-754f74fd94-6mnkj" event={"ID":"ef9a1400-2cee-4019-b907-440c025638b6","Type":"ContainerStarted","Data":"3ebe650c86674a5b830721481ad113ef2a916f54a0020fdc9ad70e3a263d5aa6"}
Mar 20 15:44:49 crc kubenswrapper[4730]: I0320 15:44:49.249340    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-754f74fd94-6mnkj" event={"ID":"ef9a1400-2cee-4019-b907-440c025638b6","Type":"ContainerStarted","Data":"dc3dc13f49503d3d06bcb012aeab946187b3c373989eaa5abd5e06eb7c51d310"}
Mar 20 15:44:49 crc kubenswrapper[4730]: I0320 15:44:49.249540    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-754f74fd94-6mnkj"
Mar 20 15:44:49 crc kubenswrapper[4730]: I0320 15:44:49.251467    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6768bc9c9c-k6zb8" event={"ID":"d4c66130-d966-4244-abec-e2aefba87726","Type":"ContainerStarted","Data":"c6a09d9e8075c096b4adbd19be4b5a53f4a961a045a1826d210756002335688e"}
Mar 20 15:44:49 crc kubenswrapper[4730]: I0320 15:44:49.251503    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6768bc9c9c-k6zb8" event={"ID":"d4c66130-d966-4244-abec-e2aefba87726","Type":"ContainerStarted","Data":"7789633566c7ef452d701f6956579ab64153ee92f2c67df787e8cfb7fba13d91"}
Mar 20 15:44:49 crc kubenswrapper[4730]: I0320 15:44:49.251883    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6768bc9c9c-k6zb8"
Mar 20 15:44:49 crc kubenswrapper[4730]: I0320 15:44:49.253323    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-86dfd77dcd-rv2mt"
Mar 20 15:44:49 crc kubenswrapper[4730]: I0320 15:44:49.253942    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-754f74fd94-6mnkj"
Mar 20 15:44:49 crc kubenswrapper[4730]: I0320 15:44:49.257527    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6768bc9c9c-k6zb8"
Mar 20 15:44:49 crc kubenswrapper[4730]: I0320 15:44:49.266997    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls"
Mar 20 15:44:49 crc kubenswrapper[4730]: I0320 15:44:49.271071    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default"
Mar 20 15:44:49 crc kubenswrapper[4730]: I0320 15:44:49.273798    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf"
Mar 20 15:44:49 crc kubenswrapper[4730]: I0320 15:44:49.285897    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-86dfd77dcd-rv2mt" podStartSLOduration=45.28588033 podStartE2EDuration="45.28588033s" podCreationTimestamp="2026-03-20 15:44:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:44:49.265435457 +0000 UTC m=+348.478806826" watchObservedRunningTime="2026-03-20 15:44:49.28588033 +0000 UTC m=+348.499251709"
Mar 20 15:44:49 crc kubenswrapper[4730]: I0320 15:44:49.287591    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6768bc9c9c-k6zb8" podStartSLOduration=51.287580811 podStartE2EDuration="51.287580811s" podCreationTimestamp="2026-03-20 15:43:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:44:49.283682103 +0000 UTC m=+348.497053492" watchObservedRunningTime="2026-03-20 15:44:49.287580811 +0000 UTC m=+348.500952180"
Mar 20 15:44:49 crc kubenswrapper[4730]: I0320 15:44:49.303624    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt"
Mar 20 15:44:49 crc kubenswrapper[4730]: I0320 15:44:49.327573    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-754f74fd94-6mnkj" podStartSLOduration=45.327553739 podStartE2EDuration="45.327553739s" podCreationTimestamp="2026-03-20 15:44:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:44:49.323955129 +0000 UTC m=+348.537326498" watchObservedRunningTime="2026-03-20 15:44:49.327553739 +0000 UTC m=+348.540925108"
Mar 20 15:44:49 crc kubenswrapper[4730]: I0320 15:44:49.444331    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt"
Mar 20 15:44:49 crc kubenswrapper[4730]: I0320 15:44:49.449818    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert"
Mar 20 15:44:49 crc kubenswrapper[4730]: I0320 15:44:49.481682    4730 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160
Mar 20 15:44:49 crc kubenswrapper[4730]: I0320 15:44:49.512284    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt"
Mar 20 15:44:49 crc kubenswrapper[4730]: I0320 15:44:49.545027    4730 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160
Mar 20 15:44:49 crc kubenswrapper[4730]: I0320 15:44:49.549221    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt"
Mar 20 15:44:49 crc kubenswrapper[4730]: I0320 15:44:49.568518    4730 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"]
Mar 20 15:44:49 crc kubenswrapper[4730]: I0320 15:44:49.568753    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://2bd02e0e36575d63682737a4f6e6a51da85e3c01b703e67cc6238582df76514f" gracePeriod=5
Mar 20 15:44:49 crc kubenswrapper[4730]: I0320 15:44:49.576921    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt"
Mar 20 15:44:49 crc kubenswrapper[4730]: I0320 15:44:49.602169    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1"
Mar 20 15:44:49 crc kubenswrapper[4730]: I0320 15:44:49.695217    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt"
Mar 20 15:44:49 crc kubenswrapper[4730]: I0320 15:44:49.713889    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p"
Mar 20 15:44:49 crc kubenswrapper[4730]: I0320 15:44:49.778229    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls"
Mar 20 15:44:49 crc kubenswrapper[4730]: I0320 15:44:49.880099    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt"
Mar 20 15:44:49 crc kubenswrapper[4730]: I0320 15:44:49.930895    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt"
Mar 20 15:44:50 crc kubenswrapper[4730]: I0320 15:44:50.155481    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client"
Mar 20 15:44:50 crc kubenswrapper[4730]: I0320 15:44:50.225091    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt"
Mar 20 15:44:50 crc kubenswrapper[4730]: I0320 15:44:50.346829    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt"
Mar 20 15:44:50 crc kubenswrapper[4730]: I0320 15:44:50.426172    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff"
Mar 20 15:44:50 crc kubenswrapper[4730]: I0320 15:44:50.517154    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle"
Mar 20 15:44:50 crc kubenswrapper[4730]: I0320 15:44:50.532624    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:44:50 crc kubenswrapper[4730]: I0320 15:44:50.540371    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config"
Mar 20 15:44:50 crc kubenswrapper[4730]: I0320 15:44:50.633879    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7"
Mar 20 15:44:50 crc kubenswrapper[4730]: I0320 15:44:50.683169    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt"
Mar 20 15:44:50 crc kubenswrapper[4730]: I0320 15:44:50.706899    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj"
Mar 20 15:44:50 crc kubenswrapper[4730]: I0320 15:44:50.753911    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle"
Mar 20 15:44:50 crc kubenswrapper[4730]: I0320 15:44:50.817891    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt"
Mar 20 15:44:51 crc kubenswrapper[4730]: I0320 15:44:51.005981    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt"
Mar 20 15:44:51 crc kubenswrapper[4730]: I0320 15:44:51.150387    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle"
Mar 20 15:44:51 crc kubenswrapper[4730]: I0320 15:44:51.159953    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle"
Mar 20 15:44:51 crc kubenswrapper[4730]: I0320 15:44:51.219079    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl"
Mar 20 15:44:51 crc kubenswrapper[4730]: I0320 15:44:51.486771    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd"
Mar 20 15:44:51 crc kubenswrapper[4730]: I0320 15:44:51.697226    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls"
Mar 20 15:44:51 crc kubenswrapper[4730]: I0320 15:44:51.735019    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt"
Mar 20 15:44:51 crc kubenswrapper[4730]: I0320 15:44:51.841454    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls"
Mar 20 15:44:51 crc kubenswrapper[4730]: I0320 15:44:51.947589    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides"
Mar 20 15:44:51 crc kubenswrapper[4730]: I0320 15:44:51.992048    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt"
Mar 20 15:44:51 crc kubenswrapper[4730]: I0320 15:44:51.992980    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq"
Mar 20 15:44:52 crc kubenswrapper[4730]: I0320 15:44:52.056100    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh"
Mar 20 15:44:52 crc kubenswrapper[4730]: I0320 15:44:52.066202    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca"
Mar 20 15:44:52 crc kubenswrapper[4730]: I0320 15:44:52.085935    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls"
Mar 20 15:44:52 crc kubenswrapper[4730]: I0320 15:44:52.087375    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4"
Mar 20 15:44:52 crc kubenswrapper[4730]: I0320 15:44:52.227355    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls"
Mar 20 15:44:52 crc kubenswrapper[4730]: I0320 15:44:52.502476    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert"
Mar 20 15:44:52 crc kubenswrapper[4730]: I0320 15:44:52.600682    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh"
Mar 20 15:44:52 crc kubenswrapper[4730]: I0320 15:44:52.659318    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle"
Mar 20 15:44:52 crc kubenswrapper[4730]: I0320 15:44:52.666340    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw"
Mar 20 15:44:52 crc kubenswrapper[4730]: I0320 15:44:52.961164    4730 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k"
Mar 20 15:44:53 crc kubenswrapper[4730]: I0320 15:44:53.007965    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx"
Mar 20 15:44:53 crc kubenswrapper[4730]: I0320 15:44:53.100363    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy"
Mar 20 15:44:53 crc kubenswrapper[4730]: I0320 15:44:53.138425    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt"
Mar 20 15:44:53 crc kubenswrapper[4730]: I0320 15:44:53.468657    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert"
Mar 20 15:44:53 crc kubenswrapper[4730]: I0320 15:44:53.627019    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt"
Mar 20 15:44:53 crc kubenswrapper[4730]: I0320 15:44:53.881044    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert"
Mar 20 15:44:53 crc kubenswrapper[4730]: I0320 15:44:53.950040    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt"
Mar 20 15:44:54 crc kubenswrapper[4730]: I0320 15:44:54.039569    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt"
Mar 20 15:44:54 crc kubenswrapper[4730]: I0320 15:44:54.270387    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca"
Mar 20 15:44:54 crc kubenswrapper[4730]: I0320 15:44:54.295597    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config"
Mar 20 15:44:54 crc kubenswrapper[4730]: I0320 15:44:54.569581    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert"
Mar 20 15:44:54 crc kubenswrapper[4730]: I0320 15:44:54.672006    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log"
Mar 20 15:44:54 crc kubenswrapper[4730]: I0320 15:44:54.672444    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"
Mar 20 15:44:54 crc kubenswrapper[4730]: I0320 15:44:54.738963    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy"
Mar 20 15:44:54 crc kubenswrapper[4730]: I0320 15:44:54.746967    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") "
Mar 20 15:44:54 crc kubenswrapper[4730]: I0320 15:44:54.747042    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") "
Mar 20 15:44:54 crc kubenswrapper[4730]: I0320 15:44:54.747107    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") "
Mar 20 15:44:54 crc kubenswrapper[4730]: I0320 15:44:54.747150    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") "
Mar 20 15:44:54 crc kubenswrapper[4730]: I0320 15:44:54.747196    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") "
Mar 20 15:44:54 crc kubenswrapper[4730]: I0320 15:44:54.747144    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue ""
Mar 20 15:44:54 crc kubenswrapper[4730]: I0320 15:44:54.747226    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue ""
Mar 20 15:44:54 crc kubenswrapper[4730]: I0320 15:44:54.747174    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue ""
Mar 20 15:44:54 crc kubenswrapper[4730]: I0320 15:44:54.747461    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue ""
Mar 20 15:44:54 crc kubenswrapper[4730]: I0320 15:44:54.747571    4730 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\""
Mar 20 15:44:54 crc kubenswrapper[4730]: I0320 15:44:54.747584    4730 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\""
Mar 20 15:44:54 crc kubenswrapper[4730]: I0320 15:44:54.747593    4730 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\""
Mar 20 15:44:54 crc kubenswrapper[4730]: I0320 15:44:54.747603    4730 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\""
Mar 20 15:44:54 crc kubenswrapper[4730]: I0320 15:44:54.755407    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue ""
Mar 20 15:44:54 crc kubenswrapper[4730]: I0320 15:44:54.848069    4730 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\""
Mar 20 15:44:54 crc kubenswrapper[4730]: I0320 15:44:54.872179    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert"
Mar 20 15:44:55 crc kubenswrapper[4730]: I0320 15:44:55.285948    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log"
Mar 20 15:44:55 crc kubenswrapper[4730]: I0320 15:44:55.286019    4730 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="2bd02e0e36575d63682737a4f6e6a51da85e3c01b703e67cc6238582df76514f" exitCode=137
Mar 20 15:44:55 crc kubenswrapper[4730]: I0320 15:44:55.286070    4730 scope.go:117] "RemoveContainer" containerID="2bd02e0e36575d63682737a4f6e6a51da85e3c01b703e67cc6238582df76514f"
Mar 20 15:44:55 crc kubenswrapper[4730]: I0320 15:44:55.286208    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"
Mar 20 15:44:55 crc kubenswrapper[4730]: I0320 15:44:55.304233    4730 scope.go:117] "RemoveContainer" containerID="2bd02e0e36575d63682737a4f6e6a51da85e3c01b703e67cc6238582df76514f"
Mar 20 15:44:55 crc kubenswrapper[4730]: E0320 15:44:55.304704    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bd02e0e36575d63682737a4f6e6a51da85e3c01b703e67cc6238582df76514f\": container with ID starting with 2bd02e0e36575d63682737a4f6e6a51da85e3c01b703e67cc6238582df76514f not found: ID does not exist" containerID="2bd02e0e36575d63682737a4f6e6a51da85e3c01b703e67cc6238582df76514f"
Mar 20 15:44:55 crc kubenswrapper[4730]: I0320 15:44:55.304812    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bd02e0e36575d63682737a4f6e6a51da85e3c01b703e67cc6238582df76514f"} err="failed to get container status \"2bd02e0e36575d63682737a4f6e6a51da85e3c01b703e67cc6238582df76514f\": rpc error: code = NotFound desc = could not find container \"2bd02e0e36575d63682737a4f6e6a51da85e3c01b703e67cc6238582df76514f\": container with ID starting with 2bd02e0e36575d63682737a4f6e6a51da85e3c01b703e67cc6238582df76514f not found: ID does not exist"
Mar 20 15:44:55 crc kubenswrapper[4730]: I0320 15:44:55.540916    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes"
Mar 20 15:44:55 crc kubenswrapper[4730]: I0320 15:44:55.541683    4730 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID=""
Mar 20 15:44:55 crc kubenswrapper[4730]: I0320 15:44:55.552853    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"]
Mar 20 15:44:55 crc kubenswrapper[4730]: I0320 15:44:55.553109    4730 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="e4577fc5-223a-423f-9bc6-c292e6be96b9"
Mar 20 15:44:55 crc kubenswrapper[4730]: I0320 15:44:55.557340    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"]
Mar 20 15:44:55 crc kubenswrapper[4730]: I0320 15:44:55.557388    4730 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="e4577fc5-223a-423f-9bc6-c292e6be96b9"
Mar 20 15:44:56 crc kubenswrapper[4730]: I0320 15:44:56.017499    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca"
Mar 20 15:45:02 crc kubenswrapper[4730]: I0320 15:45:02.513194    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567025-sp8pk"]
Mar 20 15:45:02 crc kubenswrapper[4730]: E0320 15:45:02.513751    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor"
Mar 20 15:45:02 crc kubenswrapper[4730]: I0320 15:45:02.513767    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor"
Mar 20 15:45:02 crc kubenswrapper[4730]: I0320 15:45:02.513859    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor"
Mar 20 15:45:02 crc kubenswrapper[4730]: I0320 15:45:02.514195    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567025-sp8pk"
Mar 20 15:45:02 crc kubenswrapper[4730]: I0320 15:45:02.517407    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t"
Mar 20 15:45:02 crc kubenswrapper[4730]: I0320 15:45:02.517519    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config"
Mar 20 15:45:02 crc kubenswrapper[4730]: I0320 15:45:02.525113    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567025-sp8pk"]
Mar 20 15:45:02 crc kubenswrapper[4730]: I0320 15:45:02.648605    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxpvp\" (UniqueName: \"kubernetes.io/projected/db3d4357-8143-45e9-ab45-e55f54735cbc-kube-api-access-pxpvp\") pod \"collect-profiles-29567025-sp8pk\" (UID: \"db3d4357-8143-45e9-ab45-e55f54735cbc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567025-sp8pk"
Mar 20 15:45:02 crc kubenswrapper[4730]: I0320 15:45:02.648755    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/db3d4357-8143-45e9-ab45-e55f54735cbc-secret-volume\") pod \"collect-profiles-29567025-sp8pk\" (UID: \"db3d4357-8143-45e9-ab45-e55f54735cbc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567025-sp8pk"
Mar 20 15:45:02 crc kubenswrapper[4730]: I0320 15:45:02.648779    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/db3d4357-8143-45e9-ab45-e55f54735cbc-config-volume\") pod \"collect-profiles-29567025-sp8pk\" (UID: \"db3d4357-8143-45e9-ab45-e55f54735cbc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567025-sp8pk"
Mar 20 15:45:02 crc kubenswrapper[4730]: I0320 15:45:02.749909    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/db3d4357-8143-45e9-ab45-e55f54735cbc-secret-volume\") pod \"collect-profiles-29567025-sp8pk\" (UID: \"db3d4357-8143-45e9-ab45-e55f54735cbc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567025-sp8pk"
Mar 20 15:45:02 crc kubenswrapper[4730]: I0320 15:45:02.749963    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/db3d4357-8143-45e9-ab45-e55f54735cbc-config-volume\") pod \"collect-profiles-29567025-sp8pk\" (UID: \"db3d4357-8143-45e9-ab45-e55f54735cbc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567025-sp8pk"
Mar 20 15:45:02 crc kubenswrapper[4730]: I0320 15:45:02.750021    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxpvp\" (UniqueName: \"kubernetes.io/projected/db3d4357-8143-45e9-ab45-e55f54735cbc-kube-api-access-pxpvp\") pod \"collect-profiles-29567025-sp8pk\" (UID: \"db3d4357-8143-45e9-ab45-e55f54735cbc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567025-sp8pk"
Mar 20 15:45:02 crc kubenswrapper[4730]: I0320 15:45:02.752021    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/db3d4357-8143-45e9-ab45-e55f54735cbc-config-volume\") pod \"collect-profiles-29567025-sp8pk\" (UID: \"db3d4357-8143-45e9-ab45-e55f54735cbc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567025-sp8pk"
Mar 20 15:45:02 crc kubenswrapper[4730]: I0320 15:45:02.755681    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/db3d4357-8143-45e9-ab45-e55f54735cbc-secret-volume\") pod \"collect-profiles-29567025-sp8pk\" (UID: \"db3d4357-8143-45e9-ab45-e55f54735cbc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567025-sp8pk"
Mar 20 15:45:02 crc kubenswrapper[4730]: I0320 15:45:02.770077    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxpvp\" (UniqueName: \"kubernetes.io/projected/db3d4357-8143-45e9-ab45-e55f54735cbc-kube-api-access-pxpvp\") pod \"collect-profiles-29567025-sp8pk\" (UID: \"db3d4357-8143-45e9-ab45-e55f54735cbc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567025-sp8pk"
Mar 20 15:45:02 crc kubenswrapper[4730]: I0320 15:45:02.849491    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567025-sp8pk"
Mar 20 15:45:03 crc kubenswrapper[4730]: I0320 15:45:03.247332    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567025-sp8pk"]
Mar 20 15:45:03 crc kubenswrapper[4730]: W0320 15:45:03.252851    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb3d4357_8143_45e9_ab45_e55f54735cbc.slice/crio-40c162368944bc083f86ebaa26c5c852910422e4d4be629097632169724e2f9c WatchSource:0}: Error finding container 40c162368944bc083f86ebaa26c5c852910422e4d4be629097632169724e2f9c: Status 404 returned error can't find the container with id 40c162368944bc083f86ebaa26c5c852910422e4d4be629097632169724e2f9c
Mar 20 15:45:03 crc kubenswrapper[4730]: I0320 15:45:03.332552    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567025-sp8pk" event={"ID":"db3d4357-8143-45e9-ab45-e55f54735cbc","Type":"ContainerStarted","Data":"40c162368944bc083f86ebaa26c5c852910422e4d4be629097632169724e2f9c"}
Mar 20 15:45:04 crc kubenswrapper[4730]: I0320 15:45:04.116548    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-86dfd77dcd-rv2mt"]
Mar 20 15:45:04 crc kubenswrapper[4730]: I0320 15:45:04.117039    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-86dfd77dcd-rv2mt" podUID="d180ebe8-2390-4a66-a6e0-1d02e256279a" containerName="controller-manager" containerID="cri-o://79c37e5d53d635bd788a7db0e9cd6a218a8e4fb2962907f08447c07acfcacfe6" gracePeriod=30
Mar 20 15:45:04 crc kubenswrapper[4730]: I0320 15:45:04.213873    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-754f74fd94-6mnkj"]
Mar 20 15:45:04 crc kubenswrapper[4730]: I0320 15:45:04.214103    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-754f74fd94-6mnkj" podUID="ef9a1400-2cee-4019-b907-440c025638b6" containerName="route-controller-manager" containerID="cri-o://3ebe650c86674a5b830721481ad113ef2a916f54a0020fdc9ad70e3a263d5aa6" gracePeriod=30
Mar 20 15:45:04 crc kubenswrapper[4730]: I0320 15:45:04.344150    4730 generic.go:334] "Generic (PLEG): container finished" podID="db3d4357-8143-45e9-ab45-e55f54735cbc" containerID="7961ca89ce2a460b127b00611370ac925492414c79b33a7aef5d34aaea8acb7f" exitCode=0
Mar 20 15:45:04 crc kubenswrapper[4730]: I0320 15:45:04.344655    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567025-sp8pk" event={"ID":"db3d4357-8143-45e9-ab45-e55f54735cbc","Type":"ContainerDied","Data":"7961ca89ce2a460b127b00611370ac925492414c79b33a7aef5d34aaea8acb7f"}
Mar 20 15:45:04 crc kubenswrapper[4730]: I0320 15:45:04.346453    4730 generic.go:334] "Generic (PLEG): container finished" podID="d180ebe8-2390-4a66-a6e0-1d02e256279a" containerID="79c37e5d53d635bd788a7db0e9cd6a218a8e4fb2962907f08447c07acfcacfe6" exitCode=0
Mar 20 15:45:04 crc kubenswrapper[4730]: I0320 15:45:04.346486    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86dfd77dcd-rv2mt" event={"ID":"d180ebe8-2390-4a66-a6e0-1d02e256279a","Type":"ContainerDied","Data":"79c37e5d53d635bd788a7db0e9cd6a218a8e4fb2962907f08447c07acfcacfe6"}
Mar 20 15:45:04 crc kubenswrapper[4730]: I0320 15:45:04.348210    4730 generic.go:334] "Generic (PLEG): container finished" podID="ef9a1400-2cee-4019-b907-440c025638b6" containerID="3ebe650c86674a5b830721481ad113ef2a916f54a0020fdc9ad70e3a263d5aa6" exitCode=0
Mar 20 15:45:04 crc kubenswrapper[4730]: I0320 15:45:04.348272    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-754f74fd94-6mnkj" event={"ID":"ef9a1400-2cee-4019-b907-440c025638b6","Type":"ContainerDied","Data":"3ebe650c86674a5b830721481ad113ef2a916f54a0020fdc9ad70e3a263d5aa6"}
Mar 20 15:45:04 crc kubenswrapper[4730]: I0320 15:45:04.657373    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-754f74fd94-6mnkj"
Mar 20 15:45:04 crc kubenswrapper[4730]: I0320 15:45:04.665408    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86dfd77dcd-rv2mt"
Mar 20 15:45:04 crc kubenswrapper[4730]: I0320 15:45:04.775542    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef9a1400-2cee-4019-b907-440c025638b6-serving-cert\") pod \"ef9a1400-2cee-4019-b907-440c025638b6\" (UID: \"ef9a1400-2cee-4019-b907-440c025638b6\") "
Mar 20 15:45:04 crc kubenswrapper[4730]: I0320 15:45:04.775581    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d180ebe8-2390-4a66-a6e0-1d02e256279a-config\") pod \"d180ebe8-2390-4a66-a6e0-1d02e256279a\" (UID: \"d180ebe8-2390-4a66-a6e0-1d02e256279a\") "
Mar 20 15:45:04 crc kubenswrapper[4730]: I0320 15:45:04.775616    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpzmq\" (UniqueName: \"kubernetes.io/projected/ef9a1400-2cee-4019-b907-440c025638b6-kube-api-access-hpzmq\") pod \"ef9a1400-2cee-4019-b907-440c025638b6\" (UID: \"ef9a1400-2cee-4019-b907-440c025638b6\") "
Mar 20 15:45:04 crc kubenswrapper[4730]: I0320 15:45:04.775676    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d180ebe8-2390-4a66-a6e0-1d02e256279a-serving-cert\") pod \"d180ebe8-2390-4a66-a6e0-1d02e256279a\" (UID: \"d180ebe8-2390-4a66-a6e0-1d02e256279a\") "
Mar 20 15:45:04 crc kubenswrapper[4730]: I0320 15:45:04.775717    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef9a1400-2cee-4019-b907-440c025638b6-client-ca\") pod \"ef9a1400-2cee-4019-b907-440c025638b6\" (UID: \"ef9a1400-2cee-4019-b907-440c025638b6\") "
Mar 20 15:45:04 crc kubenswrapper[4730]: I0320 15:45:04.775743    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d180ebe8-2390-4a66-a6e0-1d02e256279a-client-ca\") pod \"d180ebe8-2390-4a66-a6e0-1d02e256279a\" (UID: \"d180ebe8-2390-4a66-a6e0-1d02e256279a\") "
Mar 20 15:45:04 crc kubenswrapper[4730]: I0320 15:45:04.775762    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjzm9\" (UniqueName: \"kubernetes.io/projected/d180ebe8-2390-4a66-a6e0-1d02e256279a-kube-api-access-sjzm9\") pod \"d180ebe8-2390-4a66-a6e0-1d02e256279a\" (UID: \"d180ebe8-2390-4a66-a6e0-1d02e256279a\") "
Mar 20 15:45:04 crc kubenswrapper[4730]: I0320 15:45:04.776323    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d180ebe8-2390-4a66-a6e0-1d02e256279a-client-ca" (OuterVolumeSpecName: "client-ca") pod "d180ebe8-2390-4a66-a6e0-1d02e256279a" (UID: "d180ebe8-2390-4a66-a6e0-1d02e256279a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:45:04 crc kubenswrapper[4730]: I0320 15:45:04.776419    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef9a1400-2cee-4019-b907-440c025638b6-config\") pod \"ef9a1400-2cee-4019-b907-440c025638b6\" (UID: \"ef9a1400-2cee-4019-b907-440c025638b6\") "
Mar 20 15:45:04 crc kubenswrapper[4730]: I0320 15:45:04.776445    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d180ebe8-2390-4a66-a6e0-1d02e256279a-proxy-ca-bundles\") pod \"d180ebe8-2390-4a66-a6e0-1d02e256279a\" (UID: \"d180ebe8-2390-4a66-a6e0-1d02e256279a\") "
Mar 20 15:45:04 crc kubenswrapper[4730]: I0320 15:45:04.776484    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d180ebe8-2390-4a66-a6e0-1d02e256279a-config" (OuterVolumeSpecName: "config") pod "d180ebe8-2390-4a66-a6e0-1d02e256279a" (UID: "d180ebe8-2390-4a66-a6e0-1d02e256279a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:45:04 crc kubenswrapper[4730]: I0320 15:45:04.776492    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef9a1400-2cee-4019-b907-440c025638b6-client-ca" (OuterVolumeSpecName: "client-ca") pod "ef9a1400-2cee-4019-b907-440c025638b6" (UID: "ef9a1400-2cee-4019-b907-440c025638b6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:45:04 crc kubenswrapper[4730]: I0320 15:45:04.776854    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef9a1400-2cee-4019-b907-440c025638b6-config" (OuterVolumeSpecName: "config") pod "ef9a1400-2cee-4019-b907-440c025638b6" (UID: "ef9a1400-2cee-4019-b907-440c025638b6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:45:04 crc kubenswrapper[4730]: I0320 15:45:04.777038    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d180ebe8-2390-4a66-a6e0-1d02e256279a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d180ebe8-2390-4a66-a6e0-1d02e256279a" (UID: "d180ebe8-2390-4a66-a6e0-1d02e256279a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:45:04 crc kubenswrapper[4730]: I0320 15:45:04.777089    4730 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d180ebe8-2390-4a66-a6e0-1d02e256279a-client-ca\") on node \"crc\" DevicePath \"\""
Mar 20 15:45:04 crc kubenswrapper[4730]: I0320 15:45:04.777101    4730 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef9a1400-2cee-4019-b907-440c025638b6-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:45:04 crc kubenswrapper[4730]: I0320 15:45:04.777109    4730 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d180ebe8-2390-4a66-a6e0-1d02e256279a-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:45:04 crc kubenswrapper[4730]: I0320 15:45:04.777118    4730 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef9a1400-2cee-4019-b907-440c025638b6-client-ca\") on node \"crc\" DevicePath \"\""
Mar 20 15:45:04 crc kubenswrapper[4730]: I0320 15:45:04.781821    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d180ebe8-2390-4a66-a6e0-1d02e256279a-kube-api-access-sjzm9" (OuterVolumeSpecName: "kube-api-access-sjzm9") pod "d180ebe8-2390-4a66-a6e0-1d02e256279a" (UID: "d180ebe8-2390-4a66-a6e0-1d02e256279a"). InnerVolumeSpecName "kube-api-access-sjzm9". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:45:04 crc kubenswrapper[4730]: I0320 15:45:04.781863    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef9a1400-2cee-4019-b907-440c025638b6-kube-api-access-hpzmq" (OuterVolumeSpecName: "kube-api-access-hpzmq") pod "ef9a1400-2cee-4019-b907-440c025638b6" (UID: "ef9a1400-2cee-4019-b907-440c025638b6"). InnerVolumeSpecName "kube-api-access-hpzmq". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:45:04 crc kubenswrapper[4730]: I0320 15:45:04.782395    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef9a1400-2cee-4019-b907-440c025638b6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ef9a1400-2cee-4019-b907-440c025638b6" (UID: "ef9a1400-2cee-4019-b907-440c025638b6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:45:04 crc kubenswrapper[4730]: I0320 15:45:04.785332    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d180ebe8-2390-4a66-a6e0-1d02e256279a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d180ebe8-2390-4a66-a6e0-1d02e256279a" (UID: "d180ebe8-2390-4a66-a6e0-1d02e256279a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:45:04 crc kubenswrapper[4730]: I0320 15:45:04.878578    4730 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef9a1400-2cee-4019-b907-440c025638b6-serving-cert\") on node \"crc\" DevicePath \"\""
Mar 20 15:45:04 crc kubenswrapper[4730]: I0320 15:45:04.878624    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpzmq\" (UniqueName: \"kubernetes.io/projected/ef9a1400-2cee-4019-b907-440c025638b6-kube-api-access-hpzmq\") on node \"crc\" DevicePath \"\""
Mar 20 15:45:04 crc kubenswrapper[4730]: I0320 15:45:04.878637    4730 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d180ebe8-2390-4a66-a6e0-1d02e256279a-serving-cert\") on node \"crc\" DevicePath \"\""
Mar 20 15:45:04 crc kubenswrapper[4730]: I0320 15:45:04.878647    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjzm9\" (UniqueName: \"kubernetes.io/projected/d180ebe8-2390-4a66-a6e0-1d02e256279a-kube-api-access-sjzm9\") on node \"crc\" DevicePath \"\""
Mar 20 15:45:04 crc kubenswrapper[4730]: I0320 15:45:04.878658    4730 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d180ebe8-2390-4a66-a6e0-1d02e256279a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\""
Mar 20 15:45:05 crc kubenswrapper[4730]: I0320 15:45:05.241993    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5855d88fdc-qmhg5"]
Mar 20 15:45:05 crc kubenswrapper[4730]: E0320 15:45:05.242265    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef9a1400-2cee-4019-b907-440c025638b6" containerName="route-controller-manager"
Mar 20 15:45:05 crc kubenswrapper[4730]: I0320 15:45:05.242281    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef9a1400-2cee-4019-b907-440c025638b6" containerName="route-controller-manager"
Mar 20 15:45:05 crc kubenswrapper[4730]: E0320 15:45:05.242298    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d180ebe8-2390-4a66-a6e0-1d02e256279a" containerName="controller-manager"
Mar 20 15:45:05 crc kubenswrapper[4730]: I0320 15:45:05.242305    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="d180ebe8-2390-4a66-a6e0-1d02e256279a" containerName="controller-manager"
Mar 20 15:45:05 crc kubenswrapper[4730]: I0320 15:45:05.242429    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="d180ebe8-2390-4a66-a6e0-1d02e256279a" containerName="controller-manager"
Mar 20 15:45:05 crc kubenswrapper[4730]: I0320 15:45:05.242443    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef9a1400-2cee-4019-b907-440c025638b6" containerName="route-controller-manager"
Mar 20 15:45:05 crc kubenswrapper[4730]: I0320 15:45:05.242838    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5855d88fdc-qmhg5"
Mar 20 15:45:05 crc kubenswrapper[4730]: I0320 15:45:05.258121    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5855d88fdc-qmhg5"]
Mar 20 15:45:05 crc kubenswrapper[4730]: I0320 15:45:05.358553    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-754f74fd94-6mnkj" event={"ID":"ef9a1400-2cee-4019-b907-440c025638b6","Type":"ContainerDied","Data":"dc3dc13f49503d3d06bcb012aeab946187b3c373989eaa5abd5e06eb7c51d310"}
Mar 20 15:45:05 crc kubenswrapper[4730]: I0320 15:45:05.358571    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-754f74fd94-6mnkj"
Mar 20 15:45:05 crc kubenswrapper[4730]: I0320 15:45:05.358619    4730 scope.go:117] "RemoveContainer" containerID="3ebe650c86674a5b830721481ad113ef2a916f54a0020fdc9ad70e3a263d5aa6"
Mar 20 15:45:05 crc kubenswrapper[4730]: I0320 15:45:05.360355    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86dfd77dcd-rv2mt"
Mar 20 15:45:05 crc kubenswrapper[4730]: I0320 15:45:05.362316    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86dfd77dcd-rv2mt" event={"ID":"d180ebe8-2390-4a66-a6e0-1d02e256279a","Type":"ContainerDied","Data":"1f20c0c6cb3f095a9ebf090bce89d3cdbdc4242e46cb281771f335b8d4272464"}
Mar 20 15:45:05 crc kubenswrapper[4730]: I0320 15:45:05.383064    4730 scope.go:117] "RemoveContainer" containerID="79c37e5d53d635bd788a7db0e9cd6a218a8e4fb2962907f08447c07acfcacfe6"
Mar 20 15:45:05 crc kubenswrapper[4730]: I0320 15:45:05.386179    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c4a9957b-61b8-4536-ae91-fe1f1c12575f-proxy-ca-bundles\") pod \"controller-manager-5855d88fdc-qmhg5\" (UID: \"c4a9957b-61b8-4536-ae91-fe1f1c12575f\") " pod="openshift-controller-manager/controller-manager-5855d88fdc-qmhg5"
Mar 20 15:45:05 crc kubenswrapper[4730]: I0320 15:45:05.386276    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4a9957b-61b8-4536-ae91-fe1f1c12575f-config\") pod \"controller-manager-5855d88fdc-qmhg5\" (UID: \"c4a9957b-61b8-4536-ae91-fe1f1c12575f\") " pod="openshift-controller-manager/controller-manager-5855d88fdc-qmhg5"
Mar 20 15:45:05 crc kubenswrapper[4730]: I0320 15:45:05.386310    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4a9957b-61b8-4536-ae91-fe1f1c12575f-serving-cert\") pod \"controller-manager-5855d88fdc-qmhg5\" (UID: \"c4a9957b-61b8-4536-ae91-fe1f1c12575f\") " pod="openshift-controller-manager/controller-manager-5855d88fdc-qmhg5"
Mar 20 15:45:05 crc kubenswrapper[4730]: I0320 15:45:05.386440    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c4a9957b-61b8-4536-ae91-fe1f1c12575f-client-ca\") pod \"controller-manager-5855d88fdc-qmhg5\" (UID: \"c4a9957b-61b8-4536-ae91-fe1f1c12575f\") " pod="openshift-controller-manager/controller-manager-5855d88fdc-qmhg5"
Mar 20 15:45:05 crc kubenswrapper[4730]: I0320 15:45:05.386478    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj57d\" (UniqueName: \"kubernetes.io/projected/c4a9957b-61b8-4536-ae91-fe1f1c12575f-kube-api-access-nj57d\") pod \"controller-manager-5855d88fdc-qmhg5\" (UID: \"c4a9957b-61b8-4536-ae91-fe1f1c12575f\") " pod="openshift-controller-manager/controller-manager-5855d88fdc-qmhg5"
Mar 20 15:45:05 crc kubenswrapper[4730]: I0320 15:45:05.397586    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-86dfd77dcd-rv2mt"]
Mar 20 15:45:05 crc kubenswrapper[4730]: I0320 15:45:05.404165    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-86dfd77dcd-rv2mt"]
Mar 20 15:45:05 crc kubenswrapper[4730]: I0320 15:45:05.411230    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-754f74fd94-6mnkj"]
Mar 20 15:45:05 crc kubenswrapper[4730]: I0320 15:45:05.419400    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-754f74fd94-6mnkj"]
Mar 20 15:45:05 crc kubenswrapper[4730]: I0320 15:45:05.489435    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c4a9957b-61b8-4536-ae91-fe1f1c12575f-proxy-ca-bundles\") pod \"controller-manager-5855d88fdc-qmhg5\" (UID: \"c4a9957b-61b8-4536-ae91-fe1f1c12575f\") " pod="openshift-controller-manager/controller-manager-5855d88fdc-qmhg5"
Mar 20 15:45:05 crc kubenswrapper[4730]: I0320 15:45:05.489481    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4a9957b-61b8-4536-ae91-fe1f1c12575f-config\") pod \"controller-manager-5855d88fdc-qmhg5\" (UID: \"c4a9957b-61b8-4536-ae91-fe1f1c12575f\") " pod="openshift-controller-manager/controller-manager-5855d88fdc-qmhg5"
Mar 20 15:45:05 crc kubenswrapper[4730]: I0320 15:45:05.489510    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4a9957b-61b8-4536-ae91-fe1f1c12575f-serving-cert\") pod \"controller-manager-5855d88fdc-qmhg5\" (UID: \"c4a9957b-61b8-4536-ae91-fe1f1c12575f\") " pod="openshift-controller-manager/controller-manager-5855d88fdc-qmhg5"
Mar 20 15:45:05 crc kubenswrapper[4730]: I0320 15:45:05.489570    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c4a9957b-61b8-4536-ae91-fe1f1c12575f-client-ca\") pod \"controller-manager-5855d88fdc-qmhg5\" (UID: \"c4a9957b-61b8-4536-ae91-fe1f1c12575f\") " pod="openshift-controller-manager/controller-manager-5855d88fdc-qmhg5"
Mar 20 15:45:05 crc kubenswrapper[4730]: I0320 15:45:05.489602    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nj57d\" (UniqueName: \"kubernetes.io/projected/c4a9957b-61b8-4536-ae91-fe1f1c12575f-kube-api-access-nj57d\") pod \"controller-manager-5855d88fdc-qmhg5\" (UID: \"c4a9957b-61b8-4536-ae91-fe1f1c12575f\") " pod="openshift-controller-manager/controller-manager-5855d88fdc-qmhg5"
Mar 20 15:45:05 crc kubenswrapper[4730]: I0320 15:45:05.490848    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c4a9957b-61b8-4536-ae91-fe1f1c12575f-client-ca\") pod \"controller-manager-5855d88fdc-qmhg5\" (UID: \"c4a9957b-61b8-4536-ae91-fe1f1c12575f\") " pod="openshift-controller-manager/controller-manager-5855d88fdc-qmhg5"
Mar 20 15:45:05 crc kubenswrapper[4730]: I0320 15:45:05.492072    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4a9957b-61b8-4536-ae91-fe1f1c12575f-config\") pod \"controller-manager-5855d88fdc-qmhg5\" (UID: \"c4a9957b-61b8-4536-ae91-fe1f1c12575f\") " pod="openshift-controller-manager/controller-manager-5855d88fdc-qmhg5"
Mar 20 15:45:05 crc kubenswrapper[4730]: I0320 15:45:05.492451    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c4a9957b-61b8-4536-ae91-fe1f1c12575f-proxy-ca-bundles\") pod \"controller-manager-5855d88fdc-qmhg5\" (UID: \"c4a9957b-61b8-4536-ae91-fe1f1c12575f\") " pod="openshift-controller-manager/controller-manager-5855d88fdc-qmhg5"
Mar 20 15:45:05 crc kubenswrapper[4730]: I0320 15:45:05.494784    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4a9957b-61b8-4536-ae91-fe1f1c12575f-serving-cert\") pod \"controller-manager-5855d88fdc-qmhg5\" (UID: \"c4a9957b-61b8-4536-ae91-fe1f1c12575f\") " pod="openshift-controller-manager/controller-manager-5855d88fdc-qmhg5"
Mar 20 15:45:05 crc kubenswrapper[4730]: I0320 15:45:05.511167    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj57d\" (UniqueName: \"kubernetes.io/projected/c4a9957b-61b8-4536-ae91-fe1f1c12575f-kube-api-access-nj57d\") pod \"controller-manager-5855d88fdc-qmhg5\" (UID: \"c4a9957b-61b8-4536-ae91-fe1f1c12575f\") " pod="openshift-controller-manager/controller-manager-5855d88fdc-qmhg5"
Mar 20 15:45:05 crc kubenswrapper[4730]: I0320 15:45:05.541441    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d180ebe8-2390-4a66-a6e0-1d02e256279a" path="/var/lib/kubelet/pods/d180ebe8-2390-4a66-a6e0-1d02e256279a/volumes"
Mar 20 15:45:05 crc kubenswrapper[4730]: I0320 15:45:05.541996    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef9a1400-2cee-4019-b907-440c025638b6" path="/var/lib/kubelet/pods/ef9a1400-2cee-4019-b907-440c025638b6/volumes"
Mar 20 15:45:05 crc kubenswrapper[4730]: I0320 15:45:05.555922    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567025-sp8pk"
Mar 20 15:45:05 crc kubenswrapper[4730]: I0320 15:45:05.557145    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5855d88fdc-qmhg5"
Mar 20 15:45:05 crc kubenswrapper[4730]: I0320 15:45:05.693301    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/db3d4357-8143-45e9-ab45-e55f54735cbc-secret-volume\") pod \"db3d4357-8143-45e9-ab45-e55f54735cbc\" (UID: \"db3d4357-8143-45e9-ab45-e55f54735cbc\") "
Mar 20 15:45:05 crc kubenswrapper[4730]: I0320 15:45:05.693699    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxpvp\" (UniqueName: \"kubernetes.io/projected/db3d4357-8143-45e9-ab45-e55f54735cbc-kube-api-access-pxpvp\") pod \"db3d4357-8143-45e9-ab45-e55f54735cbc\" (UID: \"db3d4357-8143-45e9-ab45-e55f54735cbc\") "
Mar 20 15:45:05 crc kubenswrapper[4730]: I0320 15:45:05.693735    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/db3d4357-8143-45e9-ab45-e55f54735cbc-config-volume\") pod \"db3d4357-8143-45e9-ab45-e55f54735cbc\" (UID: \"db3d4357-8143-45e9-ab45-e55f54735cbc\") "
Mar 20 15:45:05 crc kubenswrapper[4730]: I0320 15:45:05.694677    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db3d4357-8143-45e9-ab45-e55f54735cbc-config-volume" (OuterVolumeSpecName: "config-volume") pod "db3d4357-8143-45e9-ab45-e55f54735cbc" (UID: "db3d4357-8143-45e9-ab45-e55f54735cbc"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:45:05 crc kubenswrapper[4730]: I0320 15:45:05.698845    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db3d4357-8143-45e9-ab45-e55f54735cbc-kube-api-access-pxpvp" (OuterVolumeSpecName: "kube-api-access-pxpvp") pod "db3d4357-8143-45e9-ab45-e55f54735cbc" (UID: "db3d4357-8143-45e9-ab45-e55f54735cbc"). InnerVolumeSpecName "kube-api-access-pxpvp". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:45:05 crc kubenswrapper[4730]: I0320 15:45:05.702799    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db3d4357-8143-45e9-ab45-e55f54735cbc-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "db3d4357-8143-45e9-ab45-e55f54735cbc" (UID: "db3d4357-8143-45e9-ab45-e55f54735cbc"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:45:05 crc kubenswrapper[4730]: I0320 15:45:05.794997    4730 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/db3d4357-8143-45e9-ab45-e55f54735cbc-secret-volume\") on node \"crc\" DevicePath \"\""
Mar 20 15:45:05 crc kubenswrapper[4730]: I0320 15:45:05.795036    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxpvp\" (UniqueName: \"kubernetes.io/projected/db3d4357-8143-45e9-ab45-e55f54735cbc-kube-api-access-pxpvp\") on node \"crc\" DevicePath \"\""
Mar 20 15:45:05 crc kubenswrapper[4730]: I0320 15:45:05.795050    4730 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/db3d4357-8143-45e9-ab45-e55f54735cbc-config-volume\") on node \"crc\" DevicePath \"\""
Mar 20 15:45:05 crc kubenswrapper[4730]: I0320 15:45:05.986788    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5855d88fdc-qmhg5"]
Mar 20 15:45:06 crc kubenswrapper[4730]: I0320 15:45:06.241988    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79bc85d457-rb8tc"]
Mar 20 15:45:06 crc kubenswrapper[4730]: E0320 15:45:06.242439    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db3d4357-8143-45e9-ab45-e55f54735cbc" containerName="collect-profiles"
Mar 20 15:45:06 crc kubenswrapper[4730]: I0320 15:45:06.242454    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="db3d4357-8143-45e9-ab45-e55f54735cbc" containerName="collect-profiles"
Mar 20 15:45:06 crc kubenswrapper[4730]: I0320 15:45:06.242575    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="db3d4357-8143-45e9-ab45-e55f54735cbc" containerName="collect-profiles"
Mar 20 15:45:06 crc kubenswrapper[4730]: I0320 15:45:06.242992    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79bc85d457-rb8tc"
Mar 20 15:45:06 crc kubenswrapper[4730]: I0320 15:45:06.245973    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert"
Mar 20 15:45:06 crc kubenswrapper[4730]: I0320 15:45:06.245986    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config"
Mar 20 15:45:06 crc kubenswrapper[4730]: I0320 15:45:06.246036    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2"
Mar 20 15:45:06 crc kubenswrapper[4730]: I0320 15:45:06.246194    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca"
Mar 20 15:45:06 crc kubenswrapper[4730]: I0320 15:45:06.246334    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt"
Mar 20 15:45:06 crc kubenswrapper[4730]: I0320 15:45:06.246729    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt"
Mar 20 15:45:06 crc kubenswrapper[4730]: I0320 15:45:06.253362    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79bc85d457-rb8tc"]
Mar 20 15:45:06 crc kubenswrapper[4730]: I0320 15:45:06.367981    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5855d88fdc-qmhg5" event={"ID":"c4a9957b-61b8-4536-ae91-fe1f1c12575f","Type":"ContainerStarted","Data":"272f9abedbf05198cbcea3f3c4c9c1a9c6f254ef663ed65fee9f8391c9210b16"}
Mar 20 15:45:06 crc kubenswrapper[4730]: I0320 15:45:06.368020    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5855d88fdc-qmhg5" event={"ID":"c4a9957b-61b8-4536-ae91-fe1f1c12575f","Type":"ContainerStarted","Data":"85975510d8507912acb9c3c9e341eb489a6bfcf769bf4be7878158b90105c935"}
Mar 20 15:45:06 crc kubenswrapper[4730]: I0320 15:45:06.368270    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5855d88fdc-qmhg5"
Mar 20 15:45:06 crc kubenswrapper[4730]: I0320 15:45:06.370261    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567025-sp8pk" event={"ID":"db3d4357-8143-45e9-ab45-e55f54735cbc","Type":"ContainerDied","Data":"40c162368944bc083f86ebaa26c5c852910422e4d4be629097632169724e2f9c"}
Mar 20 15:45:06 crc kubenswrapper[4730]: I0320 15:45:06.370303    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40c162368944bc083f86ebaa26c5c852910422e4d4be629097632169724e2f9c"
Mar 20 15:45:06 crc kubenswrapper[4730]: I0320 15:45:06.370319    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567025-sp8pk"
Mar 20 15:45:06 crc kubenswrapper[4730]: I0320 15:45:06.397839    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5855d88fdc-qmhg5"
Mar 20 15:45:06 crc kubenswrapper[4730]: I0320 15:45:06.403268    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5855d88fdc-qmhg5" podStartSLOduration=2.403231178 podStartE2EDuration="2.403231178s" podCreationTimestamp="2026-03-20 15:45:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:45:06.399486264 +0000 UTC m=+365.612857643" watchObservedRunningTime="2026-03-20 15:45:06.403231178 +0000 UTC m=+365.616602547"
Mar 20 15:45:06 crc kubenswrapper[4730]: I0320 15:45:06.405968    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b8a480c-69ef-49fc-83a5-4cd052de69f9-config\") pod \"route-controller-manager-79bc85d457-rb8tc\" (UID: \"5b8a480c-69ef-49fc-83a5-4cd052de69f9\") " pod="openshift-route-controller-manager/route-controller-manager-79bc85d457-rb8tc"
Mar 20 15:45:06 crc kubenswrapper[4730]: I0320 15:45:06.406081    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5b8a480c-69ef-49fc-83a5-4cd052de69f9-client-ca\") pod \"route-controller-manager-79bc85d457-rb8tc\" (UID: \"5b8a480c-69ef-49fc-83a5-4cd052de69f9\") " pod="openshift-route-controller-manager/route-controller-manager-79bc85d457-rb8tc"
Mar 20 15:45:06 crc kubenswrapper[4730]: I0320 15:45:06.406214    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b8a480c-69ef-49fc-83a5-4cd052de69f9-serving-cert\") pod \"route-controller-manager-79bc85d457-rb8tc\" (UID: \"5b8a480c-69ef-49fc-83a5-4cd052de69f9\") " pod="openshift-route-controller-manager/route-controller-manager-79bc85d457-rb8tc"
Mar 20 15:45:06 crc kubenswrapper[4730]: I0320 15:45:06.406399    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzs7x\" (UniqueName: \"kubernetes.io/projected/5b8a480c-69ef-49fc-83a5-4cd052de69f9-kube-api-access-wzs7x\") pod \"route-controller-manager-79bc85d457-rb8tc\" (UID: \"5b8a480c-69ef-49fc-83a5-4cd052de69f9\") " pod="openshift-route-controller-manager/route-controller-manager-79bc85d457-rb8tc"
Mar 20 15:45:06 crc kubenswrapper[4730]: I0320 15:45:06.508002    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzs7x\" (UniqueName: \"kubernetes.io/projected/5b8a480c-69ef-49fc-83a5-4cd052de69f9-kube-api-access-wzs7x\") pod \"route-controller-manager-79bc85d457-rb8tc\" (UID: \"5b8a480c-69ef-49fc-83a5-4cd052de69f9\") " pod="openshift-route-controller-manager/route-controller-manager-79bc85d457-rb8tc"
Mar 20 15:45:06 crc kubenswrapper[4730]: I0320 15:45:06.508155    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b8a480c-69ef-49fc-83a5-4cd052de69f9-config\") pod \"route-controller-manager-79bc85d457-rb8tc\" (UID: \"5b8a480c-69ef-49fc-83a5-4cd052de69f9\") " pod="openshift-route-controller-manager/route-controller-manager-79bc85d457-rb8tc"
Mar 20 15:45:06 crc kubenswrapper[4730]: I0320 15:45:06.508178    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5b8a480c-69ef-49fc-83a5-4cd052de69f9-client-ca\") pod \"route-controller-manager-79bc85d457-rb8tc\" (UID: \"5b8a480c-69ef-49fc-83a5-4cd052de69f9\") " pod="openshift-route-controller-manager/route-controller-manager-79bc85d457-rb8tc"
Mar 20 15:45:06 crc kubenswrapper[4730]: I0320 15:45:06.508215    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b8a480c-69ef-49fc-83a5-4cd052de69f9-serving-cert\") pod \"route-controller-manager-79bc85d457-rb8tc\" (UID: \"5b8a480c-69ef-49fc-83a5-4cd052de69f9\") " pod="openshift-route-controller-manager/route-controller-manager-79bc85d457-rb8tc"
Mar 20 15:45:06 crc kubenswrapper[4730]: I0320 15:45:06.509342    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5b8a480c-69ef-49fc-83a5-4cd052de69f9-client-ca\") pod \"route-controller-manager-79bc85d457-rb8tc\" (UID: \"5b8a480c-69ef-49fc-83a5-4cd052de69f9\") " pod="openshift-route-controller-manager/route-controller-manager-79bc85d457-rb8tc"
Mar 20 15:45:06 crc kubenswrapper[4730]: I0320 15:45:06.509933    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b8a480c-69ef-49fc-83a5-4cd052de69f9-config\") pod \"route-controller-manager-79bc85d457-rb8tc\" (UID: \"5b8a480c-69ef-49fc-83a5-4cd052de69f9\") " pod="openshift-route-controller-manager/route-controller-manager-79bc85d457-rb8tc"
Mar 20 15:45:06 crc kubenswrapper[4730]: I0320 15:45:06.511821    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b8a480c-69ef-49fc-83a5-4cd052de69f9-serving-cert\") pod \"route-controller-manager-79bc85d457-rb8tc\" (UID: \"5b8a480c-69ef-49fc-83a5-4cd052de69f9\") " pod="openshift-route-controller-manager/route-controller-manager-79bc85d457-rb8tc"
Mar 20 15:45:06 crc kubenswrapper[4730]: I0320 15:45:06.526295    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzs7x\" (UniqueName: \"kubernetes.io/projected/5b8a480c-69ef-49fc-83a5-4cd052de69f9-kube-api-access-wzs7x\") pod \"route-controller-manager-79bc85d457-rb8tc\" (UID: \"5b8a480c-69ef-49fc-83a5-4cd052de69f9\") " pod="openshift-route-controller-manager/route-controller-manager-79bc85d457-rb8tc"
Mar 20 15:45:06 crc kubenswrapper[4730]: I0320 15:45:06.562825    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79bc85d457-rb8tc"
Mar 20 15:45:06 crc kubenswrapper[4730]: I0320 15:45:06.954043    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79bc85d457-rb8tc"]
Mar 20 15:45:06 crc kubenswrapper[4730]: W0320 15:45:06.965736    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b8a480c_69ef_49fc_83a5_4cd052de69f9.slice/crio-86f0fd37bd61a9248f93bc6cef29889e8fda36907bdf0b815ae3f15147d600a1 WatchSource:0}: Error finding container 86f0fd37bd61a9248f93bc6cef29889e8fda36907bdf0b815ae3f15147d600a1: Status 404 returned error can't find the container with id 86f0fd37bd61a9248f93bc6cef29889e8fda36907bdf0b815ae3f15147d600a1
Mar 20 15:45:07 crc kubenswrapper[4730]: I0320 15:45:07.376521    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79bc85d457-rb8tc" event={"ID":"5b8a480c-69ef-49fc-83a5-4cd052de69f9","Type":"ContainerStarted","Data":"37e041be660c1bba7e1df6b27c21e4558d7b932a4f7c1f474df6579e15113027"}
Mar 20 15:45:07 crc kubenswrapper[4730]: I0320 15:45:07.376575    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79bc85d457-rb8tc" event={"ID":"5b8a480c-69ef-49fc-83a5-4cd052de69f9","Type":"ContainerStarted","Data":"86f0fd37bd61a9248f93bc6cef29889e8fda36907bdf0b815ae3f15147d600a1"}
Mar 20 15:45:07 crc kubenswrapper[4730]: I0320 15:45:07.395872    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-79bc85d457-rb8tc" podStartSLOduration=3.395856479 podStartE2EDuration="3.395856479s" podCreationTimestamp="2026-03-20 15:45:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:45:07.391642591 +0000 UTC m=+366.605013960" watchObservedRunningTime="2026-03-20 15:45:07.395856479 +0000 UTC m=+366.609227848"
Mar 20 15:45:08 crc kubenswrapper[4730]: I0320 15:45:08.381998    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-79bc85d457-rb8tc"
Mar 20 15:45:08 crc kubenswrapper[4730]: I0320 15:45:08.387290    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-79bc85d457-rb8tc"
Mar 20 15:45:12 crc kubenswrapper[4730]: I0320 15:45:12.406164    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log"
Mar 20 15:45:12 crc kubenswrapper[4730]: I0320 15:45:12.408147    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log"
Mar 20 15:45:12 crc kubenswrapper[4730]: I0320 15:45:12.408847    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log"
Mar 20 15:45:12 crc kubenswrapper[4730]: I0320 15:45:12.408910    4730 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="e1f2bc557068d23d31fb94ba0b8755d440a6066e6a0d9e74613e4436d64e826e" exitCode=137
Mar 20 15:45:12 crc kubenswrapper[4730]: I0320 15:45:12.408948    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"e1f2bc557068d23d31fb94ba0b8755d440a6066e6a0d9e74613e4436d64e826e"}
Mar 20 15:45:12 crc kubenswrapper[4730]: I0320 15:45:12.408985    4730 scope.go:117] "RemoveContainer" containerID="1aee2dcf43ecf6df4a1615aa6e468921053ccb529d3c6dbc2c2ad641e264e606"
Mar 20 15:45:13 crc kubenswrapper[4730]: I0320 15:45:13.416652    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log"
Mar 20 15:45:13 crc kubenswrapper[4730]: I0320 15:45:13.418172    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log"
Mar 20 15:45:13 crc kubenswrapper[4730]: I0320 15:45:13.418235    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c7a4751e0ccbddb326de2ce02b8632520e3bc55d3267079a122c3d02126536b6"}
Mar 20 15:45:16 crc kubenswrapper[4730]: I0320 15:45:16.433777    4730 generic.go:334] "Generic (PLEG): container finished" podID="e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3" containerID="53a23df451c80721daf3b80414ff05d019adbd298f30cf30f417f8af1c2bafc2" exitCode=0
Mar 20 15:45:16 crc kubenswrapper[4730]: I0320 15:45:16.433929    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-klbh8" event={"ID":"e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3","Type":"ContainerDied","Data":"53a23df451c80721daf3b80414ff05d019adbd298f30cf30f417f8af1c2bafc2"}
Mar 20 15:45:16 crc kubenswrapper[4730]: I0320 15:45:16.434713    4730 scope.go:117] "RemoveContainer" containerID="53a23df451c80721daf3b80414ff05d019adbd298f30cf30f417f8af1c2bafc2"
Mar 20 15:45:17 crc kubenswrapper[4730]: I0320 15:45:17.291641    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc"
Mar 20 15:45:17 crc kubenswrapper[4730]: I0320 15:45:17.439693    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-klbh8" event={"ID":"e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3","Type":"ContainerStarted","Data":"753a6699245a34bcfcd2383bf2298b09146bcf7e23ec3ecf85f51c516941cd06"}
Mar 20 15:45:17 crc kubenswrapper[4730]: I0320 15:45:17.440537    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-klbh8"
Mar 20 15:45:17 crc kubenswrapper[4730]: I0320 15:45:17.442562    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-klbh8"
Mar 20 15:45:22 crc kubenswrapper[4730]: I0320 15:45:22.118713    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc"
Mar 20 15:45:22 crc kubenswrapper[4730]: I0320 15:45:22.129950    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc"
Mar 20 15:45:22 crc kubenswrapper[4730]: I0320 15:45:22.468339    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc"
Mar 20 15:45:29 crc kubenswrapper[4730]: I0320 15:45:29.798911    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5855d88fdc-qmhg5"]
Mar 20 15:45:29 crc kubenswrapper[4730]: I0320 15:45:29.799693    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5855d88fdc-qmhg5" podUID="c4a9957b-61b8-4536-ae91-fe1f1c12575f" containerName="controller-manager" containerID="cri-o://272f9abedbf05198cbcea3f3c4c9c1a9c6f254ef663ed65fee9f8391c9210b16" gracePeriod=30
Mar 20 15:45:29 crc kubenswrapper[4730]: I0320 15:45:29.803963    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79bc85d457-rb8tc"]
Mar 20 15:45:29 crc kubenswrapper[4730]: I0320 15:45:29.811298    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-79bc85d457-rb8tc" podUID="5b8a480c-69ef-49fc-83a5-4cd052de69f9" containerName="route-controller-manager" containerID="cri-o://37e041be660c1bba7e1df6b27c21e4558d7b932a4f7c1f474df6579e15113027" gracePeriod=30
Mar 20 15:45:30 crc kubenswrapper[4730]: I0320 15:45:30.369078    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79bc85d457-rb8tc"
Mar 20 15:45:30 crc kubenswrapper[4730]: I0320 15:45:30.517397    4730 generic.go:334] "Generic (PLEG): container finished" podID="c4a9957b-61b8-4536-ae91-fe1f1c12575f" containerID="272f9abedbf05198cbcea3f3c4c9c1a9c6f254ef663ed65fee9f8391c9210b16" exitCode=0
Mar 20 15:45:30 crc kubenswrapper[4730]: I0320 15:45:30.517436    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5855d88fdc-qmhg5" event={"ID":"c4a9957b-61b8-4536-ae91-fe1f1c12575f","Type":"ContainerDied","Data":"272f9abedbf05198cbcea3f3c4c9c1a9c6f254ef663ed65fee9f8391c9210b16"}
Mar 20 15:45:30 crc kubenswrapper[4730]: I0320 15:45:30.518444    4730 generic.go:334] "Generic (PLEG): container finished" podID="5b8a480c-69ef-49fc-83a5-4cd052de69f9" containerID="37e041be660c1bba7e1df6b27c21e4558d7b932a4f7c1f474df6579e15113027" exitCode=0
Mar 20 15:45:30 crc kubenswrapper[4730]: I0320 15:45:30.518472    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79bc85d457-rb8tc" event={"ID":"5b8a480c-69ef-49fc-83a5-4cd052de69f9","Type":"ContainerDied","Data":"37e041be660c1bba7e1df6b27c21e4558d7b932a4f7c1f474df6579e15113027"}
Mar 20 15:45:30 crc kubenswrapper[4730]: I0320 15:45:30.518491    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79bc85d457-rb8tc" event={"ID":"5b8a480c-69ef-49fc-83a5-4cd052de69f9","Type":"ContainerDied","Data":"86f0fd37bd61a9248f93bc6cef29889e8fda36907bdf0b815ae3f15147d600a1"}
Mar 20 15:45:30 crc kubenswrapper[4730]: I0320 15:45:30.518493    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79bc85d457-rb8tc"
Mar 20 15:45:30 crc kubenswrapper[4730]: I0320 15:45:30.518576    4730 scope.go:117] "RemoveContainer" containerID="37e041be660c1bba7e1df6b27c21e4558d7b932a4f7c1f474df6579e15113027"
Mar 20 15:45:30 crc kubenswrapper[4730]: I0320 15:45:30.537024    4730 scope.go:117] "RemoveContainer" containerID="37e041be660c1bba7e1df6b27c21e4558d7b932a4f7c1f474df6579e15113027"
Mar 20 15:45:30 crc kubenswrapper[4730]: E0320 15:45:30.537482    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37e041be660c1bba7e1df6b27c21e4558d7b932a4f7c1f474df6579e15113027\": container with ID starting with 37e041be660c1bba7e1df6b27c21e4558d7b932a4f7c1f474df6579e15113027 not found: ID does not exist" containerID="37e041be660c1bba7e1df6b27c21e4558d7b932a4f7c1f474df6579e15113027"
Mar 20 15:45:30 crc kubenswrapper[4730]: I0320 15:45:30.537624    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37e041be660c1bba7e1df6b27c21e4558d7b932a4f7c1f474df6579e15113027"} err="failed to get container status \"37e041be660c1bba7e1df6b27c21e4558d7b932a4f7c1f474df6579e15113027\": rpc error: code = NotFound desc = could not find container \"37e041be660c1bba7e1df6b27c21e4558d7b932a4f7c1f474df6579e15113027\": container with ID starting with 37e041be660c1bba7e1df6b27c21e4558d7b932a4f7c1f474df6579e15113027 not found: ID does not exist"
Mar 20 15:45:30 crc kubenswrapper[4730]: I0320 15:45:30.557590    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b8a480c-69ef-49fc-83a5-4cd052de69f9-serving-cert\") pod \"5b8a480c-69ef-49fc-83a5-4cd052de69f9\" (UID: \"5b8a480c-69ef-49fc-83a5-4cd052de69f9\") "
Mar 20 15:45:30 crc kubenswrapper[4730]: I0320 15:45:30.557631    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b8a480c-69ef-49fc-83a5-4cd052de69f9-config\") pod \"5b8a480c-69ef-49fc-83a5-4cd052de69f9\" (UID: \"5b8a480c-69ef-49fc-83a5-4cd052de69f9\") "
Mar 20 15:45:30 crc kubenswrapper[4730]: I0320 15:45:30.557692    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzs7x\" (UniqueName: \"kubernetes.io/projected/5b8a480c-69ef-49fc-83a5-4cd052de69f9-kube-api-access-wzs7x\") pod \"5b8a480c-69ef-49fc-83a5-4cd052de69f9\" (UID: \"5b8a480c-69ef-49fc-83a5-4cd052de69f9\") "
Mar 20 15:45:30 crc kubenswrapper[4730]: I0320 15:45:30.557718    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5b8a480c-69ef-49fc-83a5-4cd052de69f9-client-ca\") pod \"5b8a480c-69ef-49fc-83a5-4cd052de69f9\" (UID: \"5b8a480c-69ef-49fc-83a5-4cd052de69f9\") "
Mar 20 15:45:30 crc kubenswrapper[4730]: I0320 15:45:30.558476    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b8a480c-69ef-49fc-83a5-4cd052de69f9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5b8a480c-69ef-49fc-83a5-4cd052de69f9" (UID: "5b8a480c-69ef-49fc-83a5-4cd052de69f9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:45:30 crc kubenswrapper[4730]: I0320 15:45:30.558520    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b8a480c-69ef-49fc-83a5-4cd052de69f9-config" (OuterVolumeSpecName: "config") pod "5b8a480c-69ef-49fc-83a5-4cd052de69f9" (UID: "5b8a480c-69ef-49fc-83a5-4cd052de69f9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:45:30 crc kubenswrapper[4730]: I0320 15:45:30.563689    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b8a480c-69ef-49fc-83a5-4cd052de69f9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5b8a480c-69ef-49fc-83a5-4cd052de69f9" (UID: "5b8a480c-69ef-49fc-83a5-4cd052de69f9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:45:30 crc kubenswrapper[4730]: I0320 15:45:30.564151    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b8a480c-69ef-49fc-83a5-4cd052de69f9-kube-api-access-wzs7x" (OuterVolumeSpecName: "kube-api-access-wzs7x") pod "5b8a480c-69ef-49fc-83a5-4cd052de69f9" (UID: "5b8a480c-69ef-49fc-83a5-4cd052de69f9"). InnerVolumeSpecName "kube-api-access-wzs7x". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:45:30 crc kubenswrapper[4730]: I0320 15:45:30.659854    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzs7x\" (UniqueName: \"kubernetes.io/projected/5b8a480c-69ef-49fc-83a5-4cd052de69f9-kube-api-access-wzs7x\") on node \"crc\" DevicePath \"\""
Mar 20 15:45:30 crc kubenswrapper[4730]: I0320 15:45:30.659886    4730 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5b8a480c-69ef-49fc-83a5-4cd052de69f9-client-ca\") on node \"crc\" DevicePath \"\""
Mar 20 15:45:30 crc kubenswrapper[4730]: I0320 15:45:30.659896    4730 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b8a480c-69ef-49fc-83a5-4cd052de69f9-serving-cert\") on node \"crc\" DevicePath \"\""
Mar 20 15:45:30 crc kubenswrapper[4730]: I0320 15:45:30.659905    4730 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b8a480c-69ef-49fc-83a5-4cd052de69f9-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:45:30 crc kubenswrapper[4730]: I0320 15:45:30.841314    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79bc85d457-rb8tc"]
Mar 20 15:45:30 crc kubenswrapper[4730]: I0320 15:45:30.846358    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79bc85d457-rb8tc"]
Mar 20 15:45:30 crc kubenswrapper[4730]: I0320 15:45:30.950276    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5855d88fdc-qmhg5"
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.065237    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4a9957b-61b8-4536-ae91-fe1f1c12575f-config\") pod \"c4a9957b-61b8-4536-ae91-fe1f1c12575f\" (UID: \"c4a9957b-61b8-4536-ae91-fe1f1c12575f\") "
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.065511    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c4a9957b-61b8-4536-ae91-fe1f1c12575f-client-ca\") pod \"c4a9957b-61b8-4536-ae91-fe1f1c12575f\" (UID: \"c4a9957b-61b8-4536-ae91-fe1f1c12575f\") "
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.065535    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nj57d\" (UniqueName: \"kubernetes.io/projected/c4a9957b-61b8-4536-ae91-fe1f1c12575f-kube-api-access-nj57d\") pod \"c4a9957b-61b8-4536-ae91-fe1f1c12575f\" (UID: \"c4a9957b-61b8-4536-ae91-fe1f1c12575f\") "
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.065556    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c4a9957b-61b8-4536-ae91-fe1f1c12575f-proxy-ca-bundles\") pod \"c4a9957b-61b8-4536-ae91-fe1f1c12575f\" (UID: \"c4a9957b-61b8-4536-ae91-fe1f1c12575f\") "
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.065646    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4a9957b-61b8-4536-ae91-fe1f1c12575f-serving-cert\") pod \"c4a9957b-61b8-4536-ae91-fe1f1c12575f\" (UID: \"c4a9957b-61b8-4536-ae91-fe1f1c12575f\") "
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.066467    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4a9957b-61b8-4536-ae91-fe1f1c12575f-config" (OuterVolumeSpecName: "config") pod "c4a9957b-61b8-4536-ae91-fe1f1c12575f" (UID: "c4a9957b-61b8-4536-ae91-fe1f1c12575f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.066504    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4a9957b-61b8-4536-ae91-fe1f1c12575f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "c4a9957b-61b8-4536-ae91-fe1f1c12575f" (UID: "c4a9957b-61b8-4536-ae91-fe1f1c12575f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.066679    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4a9957b-61b8-4536-ae91-fe1f1c12575f-client-ca" (OuterVolumeSpecName: "client-ca") pod "c4a9957b-61b8-4536-ae91-fe1f1c12575f" (UID: "c4a9957b-61b8-4536-ae91-fe1f1c12575f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.068709    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4a9957b-61b8-4536-ae91-fe1f1c12575f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c4a9957b-61b8-4536-ae91-fe1f1c12575f" (UID: "c4a9957b-61b8-4536-ae91-fe1f1c12575f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.069394    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4a9957b-61b8-4536-ae91-fe1f1c12575f-kube-api-access-nj57d" (OuterVolumeSpecName: "kube-api-access-nj57d") pod "c4a9957b-61b8-4536-ae91-fe1f1c12575f" (UID: "c4a9957b-61b8-4536-ae91-fe1f1c12575f"). InnerVolumeSpecName "kube-api-access-nj57d". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.166806    4730 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4a9957b-61b8-4536-ae91-fe1f1c12575f-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.166864    4730 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c4a9957b-61b8-4536-ae91-fe1f1c12575f-client-ca\") on node \"crc\" DevicePath \"\""
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.166878    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nj57d\" (UniqueName: \"kubernetes.io/projected/c4a9957b-61b8-4536-ae91-fe1f1c12575f-kube-api-access-nj57d\") on node \"crc\" DevicePath \"\""
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.166893    4730 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c4a9957b-61b8-4536-ae91-fe1f1c12575f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\""
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.166906    4730 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4a9957b-61b8-4536-ae91-fe1f1c12575f-serving-cert\") on node \"crc\" DevicePath \"\""
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.266436    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-887974f98-g5dnn"]
Mar 20 15:45:31 crc kubenswrapper[4730]: E0320 15:45:31.266735    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b8a480c-69ef-49fc-83a5-4cd052de69f9" containerName="route-controller-manager"
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.266755    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b8a480c-69ef-49fc-83a5-4cd052de69f9" containerName="route-controller-manager"
Mar 20 15:45:31 crc kubenswrapper[4730]: E0320 15:45:31.266767    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4a9957b-61b8-4536-ae91-fe1f1c12575f" containerName="controller-manager"
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.266776    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4a9957b-61b8-4536-ae91-fe1f1c12575f" containerName="controller-manager"
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.266912    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b8a480c-69ef-49fc-83a5-4cd052de69f9" containerName="route-controller-manager"
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.266925    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4a9957b-61b8-4536-ae91-fe1f1c12575f" containerName="controller-manager"
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.267483    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-887974f98-g5dnn"
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.268029    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6f367c6-14a0-4683-aff1-d6fbe6cc3a47-config\") pod \"route-controller-manager-887974f98-g5dnn\" (UID: \"f6f367c6-14a0-4683-aff1-d6fbe6cc3a47\") " pod="openshift-route-controller-manager/route-controller-manager-887974f98-g5dnn"
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.268085    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6f367c6-14a0-4683-aff1-d6fbe6cc3a47-serving-cert\") pod \"route-controller-manager-887974f98-g5dnn\" (UID: \"f6f367c6-14a0-4683-aff1-d6fbe6cc3a47\") " pod="openshift-route-controller-manager/route-controller-manager-887974f98-g5dnn"
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.268114    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbk4s\" (UniqueName: \"kubernetes.io/projected/f6f367c6-14a0-4683-aff1-d6fbe6cc3a47-kube-api-access-rbk4s\") pod \"route-controller-manager-887974f98-g5dnn\" (UID: \"f6f367c6-14a0-4683-aff1-d6fbe6cc3a47\") " pod="openshift-route-controller-manager/route-controller-manager-887974f98-g5dnn"
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.268292    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f6f367c6-14a0-4683-aff1-d6fbe6cc3a47-client-ca\") pod \"route-controller-manager-887974f98-g5dnn\" (UID: \"f6f367c6-14a0-4683-aff1-d6fbe6cc3a47\") " pod="openshift-route-controller-manager/route-controller-manager-887974f98-g5dnn"
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.272100    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6677d585cb-h45gr"]
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.272945    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6677d585cb-h45gr"
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.273622    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt"
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.273895    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config"
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.274216    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt"
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.274591    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2"
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.276020    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert"
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.276200    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca"
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.279654    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-887974f98-g5dnn"]
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.283767    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6677d585cb-h45gr"]
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.370075    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58959cc0-9ad2-48f9-83da-c33525b8919d-config\") pod \"controller-manager-6677d585cb-h45gr\" (UID: \"58959cc0-9ad2-48f9-83da-c33525b8919d\") " pod="openshift-controller-manager/controller-manager-6677d585cb-h45gr"
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.370181    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6f367c6-14a0-4683-aff1-d6fbe6cc3a47-config\") pod \"route-controller-manager-887974f98-g5dnn\" (UID: \"f6f367c6-14a0-4683-aff1-d6fbe6cc3a47\") " pod="openshift-route-controller-manager/route-controller-manager-887974f98-g5dnn"
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.370210    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/58959cc0-9ad2-48f9-83da-c33525b8919d-client-ca\") pod \"controller-manager-6677d585cb-h45gr\" (UID: \"58959cc0-9ad2-48f9-83da-c33525b8919d\") " pod="openshift-controller-manager/controller-manager-6677d585cb-h45gr"
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.370239    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6f367c6-14a0-4683-aff1-d6fbe6cc3a47-serving-cert\") pod \"route-controller-manager-887974f98-g5dnn\" (UID: \"f6f367c6-14a0-4683-aff1-d6fbe6cc3a47\") " pod="openshift-route-controller-manager/route-controller-manager-887974f98-g5dnn"
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.370336    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbk4s\" (UniqueName: \"kubernetes.io/projected/f6f367c6-14a0-4683-aff1-d6fbe6cc3a47-kube-api-access-rbk4s\") pod \"route-controller-manager-887974f98-g5dnn\" (UID: \"f6f367c6-14a0-4683-aff1-d6fbe6cc3a47\") " pod="openshift-route-controller-manager/route-controller-manager-887974f98-g5dnn"
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.370820    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58959cc0-9ad2-48f9-83da-c33525b8919d-serving-cert\") pod \"controller-manager-6677d585cb-h45gr\" (UID: \"58959cc0-9ad2-48f9-83da-c33525b8919d\") " pod="openshift-controller-manager/controller-manager-6677d585cb-h45gr"
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.370875    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2ml4\" (UniqueName: \"kubernetes.io/projected/58959cc0-9ad2-48f9-83da-c33525b8919d-kube-api-access-k2ml4\") pod \"controller-manager-6677d585cb-h45gr\" (UID: \"58959cc0-9ad2-48f9-83da-c33525b8919d\") " pod="openshift-controller-manager/controller-manager-6677d585cb-h45gr"
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.370896    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/58959cc0-9ad2-48f9-83da-c33525b8919d-proxy-ca-bundles\") pod \"controller-manager-6677d585cb-h45gr\" (UID: \"58959cc0-9ad2-48f9-83da-c33525b8919d\") " pod="openshift-controller-manager/controller-manager-6677d585cb-h45gr"
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.370987    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f6f367c6-14a0-4683-aff1-d6fbe6cc3a47-client-ca\") pod \"route-controller-manager-887974f98-g5dnn\" (UID: \"f6f367c6-14a0-4683-aff1-d6fbe6cc3a47\") " pod="openshift-route-controller-manager/route-controller-manager-887974f98-g5dnn"
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.371727    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f6f367c6-14a0-4683-aff1-d6fbe6cc3a47-client-ca\") pod \"route-controller-manager-887974f98-g5dnn\" (UID: \"f6f367c6-14a0-4683-aff1-d6fbe6cc3a47\") " pod="openshift-route-controller-manager/route-controller-manager-887974f98-g5dnn"
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.371884    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6f367c6-14a0-4683-aff1-d6fbe6cc3a47-config\") pod \"route-controller-manager-887974f98-g5dnn\" (UID: \"f6f367c6-14a0-4683-aff1-d6fbe6cc3a47\") " pod="openshift-route-controller-manager/route-controller-manager-887974f98-g5dnn"
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.376182    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6f367c6-14a0-4683-aff1-d6fbe6cc3a47-serving-cert\") pod \"route-controller-manager-887974f98-g5dnn\" (UID: \"f6f367c6-14a0-4683-aff1-d6fbe6cc3a47\") " pod="openshift-route-controller-manager/route-controller-manager-887974f98-g5dnn"
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.389064    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbk4s\" (UniqueName: \"kubernetes.io/projected/f6f367c6-14a0-4683-aff1-d6fbe6cc3a47-kube-api-access-rbk4s\") pod \"route-controller-manager-887974f98-g5dnn\" (UID: \"f6f367c6-14a0-4683-aff1-d6fbe6cc3a47\") " pod="openshift-route-controller-manager/route-controller-manager-887974f98-g5dnn"
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.472453    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2ml4\" (UniqueName: \"kubernetes.io/projected/58959cc0-9ad2-48f9-83da-c33525b8919d-kube-api-access-k2ml4\") pod \"controller-manager-6677d585cb-h45gr\" (UID: \"58959cc0-9ad2-48f9-83da-c33525b8919d\") " pod="openshift-controller-manager/controller-manager-6677d585cb-h45gr"
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.472544    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/58959cc0-9ad2-48f9-83da-c33525b8919d-proxy-ca-bundles\") pod \"controller-manager-6677d585cb-h45gr\" (UID: \"58959cc0-9ad2-48f9-83da-c33525b8919d\") " pod="openshift-controller-manager/controller-manager-6677d585cb-h45gr"
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.472679    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58959cc0-9ad2-48f9-83da-c33525b8919d-config\") pod \"controller-manager-6677d585cb-h45gr\" (UID: \"58959cc0-9ad2-48f9-83da-c33525b8919d\") " pod="openshift-controller-manager/controller-manager-6677d585cb-h45gr"
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.472754    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/58959cc0-9ad2-48f9-83da-c33525b8919d-client-ca\") pod \"controller-manager-6677d585cb-h45gr\" (UID: \"58959cc0-9ad2-48f9-83da-c33525b8919d\") " pod="openshift-controller-manager/controller-manager-6677d585cb-h45gr"
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.472810    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58959cc0-9ad2-48f9-83da-c33525b8919d-serving-cert\") pod \"controller-manager-6677d585cb-h45gr\" (UID: \"58959cc0-9ad2-48f9-83da-c33525b8919d\") " pod="openshift-controller-manager/controller-manager-6677d585cb-h45gr"
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.473715    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/58959cc0-9ad2-48f9-83da-c33525b8919d-client-ca\") pod \"controller-manager-6677d585cb-h45gr\" (UID: \"58959cc0-9ad2-48f9-83da-c33525b8919d\") " pod="openshift-controller-manager/controller-manager-6677d585cb-h45gr"
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.474399    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58959cc0-9ad2-48f9-83da-c33525b8919d-config\") pod \"controller-manager-6677d585cb-h45gr\" (UID: \"58959cc0-9ad2-48f9-83da-c33525b8919d\") " pod="openshift-controller-manager/controller-manager-6677d585cb-h45gr"
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.475563    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/58959cc0-9ad2-48f9-83da-c33525b8919d-proxy-ca-bundles\") pod \"controller-manager-6677d585cb-h45gr\" (UID: \"58959cc0-9ad2-48f9-83da-c33525b8919d\") " pod="openshift-controller-manager/controller-manager-6677d585cb-h45gr"
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.476627    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58959cc0-9ad2-48f9-83da-c33525b8919d-serving-cert\") pod \"controller-manager-6677d585cb-h45gr\" (UID: \"58959cc0-9ad2-48f9-83da-c33525b8919d\") " pod="openshift-controller-manager/controller-manager-6677d585cb-h45gr"
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.488989    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2ml4\" (UniqueName: \"kubernetes.io/projected/58959cc0-9ad2-48f9-83da-c33525b8919d-kube-api-access-k2ml4\") pod \"controller-manager-6677d585cb-h45gr\" (UID: \"58959cc0-9ad2-48f9-83da-c33525b8919d\") " pod="openshift-controller-manager/controller-manager-6677d585cb-h45gr"
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.525833    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5855d88fdc-qmhg5" event={"ID":"c4a9957b-61b8-4536-ae91-fe1f1c12575f","Type":"ContainerDied","Data":"85975510d8507912acb9c3c9e341eb489a6bfcf769bf4be7878158b90105c935"}
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.525893    4730 scope.go:117] "RemoveContainer" containerID="272f9abedbf05198cbcea3f3c4c9c1a9c6f254ef663ed65fee9f8391c9210b16"
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.525925    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5855d88fdc-qmhg5"
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.546196    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b8a480c-69ef-49fc-83a5-4cd052de69f9" path="/var/lib/kubelet/pods/5b8a480c-69ef-49fc-83a5-4cd052de69f9/volumes"
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.568917    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5855d88fdc-qmhg5"]
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.572032    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5855d88fdc-qmhg5"]
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.592209    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-887974f98-g5dnn"
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.617215    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6677d585cb-h45gr"
Mar 20 15:45:31 crc kubenswrapper[4730]: I0320 15:45:31.995958    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-887974f98-g5dnn"]
Mar 20 15:45:32 crc kubenswrapper[4730]: W0320 15:45:32.001764    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6f367c6_14a0_4683_aff1_d6fbe6cc3a47.slice/crio-4655d39dd1465fdb2c59ebf55c060fefff1bcbcd7ef3b0a27cf18646bd27a07f WatchSource:0}: Error finding container 4655d39dd1465fdb2c59ebf55c060fefff1bcbcd7ef3b0a27cf18646bd27a07f: Status 404 returned error can't find the container with id 4655d39dd1465fdb2c59ebf55c060fefff1bcbcd7ef3b0a27cf18646bd27a07f
Mar 20 15:45:32 crc kubenswrapper[4730]: I0320 15:45:32.077359    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6677d585cb-h45gr"]
Mar 20 15:45:32 crc kubenswrapper[4730]: W0320 15:45:32.090262    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58959cc0_9ad2_48f9_83da_c33525b8919d.slice/crio-192de1853e19e7668e1d1800d2ca4b5c1e7f274d95cbee877dd11e903eda6ae2 WatchSource:0}: Error finding container 192de1853e19e7668e1d1800d2ca4b5c1e7f274d95cbee877dd11e903eda6ae2: Status 404 returned error can't find the container with id 192de1853e19e7668e1d1800d2ca4b5c1e7f274d95cbee877dd11e903eda6ae2
Mar 20 15:45:32 crc kubenswrapper[4730]: I0320 15:45:32.534300    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6677d585cb-h45gr" event={"ID":"58959cc0-9ad2-48f9-83da-c33525b8919d","Type":"ContainerStarted","Data":"79603ae8fc3af2a5353e289038a22a13e7af42a5609045c61b779ef68aeeb5fe"}
Mar 20 15:45:32 crc kubenswrapper[4730]: I0320 15:45:32.534350    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6677d585cb-h45gr" event={"ID":"58959cc0-9ad2-48f9-83da-c33525b8919d","Type":"ContainerStarted","Data":"192de1853e19e7668e1d1800d2ca4b5c1e7f274d95cbee877dd11e903eda6ae2"}
Mar 20 15:45:32 crc kubenswrapper[4730]: I0320 15:45:32.534469    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6677d585cb-h45gr"
Mar 20 15:45:32 crc kubenswrapper[4730]: I0320 15:45:32.535592    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-887974f98-g5dnn" event={"ID":"f6f367c6-14a0-4683-aff1-d6fbe6cc3a47","Type":"ContainerStarted","Data":"093154fad0b7d76c4d87e1557430d65a3241a4160f21ee3af7aa8053222b0da7"}
Mar 20 15:45:32 crc kubenswrapper[4730]: I0320 15:45:32.535615    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-887974f98-g5dnn" event={"ID":"f6f367c6-14a0-4683-aff1-d6fbe6cc3a47","Type":"ContainerStarted","Data":"4655d39dd1465fdb2c59ebf55c060fefff1bcbcd7ef3b0a27cf18646bd27a07f"}
Mar 20 15:45:32 crc kubenswrapper[4730]: I0320 15:45:32.535802    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-887974f98-g5dnn"
Mar 20 15:45:32 crc kubenswrapper[4730]: I0320 15:45:32.540869    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6677d585cb-h45gr"
Mar 20 15:45:32 crc kubenswrapper[4730]: I0320 15:45:32.575740    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6677d585cb-h45gr" podStartSLOduration=3.57571926 podStartE2EDuration="3.57571926s" podCreationTimestamp="2026-03-20 15:45:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:45:32.555798633 +0000 UTC m=+391.769170002" watchObservedRunningTime="2026-03-20 15:45:32.57571926 +0000 UTC m=+391.789090629"
Mar 20 15:45:32 crc kubenswrapper[4730]: I0320 15:45:32.576467    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-887974f98-g5dnn" podStartSLOduration=3.576460313 podStartE2EDuration="3.576460313s" podCreationTimestamp="2026-03-20 15:45:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:45:32.571946695 +0000 UTC m=+391.785318074" watchObservedRunningTime="2026-03-20 15:45:32.576460313 +0000 UTC m=+391.789831682"
Mar 20 15:45:32 crc kubenswrapper[4730]: I0320 15:45:32.813353    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-887974f98-g5dnn"
Mar 20 15:45:33 crc kubenswrapper[4730]: I0320 15:45:33.550880    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4a9957b-61b8-4536-ae91-fe1f1c12575f" path="/var/lib/kubelet/pods/c4a9957b-61b8-4536-ae91-fe1f1c12575f/volumes"
Mar 20 15:45:48 crc kubenswrapper[4730]: I0320 15:45:48.592705    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qmxvf"]
Mar 20 15:45:48 crc kubenswrapper[4730]: I0320 15:45:48.593581    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qmxvf" podUID="ab6c90a0-1bc1-476d-8526-d1fe438163e3" containerName="registry-server" containerID="cri-o://9a52c050b4986758df8c76456386e12221d78e4ea6fa2b1c10d15807ad001b19" gracePeriod=2
Mar 20 15:45:49 crc kubenswrapper[4730]: I0320 15:45:49.632851    4730 generic.go:334] "Generic (PLEG): container finished" podID="ab6c90a0-1bc1-476d-8526-d1fe438163e3" containerID="9a52c050b4986758df8c76456386e12221d78e4ea6fa2b1c10d15807ad001b19" exitCode=0
Mar 20 15:45:49 crc kubenswrapper[4730]: I0320 15:45:49.633015    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qmxvf" event={"ID":"ab6c90a0-1bc1-476d-8526-d1fe438163e3","Type":"ContainerDied","Data":"9a52c050b4986758df8c76456386e12221d78e4ea6fa2b1c10d15807ad001b19"}
Mar 20 15:45:49 crc kubenswrapper[4730]: I0320 15:45:49.777171    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qmxvf"
Mar 20 15:45:49 crc kubenswrapper[4730]: I0320 15:45:49.922964    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c86g2\" (UniqueName: \"kubernetes.io/projected/ab6c90a0-1bc1-476d-8526-d1fe438163e3-kube-api-access-c86g2\") pod \"ab6c90a0-1bc1-476d-8526-d1fe438163e3\" (UID: \"ab6c90a0-1bc1-476d-8526-d1fe438163e3\") "
Mar 20 15:45:49 crc kubenswrapper[4730]: I0320 15:45:49.923114    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab6c90a0-1bc1-476d-8526-d1fe438163e3-utilities\") pod \"ab6c90a0-1bc1-476d-8526-d1fe438163e3\" (UID: \"ab6c90a0-1bc1-476d-8526-d1fe438163e3\") "
Mar 20 15:45:49 crc kubenswrapper[4730]: I0320 15:45:49.923172    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab6c90a0-1bc1-476d-8526-d1fe438163e3-catalog-content\") pod \"ab6c90a0-1bc1-476d-8526-d1fe438163e3\" (UID: \"ab6c90a0-1bc1-476d-8526-d1fe438163e3\") "
Mar 20 15:45:49 crc kubenswrapper[4730]: I0320 15:45:49.924185    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab6c90a0-1bc1-476d-8526-d1fe438163e3-utilities" (OuterVolumeSpecName: "utilities") pod "ab6c90a0-1bc1-476d-8526-d1fe438163e3" (UID: "ab6c90a0-1bc1-476d-8526-d1fe438163e3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 15:45:49 crc kubenswrapper[4730]: I0320 15:45:49.932975    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab6c90a0-1bc1-476d-8526-d1fe438163e3-kube-api-access-c86g2" (OuterVolumeSpecName: "kube-api-access-c86g2") pod "ab6c90a0-1bc1-476d-8526-d1fe438163e3" (UID: "ab6c90a0-1bc1-476d-8526-d1fe438163e3"). InnerVolumeSpecName "kube-api-access-c86g2". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:45:50 crc kubenswrapper[4730]: I0320 15:45:50.024314    4730 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab6c90a0-1bc1-476d-8526-d1fe438163e3-utilities\") on node \"crc\" DevicePath \"\""
Mar 20 15:45:50 crc kubenswrapper[4730]: I0320 15:45:50.024375    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c86g2\" (UniqueName: \"kubernetes.io/projected/ab6c90a0-1bc1-476d-8526-d1fe438163e3-kube-api-access-c86g2\") on node \"crc\" DevicePath \"\""
Mar 20 15:45:50 crc kubenswrapper[4730]: I0320 15:45:50.043558    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab6c90a0-1bc1-476d-8526-d1fe438163e3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ab6c90a0-1bc1-476d-8526-d1fe438163e3" (UID: "ab6c90a0-1bc1-476d-8526-d1fe438163e3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 15:45:50 crc kubenswrapper[4730]: I0320 15:45:50.125612    4730 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab6c90a0-1bc1-476d-8526-d1fe438163e3-catalog-content\") on node \"crc\" DevicePath \"\""
Mar 20 15:45:50 crc kubenswrapper[4730]: I0320 15:45:50.641401    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qmxvf" event={"ID":"ab6c90a0-1bc1-476d-8526-d1fe438163e3","Type":"ContainerDied","Data":"3663f63ee1b2ff284c18b50ae774d9b77f54d7f929ad2803b36f2f39057f8d54"}
Mar 20 15:45:50 crc kubenswrapper[4730]: I0320 15:45:50.641787    4730 scope.go:117] "RemoveContainer" containerID="9a52c050b4986758df8c76456386e12221d78e4ea6fa2b1c10d15807ad001b19"
Mar 20 15:45:50 crc kubenswrapper[4730]: I0320 15:45:50.641508    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qmxvf"
Mar 20 15:45:50 crc kubenswrapper[4730]: I0320 15:45:50.658793    4730 scope.go:117] "RemoveContainer" containerID="1d25cea84c8b33aa09a01d2a67ef03e54e2640ab453060480220fbbf97ebde61"
Mar 20 15:45:50 crc kubenswrapper[4730]: I0320 15:45:50.675697    4730 scope.go:117] "RemoveContainer" containerID="6a84b3881231514a341a3dd596d04f1b46d9c8c10a2296046ee0ddb6c55675a3"
Mar 20 15:45:50 crc kubenswrapper[4730]: I0320 15:45:50.707864    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qmxvf"]
Mar 20 15:45:50 crc kubenswrapper[4730]: I0320 15:45:50.710981    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qmxvf"]
Mar 20 15:45:51 crc kubenswrapper[4730]: I0320 15:45:51.539620    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab6c90a0-1bc1-476d-8526-d1fe438163e3" path="/var/lib/kubelet/pods/ab6c90a0-1bc1-476d-8526-d1fe438163e3/volumes"
Mar 20 15:46:00 crc kubenswrapper[4730]: I0320 15:46:00.170895    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567026-x7pgl"]
Mar 20 15:46:00 crc kubenswrapper[4730]: E0320 15:46:00.171682    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab6c90a0-1bc1-476d-8526-d1fe438163e3" containerName="extract-content"
Mar 20 15:46:00 crc kubenswrapper[4730]: I0320 15:46:00.171699    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab6c90a0-1bc1-476d-8526-d1fe438163e3" containerName="extract-content"
Mar 20 15:46:00 crc kubenswrapper[4730]: E0320 15:46:00.171709    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab6c90a0-1bc1-476d-8526-d1fe438163e3" containerName="registry-server"
Mar 20 15:46:00 crc kubenswrapper[4730]: I0320 15:46:00.171716    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab6c90a0-1bc1-476d-8526-d1fe438163e3" containerName="registry-server"
Mar 20 15:46:00 crc kubenswrapper[4730]: E0320 15:46:00.171729    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab6c90a0-1bc1-476d-8526-d1fe438163e3" containerName="extract-utilities"
Mar 20 15:46:00 crc kubenswrapper[4730]: I0320 15:46:00.171737    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab6c90a0-1bc1-476d-8526-d1fe438163e3" containerName="extract-utilities"
Mar 20 15:46:00 crc kubenswrapper[4730]: I0320 15:46:00.171850    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab6c90a0-1bc1-476d-8526-d1fe438163e3" containerName="registry-server"
Mar 20 15:46:00 crc kubenswrapper[4730]: I0320 15:46:00.172290    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567026-x7pgl"
Mar 20 15:46:00 crc kubenswrapper[4730]: I0320 15:46:00.174405    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vlqbl"
Mar 20 15:46:00 crc kubenswrapper[4730]: I0320 15:46:00.174742    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt"
Mar 20 15:46:00 crc kubenswrapper[4730]: I0320 15:46:00.179447    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567026-x7pgl"]
Mar 20 15:46:00 crc kubenswrapper[4730]: I0320 15:46:00.191683    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt"
Mar 20 15:46:00 crc kubenswrapper[4730]: I0320 15:46:00.284067    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqprb\" (UniqueName: \"kubernetes.io/projected/2acd72a0-988c-4c58-a7b4-c139ee0f6ef1-kube-api-access-vqprb\") pod \"auto-csr-approver-29567026-x7pgl\" (UID: \"2acd72a0-988c-4c58-a7b4-c139ee0f6ef1\") " pod="openshift-infra/auto-csr-approver-29567026-x7pgl"
Mar 20 15:46:00 crc kubenswrapper[4730]: I0320 15:46:00.385413    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqprb\" (UniqueName: \"kubernetes.io/projected/2acd72a0-988c-4c58-a7b4-c139ee0f6ef1-kube-api-access-vqprb\") pod \"auto-csr-approver-29567026-x7pgl\" (UID: \"2acd72a0-988c-4c58-a7b4-c139ee0f6ef1\") " pod="openshift-infra/auto-csr-approver-29567026-x7pgl"
Mar 20 15:46:00 crc kubenswrapper[4730]: I0320 15:46:00.403979    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqprb\" (UniqueName: \"kubernetes.io/projected/2acd72a0-988c-4c58-a7b4-c139ee0f6ef1-kube-api-access-vqprb\") pod \"auto-csr-approver-29567026-x7pgl\" (UID: \"2acd72a0-988c-4c58-a7b4-c139ee0f6ef1\") " pod="openshift-infra/auto-csr-approver-29567026-x7pgl"
Mar 20 15:46:00 crc kubenswrapper[4730]: I0320 15:46:00.495689    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567026-x7pgl"
Mar 20 15:46:00 crc kubenswrapper[4730]: I0320 15:46:00.900358    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567026-x7pgl"]
Mar 20 15:46:00 crc kubenswrapper[4730]: W0320 15:46:00.904424    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2acd72a0_988c_4c58_a7b4_c139ee0f6ef1.slice/crio-09c0170823fe5be12229dc19d56bbd7300e9039e431e3bc9f49d576558f3e4ad WatchSource:0}: Error finding container 09c0170823fe5be12229dc19d56bbd7300e9039e431e3bc9f49d576558f3e4ad: Status 404 returned error can't find the container with id 09c0170823fe5be12229dc19d56bbd7300e9039e431e3bc9f49d576558f3e4ad
Mar 20 15:46:01 crc kubenswrapper[4730]: I0320 15:46:01.696561    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567026-x7pgl" event={"ID":"2acd72a0-988c-4c58-a7b4-c139ee0f6ef1","Type":"ContainerStarted","Data":"09c0170823fe5be12229dc19d56bbd7300e9039e431e3bc9f49d576558f3e4ad"}
Mar 20 15:46:02 crc kubenswrapper[4730]: I0320 15:46:02.702719    4730 generic.go:334] "Generic (PLEG): container finished" podID="2acd72a0-988c-4c58-a7b4-c139ee0f6ef1" containerID="b5ebe6b01434979e266e3872ff5405b028a732d1dd5830a3d6f3ad270518946a" exitCode=0
Mar 20 15:46:02 crc kubenswrapper[4730]: I0320 15:46:02.702971    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567026-x7pgl" event={"ID":"2acd72a0-988c-4c58-a7b4-c139ee0f6ef1","Type":"ContainerDied","Data":"b5ebe6b01434979e266e3872ff5405b028a732d1dd5830a3d6f3ad270518946a"}
Mar 20 15:46:04 crc kubenswrapper[4730]: I0320 15:46:04.008834    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567026-x7pgl"
Mar 20 15:46:04 crc kubenswrapper[4730]: I0320 15:46:04.119309    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6677d585cb-h45gr"]
Mar 20 15:46:04 crc kubenswrapper[4730]: I0320 15:46:04.122304    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6677d585cb-h45gr" podUID="58959cc0-9ad2-48f9-83da-c33525b8919d" containerName="controller-manager" containerID="cri-o://79603ae8fc3af2a5353e289038a22a13e7af42a5609045c61b779ef68aeeb5fe" gracePeriod=30
Mar 20 15:46:04 crc kubenswrapper[4730]: I0320 15:46:04.128582    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqprb\" (UniqueName: \"kubernetes.io/projected/2acd72a0-988c-4c58-a7b4-c139ee0f6ef1-kube-api-access-vqprb\") pod \"2acd72a0-988c-4c58-a7b4-c139ee0f6ef1\" (UID: \"2acd72a0-988c-4c58-a7b4-c139ee0f6ef1\") "
Mar 20 15:46:04 crc kubenswrapper[4730]: I0320 15:46:04.136552    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2acd72a0-988c-4c58-a7b4-c139ee0f6ef1-kube-api-access-vqprb" (OuterVolumeSpecName: "kube-api-access-vqprb") pod "2acd72a0-988c-4c58-a7b4-c139ee0f6ef1" (UID: "2acd72a0-988c-4c58-a7b4-c139ee0f6ef1"). InnerVolumeSpecName "kube-api-access-vqprb". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:46:04 crc kubenswrapper[4730]: I0320 15:46:04.230793    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqprb\" (UniqueName: \"kubernetes.io/projected/2acd72a0-988c-4c58-a7b4-c139ee0f6ef1-kube-api-access-vqprb\") on node \"crc\" DevicePath \"\""
Mar 20 15:46:04 crc kubenswrapper[4730]: I0320 15:46:04.591814    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6677d585cb-h45gr"
Mar 20 15:46:04 crc kubenswrapper[4730]: I0320 15:46:04.711356    4730 generic.go:334] "Generic (PLEG): container finished" podID="58959cc0-9ad2-48f9-83da-c33525b8919d" containerID="79603ae8fc3af2a5353e289038a22a13e7af42a5609045c61b779ef68aeeb5fe" exitCode=0
Mar 20 15:46:04 crc kubenswrapper[4730]: I0320 15:46:04.711425    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6677d585cb-h45gr" event={"ID":"58959cc0-9ad2-48f9-83da-c33525b8919d","Type":"ContainerDied","Data":"79603ae8fc3af2a5353e289038a22a13e7af42a5609045c61b779ef68aeeb5fe"}
Mar 20 15:46:04 crc kubenswrapper[4730]: I0320 15:46:04.711431    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6677d585cb-h45gr"
Mar 20 15:46:04 crc kubenswrapper[4730]: I0320 15:46:04.711452    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6677d585cb-h45gr" event={"ID":"58959cc0-9ad2-48f9-83da-c33525b8919d","Type":"ContainerDied","Data":"192de1853e19e7668e1d1800d2ca4b5c1e7f274d95cbee877dd11e903eda6ae2"}
Mar 20 15:46:04 crc kubenswrapper[4730]: I0320 15:46:04.711470    4730 scope.go:117] "RemoveContainer" containerID="79603ae8fc3af2a5353e289038a22a13e7af42a5609045c61b779ef68aeeb5fe"
Mar 20 15:46:04 crc kubenswrapper[4730]: I0320 15:46:04.713163    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567026-x7pgl" event={"ID":"2acd72a0-988c-4c58-a7b4-c139ee0f6ef1","Type":"ContainerDied","Data":"09c0170823fe5be12229dc19d56bbd7300e9039e431e3bc9f49d576558f3e4ad"}
Mar 20 15:46:04 crc kubenswrapper[4730]: I0320 15:46:04.713218    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09c0170823fe5be12229dc19d56bbd7300e9039e431e3bc9f49d576558f3e4ad"
Mar 20 15:46:04 crc kubenswrapper[4730]: I0320 15:46:04.713281    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567026-x7pgl"
Mar 20 15:46:04 crc kubenswrapper[4730]: I0320 15:46:04.724531    4730 scope.go:117] "RemoveContainer" containerID="79603ae8fc3af2a5353e289038a22a13e7af42a5609045c61b779ef68aeeb5fe"
Mar 20 15:46:04 crc kubenswrapper[4730]: E0320 15:46:04.725000    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79603ae8fc3af2a5353e289038a22a13e7af42a5609045c61b779ef68aeeb5fe\": container with ID starting with 79603ae8fc3af2a5353e289038a22a13e7af42a5609045c61b779ef68aeeb5fe not found: ID does not exist" containerID="79603ae8fc3af2a5353e289038a22a13e7af42a5609045c61b779ef68aeeb5fe"
Mar 20 15:46:04 crc kubenswrapper[4730]: I0320 15:46:04.725036    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79603ae8fc3af2a5353e289038a22a13e7af42a5609045c61b779ef68aeeb5fe"} err="failed to get container status \"79603ae8fc3af2a5353e289038a22a13e7af42a5609045c61b779ef68aeeb5fe\": rpc error: code = NotFound desc = could not find container \"79603ae8fc3af2a5353e289038a22a13e7af42a5609045c61b779ef68aeeb5fe\": container with ID starting with 79603ae8fc3af2a5353e289038a22a13e7af42a5609045c61b779ef68aeeb5fe not found: ID does not exist"
Mar 20 15:46:04 crc kubenswrapper[4730]: I0320 15:46:04.736361    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2ml4\" (UniqueName: \"kubernetes.io/projected/58959cc0-9ad2-48f9-83da-c33525b8919d-kube-api-access-k2ml4\") pod \"58959cc0-9ad2-48f9-83da-c33525b8919d\" (UID: \"58959cc0-9ad2-48f9-83da-c33525b8919d\") "
Mar 20 15:46:04 crc kubenswrapper[4730]: I0320 15:46:04.736408    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58959cc0-9ad2-48f9-83da-c33525b8919d-serving-cert\") pod \"58959cc0-9ad2-48f9-83da-c33525b8919d\" (UID: \"58959cc0-9ad2-48f9-83da-c33525b8919d\") "
Mar 20 15:46:04 crc kubenswrapper[4730]: I0320 15:46:04.736451    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58959cc0-9ad2-48f9-83da-c33525b8919d-config\") pod \"58959cc0-9ad2-48f9-83da-c33525b8919d\" (UID: \"58959cc0-9ad2-48f9-83da-c33525b8919d\") "
Mar 20 15:46:04 crc kubenswrapper[4730]: I0320 15:46:04.736513    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/58959cc0-9ad2-48f9-83da-c33525b8919d-client-ca\") pod \"58959cc0-9ad2-48f9-83da-c33525b8919d\" (UID: \"58959cc0-9ad2-48f9-83da-c33525b8919d\") "
Mar 20 15:46:04 crc kubenswrapper[4730]: I0320 15:46:04.736536    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/58959cc0-9ad2-48f9-83da-c33525b8919d-proxy-ca-bundles\") pod \"58959cc0-9ad2-48f9-83da-c33525b8919d\" (UID: \"58959cc0-9ad2-48f9-83da-c33525b8919d\") "
Mar 20 15:46:04 crc kubenswrapper[4730]: I0320 15:46:04.737309    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58959cc0-9ad2-48f9-83da-c33525b8919d-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "58959cc0-9ad2-48f9-83da-c33525b8919d" (UID: "58959cc0-9ad2-48f9-83da-c33525b8919d"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:46:04 crc kubenswrapper[4730]: I0320 15:46:04.737321    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58959cc0-9ad2-48f9-83da-c33525b8919d-config" (OuterVolumeSpecName: "config") pod "58959cc0-9ad2-48f9-83da-c33525b8919d" (UID: "58959cc0-9ad2-48f9-83da-c33525b8919d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:46:04 crc kubenswrapper[4730]: I0320 15:46:04.737605    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58959cc0-9ad2-48f9-83da-c33525b8919d-client-ca" (OuterVolumeSpecName: "client-ca") pod "58959cc0-9ad2-48f9-83da-c33525b8919d" (UID: "58959cc0-9ad2-48f9-83da-c33525b8919d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:46:04 crc kubenswrapper[4730]: I0320 15:46:04.739203    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58959cc0-9ad2-48f9-83da-c33525b8919d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "58959cc0-9ad2-48f9-83da-c33525b8919d" (UID: "58959cc0-9ad2-48f9-83da-c33525b8919d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:46:04 crc kubenswrapper[4730]: I0320 15:46:04.739346    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58959cc0-9ad2-48f9-83da-c33525b8919d-kube-api-access-k2ml4" (OuterVolumeSpecName: "kube-api-access-k2ml4") pod "58959cc0-9ad2-48f9-83da-c33525b8919d" (UID: "58959cc0-9ad2-48f9-83da-c33525b8919d"). InnerVolumeSpecName "kube-api-access-k2ml4". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:46:04 crc kubenswrapper[4730]: I0320 15:46:04.837912    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2ml4\" (UniqueName: \"kubernetes.io/projected/58959cc0-9ad2-48f9-83da-c33525b8919d-kube-api-access-k2ml4\") on node \"crc\" DevicePath \"\""
Mar 20 15:46:04 crc kubenswrapper[4730]: I0320 15:46:04.837961    4730 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58959cc0-9ad2-48f9-83da-c33525b8919d-serving-cert\") on node \"crc\" DevicePath \"\""
Mar 20 15:46:04 crc kubenswrapper[4730]: I0320 15:46:04.837980    4730 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58959cc0-9ad2-48f9-83da-c33525b8919d-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:46:04 crc kubenswrapper[4730]: I0320 15:46:04.837998    4730 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/58959cc0-9ad2-48f9-83da-c33525b8919d-client-ca\") on node \"crc\" DevicePath \"\""
Mar 20 15:46:04 crc kubenswrapper[4730]: I0320 15:46:04.838014    4730 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/58959cc0-9ad2-48f9-83da-c33525b8919d-proxy-ca-bundles\") on node \"crc\" DevicePath \"\""
Mar 20 15:46:05 crc kubenswrapper[4730]: I0320 15:46:05.038466    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6677d585cb-h45gr"]
Mar 20 15:46:05 crc kubenswrapper[4730]: I0320 15:46:05.043412    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6677d585cb-h45gr"]
Mar 20 15:46:05 crc kubenswrapper[4730]: I0320 15:46:05.288451    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5855d88fdc-jtgvw"]
Mar 20 15:46:05 crc kubenswrapper[4730]: E0320 15:46:05.288825    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2acd72a0-988c-4c58-a7b4-c139ee0f6ef1" containerName="oc"
Mar 20 15:46:05 crc kubenswrapper[4730]: I0320 15:46:05.288836    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="2acd72a0-988c-4c58-a7b4-c139ee0f6ef1" containerName="oc"
Mar 20 15:46:05 crc kubenswrapper[4730]: E0320 15:46:05.288852    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58959cc0-9ad2-48f9-83da-c33525b8919d" containerName="controller-manager"
Mar 20 15:46:05 crc kubenswrapper[4730]: I0320 15:46:05.288859    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="58959cc0-9ad2-48f9-83da-c33525b8919d" containerName="controller-manager"
Mar 20 15:46:05 crc kubenswrapper[4730]: I0320 15:46:05.288950    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="58959cc0-9ad2-48f9-83da-c33525b8919d" containerName="controller-manager"
Mar 20 15:46:05 crc kubenswrapper[4730]: I0320 15:46:05.288960    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="2acd72a0-988c-4c58-a7b4-c139ee0f6ef1" containerName="oc"
Mar 20 15:46:05 crc kubenswrapper[4730]: I0320 15:46:05.289293    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5855d88fdc-jtgvw"
Mar 20 15:46:05 crc kubenswrapper[4730]: I0320 15:46:05.291129    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca"
Mar 20 15:46:05 crc kubenswrapper[4730]: I0320 15:46:05.291326    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config"
Mar 20 15:46:05 crc kubenswrapper[4730]: I0320 15:46:05.291949    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c"
Mar 20 15:46:05 crc kubenswrapper[4730]: I0320 15:46:05.292459    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt"
Mar 20 15:46:05 crc kubenswrapper[4730]: I0320 15:46:05.292578    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert"
Mar 20 15:46:05 crc kubenswrapper[4730]: I0320 15:46:05.292719    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt"
Mar 20 15:46:05 crc kubenswrapper[4730]: I0320 15:46:05.298564    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca"
Mar 20 15:46:05 crc kubenswrapper[4730]: I0320 15:46:05.302377    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5855d88fdc-jtgvw"]
Mar 20 15:46:05 crc kubenswrapper[4730]: I0320 15:46:05.443102    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/337bfa51-7f51-418d-9eb8-2fdd55260cf5-serving-cert\") pod \"controller-manager-5855d88fdc-jtgvw\" (UID: \"337bfa51-7f51-418d-9eb8-2fdd55260cf5\") " pod="openshift-controller-manager/controller-manager-5855d88fdc-jtgvw"
Mar 20 15:46:05 crc kubenswrapper[4730]: I0320 15:46:05.443224    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wlq8\" (UniqueName: \"kubernetes.io/projected/337bfa51-7f51-418d-9eb8-2fdd55260cf5-kube-api-access-5wlq8\") pod \"controller-manager-5855d88fdc-jtgvw\" (UID: \"337bfa51-7f51-418d-9eb8-2fdd55260cf5\") " pod="openshift-controller-manager/controller-manager-5855d88fdc-jtgvw"
Mar 20 15:46:05 crc kubenswrapper[4730]: I0320 15:46:05.443270    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/337bfa51-7f51-418d-9eb8-2fdd55260cf5-config\") pod \"controller-manager-5855d88fdc-jtgvw\" (UID: \"337bfa51-7f51-418d-9eb8-2fdd55260cf5\") " pod="openshift-controller-manager/controller-manager-5855d88fdc-jtgvw"
Mar 20 15:46:05 crc kubenswrapper[4730]: I0320 15:46:05.443298    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/337bfa51-7f51-418d-9eb8-2fdd55260cf5-proxy-ca-bundles\") pod \"controller-manager-5855d88fdc-jtgvw\" (UID: \"337bfa51-7f51-418d-9eb8-2fdd55260cf5\") " pod="openshift-controller-manager/controller-manager-5855d88fdc-jtgvw"
Mar 20 15:46:05 crc kubenswrapper[4730]: I0320 15:46:05.443343    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/337bfa51-7f51-418d-9eb8-2fdd55260cf5-client-ca\") pod \"controller-manager-5855d88fdc-jtgvw\" (UID: \"337bfa51-7f51-418d-9eb8-2fdd55260cf5\") " pod="openshift-controller-manager/controller-manager-5855d88fdc-jtgvw"
Mar 20 15:46:05 crc kubenswrapper[4730]: I0320 15:46:05.540871    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58959cc0-9ad2-48f9-83da-c33525b8919d" path="/var/lib/kubelet/pods/58959cc0-9ad2-48f9-83da-c33525b8919d/volumes"
Mar 20 15:46:05 crc kubenswrapper[4730]: I0320 15:46:05.543812    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/337bfa51-7f51-418d-9eb8-2fdd55260cf5-client-ca\") pod \"controller-manager-5855d88fdc-jtgvw\" (UID: \"337bfa51-7f51-418d-9eb8-2fdd55260cf5\") " pod="openshift-controller-manager/controller-manager-5855d88fdc-jtgvw"
Mar 20 15:46:05 crc kubenswrapper[4730]: I0320 15:46:05.543875    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/337bfa51-7f51-418d-9eb8-2fdd55260cf5-serving-cert\") pod \"controller-manager-5855d88fdc-jtgvw\" (UID: \"337bfa51-7f51-418d-9eb8-2fdd55260cf5\") " pod="openshift-controller-manager/controller-manager-5855d88fdc-jtgvw"
Mar 20 15:46:05 crc kubenswrapper[4730]: I0320 15:46:05.544587    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/337bfa51-7f51-418d-9eb8-2fdd55260cf5-client-ca\") pod \"controller-manager-5855d88fdc-jtgvw\" (UID: \"337bfa51-7f51-418d-9eb8-2fdd55260cf5\") " pod="openshift-controller-manager/controller-manager-5855d88fdc-jtgvw"
Mar 20 15:46:05 crc kubenswrapper[4730]: I0320 15:46:05.544784    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wlq8\" (UniqueName: \"kubernetes.io/projected/337bfa51-7f51-418d-9eb8-2fdd55260cf5-kube-api-access-5wlq8\") pod \"controller-manager-5855d88fdc-jtgvw\" (UID: \"337bfa51-7f51-418d-9eb8-2fdd55260cf5\") " pod="openshift-controller-manager/controller-manager-5855d88fdc-jtgvw"
Mar 20 15:46:05 crc kubenswrapper[4730]: I0320 15:46:05.544859    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/337bfa51-7f51-418d-9eb8-2fdd55260cf5-config\") pod \"controller-manager-5855d88fdc-jtgvw\" (UID: \"337bfa51-7f51-418d-9eb8-2fdd55260cf5\") " pod="openshift-controller-manager/controller-manager-5855d88fdc-jtgvw"
Mar 20 15:46:05 crc kubenswrapper[4730]: I0320 15:46:05.544928    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/337bfa51-7f51-418d-9eb8-2fdd55260cf5-proxy-ca-bundles\") pod \"controller-manager-5855d88fdc-jtgvw\" (UID: \"337bfa51-7f51-418d-9eb8-2fdd55260cf5\") " pod="openshift-controller-manager/controller-manager-5855d88fdc-jtgvw"
Mar 20 15:46:05 crc kubenswrapper[4730]: I0320 15:46:05.545894    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/337bfa51-7f51-418d-9eb8-2fdd55260cf5-proxy-ca-bundles\") pod \"controller-manager-5855d88fdc-jtgvw\" (UID: \"337bfa51-7f51-418d-9eb8-2fdd55260cf5\") " pod="openshift-controller-manager/controller-manager-5855d88fdc-jtgvw"
Mar 20 15:46:05 crc kubenswrapper[4730]: I0320 15:46:05.546900    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/337bfa51-7f51-418d-9eb8-2fdd55260cf5-config\") pod \"controller-manager-5855d88fdc-jtgvw\" (UID: \"337bfa51-7f51-418d-9eb8-2fdd55260cf5\") " pod="openshift-controller-manager/controller-manager-5855d88fdc-jtgvw"
Mar 20 15:46:05 crc kubenswrapper[4730]: I0320 15:46:05.551527    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/337bfa51-7f51-418d-9eb8-2fdd55260cf5-serving-cert\") pod \"controller-manager-5855d88fdc-jtgvw\" (UID: \"337bfa51-7f51-418d-9eb8-2fdd55260cf5\") " pod="openshift-controller-manager/controller-manager-5855d88fdc-jtgvw"
Mar 20 15:46:05 crc kubenswrapper[4730]: I0320 15:46:05.562318    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wlq8\" (UniqueName: \"kubernetes.io/projected/337bfa51-7f51-418d-9eb8-2fdd55260cf5-kube-api-access-5wlq8\") pod \"controller-manager-5855d88fdc-jtgvw\" (UID: \"337bfa51-7f51-418d-9eb8-2fdd55260cf5\") " pod="openshift-controller-manager/controller-manager-5855d88fdc-jtgvw"
Mar 20 15:46:05 crc kubenswrapper[4730]: I0320 15:46:05.608080    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5855d88fdc-jtgvw"
Mar 20 15:46:05 crc kubenswrapper[4730]: I0320 15:46:05.989003    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5855d88fdc-jtgvw"]
Mar 20 15:46:06 crc kubenswrapper[4730]: I0320 15:46:06.727939    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5855d88fdc-jtgvw" event={"ID":"337bfa51-7f51-418d-9eb8-2fdd55260cf5","Type":"ContainerStarted","Data":"1399da652fc3160c02ab5708d02305c04eebaa5861159c7d0f94f72f435ddb86"}
Mar 20 15:46:06 crc kubenswrapper[4730]: I0320 15:46:06.727978    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5855d88fdc-jtgvw" event={"ID":"337bfa51-7f51-418d-9eb8-2fdd55260cf5","Type":"ContainerStarted","Data":"83b3e0120b9b7984bb2a6622bbeb9fa9048be783e7d0806d9fd1a8e72797a307"}
Mar 20 15:46:06 crc kubenswrapper[4730]: I0320 15:46:06.729328    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5855d88fdc-jtgvw"
Mar 20 15:46:06 crc kubenswrapper[4730]: I0320 15:46:06.734087    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5855d88fdc-jtgvw"
Mar 20 15:46:06 crc kubenswrapper[4730]: I0320 15:46:06.748494    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5855d88fdc-jtgvw" podStartSLOduration=2.748476839 podStartE2EDuration="2.748476839s" podCreationTimestamp="2026-03-20 15:46:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:46:06.744687035 +0000 UTC m=+425.958058404" watchObservedRunningTime="2026-03-20 15:46:06.748476839 +0000 UTC m=+425.961848208"
Mar 20 15:46:15 crc kubenswrapper[4730]: I0320 15:46:15.477411    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mp2s2"]
Mar 20 15:46:15 crc kubenswrapper[4730]: I0320 15:46:15.478774    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-mp2s2"
Mar 20 15:46:15 crc kubenswrapper[4730]: I0320 15:46:15.499625    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mp2s2"]
Mar 20 15:46:15 crc kubenswrapper[4730]: I0320 15:46:15.664587    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cafb408c-0cda-4634-9e71-1d727ff9a7f2-registry-tls\") pod \"image-registry-66df7c8f76-mp2s2\" (UID: \"cafb408c-0cda-4634-9e71-1d727ff9a7f2\") " pod="openshift-image-registry/image-registry-66df7c8f76-mp2s2"
Mar 20 15:46:15 crc kubenswrapper[4730]: I0320 15:46:15.664627    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cafb408c-0cda-4634-9e71-1d727ff9a7f2-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mp2s2\" (UID: \"cafb408c-0cda-4634-9e71-1d727ff9a7f2\") " pod="openshift-image-registry/image-registry-66df7c8f76-mp2s2"
Mar 20 15:46:15 crc kubenswrapper[4730]: I0320 15:46:15.664663    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-mp2s2\" (UID: \"cafb408c-0cda-4634-9e71-1d727ff9a7f2\") " pod="openshift-image-registry/image-registry-66df7c8f76-mp2s2"
Mar 20 15:46:15 crc kubenswrapper[4730]: I0320 15:46:15.664685    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4pk9\" (UniqueName: \"kubernetes.io/projected/cafb408c-0cda-4634-9e71-1d727ff9a7f2-kube-api-access-p4pk9\") pod \"image-registry-66df7c8f76-mp2s2\" (UID: \"cafb408c-0cda-4634-9e71-1d727ff9a7f2\") " pod="openshift-image-registry/image-registry-66df7c8f76-mp2s2"
Mar 20 15:46:15 crc kubenswrapper[4730]: I0320 15:46:15.664751    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cafb408c-0cda-4634-9e71-1d727ff9a7f2-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mp2s2\" (UID: \"cafb408c-0cda-4634-9e71-1d727ff9a7f2\") " pod="openshift-image-registry/image-registry-66df7c8f76-mp2s2"
Mar 20 15:46:15 crc kubenswrapper[4730]: I0320 15:46:15.664767    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cafb408c-0cda-4634-9e71-1d727ff9a7f2-trusted-ca\") pod \"image-registry-66df7c8f76-mp2s2\" (UID: \"cafb408c-0cda-4634-9e71-1d727ff9a7f2\") " pod="openshift-image-registry/image-registry-66df7c8f76-mp2s2"
Mar 20 15:46:15 crc kubenswrapper[4730]: I0320 15:46:15.664813    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cafb408c-0cda-4634-9e71-1d727ff9a7f2-bound-sa-token\") pod \"image-registry-66df7c8f76-mp2s2\" (UID: \"cafb408c-0cda-4634-9e71-1d727ff9a7f2\") " pod="openshift-image-registry/image-registry-66df7c8f76-mp2s2"
Mar 20 15:46:15 crc kubenswrapper[4730]: I0320 15:46:15.664844    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cafb408c-0cda-4634-9e71-1d727ff9a7f2-registry-certificates\") pod \"image-registry-66df7c8f76-mp2s2\" (UID: \"cafb408c-0cda-4634-9e71-1d727ff9a7f2\") " pod="openshift-image-registry/image-registry-66df7c8f76-mp2s2"
Mar 20 15:46:15 crc kubenswrapper[4730]: I0320 15:46:15.682968    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-mp2s2\" (UID: \"cafb408c-0cda-4634-9e71-1d727ff9a7f2\") " pod="openshift-image-registry/image-registry-66df7c8f76-mp2s2"
Mar 20 15:46:15 crc kubenswrapper[4730]: I0320 15:46:15.765688    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cafb408c-0cda-4634-9e71-1d727ff9a7f2-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mp2s2\" (UID: \"cafb408c-0cda-4634-9e71-1d727ff9a7f2\") " pod="openshift-image-registry/image-registry-66df7c8f76-mp2s2"
Mar 20 15:46:15 crc kubenswrapper[4730]: I0320 15:46:15.765999    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cafb408c-0cda-4634-9e71-1d727ff9a7f2-trusted-ca\") pod \"image-registry-66df7c8f76-mp2s2\" (UID: \"cafb408c-0cda-4634-9e71-1d727ff9a7f2\") " pod="openshift-image-registry/image-registry-66df7c8f76-mp2s2"
Mar 20 15:46:15 crc kubenswrapper[4730]: I0320 15:46:15.766077    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cafb408c-0cda-4634-9e71-1d727ff9a7f2-bound-sa-token\") pod \"image-registry-66df7c8f76-mp2s2\" (UID: \"cafb408c-0cda-4634-9e71-1d727ff9a7f2\") " pod="openshift-image-registry/image-registry-66df7c8f76-mp2s2"
Mar 20 15:46:15 crc kubenswrapper[4730]: I0320 15:46:15.766114    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cafb408c-0cda-4634-9e71-1d727ff9a7f2-registry-certificates\") pod \"image-registry-66df7c8f76-mp2s2\" (UID: \"cafb408c-0cda-4634-9e71-1d727ff9a7f2\") " pod="openshift-image-registry/image-registry-66df7c8f76-mp2s2"
Mar 20 15:46:15 crc kubenswrapper[4730]: I0320 15:46:15.766144    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cafb408c-0cda-4634-9e71-1d727ff9a7f2-registry-tls\") pod \"image-registry-66df7c8f76-mp2s2\" (UID: \"cafb408c-0cda-4634-9e71-1d727ff9a7f2\") " pod="openshift-image-registry/image-registry-66df7c8f76-mp2s2"
Mar 20 15:46:15 crc kubenswrapper[4730]: I0320 15:46:15.766160    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cafb408c-0cda-4634-9e71-1d727ff9a7f2-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mp2s2\" (UID: \"cafb408c-0cda-4634-9e71-1d727ff9a7f2\") " pod="openshift-image-registry/image-registry-66df7c8f76-mp2s2"
Mar 20 15:46:15 crc kubenswrapper[4730]: I0320 15:46:15.766183    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4pk9\" (UniqueName: \"kubernetes.io/projected/cafb408c-0cda-4634-9e71-1d727ff9a7f2-kube-api-access-p4pk9\") pod \"image-registry-66df7c8f76-mp2s2\" (UID: \"cafb408c-0cda-4634-9e71-1d727ff9a7f2\") " pod="openshift-image-registry/image-registry-66df7c8f76-mp2s2"
Mar 20 15:46:15 crc kubenswrapper[4730]: I0320 15:46:15.766665    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cafb408c-0cda-4634-9e71-1d727ff9a7f2-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mp2s2\" (UID: \"cafb408c-0cda-4634-9e71-1d727ff9a7f2\") " pod="openshift-image-registry/image-registry-66df7c8f76-mp2s2"
Mar 20 15:46:15 crc kubenswrapper[4730]: I0320 15:46:15.767428    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cafb408c-0cda-4634-9e71-1d727ff9a7f2-registry-certificates\") pod \"image-registry-66df7c8f76-mp2s2\" (UID: \"cafb408c-0cda-4634-9e71-1d727ff9a7f2\") " pod="openshift-image-registry/image-registry-66df7c8f76-mp2s2"
Mar 20 15:46:15 crc kubenswrapper[4730]: I0320 15:46:15.767964    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cafb408c-0cda-4634-9e71-1d727ff9a7f2-trusted-ca\") pod \"image-registry-66df7c8f76-mp2s2\" (UID: \"cafb408c-0cda-4634-9e71-1d727ff9a7f2\") " pod="openshift-image-registry/image-registry-66df7c8f76-mp2s2"
Mar 20 15:46:15 crc kubenswrapper[4730]: I0320 15:46:15.772126    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cafb408c-0cda-4634-9e71-1d727ff9a7f2-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mp2s2\" (UID: \"cafb408c-0cda-4634-9e71-1d727ff9a7f2\") " pod="openshift-image-registry/image-registry-66df7c8f76-mp2s2"
Mar 20 15:46:15 crc kubenswrapper[4730]: I0320 15:46:15.774773    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cafb408c-0cda-4634-9e71-1d727ff9a7f2-registry-tls\") pod \"image-registry-66df7c8f76-mp2s2\" (UID: \"cafb408c-0cda-4634-9e71-1d727ff9a7f2\") " pod="openshift-image-registry/image-registry-66df7c8f76-mp2s2"
Mar 20 15:46:15 crc kubenswrapper[4730]: I0320 15:46:15.784411    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4pk9\" (UniqueName: \"kubernetes.io/projected/cafb408c-0cda-4634-9e71-1d727ff9a7f2-kube-api-access-p4pk9\") pod \"image-registry-66df7c8f76-mp2s2\" (UID: \"cafb408c-0cda-4634-9e71-1d727ff9a7f2\") " pod="openshift-image-registry/image-registry-66df7c8f76-mp2s2"
Mar 20 15:46:15 crc kubenswrapper[4730]: I0320 15:46:15.784727    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cafb408c-0cda-4634-9e71-1d727ff9a7f2-bound-sa-token\") pod \"image-registry-66df7c8f76-mp2s2\" (UID: \"cafb408c-0cda-4634-9e71-1d727ff9a7f2\") " pod="openshift-image-registry/image-registry-66df7c8f76-mp2s2"
Mar 20 15:46:15 crc kubenswrapper[4730]: I0320 15:46:15.793783    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-mp2s2"
Mar 20 15:46:16 crc kubenswrapper[4730]: I0320 15:46:16.172984    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mp2s2"]
Mar 20 15:46:16 crc kubenswrapper[4730]: I0320 15:46:16.775017    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-mp2s2" event={"ID":"cafb408c-0cda-4634-9e71-1d727ff9a7f2","Type":"ContainerStarted","Data":"e14b1c56a67ba8393a624f2b94275d94ad4af04c29bd58a6458288abf9a52d00"}
Mar 20 15:46:16 crc kubenswrapper[4730]: I0320 15:46:16.775062    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-mp2s2" event={"ID":"cafb408c-0cda-4634-9e71-1d727ff9a7f2","Type":"ContainerStarted","Data":"6c9dd7c1503e298e4f613eae2ccd4a88dfeecc7a2497efab7bca0fd3df564762"}
Mar 20 15:46:16 crc kubenswrapper[4730]: I0320 15:46:16.775151    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-mp2s2"
Mar 20 15:46:16 crc kubenswrapper[4730]: I0320 15:46:16.796357    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-mp2s2" podStartSLOduration=1.7963366729999999 podStartE2EDuration="1.796336673s" podCreationTimestamp="2026-03-20 15:46:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:46:16.795684734 +0000 UTC m=+436.009056113" watchObservedRunningTime="2026-03-20 15:46:16.796336673 +0000 UTC m=+436.009708042"
Mar 20 15:46:24 crc kubenswrapper[4730]: I0320 15:46:24.145334    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-887974f98-g5dnn"]
Mar 20 15:46:24 crc kubenswrapper[4730]: I0320 15:46:24.146425    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-887974f98-g5dnn" podUID="f6f367c6-14a0-4683-aff1-d6fbe6cc3a47" containerName="route-controller-manager" containerID="cri-o://093154fad0b7d76c4d87e1557430d65a3241a4160f21ee3af7aa8053222b0da7" gracePeriod=30
Mar 20 15:46:24 crc kubenswrapper[4730]: I0320 15:46:24.610985    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-887974f98-g5dnn"
Mar 20 15:46:24 crc kubenswrapper[4730]: I0320 15:46:24.785174    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbk4s\" (UniqueName: \"kubernetes.io/projected/f6f367c6-14a0-4683-aff1-d6fbe6cc3a47-kube-api-access-rbk4s\") pod \"f6f367c6-14a0-4683-aff1-d6fbe6cc3a47\" (UID: \"f6f367c6-14a0-4683-aff1-d6fbe6cc3a47\") "
Mar 20 15:46:24 crc kubenswrapper[4730]: I0320 15:46:24.785280    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6f367c6-14a0-4683-aff1-d6fbe6cc3a47-serving-cert\") pod \"f6f367c6-14a0-4683-aff1-d6fbe6cc3a47\" (UID: \"f6f367c6-14a0-4683-aff1-d6fbe6cc3a47\") "
Mar 20 15:46:24 crc kubenswrapper[4730]: I0320 15:46:24.785311    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6f367c6-14a0-4683-aff1-d6fbe6cc3a47-config\") pod \"f6f367c6-14a0-4683-aff1-d6fbe6cc3a47\" (UID: \"f6f367c6-14a0-4683-aff1-d6fbe6cc3a47\") "
Mar 20 15:46:24 crc kubenswrapper[4730]: I0320 15:46:24.785372    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f6f367c6-14a0-4683-aff1-d6fbe6cc3a47-client-ca\") pod \"f6f367c6-14a0-4683-aff1-d6fbe6cc3a47\" (UID: \"f6f367c6-14a0-4683-aff1-d6fbe6cc3a47\") "
Mar 20 15:46:24 crc kubenswrapper[4730]: I0320 15:46:24.786079    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6f367c6-14a0-4683-aff1-d6fbe6cc3a47-config" (OuterVolumeSpecName: "config") pod "f6f367c6-14a0-4683-aff1-d6fbe6cc3a47" (UID: "f6f367c6-14a0-4683-aff1-d6fbe6cc3a47"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:46:24 crc kubenswrapper[4730]: I0320 15:46:24.786108    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6f367c6-14a0-4683-aff1-d6fbe6cc3a47-client-ca" (OuterVolumeSpecName: "client-ca") pod "f6f367c6-14a0-4683-aff1-d6fbe6cc3a47" (UID: "f6f367c6-14a0-4683-aff1-d6fbe6cc3a47"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:46:24 crc kubenswrapper[4730]: I0320 15:46:24.792016    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6f367c6-14a0-4683-aff1-d6fbe6cc3a47-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f6f367c6-14a0-4683-aff1-d6fbe6cc3a47" (UID: "f6f367c6-14a0-4683-aff1-d6fbe6cc3a47"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:46:24 crc kubenswrapper[4730]: I0320 15:46:24.793127    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6f367c6-14a0-4683-aff1-d6fbe6cc3a47-kube-api-access-rbk4s" (OuterVolumeSpecName: "kube-api-access-rbk4s") pod "f6f367c6-14a0-4683-aff1-d6fbe6cc3a47" (UID: "f6f367c6-14a0-4683-aff1-d6fbe6cc3a47"). InnerVolumeSpecName "kube-api-access-rbk4s". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:46:24 crc kubenswrapper[4730]: I0320 15:46:24.815746    4730 generic.go:334] "Generic (PLEG): container finished" podID="f6f367c6-14a0-4683-aff1-d6fbe6cc3a47" containerID="093154fad0b7d76c4d87e1557430d65a3241a4160f21ee3af7aa8053222b0da7" exitCode=0
Mar 20 15:46:24 crc kubenswrapper[4730]: I0320 15:46:24.815822    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-887974f98-g5dnn" event={"ID":"f6f367c6-14a0-4683-aff1-d6fbe6cc3a47","Type":"ContainerDied","Data":"093154fad0b7d76c4d87e1557430d65a3241a4160f21ee3af7aa8053222b0da7"}
Mar 20 15:46:24 crc kubenswrapper[4730]: I0320 15:46:24.815849    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-887974f98-g5dnn" event={"ID":"f6f367c6-14a0-4683-aff1-d6fbe6cc3a47","Type":"ContainerDied","Data":"4655d39dd1465fdb2c59ebf55c060fefff1bcbcd7ef3b0a27cf18646bd27a07f"}
Mar 20 15:46:24 crc kubenswrapper[4730]: I0320 15:46:24.815851    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-887974f98-g5dnn"
Mar 20 15:46:24 crc kubenswrapper[4730]: I0320 15:46:24.815866    4730 scope.go:117] "RemoveContainer" containerID="093154fad0b7d76c4d87e1557430d65a3241a4160f21ee3af7aa8053222b0da7"
Mar 20 15:46:24 crc kubenswrapper[4730]: I0320 15:46:24.853871    4730 scope.go:117] "RemoveContainer" containerID="093154fad0b7d76c4d87e1557430d65a3241a4160f21ee3af7aa8053222b0da7"
Mar 20 15:46:24 crc kubenswrapper[4730]: E0320 15:46:24.854412    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"093154fad0b7d76c4d87e1557430d65a3241a4160f21ee3af7aa8053222b0da7\": container with ID starting with 093154fad0b7d76c4d87e1557430d65a3241a4160f21ee3af7aa8053222b0da7 not found: ID does not exist" containerID="093154fad0b7d76c4d87e1557430d65a3241a4160f21ee3af7aa8053222b0da7"
Mar 20 15:46:24 crc kubenswrapper[4730]: I0320 15:46:24.854454    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"093154fad0b7d76c4d87e1557430d65a3241a4160f21ee3af7aa8053222b0da7"} err="failed to get container status \"093154fad0b7d76c4d87e1557430d65a3241a4160f21ee3af7aa8053222b0da7\": rpc error: code = NotFound desc = could not find container \"093154fad0b7d76c4d87e1557430d65a3241a4160f21ee3af7aa8053222b0da7\": container with ID starting with 093154fad0b7d76c4d87e1557430d65a3241a4160f21ee3af7aa8053222b0da7 not found: ID does not exist"
Mar 20 15:46:24 crc kubenswrapper[4730]: I0320 15:46:24.871703    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-887974f98-g5dnn"]
Mar 20 15:46:24 crc kubenswrapper[4730]: I0320 15:46:24.875059    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-887974f98-g5dnn"]
Mar 20 15:46:24 crc kubenswrapper[4730]: I0320 15:46:24.886366    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbk4s\" (UniqueName: \"kubernetes.io/projected/f6f367c6-14a0-4683-aff1-d6fbe6cc3a47-kube-api-access-rbk4s\") on node \"crc\" DevicePath \"\""
Mar 20 15:46:24 crc kubenswrapper[4730]: I0320 15:46:24.886395    4730 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6f367c6-14a0-4683-aff1-d6fbe6cc3a47-serving-cert\") on node \"crc\" DevicePath \"\""
Mar 20 15:46:24 crc kubenswrapper[4730]: I0320 15:46:24.886403    4730 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6f367c6-14a0-4683-aff1-d6fbe6cc3a47-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:46:24 crc kubenswrapper[4730]: I0320 15:46:24.886412    4730 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f6f367c6-14a0-4683-aff1-d6fbe6cc3a47-client-ca\") on node \"crc\" DevicePath \"\""
Mar 20 15:46:25 crc kubenswrapper[4730]: I0320 15:46:25.315534    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79bc85d457-d42jn"]
Mar 20 15:46:25 crc kubenswrapper[4730]: E0320 15:46:25.316225    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6f367c6-14a0-4683-aff1-d6fbe6cc3a47" containerName="route-controller-manager"
Mar 20 15:46:25 crc kubenswrapper[4730]: I0320 15:46:25.316242    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6f367c6-14a0-4683-aff1-d6fbe6cc3a47" containerName="route-controller-manager"
Mar 20 15:46:25 crc kubenswrapper[4730]: I0320 15:46:25.316480    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6f367c6-14a0-4683-aff1-d6fbe6cc3a47" containerName="route-controller-manager"
Mar 20 15:46:25 crc kubenswrapper[4730]: I0320 15:46:25.317036    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79bc85d457-d42jn"
Mar 20 15:46:25 crc kubenswrapper[4730]: I0320 15:46:25.319863    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert"
Mar 20 15:46:25 crc kubenswrapper[4730]: I0320 15:46:25.321726    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2"
Mar 20 15:46:25 crc kubenswrapper[4730]: I0320 15:46:25.321992    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt"
Mar 20 15:46:25 crc kubenswrapper[4730]: I0320 15:46:25.322157    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt"
Mar 20 15:46:25 crc kubenswrapper[4730]: I0320 15:46:25.322390    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca"
Mar 20 15:46:25 crc kubenswrapper[4730]: I0320 15:46:25.322733    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config"
Mar 20 15:46:25 crc kubenswrapper[4730]: I0320 15:46:25.330568    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79bc85d457-d42jn"]
Mar 20 15:46:25 crc kubenswrapper[4730]: I0320 15:46:25.494905    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ba77c88b-7fd7-4b1b-be2c-9708ac3f766d-client-ca\") pod \"route-controller-manager-79bc85d457-d42jn\" (UID: \"ba77c88b-7fd7-4b1b-be2c-9708ac3f766d\") " pod="openshift-route-controller-manager/route-controller-manager-79bc85d457-d42jn"
Mar 20 15:46:25 crc kubenswrapper[4730]: I0320 15:46:25.494969    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba77c88b-7fd7-4b1b-be2c-9708ac3f766d-serving-cert\") pod \"route-controller-manager-79bc85d457-d42jn\" (UID: \"ba77c88b-7fd7-4b1b-be2c-9708ac3f766d\") " pod="openshift-route-controller-manager/route-controller-manager-79bc85d457-d42jn"
Mar 20 15:46:25 crc kubenswrapper[4730]: I0320 15:46:25.494999    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba77c88b-7fd7-4b1b-be2c-9708ac3f766d-config\") pod \"route-controller-manager-79bc85d457-d42jn\" (UID: \"ba77c88b-7fd7-4b1b-be2c-9708ac3f766d\") " pod="openshift-route-controller-manager/route-controller-manager-79bc85d457-d42jn"
Mar 20 15:46:25 crc kubenswrapper[4730]: I0320 15:46:25.495022    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2l4jp\" (UniqueName: \"kubernetes.io/projected/ba77c88b-7fd7-4b1b-be2c-9708ac3f766d-kube-api-access-2l4jp\") pod \"route-controller-manager-79bc85d457-d42jn\" (UID: \"ba77c88b-7fd7-4b1b-be2c-9708ac3f766d\") " pod="openshift-route-controller-manager/route-controller-manager-79bc85d457-d42jn"
Mar 20 15:46:25 crc kubenswrapper[4730]: I0320 15:46:25.543229    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6f367c6-14a0-4683-aff1-d6fbe6cc3a47" path="/var/lib/kubelet/pods/f6f367c6-14a0-4683-aff1-d6fbe6cc3a47/volumes"
Mar 20 15:46:25 crc kubenswrapper[4730]: I0320 15:46:25.596194    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2l4jp\" (UniqueName: \"kubernetes.io/projected/ba77c88b-7fd7-4b1b-be2c-9708ac3f766d-kube-api-access-2l4jp\") pod \"route-controller-manager-79bc85d457-d42jn\" (UID: \"ba77c88b-7fd7-4b1b-be2c-9708ac3f766d\") " pod="openshift-route-controller-manager/route-controller-manager-79bc85d457-d42jn"
Mar 20 15:46:25 crc kubenswrapper[4730]: I0320 15:46:25.596356    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ba77c88b-7fd7-4b1b-be2c-9708ac3f766d-client-ca\") pod \"route-controller-manager-79bc85d457-d42jn\" (UID: \"ba77c88b-7fd7-4b1b-be2c-9708ac3f766d\") " pod="openshift-route-controller-manager/route-controller-manager-79bc85d457-d42jn"
Mar 20 15:46:25 crc kubenswrapper[4730]: I0320 15:46:25.596374    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba77c88b-7fd7-4b1b-be2c-9708ac3f766d-serving-cert\") pod \"route-controller-manager-79bc85d457-d42jn\" (UID: \"ba77c88b-7fd7-4b1b-be2c-9708ac3f766d\") " pod="openshift-route-controller-manager/route-controller-manager-79bc85d457-d42jn"
Mar 20 15:46:25 crc kubenswrapper[4730]: I0320 15:46:25.596398    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba77c88b-7fd7-4b1b-be2c-9708ac3f766d-config\") pod \"route-controller-manager-79bc85d457-d42jn\" (UID: \"ba77c88b-7fd7-4b1b-be2c-9708ac3f766d\") " pod="openshift-route-controller-manager/route-controller-manager-79bc85d457-d42jn"
Mar 20 15:46:25 crc kubenswrapper[4730]: I0320 15:46:25.597560    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba77c88b-7fd7-4b1b-be2c-9708ac3f766d-config\") pod \"route-controller-manager-79bc85d457-d42jn\" (UID: \"ba77c88b-7fd7-4b1b-be2c-9708ac3f766d\") " pod="openshift-route-controller-manager/route-controller-manager-79bc85d457-d42jn"
Mar 20 15:46:25 crc kubenswrapper[4730]: I0320 15:46:25.599982    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ba77c88b-7fd7-4b1b-be2c-9708ac3f766d-client-ca\") pod \"route-controller-manager-79bc85d457-d42jn\" (UID: \"ba77c88b-7fd7-4b1b-be2c-9708ac3f766d\") " pod="openshift-route-controller-manager/route-controller-manager-79bc85d457-d42jn"
Mar 20 15:46:25 crc kubenswrapper[4730]: I0320 15:46:25.608827    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba77c88b-7fd7-4b1b-be2c-9708ac3f766d-serving-cert\") pod \"route-controller-manager-79bc85d457-d42jn\" (UID: \"ba77c88b-7fd7-4b1b-be2c-9708ac3f766d\") " pod="openshift-route-controller-manager/route-controller-manager-79bc85d457-d42jn"
Mar 20 15:46:25 crc kubenswrapper[4730]: I0320 15:46:25.615862    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2l4jp\" (UniqueName: \"kubernetes.io/projected/ba77c88b-7fd7-4b1b-be2c-9708ac3f766d-kube-api-access-2l4jp\") pod \"route-controller-manager-79bc85d457-d42jn\" (UID: \"ba77c88b-7fd7-4b1b-be2c-9708ac3f766d\") " pod="openshift-route-controller-manager/route-controller-manager-79bc85d457-d42jn"
Mar 20 15:46:25 crc kubenswrapper[4730]: I0320 15:46:25.636242    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79bc85d457-d42jn"
Mar 20 15:46:25 crc kubenswrapper[4730]: I0320 15:46:25.698108    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:46:25 crc kubenswrapper[4730]: I0320 15:46:25.698329    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:46:25 crc kubenswrapper[4730]: I0320 15:46:25.698370    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:46:25 crc kubenswrapper[4730]: I0320 15:46:25.698414    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:46:25 crc kubenswrapper[4730]: I0320 15:46:25.700964    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:46:25 crc kubenswrapper[4730]: I0320 15:46:25.702855    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:46:25 crc kubenswrapper[4730]: I0320 15:46:25.703044    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:46:25 crc kubenswrapper[4730]: I0320 15:46:25.703430    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:46:25 crc kubenswrapper[4730]: I0320 15:46:25.833638    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:46:25 crc kubenswrapper[4730]: I0320 15:46:25.833638    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"
Mar 20 15:46:25 crc kubenswrapper[4730]: I0320 15:46:25.833756    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf"
Mar 20 15:46:25 crc kubenswrapper[4730]: I0320 15:46:25.901774    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a-metrics-certs\") pod \"network-metrics-daemon-2prfn\" (UID: \"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\") " pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:46:25 crc kubenswrapper[4730]: I0320 15:46:25.906961    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a-metrics-certs\") pod \"network-metrics-daemon-2prfn\" (UID: \"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a\") " pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:46:25 crc kubenswrapper[4730]: I0320 15:46:25.936694    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c"
Mar 20 15:46:25 crc kubenswrapper[4730]: I0320 15:46:25.944598    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2prfn"
Mar 20 15:46:26 crc kubenswrapper[4730]: I0320 15:46:26.046384    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79bc85d457-d42jn"]
Mar 20 15:46:26 crc kubenswrapper[4730]: W0320 15:46:26.277913    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-8f66443105590edc6962d38c8fe805088c5010ec68eee05fcd7144569423c615 WatchSource:0}: Error finding container 8f66443105590edc6962d38c8fe805088c5010ec68eee05fcd7144569423c615: Status 404 returned error can't find the container with id 8f66443105590edc6962d38c8fe805088c5010ec68eee05fcd7144569423c615
Mar 20 15:46:26 crc kubenswrapper[4730]: I0320 15:46:26.366755    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2prfn"]
Mar 20 15:46:26 crc kubenswrapper[4730]: I0320 15:46:26.841938    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2prfn" event={"ID":"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a","Type":"ContainerStarted","Data":"042e670d22ec3cd255c89c175c1ae13c5142eb845549769f30fc53f425605ef0"}
Mar 20 15:46:26 crc kubenswrapper[4730]: I0320 15:46:26.842328    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2prfn" event={"ID":"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a","Type":"ContainerStarted","Data":"dd95ab3ea3f4a45814d13ed92e8765afd7e941fbc36557369f7c8ce736f5429c"}
Mar 20 15:46:26 crc kubenswrapper[4730]: I0320 15:46:26.842346    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2prfn" event={"ID":"db9ee3bf-d96d-4ae8-bb5c-f299469bfc0a","Type":"ContainerStarted","Data":"509bc94cb9394c5da06389965f36ec02d42d5e991cb66ac94831363750ca9f36"}
Mar 20 15:46:26 crc kubenswrapper[4730]: I0320 15:46:26.843735    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79bc85d457-d42jn" event={"ID":"ba77c88b-7fd7-4b1b-be2c-9708ac3f766d","Type":"ContainerStarted","Data":"5faf01e0b07690fd70093656deaf9b89f77f92f64bf5fe9609eda77952c15dea"}
Mar 20 15:46:26 crc kubenswrapper[4730]: I0320 15:46:26.843782    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79bc85d457-d42jn" event={"ID":"ba77c88b-7fd7-4b1b-be2c-9708ac3f766d","Type":"ContainerStarted","Data":"d18434bb33a755f466d4b04274990f2ecebf84177e2a328df838834244320e7a"}
Mar 20 15:46:26 crc kubenswrapper[4730]: I0320 15:46:26.843894    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-79bc85d457-d42jn"
Mar 20 15:46:26 crc kubenswrapper[4730]: I0320 15:46:26.844895    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"28f49df9dd8811e43a03c84f32b3d00f487e1c46caa9286b20491d1e279e755c"}
Mar 20 15:46:26 crc kubenswrapper[4730]: I0320 15:46:26.844934    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"c636efc01d0c49b1ce86f53c0064c221598c4bb08e67586d3f9ae1e2618f89f0"}
Mar 20 15:46:26 crc kubenswrapper[4730]: I0320 15:46:26.846074    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"80e9abf5f5b71f4dc7d897ab2c626cb54651f486709ea593fe5e4e2481bf3824"}
Mar 20 15:46:26 crc kubenswrapper[4730]: I0320 15:46:26.846107    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"153e9d4ad565cf26447b51dae0425e7dee85181ab21dc23255dab338040c1560"}
Mar 20 15:46:26 crc kubenswrapper[4730]: I0320 15:46:26.847671    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"16b135ea8083cb8cb5ba34ee60d710019169f125d3c216aacd5dce21f8be39d9"}
Mar 20 15:46:26 crc kubenswrapper[4730]: I0320 15:46:26.847717    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"8f66443105590edc6962d38c8fe805088c5010ec68eee05fcd7144569423c615"}
Mar 20 15:46:26 crc kubenswrapper[4730]: I0320 15:46:26.847907    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:46:26 crc kubenswrapper[4730]: I0320 15:46:26.849026    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-79bc85d457-d42jn"
Mar 20 15:46:26 crc kubenswrapper[4730]: I0320 15:46:26.874327    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-2prfn" podStartSLOduration=394.874306495 podStartE2EDuration="6m34.874306495s" podCreationTimestamp="2026-03-20 15:39:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:46:26.858050026 +0000 UTC m=+446.071421395" watchObservedRunningTime="2026-03-20 15:46:26.874306495 +0000 UTC m=+446.087677874"
Mar 20 15:46:26 crc kubenswrapper[4730]: I0320 15:46:26.908294    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-79bc85d457-d42jn" podStartSLOduration=2.908271127 podStartE2EDuration="2.908271127s" podCreationTimestamp="2026-03-20 15:46:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:46:26.904181684 +0000 UTC m=+446.117553073" watchObservedRunningTime="2026-03-20 15:46:26.908271127 +0000 UTC m=+446.121642496"
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.107215    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rlnqc"]
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.108044    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rlnqc" podUID="e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98" containerName="registry-server" containerID="cri-o://3303b366b010494b00cff91f0adf58b15d0be7946981888a990192d9cd69b3fa" gracePeriod=30
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.114498    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mbtfk"]
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.114744    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mbtfk" podUID="d5addb8e-1dbc-41a2-8330-8a97251bd52f" containerName="registry-server" containerID="cri-o://b0b7cf1aa8683df6582d7fa32a0ee12665587f8843d63db7cc45648643eb352c" gracePeriod=30
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.118228    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-klbh8"]
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.118432    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-klbh8" podUID="e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3" containerName="marketplace-operator" containerID="cri-o://753a6699245a34bcfcd2383bf2298b09146bcf7e23ec3ecf85f51c516941cd06" gracePeriod=30
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.133312    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-flpw2"]
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.133617    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-flpw2" podUID="5a347883-e4f7-4fcd-8920-59519533cf43" containerName="registry-server" containerID="cri-o://8a137f491e123e26bfe8e53249675fb8ec9405c7b00ec70ee4673e9a88e5d6bf" gracePeriod=30
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.146105    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-b842f"]
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.147128    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-b842f"
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.152033    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8rptq"]
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.152342    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8rptq" podUID="558b00fd-2589-4842-8cba-db0cffe8c826" containerName="registry-server" containerID="cri-o://be8ad8b5a0faeab783b7ccafb3517fc045687f3f5ccf91534d7b0f3ee31c621e" gracePeriod=30
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.155566    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-b842f"]
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.270424    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b3eaa81f-92a9-49fa-aca0-1e8e35920f20-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-b842f\" (UID: \"b3eaa81f-92a9-49fa-aca0-1e8e35920f20\") " pod="openshift-marketplace/marketplace-operator-79b997595-b842f"
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.270468    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdwqt\" (UniqueName: \"kubernetes.io/projected/b3eaa81f-92a9-49fa-aca0-1e8e35920f20-kube-api-access-jdwqt\") pod \"marketplace-operator-79b997595-b842f\" (UID: \"b3eaa81f-92a9-49fa-aca0-1e8e35920f20\") " pod="openshift-marketplace/marketplace-operator-79b997595-b842f"
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.270518    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b3eaa81f-92a9-49fa-aca0-1e8e35920f20-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-b842f\" (UID: \"b3eaa81f-92a9-49fa-aca0-1e8e35920f20\") " pod="openshift-marketplace/marketplace-operator-79b997595-b842f"
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.371224    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b3eaa81f-92a9-49fa-aca0-1e8e35920f20-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-b842f\" (UID: \"b3eaa81f-92a9-49fa-aca0-1e8e35920f20\") " pod="openshift-marketplace/marketplace-operator-79b997595-b842f"
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.371283    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdwqt\" (UniqueName: \"kubernetes.io/projected/b3eaa81f-92a9-49fa-aca0-1e8e35920f20-kube-api-access-jdwqt\") pod \"marketplace-operator-79b997595-b842f\" (UID: \"b3eaa81f-92a9-49fa-aca0-1e8e35920f20\") " pod="openshift-marketplace/marketplace-operator-79b997595-b842f"
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.374948    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b3eaa81f-92a9-49fa-aca0-1e8e35920f20-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-b842f\" (UID: \"b3eaa81f-92a9-49fa-aca0-1e8e35920f20\") " pod="openshift-marketplace/marketplace-operator-79b997595-b842f"
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.376109    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b3eaa81f-92a9-49fa-aca0-1e8e35920f20-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-b842f\" (UID: \"b3eaa81f-92a9-49fa-aca0-1e8e35920f20\") " pod="openshift-marketplace/marketplace-operator-79b997595-b842f"
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.376680    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b3eaa81f-92a9-49fa-aca0-1e8e35920f20-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-b842f\" (UID: \"b3eaa81f-92a9-49fa-aca0-1e8e35920f20\") " pod="openshift-marketplace/marketplace-operator-79b997595-b842f"
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.386386    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdwqt\" (UniqueName: \"kubernetes.io/projected/b3eaa81f-92a9-49fa-aca0-1e8e35920f20-kube-api-access-jdwqt\") pod \"marketplace-operator-79b997595-b842f\" (UID: \"b3eaa81f-92a9-49fa-aca0-1e8e35920f20\") " pod="openshift-marketplace/marketplace-operator-79b997595-b842f"
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.470652    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-b842f"
Mar 20 15:46:30 crc kubenswrapper[4730]: E0320 15:46:30.496770    4730 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of be8ad8b5a0faeab783b7ccafb3517fc045687f3f5ccf91534d7b0f3ee31c621e is running failed: container process not found" containerID="be8ad8b5a0faeab783b7ccafb3517fc045687f3f5ccf91534d7b0f3ee31c621e" cmd=["grpc_health_probe","-addr=:50051"]
Mar 20 15:46:30 crc kubenswrapper[4730]: E0320 15:46:30.497481    4730 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of be8ad8b5a0faeab783b7ccafb3517fc045687f3f5ccf91534d7b0f3ee31c621e is running failed: container process not found" containerID="be8ad8b5a0faeab783b7ccafb3517fc045687f3f5ccf91534d7b0f3ee31c621e" cmd=["grpc_health_probe","-addr=:50051"]
Mar 20 15:46:30 crc kubenswrapper[4730]: E0320 15:46:30.497827    4730 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of be8ad8b5a0faeab783b7ccafb3517fc045687f3f5ccf91534d7b0f3ee31c621e is running failed: container process not found" containerID="be8ad8b5a0faeab783b7ccafb3517fc045687f3f5ccf91534d7b0f3ee31c621e" cmd=["grpc_health_probe","-addr=:50051"]
Mar 20 15:46:30 crc kubenswrapper[4730]: E0320 15:46:30.497859    4730 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of be8ad8b5a0faeab783b7ccafb3517fc045687f3f5ccf91534d7b0f3ee31c621e is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-8rptq" podUID="558b00fd-2589-4842-8cba-db0cffe8c826" containerName="registry-server"
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.713765    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-klbh8"
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.780210    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3-marketplace-operator-metrics\") pod \"e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3\" (UID: \"e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3\") "
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.780282    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxflj\" (UniqueName: \"kubernetes.io/projected/e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3-kube-api-access-jxflj\") pod \"e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3\" (UID: \"e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3\") "
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.780335    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3-marketplace-trusted-ca\") pod \"e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3\" (UID: \"e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3\") "
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.780996    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3" (UID: "e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.784642    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3" (UID: "e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.785235    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3-kube-api-access-jxflj" (OuterVolumeSpecName: "kube-api-access-jxflj") pod "e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3" (UID: "e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3"). InnerVolumeSpecName "kube-api-access-jxflj". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.813007    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rlnqc"
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.871494    4730 generic.go:334] "Generic (PLEG): container finished" podID="e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3" containerID="753a6699245a34bcfcd2383bf2298b09146bcf7e23ec3ecf85f51c516941cd06" exitCode=0
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.871566    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-klbh8" event={"ID":"e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3","Type":"ContainerDied","Data":"753a6699245a34bcfcd2383bf2298b09146bcf7e23ec3ecf85f51c516941cd06"}
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.871602    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-klbh8" event={"ID":"e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3","Type":"ContainerDied","Data":"759bd2ea27103b987add0fca450b8a256b74867157aa94299acbb52889decc8f"}
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.871623    4730 scope.go:117] "RemoveContainer" containerID="753a6699245a34bcfcd2383bf2298b09146bcf7e23ec3ecf85f51c516941cd06"
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.871717    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-klbh8"
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.876809    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8rptq"
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.878692    4730 generic.go:334] "Generic (PLEG): container finished" podID="e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98" containerID="3303b366b010494b00cff91f0adf58b15d0be7946981888a990192d9cd69b3fa" exitCode=0
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.878744    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rlnqc" event={"ID":"e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98","Type":"ContainerDied","Data":"3303b366b010494b00cff91f0adf58b15d0be7946981888a990192d9cd69b3fa"}
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.878763    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rlnqc" event={"ID":"e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98","Type":"ContainerDied","Data":"3040ee2962d5bbb55e0748bc4e87b0b5953d222ba1e8e8b464bf1b4d1408cbfd"}
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.878817    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rlnqc"
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.880860    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgx8f\" (UniqueName: \"kubernetes.io/projected/558b00fd-2589-4842-8cba-db0cffe8c826-kube-api-access-kgx8f\") pod \"558b00fd-2589-4842-8cba-db0cffe8c826\" (UID: \"558b00fd-2589-4842-8cba-db0cffe8c826\") "
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.880914    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98-utilities\") pod \"e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98\" (UID: \"e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98\") "
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.880938    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/558b00fd-2589-4842-8cba-db0cffe8c826-catalog-content\") pod \"558b00fd-2589-4842-8cba-db0cffe8c826\" (UID: \"558b00fd-2589-4842-8cba-db0cffe8c826\") "
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.880975    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgmhp\" (UniqueName: \"kubernetes.io/projected/e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98-kube-api-access-rgmhp\") pod \"e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98\" (UID: \"e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98\") "
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.881181    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mbtfk"
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.881534    4730 generic.go:334] "Generic (PLEG): container finished" podID="558b00fd-2589-4842-8cba-db0cffe8c826" containerID="be8ad8b5a0faeab783b7ccafb3517fc045687f3f5ccf91534d7b0f3ee31c621e" exitCode=0
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.881578    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8rptq" event={"ID":"558b00fd-2589-4842-8cba-db0cffe8c826","Type":"ContainerDied","Data":"be8ad8b5a0faeab783b7ccafb3517fc045687f3f5ccf91534d7b0f3ee31c621e"}
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.881599    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8rptq" event={"ID":"558b00fd-2589-4842-8cba-db0cffe8c826","Type":"ContainerDied","Data":"8b528dc3e6323a70e8b05e1cb0a0d95967e9a6d57d83e5d00d37458aa2621e38"}
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.881722    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8rptq"
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.882219    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98-utilities" (OuterVolumeSpecName: "utilities") pod "e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98" (UID: "e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.882377    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/558b00fd-2589-4842-8cba-db0cffe8c826-utilities\") pod \"558b00fd-2589-4842-8cba-db0cffe8c826\" (UID: \"558b00fd-2589-4842-8cba-db0cffe8c826\") "
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.882439    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98-catalog-content\") pod \"e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98\" (UID: \"e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98\") "
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.883962    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/558b00fd-2589-4842-8cba-db0cffe8c826-utilities" (OuterVolumeSpecName: "utilities") pod "558b00fd-2589-4842-8cba-db0cffe8c826" (UID: "558b00fd-2589-4842-8cba-db0cffe8c826"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.884097    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98-kube-api-access-rgmhp" (OuterVolumeSpecName: "kube-api-access-rgmhp") pod "e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98" (UID: "e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98"). InnerVolumeSpecName "kube-api-access-rgmhp". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.885284    4730 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/558b00fd-2589-4842-8cba-db0cffe8c826-utilities\") on node \"crc\" DevicePath \"\""
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.885310    4730 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\""
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.885324    4730 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98-utilities\") on node \"crc\" DevicePath \"\""
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.885323    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/558b00fd-2589-4842-8cba-db0cffe8c826-kube-api-access-kgx8f" (OuterVolumeSpecName: "kube-api-access-kgx8f") pod "558b00fd-2589-4842-8cba-db0cffe8c826" (UID: "558b00fd-2589-4842-8cba-db0cffe8c826"). InnerVolumeSpecName "kube-api-access-kgx8f". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.885336    4730 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\""
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.885354    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgmhp\" (UniqueName: \"kubernetes.io/projected/e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98-kube-api-access-rgmhp\") on node \"crc\" DevicePath \"\""
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.885367    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxflj\" (UniqueName: \"kubernetes.io/projected/e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3-kube-api-access-jxflj\") on node \"crc\" DevicePath \"\""
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.891346    4730 generic.go:334] "Generic (PLEG): container finished" podID="d5addb8e-1dbc-41a2-8330-8a97251bd52f" containerID="b0b7cf1aa8683df6582d7fa32a0ee12665587f8843d63db7cc45648643eb352c" exitCode=0
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.891424    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbtfk" event={"ID":"d5addb8e-1dbc-41a2-8330-8a97251bd52f","Type":"ContainerDied","Data":"b0b7cf1aa8683df6582d7fa32a0ee12665587f8843d63db7cc45648643eb352c"}
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.891495    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mbtfk"
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.895662    4730 generic.go:334] "Generic (PLEG): container finished" podID="5a347883-e4f7-4fcd-8920-59519533cf43" containerID="8a137f491e123e26bfe8e53249675fb8ec9405c7b00ec70ee4673e9a88e5d6bf" exitCode=0
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.895700    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-flpw2" event={"ID":"5a347883-e4f7-4fcd-8920-59519533cf43","Type":"ContainerDied","Data":"8a137f491e123e26bfe8e53249675fb8ec9405c7b00ec70ee4673e9a88e5d6bf"}
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.919560    4730 scope.go:117] "RemoveContainer" containerID="53a23df451c80721daf3b80414ff05d019adbd298f30cf30f417f8af1c2bafc2"
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.923958    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-klbh8"]
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.938214    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-flpw2"
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.945179    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-klbh8"]
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.947371    4730 scope.go:117] "RemoveContainer" containerID="753a6699245a34bcfcd2383bf2298b09146bcf7e23ec3ecf85f51c516941cd06"
Mar 20 15:46:30 crc kubenswrapper[4730]: E0320 15:46:30.949040    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"753a6699245a34bcfcd2383bf2298b09146bcf7e23ec3ecf85f51c516941cd06\": container with ID starting with 753a6699245a34bcfcd2383bf2298b09146bcf7e23ec3ecf85f51c516941cd06 not found: ID does not exist" containerID="753a6699245a34bcfcd2383bf2298b09146bcf7e23ec3ecf85f51c516941cd06"
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.949067    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"753a6699245a34bcfcd2383bf2298b09146bcf7e23ec3ecf85f51c516941cd06"} err="failed to get container status \"753a6699245a34bcfcd2383bf2298b09146bcf7e23ec3ecf85f51c516941cd06\": rpc error: code = NotFound desc = could not find container \"753a6699245a34bcfcd2383bf2298b09146bcf7e23ec3ecf85f51c516941cd06\": container with ID starting with 753a6699245a34bcfcd2383bf2298b09146bcf7e23ec3ecf85f51c516941cd06 not found: ID does not exist"
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.949086    4730 scope.go:117] "RemoveContainer" containerID="53a23df451c80721daf3b80414ff05d019adbd298f30cf30f417f8af1c2bafc2"
Mar 20 15:46:30 crc kubenswrapper[4730]: E0320 15:46:30.949390    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53a23df451c80721daf3b80414ff05d019adbd298f30cf30f417f8af1c2bafc2\": container with ID starting with 53a23df451c80721daf3b80414ff05d019adbd298f30cf30f417f8af1c2bafc2 not found: ID does not exist" containerID="53a23df451c80721daf3b80414ff05d019adbd298f30cf30f417f8af1c2bafc2"
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.949427    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53a23df451c80721daf3b80414ff05d019adbd298f30cf30f417f8af1c2bafc2"} err="failed to get container status \"53a23df451c80721daf3b80414ff05d019adbd298f30cf30f417f8af1c2bafc2\": rpc error: code = NotFound desc = could not find container \"53a23df451c80721daf3b80414ff05d019adbd298f30cf30f417f8af1c2bafc2\": container with ID starting with 53a23df451c80721daf3b80414ff05d019adbd298f30cf30f417f8af1c2bafc2 not found: ID does not exist"
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.949451    4730 scope.go:117] "RemoveContainer" containerID="3303b366b010494b00cff91f0adf58b15d0be7946981888a990192d9cd69b3fa"
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.973153    4730 scope.go:117] "RemoveContainer" containerID="2c3e34ad9ea0b6c3222cf006f08a02a03e69e35e189c65669c0748e767b79f75"
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.975504    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98" (UID: "e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.985721    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5addb8e-1dbc-41a2-8330-8a97251bd52f-catalog-content\") pod \"d5addb8e-1dbc-41a2-8330-8a97251bd52f\" (UID: \"d5addb8e-1dbc-41a2-8330-8a97251bd52f\") "
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.986071    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rd6js\" (UniqueName: \"kubernetes.io/projected/d5addb8e-1dbc-41a2-8330-8a97251bd52f-kube-api-access-rd6js\") pod \"d5addb8e-1dbc-41a2-8330-8a97251bd52f\" (UID: \"d5addb8e-1dbc-41a2-8330-8a97251bd52f\") "
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.986204    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a347883-e4f7-4fcd-8920-59519533cf43-catalog-content\") pod \"5a347883-e4f7-4fcd-8920-59519533cf43\" (UID: \"5a347883-e4f7-4fcd-8920-59519533cf43\") "
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.986409    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5addb8e-1dbc-41a2-8330-8a97251bd52f-utilities\") pod \"d5addb8e-1dbc-41a2-8330-8a97251bd52f\" (UID: \"d5addb8e-1dbc-41a2-8330-8a97251bd52f\") "
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.986525    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a347883-e4f7-4fcd-8920-59519533cf43-utilities\") pod \"5a347883-e4f7-4fcd-8920-59519533cf43\" (UID: \"5a347883-e4f7-4fcd-8920-59519533cf43\") "
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.986632    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hs46z\" (UniqueName: \"kubernetes.io/projected/5a347883-e4f7-4fcd-8920-59519533cf43-kube-api-access-hs46z\") pod \"5a347883-e4f7-4fcd-8920-59519533cf43\" (UID: \"5a347883-e4f7-4fcd-8920-59519533cf43\") "
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.987173    4730 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98-catalog-content\") on node \"crc\" DevicePath \"\""
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.987399    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgx8f\" (UniqueName: \"kubernetes.io/projected/558b00fd-2589-4842-8cba-db0cffe8c826-kube-api-access-kgx8f\") on node \"crc\" DevicePath \"\""
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.987739    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5addb8e-1dbc-41a2-8330-8a97251bd52f-utilities" (OuterVolumeSpecName: "utilities") pod "d5addb8e-1dbc-41a2-8330-8a97251bd52f" (UID: "d5addb8e-1dbc-41a2-8330-8a97251bd52f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.987829    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a347883-e4f7-4fcd-8920-59519533cf43-utilities" (OuterVolumeSpecName: "utilities") pod "5a347883-e4f7-4fcd-8920-59519533cf43" (UID: "5a347883-e4f7-4fcd-8920-59519533cf43"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.990520    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5addb8e-1dbc-41a2-8330-8a97251bd52f-kube-api-access-rd6js" (OuterVolumeSpecName: "kube-api-access-rd6js") pod "d5addb8e-1dbc-41a2-8330-8a97251bd52f" (UID: "d5addb8e-1dbc-41a2-8330-8a97251bd52f"). InnerVolumeSpecName "kube-api-access-rd6js". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.990872    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a347883-e4f7-4fcd-8920-59519533cf43-kube-api-access-hs46z" (OuterVolumeSpecName: "kube-api-access-hs46z") pod "5a347883-e4f7-4fcd-8920-59519533cf43" (UID: "5a347883-e4f7-4fcd-8920-59519533cf43"). InnerVolumeSpecName "kube-api-access-hs46z". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:46:30 crc kubenswrapper[4730]: I0320 15:46:30.991826    4730 scope.go:117] "RemoveContainer" containerID="0936786d2af592781681255288d0bc8bfa0e6ea172412747ed1615c053176e9a"
Mar 20 15:46:31 crc kubenswrapper[4730]: I0320 15:46:31.005419    4730 scope.go:117] "RemoveContainer" containerID="3303b366b010494b00cff91f0adf58b15d0be7946981888a990192d9cd69b3fa"
Mar 20 15:46:31 crc kubenswrapper[4730]: E0320 15:46:31.005864    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3303b366b010494b00cff91f0adf58b15d0be7946981888a990192d9cd69b3fa\": container with ID starting with 3303b366b010494b00cff91f0adf58b15d0be7946981888a990192d9cd69b3fa not found: ID does not exist" containerID="3303b366b010494b00cff91f0adf58b15d0be7946981888a990192d9cd69b3fa"
Mar 20 15:46:31 crc kubenswrapper[4730]: I0320 15:46:31.005906    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3303b366b010494b00cff91f0adf58b15d0be7946981888a990192d9cd69b3fa"} err="failed to get container status \"3303b366b010494b00cff91f0adf58b15d0be7946981888a990192d9cd69b3fa\": rpc error: code = NotFound desc = could not find container \"3303b366b010494b00cff91f0adf58b15d0be7946981888a990192d9cd69b3fa\": container with ID starting with 3303b366b010494b00cff91f0adf58b15d0be7946981888a990192d9cd69b3fa not found: ID does not exist"
Mar 20 15:46:31 crc kubenswrapper[4730]: I0320 15:46:31.005960    4730 scope.go:117] "RemoveContainer" containerID="2c3e34ad9ea0b6c3222cf006f08a02a03e69e35e189c65669c0748e767b79f75"
Mar 20 15:46:31 crc kubenswrapper[4730]: E0320 15:46:31.007721    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c3e34ad9ea0b6c3222cf006f08a02a03e69e35e189c65669c0748e767b79f75\": container with ID starting with 2c3e34ad9ea0b6c3222cf006f08a02a03e69e35e189c65669c0748e767b79f75 not found: ID does not exist" containerID="2c3e34ad9ea0b6c3222cf006f08a02a03e69e35e189c65669c0748e767b79f75"
Mar 20 15:46:31 crc kubenswrapper[4730]: I0320 15:46:31.009225    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c3e34ad9ea0b6c3222cf006f08a02a03e69e35e189c65669c0748e767b79f75"} err="failed to get container status \"2c3e34ad9ea0b6c3222cf006f08a02a03e69e35e189c65669c0748e767b79f75\": rpc error: code = NotFound desc = could not find container \"2c3e34ad9ea0b6c3222cf006f08a02a03e69e35e189c65669c0748e767b79f75\": container with ID starting with 2c3e34ad9ea0b6c3222cf006f08a02a03e69e35e189c65669c0748e767b79f75 not found: ID does not exist"
Mar 20 15:46:31 crc kubenswrapper[4730]: I0320 15:46:31.009274    4730 scope.go:117] "RemoveContainer" containerID="0936786d2af592781681255288d0bc8bfa0e6ea172412747ed1615c053176e9a"
Mar 20 15:46:31 crc kubenswrapper[4730]: E0320 15:46:31.010775    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0936786d2af592781681255288d0bc8bfa0e6ea172412747ed1615c053176e9a\": container with ID starting with 0936786d2af592781681255288d0bc8bfa0e6ea172412747ed1615c053176e9a not found: ID does not exist" containerID="0936786d2af592781681255288d0bc8bfa0e6ea172412747ed1615c053176e9a"
Mar 20 15:46:31 crc kubenswrapper[4730]: I0320 15:46:31.010821    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0936786d2af592781681255288d0bc8bfa0e6ea172412747ed1615c053176e9a"} err="failed to get container status \"0936786d2af592781681255288d0bc8bfa0e6ea172412747ed1615c053176e9a\": rpc error: code = NotFound desc = could not find container \"0936786d2af592781681255288d0bc8bfa0e6ea172412747ed1615c053176e9a\": container with ID starting with 0936786d2af592781681255288d0bc8bfa0e6ea172412747ed1615c053176e9a not found: ID does not exist"
Mar 20 15:46:31 crc kubenswrapper[4730]: I0320 15:46:31.010854    4730 scope.go:117] "RemoveContainer" containerID="be8ad8b5a0faeab783b7ccafb3517fc045687f3f5ccf91534d7b0f3ee31c621e"
Mar 20 15:46:31 crc kubenswrapper[4730]: I0320 15:46:31.024033    4730 scope.go:117] "RemoveContainer" containerID="0b9feaef40e353d64a848dba5e34276e42725c50bac6122fd4b5265fc07ad6a1"
Mar 20 15:46:31 crc kubenswrapper[4730]: I0320 15:46:31.029866    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a347883-e4f7-4fcd-8920-59519533cf43-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5a347883-e4f7-4fcd-8920-59519533cf43" (UID: "5a347883-e4f7-4fcd-8920-59519533cf43"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 15:46:31 crc kubenswrapper[4730]: I0320 15:46:31.041604    4730 scope.go:117] "RemoveContainer" containerID="c2ed9a8424613f58ef80e56f4945d0a2f01a1e4ab20cce8fcafa4c0b7fc30d73"
Mar 20 15:46:31 crc kubenswrapper[4730]: I0320 15:46:31.044315    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/558b00fd-2589-4842-8cba-db0cffe8c826-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "558b00fd-2589-4842-8cba-db0cffe8c826" (UID: "558b00fd-2589-4842-8cba-db0cffe8c826"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 15:46:31 crc kubenswrapper[4730]: I0320 15:46:31.057279    4730 scope.go:117] "RemoveContainer" containerID="be8ad8b5a0faeab783b7ccafb3517fc045687f3f5ccf91534d7b0f3ee31c621e"
Mar 20 15:46:31 crc kubenswrapper[4730]: E0320 15:46:31.057793    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be8ad8b5a0faeab783b7ccafb3517fc045687f3f5ccf91534d7b0f3ee31c621e\": container with ID starting with be8ad8b5a0faeab783b7ccafb3517fc045687f3f5ccf91534d7b0f3ee31c621e not found: ID does not exist" containerID="be8ad8b5a0faeab783b7ccafb3517fc045687f3f5ccf91534d7b0f3ee31c621e"
Mar 20 15:46:31 crc kubenswrapper[4730]: I0320 15:46:31.057845    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be8ad8b5a0faeab783b7ccafb3517fc045687f3f5ccf91534d7b0f3ee31c621e"} err="failed to get container status \"be8ad8b5a0faeab783b7ccafb3517fc045687f3f5ccf91534d7b0f3ee31c621e\": rpc error: code = NotFound desc = could not find container \"be8ad8b5a0faeab783b7ccafb3517fc045687f3f5ccf91534d7b0f3ee31c621e\": container with ID starting with be8ad8b5a0faeab783b7ccafb3517fc045687f3f5ccf91534d7b0f3ee31c621e not found: ID does not exist"
Mar 20 15:46:31 crc kubenswrapper[4730]: I0320 15:46:31.057880    4730 scope.go:117] "RemoveContainer" containerID="0b9feaef40e353d64a848dba5e34276e42725c50bac6122fd4b5265fc07ad6a1"
Mar 20 15:46:31 crc kubenswrapper[4730]: I0320 15:46:31.058148    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5addb8e-1dbc-41a2-8330-8a97251bd52f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d5addb8e-1dbc-41a2-8330-8a97251bd52f" (UID: "d5addb8e-1dbc-41a2-8330-8a97251bd52f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 15:46:31 crc kubenswrapper[4730]: E0320 15:46:31.058285    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b9feaef40e353d64a848dba5e34276e42725c50bac6122fd4b5265fc07ad6a1\": container with ID starting with 0b9feaef40e353d64a848dba5e34276e42725c50bac6122fd4b5265fc07ad6a1 not found: ID does not exist" containerID="0b9feaef40e353d64a848dba5e34276e42725c50bac6122fd4b5265fc07ad6a1"
Mar 20 15:46:31 crc kubenswrapper[4730]: I0320 15:46:31.058315    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b9feaef40e353d64a848dba5e34276e42725c50bac6122fd4b5265fc07ad6a1"} err="failed to get container status \"0b9feaef40e353d64a848dba5e34276e42725c50bac6122fd4b5265fc07ad6a1\": rpc error: code = NotFound desc = could not find container \"0b9feaef40e353d64a848dba5e34276e42725c50bac6122fd4b5265fc07ad6a1\": container with ID starting with 0b9feaef40e353d64a848dba5e34276e42725c50bac6122fd4b5265fc07ad6a1 not found: ID does not exist"
Mar 20 15:46:31 crc kubenswrapper[4730]: I0320 15:46:31.058334    4730 scope.go:117] "RemoveContainer" containerID="c2ed9a8424613f58ef80e56f4945d0a2f01a1e4ab20cce8fcafa4c0b7fc30d73"
Mar 20 15:46:31 crc kubenswrapper[4730]: E0320 15:46:31.058751    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2ed9a8424613f58ef80e56f4945d0a2f01a1e4ab20cce8fcafa4c0b7fc30d73\": container with ID starting with c2ed9a8424613f58ef80e56f4945d0a2f01a1e4ab20cce8fcafa4c0b7fc30d73 not found: ID does not exist" containerID="c2ed9a8424613f58ef80e56f4945d0a2f01a1e4ab20cce8fcafa4c0b7fc30d73"
Mar 20 15:46:31 crc kubenswrapper[4730]: I0320 15:46:31.058783    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2ed9a8424613f58ef80e56f4945d0a2f01a1e4ab20cce8fcafa4c0b7fc30d73"} err="failed to get container status \"c2ed9a8424613f58ef80e56f4945d0a2f01a1e4ab20cce8fcafa4c0b7fc30d73\": rpc error: code = NotFound desc = could not find container \"c2ed9a8424613f58ef80e56f4945d0a2f01a1e4ab20cce8fcafa4c0b7fc30d73\": container with ID starting with c2ed9a8424613f58ef80e56f4945d0a2f01a1e4ab20cce8fcafa4c0b7fc30d73 not found: ID does not exist"
Mar 20 15:46:31 crc kubenswrapper[4730]: I0320 15:46:31.058810    4730 scope.go:117] "RemoveContainer" containerID="b0b7cf1aa8683df6582d7fa32a0ee12665587f8843d63db7cc45648643eb352c"
Mar 20 15:46:31 crc kubenswrapper[4730]: I0320 15:46:31.073265    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-b842f"]
Mar 20 15:46:31 crc kubenswrapper[4730]: I0320 15:46:31.073401    4730 scope.go:117] "RemoveContainer" containerID="027ff3ee79dd3768bc7352d26b5e9a7647079a2f17aa58047546ce0332c5b335"
Mar 20 15:46:31 crc kubenswrapper[4730]: W0320 15:46:31.077383    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3eaa81f_92a9_49fa_aca0_1e8e35920f20.slice/crio-f2983f2e7c0e6fdb3a360dbaf18785290bfa97fa39dfa4dd90909de1062ea9c1 WatchSource:0}: Error finding container f2983f2e7c0e6fdb3a360dbaf18785290bfa97fa39dfa4dd90909de1062ea9c1: Status 404 returned error can't find the container with id f2983f2e7c0e6fdb3a360dbaf18785290bfa97fa39dfa4dd90909de1062ea9c1
Mar 20 15:46:31 crc kubenswrapper[4730]: I0320 15:46:31.091468    4730 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5addb8e-1dbc-41a2-8330-8a97251bd52f-catalog-content\") on node \"crc\" DevicePath \"\""
Mar 20 15:46:31 crc kubenswrapper[4730]: I0320 15:46:31.091523    4730 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/558b00fd-2589-4842-8cba-db0cffe8c826-catalog-content\") on node \"crc\" DevicePath \"\""
Mar 20 15:46:31 crc kubenswrapper[4730]: I0320 15:46:31.091538    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rd6js\" (UniqueName: \"kubernetes.io/projected/d5addb8e-1dbc-41a2-8330-8a97251bd52f-kube-api-access-rd6js\") on node \"crc\" DevicePath \"\""
Mar 20 15:46:31 crc kubenswrapper[4730]: I0320 15:46:31.091549    4730 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a347883-e4f7-4fcd-8920-59519533cf43-catalog-content\") on node \"crc\" DevicePath \"\""
Mar 20 15:46:31 crc kubenswrapper[4730]: I0320 15:46:31.091559    4730 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5addb8e-1dbc-41a2-8330-8a97251bd52f-utilities\") on node \"crc\" DevicePath \"\""
Mar 20 15:46:31 crc kubenswrapper[4730]: I0320 15:46:31.091567    4730 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a347883-e4f7-4fcd-8920-59519533cf43-utilities\") on node \"crc\" DevicePath \"\""
Mar 20 15:46:31 crc kubenswrapper[4730]: I0320 15:46:31.091575    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hs46z\" (UniqueName: \"kubernetes.io/projected/5a347883-e4f7-4fcd-8920-59519533cf43-kube-api-access-hs46z\") on node \"crc\" DevicePath \"\""
Mar 20 15:46:31 crc kubenswrapper[4730]: I0320 15:46:31.092081    4730 scope.go:117] "RemoveContainer" containerID="1ce763ed176ec4f4dede58163ade6fda497d7444c6c6f195c24a524a711de167"
Mar 20 15:46:31 crc kubenswrapper[4730]: I0320 15:46:31.209988    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rlnqc"]
Mar 20 15:46:31 crc kubenswrapper[4730]: I0320 15:46:31.222269    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rlnqc"]
Mar 20 15:46:31 crc kubenswrapper[4730]: I0320 15:46:31.225601    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8rptq"]
Mar 20 15:46:31 crc kubenswrapper[4730]: I0320 15:46:31.232271    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8rptq"]
Mar 20 15:46:31 crc kubenswrapper[4730]: I0320 15:46:31.235464    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mbtfk"]
Mar 20 15:46:31 crc kubenswrapper[4730]: I0320 15:46:31.238268    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mbtfk"]
Mar 20 15:46:31 crc kubenswrapper[4730]: I0320 15:46:31.546120    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="558b00fd-2589-4842-8cba-db0cffe8c826" path="/var/lib/kubelet/pods/558b00fd-2589-4842-8cba-db0cffe8c826/volumes"
Mar 20 15:46:31 crc kubenswrapper[4730]: I0320 15:46:31.547029    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5addb8e-1dbc-41a2-8330-8a97251bd52f" path="/var/lib/kubelet/pods/d5addb8e-1dbc-41a2-8330-8a97251bd52f/volumes"
Mar 20 15:46:31 crc kubenswrapper[4730]: I0320 15:46:31.547744    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3" path="/var/lib/kubelet/pods/e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3/volumes"
Mar 20 15:46:31 crc kubenswrapper[4730]: I0320 15:46:31.548671    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98" path="/var/lib/kubelet/pods/e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98/volumes"
Mar 20 15:46:31 crc kubenswrapper[4730]: I0320 15:46:31.903420    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-b842f" event={"ID":"b3eaa81f-92a9-49fa-aca0-1e8e35920f20","Type":"ContainerStarted","Data":"23ffe972b9481dc0923d0a873a956bf4171c6c5123d4b682bb2650dec152f60f"}
Mar 20 15:46:31 crc kubenswrapper[4730]: I0320 15:46:31.903460    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-b842f" event={"ID":"b3eaa81f-92a9-49fa-aca0-1e8e35920f20","Type":"ContainerStarted","Data":"f2983f2e7c0e6fdb3a360dbaf18785290bfa97fa39dfa4dd90909de1062ea9c1"}
Mar 20 15:46:31 crc kubenswrapper[4730]: I0320 15:46:31.903626    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-b842f"
Mar 20 15:46:31 crc kubenswrapper[4730]: I0320 15:46:31.907332    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-flpw2" event={"ID":"5a347883-e4f7-4fcd-8920-59519533cf43","Type":"ContainerDied","Data":"9a27ed5cf68d2bc6928d37904a276f94c614d0e58deaf1534a9994ffbccaa224"}
Mar 20 15:46:31 crc kubenswrapper[4730]: I0320 15:46:31.907381    4730 scope.go:117] "RemoveContainer" containerID="8a137f491e123e26bfe8e53249675fb8ec9405c7b00ec70ee4673e9a88e5d6bf"
Mar 20 15:46:31 crc kubenswrapper[4730]: I0320 15:46:31.907530    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-flpw2"
Mar 20 15:46:31 crc kubenswrapper[4730]: I0320 15:46:31.908404    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-b842f"
Mar 20 15:46:31 crc kubenswrapper[4730]: I0320 15:46:31.930832    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-b842f" podStartSLOduration=1.930811286 podStartE2EDuration="1.930811286s" podCreationTimestamp="2026-03-20 15:46:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:46:31.927056243 +0000 UTC m=+451.140427622" watchObservedRunningTime="2026-03-20 15:46:31.930811286 +0000 UTC m=+451.144182665"
Mar 20 15:46:31 crc kubenswrapper[4730]: I0320 15:46:31.931115    4730 scope.go:117] "RemoveContainer" containerID="9a83b1f8dc654ef4f4276c65729d5eabf19cc5bf1944836a69eeb1d195139aba"
Mar 20 15:46:31 crc kubenswrapper[4730]: I0320 15:46:31.945643    4730 scope.go:117] "RemoveContainer" containerID="1e2d0f7b622d4a27e1b76b3b32f61e354d6ed5f7ddeb8e6368356819c35fc74f"
Mar 20 15:46:31 crc kubenswrapper[4730]: I0320 15:46:31.965169    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-flpw2"]
Mar 20 15:46:31 crc kubenswrapper[4730]: I0320 15:46:31.975278    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-flpw2"]
Mar 20 15:46:32 crc kubenswrapper[4730]: I0320 15:46:32.514885    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vk6rc"]
Mar 20 15:46:32 crc kubenswrapper[4730]: E0320 15:46:32.515389    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a347883-e4f7-4fcd-8920-59519533cf43" containerName="extract-utilities"
Mar 20 15:46:32 crc kubenswrapper[4730]: I0320 15:46:32.515405    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a347883-e4f7-4fcd-8920-59519533cf43" containerName="extract-utilities"
Mar 20 15:46:32 crc kubenswrapper[4730]: E0320 15:46:32.515416    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3" containerName="marketplace-operator"
Mar 20 15:46:32 crc kubenswrapper[4730]: I0320 15:46:32.515424    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3" containerName="marketplace-operator"
Mar 20 15:46:32 crc kubenswrapper[4730]: E0320 15:46:32.515433    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5addb8e-1dbc-41a2-8330-8a97251bd52f" containerName="registry-server"
Mar 20 15:46:32 crc kubenswrapper[4730]: I0320 15:46:32.515441    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5addb8e-1dbc-41a2-8330-8a97251bd52f" containerName="registry-server"
Mar 20 15:46:32 crc kubenswrapper[4730]: E0320 15:46:32.515449    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="558b00fd-2589-4842-8cba-db0cffe8c826" containerName="extract-content"
Mar 20 15:46:32 crc kubenswrapper[4730]: I0320 15:46:32.515456    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="558b00fd-2589-4842-8cba-db0cffe8c826" containerName="extract-content"
Mar 20 15:46:32 crc kubenswrapper[4730]: E0320 15:46:32.515469    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a347883-e4f7-4fcd-8920-59519533cf43" containerName="extract-content"
Mar 20 15:46:32 crc kubenswrapper[4730]: I0320 15:46:32.515478    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a347883-e4f7-4fcd-8920-59519533cf43" containerName="extract-content"
Mar 20 15:46:32 crc kubenswrapper[4730]: E0320 15:46:32.515486    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98" containerName="extract-content"
Mar 20 15:46:32 crc kubenswrapper[4730]: I0320 15:46:32.515493    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98" containerName="extract-content"
Mar 20 15:46:32 crc kubenswrapper[4730]: E0320 15:46:32.515501    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="558b00fd-2589-4842-8cba-db0cffe8c826" containerName="registry-server"
Mar 20 15:46:32 crc kubenswrapper[4730]: I0320 15:46:32.515508    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="558b00fd-2589-4842-8cba-db0cffe8c826" containerName="registry-server"
Mar 20 15:46:32 crc kubenswrapper[4730]: E0320 15:46:32.515517    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98" containerName="registry-server"
Mar 20 15:46:32 crc kubenswrapper[4730]: I0320 15:46:32.515524    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98" containerName="registry-server"
Mar 20 15:46:32 crc kubenswrapper[4730]: E0320 15:46:32.515536    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98" containerName="extract-utilities"
Mar 20 15:46:32 crc kubenswrapper[4730]: I0320 15:46:32.515544    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98" containerName="extract-utilities"
Mar 20 15:46:32 crc kubenswrapper[4730]: E0320 15:46:32.515553    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5addb8e-1dbc-41a2-8330-8a97251bd52f" containerName="extract-utilities"
Mar 20 15:46:32 crc kubenswrapper[4730]: I0320 15:46:32.515560    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5addb8e-1dbc-41a2-8330-8a97251bd52f" containerName="extract-utilities"
Mar 20 15:46:32 crc kubenswrapper[4730]: E0320 15:46:32.515570    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5addb8e-1dbc-41a2-8330-8a97251bd52f" containerName="extract-content"
Mar 20 15:46:32 crc kubenswrapper[4730]: I0320 15:46:32.515577    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5addb8e-1dbc-41a2-8330-8a97251bd52f" containerName="extract-content"
Mar 20 15:46:32 crc kubenswrapper[4730]: E0320 15:46:32.515586    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="558b00fd-2589-4842-8cba-db0cffe8c826" containerName="extract-utilities"
Mar 20 15:46:32 crc kubenswrapper[4730]: I0320 15:46:32.515593    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="558b00fd-2589-4842-8cba-db0cffe8c826" containerName="extract-utilities"
Mar 20 15:46:32 crc kubenswrapper[4730]: E0320 15:46:32.515600    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a347883-e4f7-4fcd-8920-59519533cf43" containerName="registry-server"
Mar 20 15:46:32 crc kubenswrapper[4730]: I0320 15:46:32.515605    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a347883-e4f7-4fcd-8920-59519533cf43" containerName="registry-server"
Mar 20 15:46:32 crc kubenswrapper[4730]: I0320 15:46:32.515703    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a347883-e4f7-4fcd-8920-59519533cf43" containerName="registry-server"
Mar 20 15:46:32 crc kubenswrapper[4730]: I0320 15:46:32.515714    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5addb8e-1dbc-41a2-8330-8a97251bd52f" containerName="registry-server"
Mar 20 15:46:32 crc kubenswrapper[4730]: I0320 15:46:32.515727    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3" containerName="marketplace-operator"
Mar 20 15:46:32 crc kubenswrapper[4730]: I0320 15:46:32.515735    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="558b00fd-2589-4842-8cba-db0cffe8c826" containerName="registry-server"
Mar 20 15:46:32 crc kubenswrapper[4730]: I0320 15:46:32.515742    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3" containerName="marketplace-operator"
Mar 20 15:46:32 crc kubenswrapper[4730]: I0320 15:46:32.515752    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1e9bea0-2eab-4ac3-ae73-6ad7bf4d7a98" containerName="registry-server"
Mar 20 15:46:32 crc kubenswrapper[4730]: E0320 15:46:32.515835    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3" containerName="marketplace-operator"
Mar 20 15:46:32 crc kubenswrapper[4730]: I0320 15:46:32.515843    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0ea8f10-d0d5-4b2b-9cc1-58eb2d785af3" containerName="marketplace-operator"
Mar 20 15:46:32 crc kubenswrapper[4730]: I0320 15:46:32.516943    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vk6rc"
Mar 20 15:46:32 crc kubenswrapper[4730]: I0320 15:46:32.520180    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh"
Mar 20 15:46:32 crc kubenswrapper[4730]: I0320 15:46:32.531422    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vk6rc"]
Mar 20 15:46:32 crc kubenswrapper[4730]: I0320 15:46:32.607967    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbqqz\" (UniqueName: \"kubernetes.io/projected/70d03566-9776-4dcc-84b5-17281f8ae66e-kube-api-access-nbqqz\") pod \"redhat-operators-vk6rc\" (UID: \"70d03566-9776-4dcc-84b5-17281f8ae66e\") " pod="openshift-marketplace/redhat-operators-vk6rc"
Mar 20 15:46:32 crc kubenswrapper[4730]: I0320 15:46:32.608134    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70d03566-9776-4dcc-84b5-17281f8ae66e-utilities\") pod \"redhat-operators-vk6rc\" (UID: \"70d03566-9776-4dcc-84b5-17281f8ae66e\") " pod="openshift-marketplace/redhat-operators-vk6rc"
Mar 20 15:46:32 crc kubenswrapper[4730]: I0320 15:46:32.608170    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70d03566-9776-4dcc-84b5-17281f8ae66e-catalog-content\") pod \"redhat-operators-vk6rc\" (UID: \"70d03566-9776-4dcc-84b5-17281f8ae66e\") " pod="openshift-marketplace/redhat-operators-vk6rc"
Mar 20 15:46:32 crc kubenswrapper[4730]: I0320 15:46:32.709726    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70d03566-9776-4dcc-84b5-17281f8ae66e-utilities\") pod \"redhat-operators-vk6rc\" (UID: \"70d03566-9776-4dcc-84b5-17281f8ae66e\") " pod="openshift-marketplace/redhat-operators-vk6rc"
Mar 20 15:46:32 crc kubenswrapper[4730]: I0320 15:46:32.709790    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70d03566-9776-4dcc-84b5-17281f8ae66e-catalog-content\") pod \"redhat-operators-vk6rc\" (UID: \"70d03566-9776-4dcc-84b5-17281f8ae66e\") " pod="openshift-marketplace/redhat-operators-vk6rc"
Mar 20 15:46:32 crc kubenswrapper[4730]: I0320 15:46:32.709841    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbqqz\" (UniqueName: \"kubernetes.io/projected/70d03566-9776-4dcc-84b5-17281f8ae66e-kube-api-access-nbqqz\") pod \"redhat-operators-vk6rc\" (UID: \"70d03566-9776-4dcc-84b5-17281f8ae66e\") " pod="openshift-marketplace/redhat-operators-vk6rc"
Mar 20 15:46:32 crc kubenswrapper[4730]: I0320 15:46:32.710369    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70d03566-9776-4dcc-84b5-17281f8ae66e-catalog-content\") pod \"redhat-operators-vk6rc\" (UID: \"70d03566-9776-4dcc-84b5-17281f8ae66e\") " pod="openshift-marketplace/redhat-operators-vk6rc"
Mar 20 15:46:32 crc kubenswrapper[4730]: I0320 15:46:32.710364    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70d03566-9776-4dcc-84b5-17281f8ae66e-utilities\") pod \"redhat-operators-vk6rc\" (UID: \"70d03566-9776-4dcc-84b5-17281f8ae66e\") " pod="openshift-marketplace/redhat-operators-vk6rc"
Mar 20 15:46:32 crc kubenswrapper[4730]: I0320 15:46:32.729455    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbqqz\" (UniqueName: \"kubernetes.io/projected/70d03566-9776-4dcc-84b5-17281f8ae66e-kube-api-access-nbqqz\") pod \"redhat-operators-vk6rc\" (UID: \"70d03566-9776-4dcc-84b5-17281f8ae66e\") " pod="openshift-marketplace/redhat-operators-vk6rc"
Mar 20 15:46:32 crc kubenswrapper[4730]: I0320 15:46:32.831385    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vk6rc"
Mar 20 15:46:33 crc kubenswrapper[4730]: I0320 15:46:33.261360    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vk6rc"]
Mar 20 15:46:33 crc kubenswrapper[4730]: I0320 15:46:33.541382    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a347883-e4f7-4fcd-8920-59519533cf43" path="/var/lib/kubelet/pods/5a347883-e4f7-4fcd-8920-59519533cf43/volumes"
Mar 20 15:46:33 crc kubenswrapper[4730]: I0320 15:46:33.925061    4730 generic.go:334] "Generic (PLEG): container finished" podID="70d03566-9776-4dcc-84b5-17281f8ae66e" containerID="c7a40db6c3e39bcdc6ed4f1aa1615de26a75d2d529d29d0be891dcfb62c4c11c" exitCode=0
Mar 20 15:46:33 crc kubenswrapper[4730]: I0320 15:46:33.925147    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vk6rc" event={"ID":"70d03566-9776-4dcc-84b5-17281f8ae66e","Type":"ContainerDied","Data":"c7a40db6c3e39bcdc6ed4f1aa1615de26a75d2d529d29d0be891dcfb62c4c11c"}
Mar 20 15:46:33 crc kubenswrapper[4730]: I0320 15:46:33.925172    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vk6rc" event={"ID":"70d03566-9776-4dcc-84b5-17281f8ae66e","Type":"ContainerStarted","Data":"89095373adcfb1a8fa95226ed1d5a210203e8b1527f394de91dacd42089f46d4"}
Mar 20 15:46:34 crc kubenswrapper[4730]: I0320 15:46:34.317698    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rkhd6"]
Mar 20 15:46:34 crc kubenswrapper[4730]: I0320 15:46:34.318607    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rkhd6"
Mar 20 15:46:34 crc kubenswrapper[4730]: I0320 15:46:34.321064    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g"
Mar 20 15:46:34 crc kubenswrapper[4730]: I0320 15:46:34.329561    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6e8fab3-7ebb-4b3f-af2c-fcc299e01381-catalog-content\") pod \"certified-operators-rkhd6\" (UID: \"d6e8fab3-7ebb-4b3f-af2c-fcc299e01381\") " pod="openshift-marketplace/certified-operators-rkhd6"
Mar 20 15:46:34 crc kubenswrapper[4730]: I0320 15:46:34.329628    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6bc6\" (UniqueName: \"kubernetes.io/projected/d6e8fab3-7ebb-4b3f-af2c-fcc299e01381-kube-api-access-z6bc6\") pod \"certified-operators-rkhd6\" (UID: \"d6e8fab3-7ebb-4b3f-af2c-fcc299e01381\") " pod="openshift-marketplace/certified-operators-rkhd6"
Mar 20 15:46:34 crc kubenswrapper[4730]: I0320 15:46:34.329683    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6e8fab3-7ebb-4b3f-af2c-fcc299e01381-utilities\") pod \"certified-operators-rkhd6\" (UID: \"d6e8fab3-7ebb-4b3f-af2c-fcc299e01381\") " pod="openshift-marketplace/certified-operators-rkhd6"
Mar 20 15:46:34 crc kubenswrapper[4730]: I0320 15:46:34.335122    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rkhd6"]
Mar 20 15:46:34 crc kubenswrapper[4730]: I0320 15:46:34.430607    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6e8fab3-7ebb-4b3f-af2c-fcc299e01381-catalog-content\") pod \"certified-operators-rkhd6\" (UID: \"d6e8fab3-7ebb-4b3f-af2c-fcc299e01381\") " pod="openshift-marketplace/certified-operators-rkhd6"
Mar 20 15:46:34 crc kubenswrapper[4730]: I0320 15:46:34.430653    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6bc6\" (UniqueName: \"kubernetes.io/projected/d6e8fab3-7ebb-4b3f-af2c-fcc299e01381-kube-api-access-z6bc6\") pod \"certified-operators-rkhd6\" (UID: \"d6e8fab3-7ebb-4b3f-af2c-fcc299e01381\") " pod="openshift-marketplace/certified-operators-rkhd6"
Mar 20 15:46:34 crc kubenswrapper[4730]: I0320 15:46:34.430697    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6e8fab3-7ebb-4b3f-af2c-fcc299e01381-utilities\") pod \"certified-operators-rkhd6\" (UID: \"d6e8fab3-7ebb-4b3f-af2c-fcc299e01381\") " pod="openshift-marketplace/certified-operators-rkhd6"
Mar 20 15:46:34 crc kubenswrapper[4730]: I0320 15:46:34.431103    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6e8fab3-7ebb-4b3f-af2c-fcc299e01381-catalog-content\") pod \"certified-operators-rkhd6\" (UID: \"d6e8fab3-7ebb-4b3f-af2c-fcc299e01381\") " pod="openshift-marketplace/certified-operators-rkhd6"
Mar 20 15:46:34 crc kubenswrapper[4730]: I0320 15:46:34.431118    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6e8fab3-7ebb-4b3f-af2c-fcc299e01381-utilities\") pod \"certified-operators-rkhd6\" (UID: \"d6e8fab3-7ebb-4b3f-af2c-fcc299e01381\") " pod="openshift-marketplace/certified-operators-rkhd6"
Mar 20 15:46:34 crc kubenswrapper[4730]: I0320 15:46:34.447738    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6bc6\" (UniqueName: \"kubernetes.io/projected/d6e8fab3-7ebb-4b3f-af2c-fcc299e01381-kube-api-access-z6bc6\") pod \"certified-operators-rkhd6\" (UID: \"d6e8fab3-7ebb-4b3f-af2c-fcc299e01381\") " pod="openshift-marketplace/certified-operators-rkhd6"
Mar 20 15:46:34 crc kubenswrapper[4730]: I0320 15:46:34.641014    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rkhd6"
Mar 20 15:46:34 crc kubenswrapper[4730]: I0320 15:46:34.915816    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bbtzz"]
Mar 20 15:46:34 crc kubenswrapper[4730]: I0320 15:46:34.917775    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bbtzz"
Mar 20 15:46:34 crc kubenswrapper[4730]: I0320 15:46:34.920215    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl"
Mar 20 15:46:34 crc kubenswrapper[4730]: I0320 15:46:34.928775    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bbtzz"]
Mar 20 15:46:34 crc kubenswrapper[4730]: I0320 15:46:34.935828    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8a22a9f-2975-485c-99f7-05e6b934e0a1-catalog-content\") pod \"community-operators-bbtzz\" (UID: \"d8a22a9f-2975-485c-99f7-05e6b934e0a1\") " pod="openshift-marketplace/community-operators-bbtzz"
Mar 20 15:46:34 crc kubenswrapper[4730]: I0320 15:46:34.935898    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw6p9\" (UniqueName: \"kubernetes.io/projected/d8a22a9f-2975-485c-99f7-05e6b934e0a1-kube-api-access-zw6p9\") pod \"community-operators-bbtzz\" (UID: \"d8a22a9f-2975-485c-99f7-05e6b934e0a1\") " pod="openshift-marketplace/community-operators-bbtzz"
Mar 20 15:46:34 crc kubenswrapper[4730]: I0320 15:46:34.935952    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8a22a9f-2975-485c-99f7-05e6b934e0a1-utilities\") pod \"community-operators-bbtzz\" (UID: \"d8a22a9f-2975-485c-99f7-05e6b934e0a1\") " pod="openshift-marketplace/community-operators-bbtzz"
Mar 20 15:46:35 crc kubenswrapper[4730]: I0320 15:46:35.037072    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8a22a9f-2975-485c-99f7-05e6b934e0a1-catalog-content\") pod \"community-operators-bbtzz\" (UID: \"d8a22a9f-2975-485c-99f7-05e6b934e0a1\") " pod="openshift-marketplace/community-operators-bbtzz"
Mar 20 15:46:35 crc kubenswrapper[4730]: I0320 15:46:35.037160    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zw6p9\" (UniqueName: \"kubernetes.io/projected/d8a22a9f-2975-485c-99f7-05e6b934e0a1-kube-api-access-zw6p9\") pod \"community-operators-bbtzz\" (UID: \"d8a22a9f-2975-485c-99f7-05e6b934e0a1\") " pod="openshift-marketplace/community-operators-bbtzz"
Mar 20 15:46:35 crc kubenswrapper[4730]: I0320 15:46:35.037228    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8a22a9f-2975-485c-99f7-05e6b934e0a1-utilities\") pod \"community-operators-bbtzz\" (UID: \"d8a22a9f-2975-485c-99f7-05e6b934e0a1\") " pod="openshift-marketplace/community-operators-bbtzz"
Mar 20 15:46:35 crc kubenswrapper[4730]: I0320 15:46:35.038044    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8a22a9f-2975-485c-99f7-05e6b934e0a1-utilities\") pod \"community-operators-bbtzz\" (UID: \"d8a22a9f-2975-485c-99f7-05e6b934e0a1\") " pod="openshift-marketplace/community-operators-bbtzz"
Mar 20 15:46:35 crc kubenswrapper[4730]: I0320 15:46:35.038044    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8a22a9f-2975-485c-99f7-05e6b934e0a1-catalog-content\") pod \"community-operators-bbtzz\" (UID: \"d8a22a9f-2975-485c-99f7-05e6b934e0a1\") " pod="openshift-marketplace/community-operators-bbtzz"
Mar 20 15:46:35 crc kubenswrapper[4730]: I0320 15:46:35.055862    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rkhd6"]
Mar 20 15:46:35 crc kubenswrapper[4730]: I0320 15:46:35.061312    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw6p9\" (UniqueName: \"kubernetes.io/projected/d8a22a9f-2975-485c-99f7-05e6b934e0a1-kube-api-access-zw6p9\") pod \"community-operators-bbtzz\" (UID: \"d8a22a9f-2975-485c-99f7-05e6b934e0a1\") " pod="openshift-marketplace/community-operators-bbtzz"
Mar 20 15:46:35 crc kubenswrapper[4730]: W0320 15:46:35.069314    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6e8fab3_7ebb_4b3f_af2c_fcc299e01381.slice/crio-b3a96be23f133a45e47eabf32d63cbb3b3f6505ef19ae7141a386229ffe971c0 WatchSource:0}: Error finding container b3a96be23f133a45e47eabf32d63cbb3b3f6505ef19ae7141a386229ffe971c0: Status 404 returned error can't find the container with id b3a96be23f133a45e47eabf32d63cbb3b3f6505ef19ae7141a386229ffe971c0
Mar 20 15:46:35 crc kubenswrapper[4730]: I0320 15:46:35.236757    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bbtzz"
Mar 20 15:46:35 crc kubenswrapper[4730]: I0320 15:46:35.605220    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bbtzz"]
Mar 20 15:46:35 crc kubenswrapper[4730]: W0320 15:46:35.623314    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8a22a9f_2975_485c_99f7_05e6b934e0a1.slice/crio-663eae788e97fc6ca48a1450b50b94b2e512a4a113ab307d9069b46bc8aa6e1b WatchSource:0}: Error finding container 663eae788e97fc6ca48a1450b50b94b2e512a4a113ab307d9069b46bc8aa6e1b: Status 404 returned error can't find the container with id 663eae788e97fc6ca48a1450b50b94b2e512a4a113ab307d9069b46bc8aa6e1b
Mar 20 15:46:35 crc kubenswrapper[4730]: I0320 15:46:35.799314    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-mp2s2"
Mar 20 15:46:35 crc kubenswrapper[4730]: I0320 15:46:35.867197    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bdpg6"]
Mar 20 15:46:35 crc kubenswrapper[4730]: I0320 15:46:35.938008    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vk6rc" event={"ID":"70d03566-9776-4dcc-84b5-17281f8ae66e","Type":"ContainerStarted","Data":"73da7c88a758b29a10ab785c5df8fff5977b17912b7a7aa61adcd6a4298dd476"}
Mar 20 15:46:35 crc kubenswrapper[4730]: I0320 15:46:35.940428    4730 generic.go:334] "Generic (PLEG): container finished" podID="d6e8fab3-7ebb-4b3f-af2c-fcc299e01381" containerID="73ab3b31d9dda4fc31017dd8ae10b58885314deb0ed5858dbad85902b8789321" exitCode=0
Mar 20 15:46:35 crc kubenswrapper[4730]: I0320 15:46:35.940568    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rkhd6" event={"ID":"d6e8fab3-7ebb-4b3f-af2c-fcc299e01381","Type":"ContainerDied","Data":"73ab3b31d9dda4fc31017dd8ae10b58885314deb0ed5858dbad85902b8789321"}
Mar 20 15:46:35 crc kubenswrapper[4730]: I0320 15:46:35.940616    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rkhd6" event={"ID":"d6e8fab3-7ebb-4b3f-af2c-fcc299e01381","Type":"ContainerStarted","Data":"b3a96be23f133a45e47eabf32d63cbb3b3f6505ef19ae7141a386229ffe971c0"}
Mar 20 15:46:35 crc kubenswrapper[4730]: I0320 15:46:35.942776    4730 generic.go:334] "Generic (PLEG): container finished" podID="d8a22a9f-2975-485c-99f7-05e6b934e0a1" containerID="bbc862e72da6db4159a1b63077867ba68600c99429b0bebda558060a0ceca52f" exitCode=0
Mar 20 15:46:35 crc kubenswrapper[4730]: I0320 15:46:35.942860    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bbtzz" event={"ID":"d8a22a9f-2975-485c-99f7-05e6b934e0a1","Type":"ContainerDied","Data":"bbc862e72da6db4159a1b63077867ba68600c99429b0bebda558060a0ceca52f"}
Mar 20 15:46:35 crc kubenswrapper[4730]: I0320 15:46:35.942933    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bbtzz" event={"ID":"d8a22a9f-2975-485c-99f7-05e6b934e0a1","Type":"ContainerStarted","Data":"663eae788e97fc6ca48a1450b50b94b2e512a4a113ab307d9069b46bc8aa6e1b"}
Mar 20 15:46:36 crc kubenswrapper[4730]: I0320 15:46:36.714620    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7vhhm"]
Mar 20 15:46:36 crc kubenswrapper[4730]: I0320 15:46:36.716507    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7vhhm"
Mar 20 15:46:36 crc kubenswrapper[4730]: I0320 15:46:36.718714    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb"
Mar 20 15:46:36 crc kubenswrapper[4730]: I0320 15:46:36.724649    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7vhhm"]
Mar 20 15:46:36 crc kubenswrapper[4730]: I0320 15:46:36.860663    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cae6da2c-50d0-460f-b29c-5b3e3df439c5-utilities\") pod \"redhat-marketplace-7vhhm\" (UID: \"cae6da2c-50d0-460f-b29c-5b3e3df439c5\") " pod="openshift-marketplace/redhat-marketplace-7vhhm"
Mar 20 15:46:36 crc kubenswrapper[4730]: I0320 15:46:36.861418    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cae6da2c-50d0-460f-b29c-5b3e3df439c5-catalog-content\") pod \"redhat-marketplace-7vhhm\" (UID: \"cae6da2c-50d0-460f-b29c-5b3e3df439c5\") " pod="openshift-marketplace/redhat-marketplace-7vhhm"
Mar 20 15:46:36 crc kubenswrapper[4730]: I0320 15:46:36.861590    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqtlx\" (UniqueName: \"kubernetes.io/projected/cae6da2c-50d0-460f-b29c-5b3e3df439c5-kube-api-access-lqtlx\") pod \"redhat-marketplace-7vhhm\" (UID: \"cae6da2c-50d0-460f-b29c-5b3e3df439c5\") " pod="openshift-marketplace/redhat-marketplace-7vhhm"
Mar 20 15:46:36 crc kubenswrapper[4730]: I0320 15:46:36.949906    4730 generic.go:334] "Generic (PLEG): container finished" podID="70d03566-9776-4dcc-84b5-17281f8ae66e" containerID="73da7c88a758b29a10ab785c5df8fff5977b17912b7a7aa61adcd6a4298dd476" exitCode=0
Mar 20 15:46:36 crc kubenswrapper[4730]: I0320 15:46:36.949948    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vk6rc" event={"ID":"70d03566-9776-4dcc-84b5-17281f8ae66e","Type":"ContainerDied","Data":"73da7c88a758b29a10ab785c5df8fff5977b17912b7a7aa61adcd6a4298dd476"}
Mar 20 15:46:36 crc kubenswrapper[4730]: I0320 15:46:36.962742    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqtlx\" (UniqueName: \"kubernetes.io/projected/cae6da2c-50d0-460f-b29c-5b3e3df439c5-kube-api-access-lqtlx\") pod \"redhat-marketplace-7vhhm\" (UID: \"cae6da2c-50d0-460f-b29c-5b3e3df439c5\") " pod="openshift-marketplace/redhat-marketplace-7vhhm"
Mar 20 15:46:36 crc kubenswrapper[4730]: I0320 15:46:36.962893    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cae6da2c-50d0-460f-b29c-5b3e3df439c5-utilities\") pod \"redhat-marketplace-7vhhm\" (UID: \"cae6da2c-50d0-460f-b29c-5b3e3df439c5\") " pod="openshift-marketplace/redhat-marketplace-7vhhm"
Mar 20 15:46:36 crc kubenswrapper[4730]: I0320 15:46:36.962984    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cae6da2c-50d0-460f-b29c-5b3e3df439c5-catalog-content\") pod \"redhat-marketplace-7vhhm\" (UID: \"cae6da2c-50d0-460f-b29c-5b3e3df439c5\") " pod="openshift-marketplace/redhat-marketplace-7vhhm"
Mar 20 15:46:36 crc kubenswrapper[4730]: I0320 15:46:36.963363    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cae6da2c-50d0-460f-b29c-5b3e3df439c5-utilities\") pod \"redhat-marketplace-7vhhm\" (UID: \"cae6da2c-50d0-460f-b29c-5b3e3df439c5\") " pod="openshift-marketplace/redhat-marketplace-7vhhm"
Mar 20 15:46:36 crc kubenswrapper[4730]: I0320 15:46:36.963586    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cae6da2c-50d0-460f-b29c-5b3e3df439c5-catalog-content\") pod \"redhat-marketplace-7vhhm\" (UID: \"cae6da2c-50d0-460f-b29c-5b3e3df439c5\") " pod="openshift-marketplace/redhat-marketplace-7vhhm"
Mar 20 15:46:36 crc kubenswrapper[4730]: I0320 15:46:36.982841    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqtlx\" (UniqueName: \"kubernetes.io/projected/cae6da2c-50d0-460f-b29c-5b3e3df439c5-kube-api-access-lqtlx\") pod \"redhat-marketplace-7vhhm\" (UID: \"cae6da2c-50d0-460f-b29c-5b3e3df439c5\") " pod="openshift-marketplace/redhat-marketplace-7vhhm"
Mar 20 15:46:37 crc kubenswrapper[4730]: I0320 15:46:37.036320    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7vhhm"
Mar 20 15:46:37 crc kubenswrapper[4730]: I0320 15:46:37.467564    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7vhhm"]
Mar 20 15:46:37 crc kubenswrapper[4730]: W0320 15:46:37.477535    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcae6da2c_50d0_460f_b29c_5b3e3df439c5.slice/crio-37af390bdb5af33c7366157b9007569fb0d12bfaeb477babb66c301c550167b2 WatchSource:0}: Error finding container 37af390bdb5af33c7366157b9007569fb0d12bfaeb477babb66c301c550167b2: Status 404 returned error can't find the container with id 37af390bdb5af33c7366157b9007569fb0d12bfaeb477babb66c301c550167b2
Mar 20 15:46:37 crc kubenswrapper[4730]: I0320 15:46:37.956372    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vk6rc" event={"ID":"70d03566-9776-4dcc-84b5-17281f8ae66e","Type":"ContainerStarted","Data":"a02ec7173c2df409bf4c46faf3534e004b5113e620387bf44510db2eb3565d69"}
Mar 20 15:46:37 crc kubenswrapper[4730]: I0320 15:46:37.958461    4730 generic.go:334] "Generic (PLEG): container finished" podID="d6e8fab3-7ebb-4b3f-af2c-fcc299e01381" containerID="637bfd25c305324e8a040583ebdd939dd62f4fda7f8a9fc68a08ab56da1cc550" exitCode=0
Mar 20 15:46:37 crc kubenswrapper[4730]: I0320 15:46:37.958642    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rkhd6" event={"ID":"d6e8fab3-7ebb-4b3f-af2c-fcc299e01381","Type":"ContainerDied","Data":"637bfd25c305324e8a040583ebdd939dd62f4fda7f8a9fc68a08ab56da1cc550"}
Mar 20 15:46:37 crc kubenswrapper[4730]: I0320 15:46:37.959815    4730 generic.go:334] "Generic (PLEG): container finished" podID="cae6da2c-50d0-460f-b29c-5b3e3df439c5" containerID="30191a20b8e2b8605c55ff9074a4a3e62d06ac6642f2c8bccc6466ce0fcb479b" exitCode=0
Mar 20 15:46:37 crc kubenswrapper[4730]: I0320 15:46:37.959851    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7vhhm" event={"ID":"cae6da2c-50d0-460f-b29c-5b3e3df439c5","Type":"ContainerDied","Data":"30191a20b8e2b8605c55ff9074a4a3e62d06ac6642f2c8bccc6466ce0fcb479b"}
Mar 20 15:46:37 crc kubenswrapper[4730]: I0320 15:46:37.959863    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7vhhm" event={"ID":"cae6da2c-50d0-460f-b29c-5b3e3df439c5","Type":"ContainerStarted","Data":"37af390bdb5af33c7366157b9007569fb0d12bfaeb477babb66c301c550167b2"}
Mar 20 15:46:37 crc kubenswrapper[4730]: I0320 15:46:37.966975    4730 generic.go:334] "Generic (PLEG): container finished" podID="d8a22a9f-2975-485c-99f7-05e6b934e0a1" containerID="179bf7e47b9d226171d3a00da200d75f48331106eaf22adaada32e89e66497ee" exitCode=0
Mar 20 15:46:37 crc kubenswrapper[4730]: I0320 15:46:37.967024    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bbtzz" event={"ID":"d8a22a9f-2975-485c-99f7-05e6b934e0a1","Type":"ContainerDied","Data":"179bf7e47b9d226171d3a00da200d75f48331106eaf22adaada32e89e66497ee"}
Mar 20 15:46:37 crc kubenswrapper[4730]: I0320 15:46:37.973498    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vk6rc" podStartSLOduration=2.386061558 podStartE2EDuration="5.973486161s" podCreationTimestamp="2026-03-20 15:46:32 +0000 UTC" firstStartedPulling="2026-03-20 15:46:33.92677269 +0000 UTC m=+453.140144059" lastFinishedPulling="2026-03-20 15:46:37.514197293 +0000 UTC m=+456.727568662" observedRunningTime="2026-03-20 15:46:37.973211443 +0000 UTC m=+457.186582812" watchObservedRunningTime="2026-03-20 15:46:37.973486161 +0000 UTC m=+457.186857520"
Mar 20 15:46:38 crc kubenswrapper[4730]: I0320 15:46:38.977202    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rkhd6" event={"ID":"d6e8fab3-7ebb-4b3f-af2c-fcc299e01381","Type":"ContainerStarted","Data":"88c5f5d777fda8e8b64d06a7a00c4fc6f83af7fca08eca4140f037c5ad6fa0d7"}
Mar 20 15:46:38 crc kubenswrapper[4730]: I0320 15:46:38.979421    4730 generic.go:334] "Generic (PLEG): container finished" podID="cae6da2c-50d0-460f-b29c-5b3e3df439c5" containerID="7f56571bd135700900c30c39a5d9d179d5a25296048600cd392a928e99d1e34f" exitCode=0
Mar 20 15:46:38 crc kubenswrapper[4730]: I0320 15:46:38.979483    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7vhhm" event={"ID":"cae6da2c-50d0-460f-b29c-5b3e3df439c5","Type":"ContainerDied","Data":"7f56571bd135700900c30c39a5d9d179d5a25296048600cd392a928e99d1e34f"}
Mar 20 15:46:38 crc kubenswrapper[4730]: I0320 15:46:38.987520    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bbtzz" event={"ID":"d8a22a9f-2975-485c-99f7-05e6b934e0a1","Type":"ContainerStarted","Data":"1ce1392e9cef60e8f158c640ba92127c0e0c1616d90fa5cf171c249de5e2237e"}
Mar 20 15:46:39 crc kubenswrapper[4730]: I0320 15:46:39.024508    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bbtzz" podStartSLOduration=2.5453811330000002 podStartE2EDuration="5.02449332s" podCreationTimestamp="2026-03-20 15:46:34 +0000 UTC" firstStartedPulling="2026-03-20 15:46:35.944373724 +0000 UTC m=+455.157745093" lastFinishedPulling="2026-03-20 15:46:38.423485901 +0000 UTC m=+457.636857280" observedRunningTime="2026-03-20 15:46:39.021604935 +0000 UTC m=+458.234976314" watchObservedRunningTime="2026-03-20 15:46:39.02449332 +0000 UTC m=+458.237864689"
Mar 20 15:46:39 crc kubenswrapper[4730]: I0320 15:46:39.035968    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rkhd6" podStartSLOduration=2.51083002 podStartE2EDuration="5.035950445s" podCreationTimestamp="2026-03-20 15:46:34 +0000 UTC" firstStartedPulling="2026-03-20 15:46:35.942072915 +0000 UTC m=+455.155444284" lastFinishedPulling="2026-03-20 15:46:38.46719334 +0000 UTC m=+457.680564709" observedRunningTime="2026-03-20 15:46:39.006777811 +0000 UTC m=+458.220149180" watchObservedRunningTime="2026-03-20 15:46:39.035950445 +0000 UTC m=+458.249321814"
Mar 20 15:46:39 crc kubenswrapper[4730]: I0320 15:46:39.996461    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7vhhm" event={"ID":"cae6da2c-50d0-460f-b29c-5b3e3df439c5","Type":"ContainerStarted","Data":"f3544d80412c2df30a39271c94de8de8e78751495e2eb74202b2534f5e34b80b"}
Mar 20 15:46:40 crc kubenswrapper[4730]: I0320 15:46:40.014977    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7vhhm" podStartSLOduration=2.551161587 podStartE2EDuration="4.014960547s" podCreationTimestamp="2026-03-20 15:46:36 +0000 UTC" firstStartedPulling="2026-03-20 15:46:37.96388724 +0000 UTC m=+457.177258609" lastFinishedPulling="2026-03-20 15:46:39.4276862 +0000 UTC m=+458.641057569" observedRunningTime="2026-03-20 15:46:40.013050132 +0000 UTC m=+459.226421501" watchObservedRunningTime="2026-03-20 15:46:40.014960547 +0000 UTC m=+459.228331916"
Mar 20 15:46:42 crc kubenswrapper[4730]: I0320 15:46:42.832479    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vk6rc"
Mar 20 15:46:42 crc kubenswrapper[4730]: I0320 15:46:42.833113    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vk6rc"
Mar 20 15:46:42 crc kubenswrapper[4730]: I0320 15:46:42.880546    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 15:46:42 crc kubenswrapper[4730]: I0320 15:46:42.880596    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 15:46:43 crc kubenswrapper[4730]: I0320 15:46:43.872946    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vk6rc" podUID="70d03566-9776-4dcc-84b5-17281f8ae66e" containerName="registry-server" probeResult="failure" output=<
Mar 20 15:46:43 crc kubenswrapper[4730]:         timeout: failed to connect service ":50051" within 1s
Mar 20 15:46:43 crc kubenswrapper[4730]:  >
Mar 20 15:46:44 crc kubenswrapper[4730]: I0320 15:46:44.642102    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rkhd6"
Mar 20 15:46:44 crc kubenswrapper[4730]: I0320 15:46:44.642266    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rkhd6"
Mar 20 15:46:44 crc kubenswrapper[4730]: I0320 15:46:44.677312    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rkhd6"
Mar 20 15:46:45 crc kubenswrapper[4730]: I0320 15:46:45.059100    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rkhd6"
Mar 20 15:46:45 crc kubenswrapper[4730]: I0320 15:46:45.237777    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bbtzz"
Mar 20 15:46:45 crc kubenswrapper[4730]: I0320 15:46:45.237821    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bbtzz"
Mar 20 15:46:45 crc kubenswrapper[4730]: I0320 15:46:45.278128    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bbtzz"
Mar 20 15:46:46 crc kubenswrapper[4730]: I0320 15:46:46.072023    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bbtzz"
Mar 20 15:46:47 crc kubenswrapper[4730]: I0320 15:46:47.037353    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7vhhm"
Mar 20 15:46:47 crc kubenswrapper[4730]: I0320 15:46:47.037698    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7vhhm"
Mar 20 15:46:47 crc kubenswrapper[4730]: I0320 15:46:47.082137    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7vhhm"
Mar 20 15:46:48 crc kubenswrapper[4730]: I0320 15:46:48.080820    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7vhhm"
Mar 20 15:46:52 crc kubenswrapper[4730]: I0320 15:46:52.894237    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vk6rc"
Mar 20 15:46:52 crc kubenswrapper[4730]: I0320 15:46:52.946498    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vk6rc"
Mar 20 15:47:00 crc kubenswrapper[4730]: I0320 15:47:00.905588    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6" podUID="54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e" containerName="registry" containerID="cri-o://deb22e7f4b1084e112c47fe5c83d1b16157bc080fc85d2ff74bb28b439a9502d" gracePeriod=30
Mar 20 15:47:01 crc kubenswrapper[4730]: I0320 15:47:01.124241    4730 generic.go:334] "Generic (PLEG): container finished" podID="54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e" containerID="deb22e7f4b1084e112c47fe5c83d1b16157bc080fc85d2ff74bb28b439a9502d" exitCode=0
Mar 20 15:47:01 crc kubenswrapper[4730]: I0320 15:47:01.124328    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6" event={"ID":"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e","Type":"ContainerDied","Data":"deb22e7f4b1084e112c47fe5c83d1b16157bc080fc85d2ff74bb28b439a9502d"}
Mar 20 15:47:01 crc kubenswrapper[4730]: I0320 15:47:01.347103    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:47:01 crc kubenswrapper[4730]: I0320 15:47:01.496935    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e-ca-trust-extracted\") pod \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") "
Mar 20 15:47:01 crc kubenswrapper[4730]: I0320 15:47:01.496995    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e-bound-sa-token\") pod \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") "
Mar 20 15:47:01 crc kubenswrapper[4730]: I0320 15:47:01.497039    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e-registry-certificates\") pod \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") "
Mar 20 15:47:01 crc kubenswrapper[4730]: I0320 15:47:01.497088    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e-trusted-ca\") pod \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") "
Mar 20 15:47:01 crc kubenswrapper[4730]: I0320 15:47:01.497134    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l78n7\" (UniqueName: \"kubernetes.io/projected/54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e-kube-api-access-l78n7\") pod \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") "
Mar 20 15:47:01 crc kubenswrapper[4730]: I0320 15:47:01.497161    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e-registry-tls\") pod \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") "
Mar 20 15:47:01 crc kubenswrapper[4730]: I0320 15:47:01.497200    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e-installation-pull-secrets\") pod \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") "
Mar 20 15:47:01 crc kubenswrapper[4730]: I0320 15:47:01.497498    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\" (UID: \"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e\") "
Mar 20 15:47:01 crc kubenswrapper[4730]: I0320 15:47:01.498757    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:47:01 crc kubenswrapper[4730]: I0320 15:47:01.499893    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:47:01 crc kubenswrapper[4730]: I0320 15:47:01.508897    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue ""
Mar 20 15:47:01 crc kubenswrapper[4730]: I0320 15:47:01.516483    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:47:01 crc kubenswrapper[4730]: I0320 15:47:01.516659    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:47:01 crc kubenswrapper[4730]: I0320 15:47:01.516875    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e-kube-api-access-l78n7" (OuterVolumeSpecName: "kube-api-access-l78n7") pod "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e"). InnerVolumeSpecName "kube-api-access-l78n7". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:47:01 crc kubenswrapper[4730]: I0320 15:47:01.517158    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:47:01 crc kubenswrapper[4730]: I0320 15:47:01.517723    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e" (UID: "54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 15:47:01 crc kubenswrapper[4730]: I0320 15:47:01.599091    4730 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e-trusted-ca\") on node \"crc\" DevicePath \"\""
Mar 20 15:47:01 crc kubenswrapper[4730]: I0320 15:47:01.599135    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l78n7\" (UniqueName: \"kubernetes.io/projected/54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e-kube-api-access-l78n7\") on node \"crc\" DevicePath \"\""
Mar 20 15:47:01 crc kubenswrapper[4730]: I0320 15:47:01.599148    4730 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e-registry-tls\") on node \"crc\" DevicePath \"\""
Mar 20 15:47:01 crc kubenswrapper[4730]: I0320 15:47:01.599160    4730 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e-installation-pull-secrets\") on node \"crc\" DevicePath \"\""
Mar 20 15:47:01 crc kubenswrapper[4730]: I0320 15:47:01.599171    4730 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e-ca-trust-extracted\") on node \"crc\" DevicePath \"\""
Mar 20 15:47:01 crc kubenswrapper[4730]: I0320 15:47:01.599183    4730 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e-bound-sa-token\") on node \"crc\" DevicePath \"\""
Mar 20 15:47:01 crc kubenswrapper[4730]: I0320 15:47:01.599194    4730 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e-registry-certificates\") on node \"crc\" DevicePath \"\""
Mar 20 15:47:02 crc kubenswrapper[4730]: I0320 15:47:02.132027    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6" event={"ID":"54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e","Type":"ContainerDied","Data":"2585ec9b7501bfbd7ee8fe496b92574cacdd42248f1dc5fb5a3a27e92aa35714"}
Mar 20 15:47:02 crc kubenswrapper[4730]: I0320 15:47:02.132082    4730 scope.go:117] "RemoveContainer" containerID="deb22e7f4b1084e112c47fe5c83d1b16157bc080fc85d2ff74bb28b439a9502d"
Mar 20 15:47:02 crc kubenswrapper[4730]: I0320 15:47:02.132087    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bdpg6"
Mar 20 15:47:02 crc kubenswrapper[4730]: I0320 15:47:02.151481    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bdpg6"]
Mar 20 15:47:02 crc kubenswrapper[4730]: I0320 15:47:02.157162    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bdpg6"]
Mar 20 15:47:03 crc kubenswrapper[4730]: I0320 15:47:03.541664    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e" path="/var/lib/kubelet/pods/54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e/volumes"
Mar 20 15:47:05 crc kubenswrapper[4730]: I0320 15:47:05.837651    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c"
Mar 20 15:47:12 crc kubenswrapper[4730]: I0320 15:47:12.880097    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 15:47:12 crc kubenswrapper[4730]: I0320 15:47:12.880880    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 15:47:42 crc kubenswrapper[4730]: I0320 15:47:42.880228    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 15:47:42 crc kubenswrapper[4730]: I0320 15:47:42.880806    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 15:47:42 crc kubenswrapper[4730]: I0320 15:47:42.880853    4730 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf"
Mar 20 15:47:42 crc kubenswrapper[4730]: I0320 15:47:42.881387    4730 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"418b64bd31efa72e03b6036c281348bfc6e1d5be086f3887fe653df9e0316583"} pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted"
Mar 20 15:47:42 crc kubenswrapper[4730]: I0320 15:47:42.881440    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" containerID="cri-o://418b64bd31efa72e03b6036c281348bfc6e1d5be086f3887fe653df9e0316583" gracePeriod=600
Mar 20 15:47:43 crc kubenswrapper[4730]: I0320 15:47:43.351931    4730 generic.go:334] "Generic (PLEG): container finished" podID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerID="418b64bd31efa72e03b6036c281348bfc6e1d5be086f3887fe653df9e0316583" exitCode=0
Mar 20 15:47:43 crc kubenswrapper[4730]: I0320 15:47:43.351985    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" event={"ID":"7fcd3db3-55f1-4c23-8fa9-78844495cea3","Type":"ContainerDied","Data":"418b64bd31efa72e03b6036c281348bfc6e1d5be086f3887fe653df9e0316583"}
Mar 20 15:47:43 crc kubenswrapper[4730]: I0320 15:47:43.352336    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" event={"ID":"7fcd3db3-55f1-4c23-8fa9-78844495cea3","Type":"ContainerStarted","Data":"44f44ed17252feb14ca678b8fd7bddf96639b37f5ddb8303898a1167aa46bf9c"}
Mar 20 15:47:43 crc kubenswrapper[4730]: I0320 15:47:43.352361    4730 scope.go:117] "RemoveContainer" containerID="cd661cda796a2bd61d1446bee672c4471d60370245caf7cb54faf54dfa9c58a0"
Mar 20 15:48:00 crc kubenswrapper[4730]: I0320 15:48:00.129645    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567028-s6xcp"]
Mar 20 15:48:00 crc kubenswrapper[4730]: E0320 15:48:00.130380    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e" containerName="registry"
Mar 20 15:48:00 crc kubenswrapper[4730]: I0320 15:48:00.130394    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e" containerName="registry"
Mar 20 15:48:00 crc kubenswrapper[4730]: I0320 15:48:00.130497    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="54319e7c-f09f-4f3d-80cd-8d6dcd4ef88e" containerName="registry"
Mar 20 15:48:00 crc kubenswrapper[4730]: I0320 15:48:00.130911    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567028-s6xcp"
Mar 20 15:48:00 crc kubenswrapper[4730]: I0320 15:48:00.133422    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt"
Mar 20 15:48:00 crc kubenswrapper[4730]: I0320 15:48:00.133580    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt"
Mar 20 15:48:00 crc kubenswrapper[4730]: I0320 15:48:00.133607    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vlqbl"
Mar 20 15:48:00 crc kubenswrapper[4730]: I0320 15:48:00.142968    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567028-s6xcp"]
Mar 20 15:48:00 crc kubenswrapper[4730]: I0320 15:48:00.240172    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-578cc\" (UniqueName: \"kubernetes.io/projected/e56ca246-99ac-4397-a499-62738ac94a39-kube-api-access-578cc\") pod \"auto-csr-approver-29567028-s6xcp\" (UID: \"e56ca246-99ac-4397-a499-62738ac94a39\") " pod="openshift-infra/auto-csr-approver-29567028-s6xcp"
Mar 20 15:48:00 crc kubenswrapper[4730]: I0320 15:48:00.341791    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-578cc\" (UniqueName: \"kubernetes.io/projected/e56ca246-99ac-4397-a499-62738ac94a39-kube-api-access-578cc\") pod \"auto-csr-approver-29567028-s6xcp\" (UID: \"e56ca246-99ac-4397-a499-62738ac94a39\") " pod="openshift-infra/auto-csr-approver-29567028-s6xcp"
Mar 20 15:48:00 crc kubenswrapper[4730]: I0320 15:48:00.360197    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-578cc\" (UniqueName: \"kubernetes.io/projected/e56ca246-99ac-4397-a499-62738ac94a39-kube-api-access-578cc\") pod \"auto-csr-approver-29567028-s6xcp\" (UID: \"e56ca246-99ac-4397-a499-62738ac94a39\") " pod="openshift-infra/auto-csr-approver-29567028-s6xcp"
Mar 20 15:48:00 crc kubenswrapper[4730]: I0320 15:48:00.448155    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567028-s6xcp"
Mar 20 15:48:01 crc kubenswrapper[4730]: I0320 15:48:01.488165    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567028-s6xcp"]
Mar 20 15:48:01 crc kubenswrapper[4730]: I0320 15:48:01.495936    4730 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider
Mar 20 15:48:02 crc kubenswrapper[4730]: I0320 15:48:02.454178    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567028-s6xcp" event={"ID":"e56ca246-99ac-4397-a499-62738ac94a39","Type":"ContainerStarted","Data":"5141a5c6a1cf5183972f40437d284ec8b2b455a48c23c168fde813932fc800eb"}
Mar 20 15:48:03 crc kubenswrapper[4730]: I0320 15:48:03.462554    4730 generic.go:334] "Generic (PLEG): container finished" podID="e56ca246-99ac-4397-a499-62738ac94a39" containerID="44d76ae85c164cacb0e0982473fd32dc59c0d37d2af3868ef4b22b1a51c8b024" exitCode=0
Mar 20 15:48:03 crc kubenswrapper[4730]: I0320 15:48:03.462666    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567028-s6xcp" event={"ID":"e56ca246-99ac-4397-a499-62738ac94a39","Type":"ContainerDied","Data":"44d76ae85c164cacb0e0982473fd32dc59c0d37d2af3868ef4b22b1a51c8b024"}
Mar 20 15:48:04 crc kubenswrapper[4730]: I0320 15:48:04.718281    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567028-s6xcp"
Mar 20 15:48:04 crc kubenswrapper[4730]: I0320 15:48:04.798801    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-578cc\" (UniqueName: \"kubernetes.io/projected/e56ca246-99ac-4397-a499-62738ac94a39-kube-api-access-578cc\") pod \"e56ca246-99ac-4397-a499-62738ac94a39\" (UID: \"e56ca246-99ac-4397-a499-62738ac94a39\") "
Mar 20 15:48:04 crc kubenswrapper[4730]: I0320 15:48:04.804437    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e56ca246-99ac-4397-a499-62738ac94a39-kube-api-access-578cc" (OuterVolumeSpecName: "kube-api-access-578cc") pod "e56ca246-99ac-4397-a499-62738ac94a39" (UID: "e56ca246-99ac-4397-a499-62738ac94a39"). InnerVolumeSpecName "kube-api-access-578cc". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:48:04 crc kubenswrapper[4730]: I0320 15:48:04.900658    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-578cc\" (UniqueName: \"kubernetes.io/projected/e56ca246-99ac-4397-a499-62738ac94a39-kube-api-access-578cc\") on node \"crc\" DevicePath \"\""
Mar 20 15:48:05 crc kubenswrapper[4730]: I0320 15:48:05.475380    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567028-s6xcp" event={"ID":"e56ca246-99ac-4397-a499-62738ac94a39","Type":"ContainerDied","Data":"5141a5c6a1cf5183972f40437d284ec8b2b455a48c23c168fde813932fc800eb"}
Mar 20 15:48:05 crc kubenswrapper[4730]: I0320 15:48:05.475418    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5141a5c6a1cf5183972f40437d284ec8b2b455a48c23c168fde813932fc800eb"
Mar 20 15:48:05 crc kubenswrapper[4730]: I0320 15:48:05.475466    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567028-s6xcp"
Mar 20 15:48:05 crc kubenswrapper[4730]: I0320 15:48:05.777648    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567022-wf5nv"]
Mar 20 15:48:05 crc kubenswrapper[4730]: I0320 15:48:05.781340    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567022-wf5nv"]
Mar 20 15:48:07 crc kubenswrapper[4730]: I0320 15:48:07.541266    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d87adfe-3206-4175-8d8f-5a00015cc61e" path="/var/lib/kubelet/pods/7d87adfe-3206-4175-8d8f-5a00015cc61e/volumes"
Mar 20 15:50:00 crc kubenswrapper[4730]: I0320 15:50:00.142107    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567030-pwdln"]
Mar 20 15:50:00 crc kubenswrapper[4730]: E0320 15:50:00.142821    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e56ca246-99ac-4397-a499-62738ac94a39" containerName="oc"
Mar 20 15:50:00 crc kubenswrapper[4730]: I0320 15:50:00.142832    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="e56ca246-99ac-4397-a499-62738ac94a39" containerName="oc"
Mar 20 15:50:00 crc kubenswrapper[4730]: I0320 15:50:00.142932    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="e56ca246-99ac-4397-a499-62738ac94a39" containerName="oc"
Mar 20 15:50:00 crc kubenswrapper[4730]: I0320 15:50:00.143341    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567030-pwdln"
Mar 20 15:50:00 crc kubenswrapper[4730]: I0320 15:50:00.144873    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vlqbl"
Mar 20 15:50:00 crc kubenswrapper[4730]: I0320 15:50:00.145300    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt"
Mar 20 15:50:00 crc kubenswrapper[4730]: I0320 15:50:00.145531    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8cck\" (UniqueName: \"kubernetes.io/projected/b7dcd73b-be94-4b96-b001-593d2fd56aa3-kube-api-access-d8cck\") pod \"auto-csr-approver-29567030-pwdln\" (UID: \"b7dcd73b-be94-4b96-b001-593d2fd56aa3\") " pod="openshift-infra/auto-csr-approver-29567030-pwdln"
Mar 20 15:50:00 crc kubenswrapper[4730]: I0320 15:50:00.145710    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt"
Mar 20 15:50:00 crc kubenswrapper[4730]: I0320 15:50:00.151327    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567030-pwdln"]
Mar 20 15:50:00 crc kubenswrapper[4730]: I0320 15:50:00.246522    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8cck\" (UniqueName: \"kubernetes.io/projected/b7dcd73b-be94-4b96-b001-593d2fd56aa3-kube-api-access-d8cck\") pod \"auto-csr-approver-29567030-pwdln\" (UID: \"b7dcd73b-be94-4b96-b001-593d2fd56aa3\") " pod="openshift-infra/auto-csr-approver-29567030-pwdln"
Mar 20 15:50:00 crc kubenswrapper[4730]: I0320 15:50:00.276437    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8cck\" (UniqueName: \"kubernetes.io/projected/b7dcd73b-be94-4b96-b001-593d2fd56aa3-kube-api-access-d8cck\") pod \"auto-csr-approver-29567030-pwdln\" (UID: \"b7dcd73b-be94-4b96-b001-593d2fd56aa3\") " pod="openshift-infra/auto-csr-approver-29567030-pwdln"
Mar 20 15:50:00 crc kubenswrapper[4730]: I0320 15:50:00.459556    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567030-pwdln"
Mar 20 15:50:00 crc kubenswrapper[4730]: I0320 15:50:00.637866    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567030-pwdln"]
Mar 20 15:50:01 crc kubenswrapper[4730]: I0320 15:50:01.184466    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567030-pwdln" event={"ID":"b7dcd73b-be94-4b96-b001-593d2fd56aa3","Type":"ContainerStarted","Data":"1e4947204f122f48f44729439abc35d708180684d597470f1ef2ebb11d1aef36"}
Mar 20 15:50:03 crc kubenswrapper[4730]: I0320 15:50:03.200049    4730 generic.go:334] "Generic (PLEG): container finished" podID="b7dcd73b-be94-4b96-b001-593d2fd56aa3" containerID="db53fcef559ab1b37329ca537473be13177cc4e3055c12b3c5b8536921ff4616" exitCode=0
Mar 20 15:50:03 crc kubenswrapper[4730]: I0320 15:50:03.200140    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567030-pwdln" event={"ID":"b7dcd73b-be94-4b96-b001-593d2fd56aa3","Type":"ContainerDied","Data":"db53fcef559ab1b37329ca537473be13177cc4e3055c12b3c5b8536921ff4616"}
Mar 20 15:50:04 crc kubenswrapper[4730]: I0320 15:50:04.393211    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567030-pwdln"
Mar 20 15:50:04 crc kubenswrapper[4730]: I0320 15:50:04.593854    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8cck\" (UniqueName: \"kubernetes.io/projected/b7dcd73b-be94-4b96-b001-593d2fd56aa3-kube-api-access-d8cck\") pod \"b7dcd73b-be94-4b96-b001-593d2fd56aa3\" (UID: \"b7dcd73b-be94-4b96-b001-593d2fd56aa3\") "
Mar 20 15:50:04 crc kubenswrapper[4730]: I0320 15:50:04.600453    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7dcd73b-be94-4b96-b001-593d2fd56aa3-kube-api-access-d8cck" (OuterVolumeSpecName: "kube-api-access-d8cck") pod "b7dcd73b-be94-4b96-b001-593d2fd56aa3" (UID: "b7dcd73b-be94-4b96-b001-593d2fd56aa3"). InnerVolumeSpecName "kube-api-access-d8cck". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:50:04 crc kubenswrapper[4730]: I0320 15:50:04.694750    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8cck\" (UniqueName: \"kubernetes.io/projected/b7dcd73b-be94-4b96-b001-593d2fd56aa3-kube-api-access-d8cck\") on node \"crc\" DevicePath \"\""
Mar 20 15:50:05 crc kubenswrapper[4730]: I0320 15:50:05.212054    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567030-pwdln" event={"ID":"b7dcd73b-be94-4b96-b001-593d2fd56aa3","Type":"ContainerDied","Data":"1e4947204f122f48f44729439abc35d708180684d597470f1ef2ebb11d1aef36"}
Mar 20 15:50:05 crc kubenswrapper[4730]: I0320 15:50:05.212086    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567030-pwdln"
Mar 20 15:50:05 crc kubenswrapper[4730]: I0320 15:50:05.212091    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e4947204f122f48f44729439abc35d708180684d597470f1ef2ebb11d1aef36"
Mar 20 15:50:05 crc kubenswrapper[4730]: I0320 15:50:05.441685    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567024-s2r9c"]
Mar 20 15:50:05 crc kubenswrapper[4730]: I0320 15:50:05.444706    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567024-s2r9c"]
Mar 20 15:50:05 crc kubenswrapper[4730]: I0320 15:50:05.541024    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f093381-3bf4-49ff-beb4-f44aa012c521" path="/var/lib/kubelet/pods/3f093381-3bf4-49ff-beb4-f44aa012c521/volumes"
Mar 20 15:50:12 crc kubenswrapper[4730]: I0320 15:50:12.880504    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 15:50:12 crc kubenswrapper[4730]: I0320 15:50:12.881076    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 15:50:28 crc kubenswrapper[4730]: I0320 15:50:28.212733    4730 scope.go:117] "RemoveContainer" containerID="bb5b04ddf5d3880ba3c77fa4e7069bd85e272160b1890e28a7de00d43e3a9f9e"
Mar 20 15:50:28 crc kubenswrapper[4730]: I0320 15:50:28.251729    4730 scope.go:117] "RemoveContainer" containerID="6ba1acd4b6440038c4d2f11f36de1734bab2b24cdd1e2d4018cd0e97b421d598"
Mar 20 15:50:42 crc kubenswrapper[4730]: I0320 15:50:42.880717    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 15:50:42 crc kubenswrapper[4730]: I0320 15:50:42.881277    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 15:51:12 crc kubenswrapper[4730]: I0320 15:51:12.880275    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 15:51:12 crc kubenswrapper[4730]: I0320 15:51:12.882082    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 15:51:12 crc kubenswrapper[4730]: I0320 15:51:12.882137    4730 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf"
Mar 20 15:51:12 crc kubenswrapper[4730]: I0320 15:51:12.882863    4730 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"44f44ed17252feb14ca678b8fd7bddf96639b37f5ddb8303898a1167aa46bf9c"} pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted"
Mar 20 15:51:12 crc kubenswrapper[4730]: I0320 15:51:12.882915    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" containerID="cri-o://44f44ed17252feb14ca678b8fd7bddf96639b37f5ddb8303898a1167aa46bf9c" gracePeriod=600
Mar 20 15:51:13 crc kubenswrapper[4730]: I0320 15:51:13.615043    4730 generic.go:334] "Generic (PLEG): container finished" podID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerID="44f44ed17252feb14ca678b8fd7bddf96639b37f5ddb8303898a1167aa46bf9c" exitCode=0
Mar 20 15:51:13 crc kubenswrapper[4730]: I0320 15:51:13.615155    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" event={"ID":"7fcd3db3-55f1-4c23-8fa9-78844495cea3","Type":"ContainerDied","Data":"44f44ed17252feb14ca678b8fd7bddf96639b37f5ddb8303898a1167aa46bf9c"}
Mar 20 15:51:13 crc kubenswrapper[4730]: I0320 15:51:13.615608    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" event={"ID":"7fcd3db3-55f1-4c23-8fa9-78844495cea3","Type":"ContainerStarted","Data":"4969adb306e949f48cbf48ac9e1452830c3458afd1750aa781060e2cc0952393"}
Mar 20 15:51:13 crc kubenswrapper[4730]: I0320 15:51:13.615633    4730 scope.go:117] "RemoveContainer" containerID="418b64bd31efa72e03b6036c281348bfc6e1d5be086f3887fe653df9e0316583"
Mar 20 15:51:45 crc kubenswrapper[4730]: I0320 15:51:45.816764    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-dwg9x"]
Mar 20 15:51:45 crc kubenswrapper[4730]: E0320 15:51:45.817542    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7dcd73b-be94-4b96-b001-593d2fd56aa3" containerName="oc"
Mar 20 15:51:45 crc kubenswrapper[4730]: I0320 15:51:45.817554    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7dcd73b-be94-4b96-b001-593d2fd56aa3" containerName="oc"
Mar 20 15:51:45 crc kubenswrapper[4730]: I0320 15:51:45.818097    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7dcd73b-be94-4b96-b001-593d2fd56aa3" containerName="oc"
Mar 20 15:51:45 crc kubenswrapper[4730]: I0320 15:51:45.820741    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-dwg9x"
Mar 20 15:51:45 crc kubenswrapper[4730]: I0320 15:51:45.823720    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-89r9d"]
Mar 20 15:51:45 crc kubenswrapper[4730]: I0320 15:51:45.824602    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-89r9d"
Mar 20 15:51:45 crc kubenswrapper[4730]: I0320 15:51:45.841118    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt"
Mar 20 15:51:45 crc kubenswrapper[4730]: I0320 15:51:45.841348    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt"
Mar 20 15:51:45 crc kubenswrapper[4730]: I0320 15:51:45.842960    4730 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-q7m7b"
Mar 20 15:51:45 crc kubenswrapper[4730]: I0320 15:51:45.843398    4730 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-m9shs"
Mar 20 15:51:45 crc kubenswrapper[4730]: I0320 15:51:45.849634    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-dwg9x"]
Mar 20 15:51:45 crc kubenswrapper[4730]: I0320 15:51:45.853196    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-89r9d"]
Mar 20 15:51:45 crc kubenswrapper[4730]: I0320 15:51:45.869574    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-qcz52"]
Mar 20 15:51:45 crc kubenswrapper[4730]: I0320 15:51:45.870335    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-qcz52"
Mar 20 15:51:45 crc kubenswrapper[4730]: I0320 15:51:45.872310    4730 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-wg4bg"
Mar 20 15:51:45 crc kubenswrapper[4730]: I0320 15:51:45.874344    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-qcz52"]
Mar 20 15:51:45 crc kubenswrapper[4730]: I0320 15:51:45.904859    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9t96\" (UniqueName: \"kubernetes.io/projected/096957e4-5a35-42f7-adf0-cac7672589a4-kube-api-access-s9t96\") pod \"cert-manager-858654f9db-dwg9x\" (UID: \"096957e4-5a35-42f7-adf0-cac7672589a4\") " pod="cert-manager/cert-manager-858654f9db-dwg9x"
Mar 20 15:51:45 crc kubenswrapper[4730]: I0320 15:51:45.904980    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7rfm\" (UniqueName: \"kubernetes.io/projected/b59581d5-071c-4764-9ef6-50ea4724e0a6-kube-api-access-d7rfm\") pod \"cert-manager-cainjector-cf98fcc89-89r9d\" (UID: \"b59581d5-071c-4764-9ef6-50ea4724e0a6\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-89r9d"
Mar 20 15:51:45 crc kubenswrapper[4730]: I0320 15:51:45.905019    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms66r\" (UniqueName: \"kubernetes.io/projected/e7c6b209-7bad-4eb0-b8d0-61a602be9b89-kube-api-access-ms66r\") pod \"cert-manager-webhook-687f57d79b-qcz52\" (UID: \"e7c6b209-7bad-4eb0-b8d0-61a602be9b89\") " pod="cert-manager/cert-manager-webhook-687f57d79b-qcz52"
Mar 20 15:51:46 crc kubenswrapper[4730]: I0320 15:51:46.005782    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9t96\" (UniqueName: \"kubernetes.io/projected/096957e4-5a35-42f7-adf0-cac7672589a4-kube-api-access-s9t96\") pod \"cert-manager-858654f9db-dwg9x\" (UID: \"096957e4-5a35-42f7-adf0-cac7672589a4\") " pod="cert-manager/cert-manager-858654f9db-dwg9x"
Mar 20 15:51:46 crc kubenswrapper[4730]: I0320 15:51:46.005847    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7rfm\" (UniqueName: \"kubernetes.io/projected/b59581d5-071c-4764-9ef6-50ea4724e0a6-kube-api-access-d7rfm\") pod \"cert-manager-cainjector-cf98fcc89-89r9d\" (UID: \"b59581d5-071c-4764-9ef6-50ea4724e0a6\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-89r9d"
Mar 20 15:51:46 crc kubenswrapper[4730]: I0320 15:51:46.005876    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms66r\" (UniqueName: \"kubernetes.io/projected/e7c6b209-7bad-4eb0-b8d0-61a602be9b89-kube-api-access-ms66r\") pod \"cert-manager-webhook-687f57d79b-qcz52\" (UID: \"e7c6b209-7bad-4eb0-b8d0-61a602be9b89\") " pod="cert-manager/cert-manager-webhook-687f57d79b-qcz52"
Mar 20 15:51:46 crc kubenswrapper[4730]: I0320 15:51:46.025166    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms66r\" (UniqueName: \"kubernetes.io/projected/e7c6b209-7bad-4eb0-b8d0-61a602be9b89-kube-api-access-ms66r\") pod \"cert-manager-webhook-687f57d79b-qcz52\" (UID: \"e7c6b209-7bad-4eb0-b8d0-61a602be9b89\") " pod="cert-manager/cert-manager-webhook-687f57d79b-qcz52"
Mar 20 15:51:46 crc kubenswrapper[4730]: I0320 15:51:46.026773    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7rfm\" (UniqueName: \"kubernetes.io/projected/b59581d5-071c-4764-9ef6-50ea4724e0a6-kube-api-access-d7rfm\") pod \"cert-manager-cainjector-cf98fcc89-89r9d\" (UID: \"b59581d5-071c-4764-9ef6-50ea4724e0a6\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-89r9d"
Mar 20 15:51:46 crc kubenswrapper[4730]: I0320 15:51:46.028555    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9t96\" (UniqueName: \"kubernetes.io/projected/096957e4-5a35-42f7-adf0-cac7672589a4-kube-api-access-s9t96\") pod \"cert-manager-858654f9db-dwg9x\" (UID: \"096957e4-5a35-42f7-adf0-cac7672589a4\") " pod="cert-manager/cert-manager-858654f9db-dwg9x"
Mar 20 15:51:46 crc kubenswrapper[4730]: I0320 15:51:46.155611    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-dwg9x"
Mar 20 15:51:46 crc kubenswrapper[4730]: I0320 15:51:46.169985    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-89r9d"
Mar 20 15:51:46 crc kubenswrapper[4730]: I0320 15:51:46.184839    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-qcz52"
Mar 20 15:51:46 crc kubenswrapper[4730]: I0320 15:51:46.414081    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-dwg9x"]
Mar 20 15:51:46 crc kubenswrapper[4730]: I0320 15:51:46.445938    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-89r9d"]
Mar 20 15:51:46 crc kubenswrapper[4730]: W0320 15:51:46.451156    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb59581d5_071c_4764_9ef6_50ea4724e0a6.slice/crio-d11570b1056c2b825ceaefd7e2f85c8dafee33ffe1d2fdee038e11a1e280a3c8 WatchSource:0}: Error finding container d11570b1056c2b825ceaefd7e2f85c8dafee33ffe1d2fdee038e11a1e280a3c8: Status 404 returned error can't find the container with id d11570b1056c2b825ceaefd7e2f85c8dafee33ffe1d2fdee038e11a1e280a3c8
Mar 20 15:51:46 crc kubenswrapper[4730]: I0320 15:51:46.692270    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-qcz52"]
Mar 20 15:51:46 crc kubenswrapper[4730]: W0320 15:51:46.695196    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7c6b209_7bad_4eb0_b8d0_61a602be9b89.slice/crio-9e4dfe5d7fed48bf606dc19a5dea601ea0a47571018fefdbe959bc2fafaab550 WatchSource:0}: Error finding container 9e4dfe5d7fed48bf606dc19a5dea601ea0a47571018fefdbe959bc2fafaab550: Status 404 returned error can't find the container with id 9e4dfe5d7fed48bf606dc19a5dea601ea0a47571018fefdbe959bc2fafaab550
Mar 20 15:51:46 crc kubenswrapper[4730]: I0320 15:51:46.805998    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-dwg9x" event={"ID":"096957e4-5a35-42f7-adf0-cac7672589a4","Type":"ContainerStarted","Data":"c59a935482eb9ff1774c6387b00890ba9f481b157706bdf808afd32ad2202efc"}
Mar 20 15:51:46 crc kubenswrapper[4730]: I0320 15:51:46.807564    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-qcz52" event={"ID":"e7c6b209-7bad-4eb0-b8d0-61a602be9b89","Type":"ContainerStarted","Data":"9e4dfe5d7fed48bf606dc19a5dea601ea0a47571018fefdbe959bc2fafaab550"}
Mar 20 15:51:46 crc kubenswrapper[4730]: I0320 15:51:46.808816    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-89r9d" event={"ID":"b59581d5-071c-4764-9ef6-50ea4724e0a6","Type":"ContainerStarted","Data":"d11570b1056c2b825ceaefd7e2f85c8dafee33ffe1d2fdee038e11a1e280a3c8"}
Mar 20 15:51:49 crc kubenswrapper[4730]: I0320 15:51:49.824680    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-qcz52" event={"ID":"e7c6b209-7bad-4eb0-b8d0-61a602be9b89","Type":"ContainerStarted","Data":"bea517557beebe1c6e1691c7e382b6d8fcb55218b38c9f69540e207508db5d57"}
Mar 20 15:51:49 crc kubenswrapper[4730]: I0320 15:51:49.825290    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-qcz52"
Mar 20 15:51:49 crc kubenswrapper[4730]: I0320 15:51:49.847274    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-qcz52" podStartSLOduration=2.733959531 podStartE2EDuration="4.847239206s" podCreationTimestamp="2026-03-20 15:51:45 +0000 UTC" firstStartedPulling="2026-03-20 15:51:46.697675666 +0000 UTC m=+765.911047045" lastFinishedPulling="2026-03-20 15:51:48.810955341 +0000 UTC m=+768.024326720" observedRunningTime="2026-03-20 15:51:49.840534934 +0000 UTC m=+769.053906303" watchObservedRunningTime="2026-03-20 15:51:49.847239206 +0000 UTC m=+769.060610575"
Mar 20 15:51:50 crc kubenswrapper[4730]: I0320 15:51:50.835001    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-89r9d" event={"ID":"b59581d5-071c-4764-9ef6-50ea4724e0a6","Type":"ContainerStarted","Data":"599e9bd6946d3e853b45c14376407263951b971834c819015e7d19ed7c52ba7c"}
Mar 20 15:51:50 crc kubenswrapper[4730]: I0320 15:51:50.838702    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-dwg9x" event={"ID":"096957e4-5a35-42f7-adf0-cac7672589a4","Type":"ContainerStarted","Data":"aadceb6e1a5ca740041d23022aa81e2db5c6c81b12d873b540355ae4258b69e2"}
Mar 20 15:51:50 crc kubenswrapper[4730]: I0320 15:51:50.859166    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-89r9d" podStartSLOduration=2.235173188 podStartE2EDuration="5.859149963s" podCreationTimestamp="2026-03-20 15:51:45 +0000 UTC" firstStartedPulling="2026-03-20 15:51:46.452955629 +0000 UTC m=+765.666326998" lastFinishedPulling="2026-03-20 15:51:50.076932404 +0000 UTC m=+769.290303773" observedRunningTime="2026-03-20 15:51:50.857303491 +0000 UTC m=+770.070674870" watchObservedRunningTime="2026-03-20 15:51:50.859149963 +0000 UTC m=+770.072521332"
Mar 20 15:51:50 crc kubenswrapper[4730]: I0320 15:51:50.881628    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-dwg9x" podStartSLOduration=2.170115998 podStartE2EDuration="5.881597135s" podCreationTimestamp="2026-03-20 15:51:45 +0000 UTC" firstStartedPulling="2026-03-20 15:51:46.413672896 +0000 UTC m=+765.627044265" lastFinishedPulling="2026-03-20 15:51:50.125154033 +0000 UTC m=+769.338525402" observedRunningTime="2026-03-20 15:51:50.880327239 +0000 UTC m=+770.093698648" watchObservedRunningTime="2026-03-20 15:51:50.881597135 +0000 UTC m=+770.094968564"
Mar 20 15:51:54 crc kubenswrapper[4730]: I0320 15:51:54.885548    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qj97f"]
Mar 20 15:51:54 crc kubenswrapper[4730]: I0320 15:51:54.887439    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerName="ovn-controller" containerID="cri-o://31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c" gracePeriod=30
Mar 20 15:51:54 crc kubenswrapper[4730]: I0320 15:51:54.887517    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerName="nbdb" containerID="cri-o://d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01" gracePeriod=30
Mar 20 15:51:54 crc kubenswrapper[4730]: I0320 15:51:54.887608    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerName="northd" containerID="cri-o://462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c" gracePeriod=30
Mar 20 15:51:54 crc kubenswrapper[4730]: I0320 15:51:54.887723    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerName="sbdb" containerID="cri-o://43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f" gracePeriod=30
Mar 20 15:51:54 crc kubenswrapper[4730]: I0320 15:51:54.887741    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerName="ovn-acl-logging" containerID="cri-o://b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db" gracePeriod=30
Mar 20 15:51:54 crc kubenswrapper[4730]: I0320 15:51:54.887810    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d" gracePeriod=30
Mar 20 15:51:54 crc kubenswrapper[4730]: I0320 15:51:54.887677    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerName="kube-rbac-proxy-node" containerID="cri-o://e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91" gracePeriod=30
Mar 20 15:51:54 crc kubenswrapper[4730]: I0320 15:51:54.930844    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerName="ovnkube-controller" containerID="cri-o://35f50ce915b43165e56de90b2b1949244c29bf63fd85bb0583a8a8a6988d8971" gracePeriod=30
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.228327    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qj97f_c4b4e0e8-af33-491e-b1d1-31079d90c656/ovnkube-controller/3.log"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.230478    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qj97f_c4b4e0e8-af33-491e-b1d1-31079d90c656/ovn-acl-logging/0.log"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.230957    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qj97f_c4b4e0e8-af33-491e-b1d1-31079d90c656/ovn-controller/0.log"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.231411    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.248051    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c4b4e0e8-af33-491e-b1d1-31079d90c656-env-overrides\") pod \"c4b4e0e8-af33-491e-b1d1-31079d90c656\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") "
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.248087    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-log-socket\") pod \"c4b4e0e8-af33-491e-b1d1-31079d90c656\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") "
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.248112    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-host-cni-bin\") pod \"c4b4e0e8-af33-491e-b1d1-31079d90c656\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") "
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.248155    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-host-cni-netd\") pod \"c4b4e0e8-af33-491e-b1d1-31079d90c656\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") "
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.248176    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mz64b\" (UniqueName: \"kubernetes.io/projected/c4b4e0e8-af33-491e-b1d1-31079d90c656-kube-api-access-mz64b\") pod \"c4b4e0e8-af33-491e-b1d1-31079d90c656\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") "
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.248191    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-node-log\") pod \"c4b4e0e8-af33-491e-b1d1-31079d90c656\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") "
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.248211    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-host-run-netns\") pod \"c4b4e0e8-af33-491e-b1d1-31079d90c656\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") "
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.248234    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-systemd-units\") pod \"c4b4e0e8-af33-491e-b1d1-31079d90c656\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") "
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.248275    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-run-systemd\") pod \"c4b4e0e8-af33-491e-b1d1-31079d90c656\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") "
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.248300    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c4b4e0e8-af33-491e-b1d1-31079d90c656-ovn-node-metrics-cert\") pod \"c4b4e0e8-af33-491e-b1d1-31079d90c656\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") "
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.248331    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-host-slash\") pod \"c4b4e0e8-af33-491e-b1d1-31079d90c656\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") "
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.248351    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-host-run-ovn-kubernetes\") pod \"c4b4e0e8-af33-491e-b1d1-31079d90c656\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") "
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.248364    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-run-openvswitch\") pod \"c4b4e0e8-af33-491e-b1d1-31079d90c656\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") "
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.248404    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-run-ovn\") pod \"c4b4e0e8-af33-491e-b1d1-31079d90c656\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") "
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.248418    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-host-var-lib-cni-networks-ovn-kubernetes\") pod \"c4b4e0e8-af33-491e-b1d1-31079d90c656\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") "
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.248436    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c4b4e0e8-af33-491e-b1d1-31079d90c656-ovnkube-script-lib\") pod \"c4b4e0e8-af33-491e-b1d1-31079d90c656\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") "
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.248449    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-etc-openvswitch\") pod \"c4b4e0e8-af33-491e-b1d1-31079d90c656\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") "
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.248465    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-host-kubelet\") pod \"c4b4e0e8-af33-491e-b1d1-31079d90c656\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") "
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.248481    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-var-lib-openvswitch\") pod \"c4b4e0e8-af33-491e-b1d1-31079d90c656\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") "
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.248512    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c4b4e0e8-af33-491e-b1d1-31079d90c656-ovnkube-config\") pod \"c4b4e0e8-af33-491e-b1d1-31079d90c656\" (UID: \"c4b4e0e8-af33-491e-b1d1-31079d90c656\") "
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.248679    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "c4b4e0e8-af33-491e-b1d1-31079d90c656" (UID: "c4b4e0e8-af33-491e-b1d1-31079d90c656"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue ""
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.248802    4730 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\""
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.248831    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-host-slash" (OuterVolumeSpecName: "host-slash") pod "c4b4e0e8-af33-491e-b1d1-31079d90c656" (UID: "c4b4e0e8-af33-491e-b1d1-31079d90c656"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue ""
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.248854    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "c4b4e0e8-af33-491e-b1d1-31079d90c656" (UID: "c4b4e0e8-af33-491e-b1d1-31079d90c656"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue ""
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.248873    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "c4b4e0e8-af33-491e-b1d1-31079d90c656" (UID: "c4b4e0e8-af33-491e-b1d1-31079d90c656"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue ""
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.248892    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "c4b4e0e8-af33-491e-b1d1-31079d90c656" (UID: "c4b4e0e8-af33-491e-b1d1-31079d90c656"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue ""
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.249120    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4b4e0e8-af33-491e-b1d1-31079d90c656-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "c4b4e0e8-af33-491e-b1d1-31079d90c656" (UID: "c4b4e0e8-af33-491e-b1d1-31079d90c656"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.249159    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "c4b4e0e8-af33-491e-b1d1-31079d90c656" (UID: "c4b4e0e8-af33-491e-b1d1-31079d90c656"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue ""
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.249179    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-log-socket" (OuterVolumeSpecName: "log-socket") pod "c4b4e0e8-af33-491e-b1d1-31079d90c656" (UID: "c4b4e0e8-af33-491e-b1d1-31079d90c656"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue ""
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.249204    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "c4b4e0e8-af33-491e-b1d1-31079d90c656" (UID: "c4b4e0e8-af33-491e-b1d1-31079d90c656"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue ""
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.249194    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "c4b4e0e8-af33-491e-b1d1-31079d90c656" (UID: "c4b4e0e8-af33-491e-b1d1-31079d90c656"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue ""
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.249221    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-node-log" (OuterVolumeSpecName: "node-log") pod "c4b4e0e8-af33-491e-b1d1-31079d90c656" (UID: "c4b4e0e8-af33-491e-b1d1-31079d90c656"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue ""
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.249241    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "c4b4e0e8-af33-491e-b1d1-31079d90c656" (UID: "c4b4e0e8-af33-491e-b1d1-31079d90c656"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue ""
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.249454    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "c4b4e0e8-af33-491e-b1d1-31079d90c656" (UID: "c4b4e0e8-af33-491e-b1d1-31079d90c656"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue ""
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.249456    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "c4b4e0e8-af33-491e-b1d1-31079d90c656" (UID: "c4b4e0e8-af33-491e-b1d1-31079d90c656"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue ""
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.249524    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4b4e0e8-af33-491e-b1d1-31079d90c656-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "c4b4e0e8-af33-491e-b1d1-31079d90c656" (UID: "c4b4e0e8-af33-491e-b1d1-31079d90c656"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.249574    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "c4b4e0e8-af33-491e-b1d1-31079d90c656" (UID: "c4b4e0e8-af33-491e-b1d1-31079d90c656"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue ""
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.249592    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4b4e0e8-af33-491e-b1d1-31079d90c656-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "c4b4e0e8-af33-491e-b1d1-31079d90c656" (UID: "c4b4e0e8-af33-491e-b1d1-31079d90c656"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.254398    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4b4e0e8-af33-491e-b1d1-31079d90c656-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "c4b4e0e8-af33-491e-b1d1-31079d90c656" (UID: "c4b4e0e8-af33-491e-b1d1-31079d90c656"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.254772    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4b4e0e8-af33-491e-b1d1-31079d90c656-kube-api-access-mz64b" (OuterVolumeSpecName: "kube-api-access-mz64b") pod "c4b4e0e8-af33-491e-b1d1-31079d90c656" (UID: "c4b4e0e8-af33-491e-b1d1-31079d90c656"). InnerVolumeSpecName "kube-api-access-mz64b". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.267028    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "c4b4e0e8-af33-491e-b1d1-31079d90c656" (UID: "c4b4e0e8-af33-491e-b1d1-31079d90c656"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue ""
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.282594    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-flvb5"]
Mar 20 15:51:55 crc kubenswrapper[4730]: E0320 15:51:55.282831    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerName="kube-rbac-proxy-node"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.282847    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerName="kube-rbac-proxy-node"
Mar 20 15:51:55 crc kubenswrapper[4730]: E0320 15:51:55.282859    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerName="nbdb"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.282865    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerName="nbdb"
Mar 20 15:51:55 crc kubenswrapper[4730]: E0320 15:51:55.282873    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerName="ovnkube-controller"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.282879    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerName="ovnkube-controller"
Mar 20 15:51:55 crc kubenswrapper[4730]: E0320 15:51:55.282886    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerName="northd"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.282892    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerName="northd"
Mar 20 15:51:55 crc kubenswrapper[4730]: E0320 15:51:55.282900    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerName="ovnkube-controller"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.282906    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerName="ovnkube-controller"
Mar 20 15:51:55 crc kubenswrapper[4730]: E0320 15:51:55.282916    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerName="ovnkube-controller"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.282921    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerName="ovnkube-controller"
Mar 20 15:51:55 crc kubenswrapper[4730]: E0320 15:51:55.282930    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerName="sbdb"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.282935    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerName="sbdb"
Mar 20 15:51:55 crc kubenswrapper[4730]: E0320 15:51:55.282943    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerName="ovnkube-controller"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.282948    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerName="ovnkube-controller"
Mar 20 15:51:55 crc kubenswrapper[4730]: E0320 15:51:55.282959    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerName="ovn-controller"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.282964    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerName="ovn-controller"
Mar 20 15:51:55 crc kubenswrapper[4730]: E0320 15:51:55.282975    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerName="ovn-acl-logging"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.282980    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerName="ovn-acl-logging"
Mar 20 15:51:55 crc kubenswrapper[4730]: E0320 15:51:55.282989    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerName="kubecfg-setup"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.282996    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerName="kubecfg-setup"
Mar 20 15:51:55 crc kubenswrapper[4730]: E0320 15:51:55.283003    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerName="kube-rbac-proxy-ovn-metrics"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.283008    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerName="kube-rbac-proxy-ovn-metrics"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.283110    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerName="kube-rbac-proxy-ovn-metrics"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.283121    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerName="ovn-controller"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.283128    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerName="nbdb"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.283136    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerName="sbdb"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.283145    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerName="ovn-acl-logging"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.283152    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerName="ovnkube-controller"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.283160    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerName="kube-rbac-proxy-node"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.283169    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerName="ovnkube-controller"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.283176    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerName="ovnkube-controller"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.283184    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerName="northd"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.283191    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerName="ovnkube-controller"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.283197    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerName="ovnkube-controller"
Mar 20 15:51:55 crc kubenswrapper[4730]: E0320 15:51:55.284559    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerName="ovnkube-controller"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.284582    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerName="ovnkube-controller"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.286353    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.349580    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-ovn-node-metrics-cert\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.349632    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-host-kubelet\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.349704    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcbxp\" (UniqueName: \"kubernetes.io/projected/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-kube-api-access-gcbxp\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.349813    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.349870    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-var-lib-openvswitch\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.349891    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-ovnkube-config\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.349910    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-host-run-ovn-kubernetes\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.349939    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-host-cni-bin\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.349981    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-node-log\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.349998    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-env-overrides\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.350012    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-ovnkube-script-lib\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.350028    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-log-socket\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.350075    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-run-systemd\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.350094    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-etc-openvswitch\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.350131    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-host-slash\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.350158    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-run-ovn\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.350177    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-host-cni-netd\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.350197    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-run-openvswitch\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.350213    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-systemd-units\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.350238    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-host-run-netns\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.350319    4730 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-host-cni-netd\") on node \"crc\" DevicePath \"\""
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.350332    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mz64b\" (UniqueName: \"kubernetes.io/projected/c4b4e0e8-af33-491e-b1d1-31079d90c656-kube-api-access-mz64b\") on node \"crc\" DevicePath \"\""
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.350344    4730 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-node-log\") on node \"crc\" DevicePath \"\""
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.350355    4730 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-host-run-netns\") on node \"crc\" DevicePath \"\""
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.350364    4730 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-systemd-units\") on node \"crc\" DevicePath \"\""
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.350371    4730 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-run-systemd\") on node \"crc\" DevicePath \"\""
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.350379    4730 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c4b4e0e8-af33-491e-b1d1-31079d90c656-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\""
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.350389    4730 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-host-slash\") on node \"crc\" DevicePath \"\""
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.350398    4730 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\""
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.350406    4730 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-run-openvswitch\") on node \"crc\" DevicePath \"\""
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.350413    4730 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-run-ovn\") on node \"crc\" DevicePath \"\""
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.350421    4730 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c4b4e0e8-af33-491e-b1d1-31079d90c656-ovnkube-script-lib\") on node \"crc\" DevicePath \"\""
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.350429    4730 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-etc-openvswitch\") on node \"crc\" DevicePath \"\""
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.350436    4730 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-host-kubelet\") on node \"crc\" DevicePath \"\""
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.350443    4730 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-var-lib-openvswitch\") on node \"crc\" DevicePath \"\""
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.350452    4730 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c4b4e0e8-af33-491e-b1d1-31079d90c656-ovnkube-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.350459    4730 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c4b4e0e8-af33-491e-b1d1-31079d90c656-env-overrides\") on node \"crc\" DevicePath \"\""
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.350466    4730 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-log-socket\") on node \"crc\" DevicePath \"\""
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.350474    4730 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c4b4e0e8-af33-491e-b1d1-31079d90c656-host-cni-bin\") on node \"crc\" DevicePath \"\""
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.452128    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-node-log\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.452195    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-env-overrides\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.452234    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-ovnkube-script-lib\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.452341    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-node-log\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.452344    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-log-socket\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.452420    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-log-socket\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.452459    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-run-systemd\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.452433    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-run-systemd\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.452510    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-etc-openvswitch\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.452601    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-host-slash\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.452650    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-run-ovn\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.452687    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-host-cni-netd\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.452724    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-run-openvswitch\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.452753    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-systemd-units\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.452784    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-host-run-netns\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.452844    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-ovn-node-metrics-cert\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.452894    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-host-kubelet\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.452958    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcbxp\" (UniqueName: \"kubernetes.io/projected/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-kube-api-access-gcbxp\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.453005    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.453049    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-var-lib-openvswitch\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.453083    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-ovnkube-config\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.453114    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-host-run-ovn-kubernetes\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.453144    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-host-cni-bin\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.453201    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-env-overrides\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.453237    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-host-cni-bin\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.453297    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-etc-openvswitch\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.453324    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-host-slash\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.453347    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-run-ovn\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.453369    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-host-cni-netd\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.453390    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-run-openvswitch\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.453416    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-systemd-units\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.453432    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-var-lib-openvswitch\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.453487    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.453491    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-ovnkube-script-lib\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.453506    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-host-run-ovn-kubernetes\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.453516    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-host-kubelet\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.453279    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-host-run-netns\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.453924    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-ovnkube-config\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.461179    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-ovn-node-metrics-cert\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.470462    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcbxp\" (UniqueName: \"kubernetes.io/projected/393a72ce-ab43-44d6-a484-dcc1ebe1d48e-kube-api-access-gcbxp\") pod \"ovnkube-node-flvb5\" (UID: \"393a72ce-ab43-44d6-a484-dcc1ebe1d48e\") " pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.600813    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:51:55 crc kubenswrapper[4730]: W0320 15:51:55.627033    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod393a72ce_ab43_44d6_a484_dcc1ebe1d48e.slice/crio-1f466238e3c2440ae3bdf44e2467841d891dda10a0371c505f812e48d302c35f WatchSource:0}: Error finding container 1f466238e3c2440ae3bdf44e2467841d891dda10a0371c505f812e48d302c35f: Status 404 returned error can't find the container with id 1f466238e3c2440ae3bdf44e2467841d891dda10a0371c505f812e48d302c35f
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.880191    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qj97f_c4b4e0e8-af33-491e-b1d1-31079d90c656/ovnkube-controller/3.log"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.885335    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qj97f_c4b4e0e8-af33-491e-b1d1-31079d90c656/ovn-acl-logging/0.log"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.885885    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qj97f_c4b4e0e8-af33-491e-b1d1-31079d90c656/ovn-controller/0.log"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.886300    4730 generic.go:334] "Generic (PLEG): container finished" podID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerID="35f50ce915b43165e56de90b2b1949244c29bf63fd85bb0583a8a8a6988d8971" exitCode=0
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.887295    4730 generic.go:334] "Generic (PLEG): container finished" podID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerID="43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f" exitCode=0
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.887444    4730 generic.go:334] "Generic (PLEG): container finished" podID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerID="d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01" exitCode=0
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.887534    4730 generic.go:334] "Generic (PLEG): container finished" podID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerID="462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c" exitCode=0
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.887590    4730 generic.go:334] "Generic (PLEG): container finished" podID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerID="006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d" exitCode=0
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.887643    4730 generic.go:334] "Generic (PLEG): container finished" podID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerID="e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91" exitCode=0
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.887694    4730 generic.go:334] "Generic (PLEG): container finished" podID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerID="b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db" exitCode=143
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.887750    4730 generic.go:334] "Generic (PLEG): container finished" podID="c4b4e0e8-af33-491e-b1d1-31079d90c656" containerID="31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c" exitCode=143
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.886390    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.886346    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" event={"ID":"c4b4e0e8-af33-491e-b1d1-31079d90c656","Type":"ContainerDied","Data":"35f50ce915b43165e56de90b2b1949244c29bf63fd85bb0583a8a8a6988d8971"}
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.888053    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" event={"ID":"c4b4e0e8-af33-491e-b1d1-31079d90c656","Type":"ContainerDied","Data":"43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f"}
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.888068    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" event={"ID":"c4b4e0e8-af33-491e-b1d1-31079d90c656","Type":"ContainerDied","Data":"d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01"}
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.888072    4730 scope.go:117] "RemoveContainer" containerID="35f50ce915b43165e56de90b2b1949244c29bf63fd85bb0583a8a8a6988d8971"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.888079    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" event={"ID":"c4b4e0e8-af33-491e-b1d1-31079d90c656","Type":"ContainerDied","Data":"462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c"}
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.888156    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" event={"ID":"c4b4e0e8-af33-491e-b1d1-31079d90c656","Type":"ContainerDied","Data":"006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d"}
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.888167    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" event={"ID":"c4b4e0e8-af33-491e-b1d1-31079d90c656","Type":"ContainerDied","Data":"e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91"}
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.888176    4730 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7263ba407f8e4372a1f43899fa38bbfefcf8ad7bf2f07d5e4fc6f145dc922aed"}
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.888185    4730 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f"}
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.888192    4730 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01"}
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.888199    4730 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c"}
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.888204    4730 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d"}
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.888209    4730 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91"}
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.888214    4730 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db"}
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.888219    4730 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c"}
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.888224    4730 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e"}
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.888231    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" event={"ID":"c4b4e0e8-af33-491e-b1d1-31079d90c656","Type":"ContainerDied","Data":"b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db"}
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.888239    4730 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"35f50ce915b43165e56de90b2b1949244c29bf63fd85bb0583a8a8a6988d8971"}
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.888259    4730 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7263ba407f8e4372a1f43899fa38bbfefcf8ad7bf2f07d5e4fc6f145dc922aed"}
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.888264    4730 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f"}
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.888269    4730 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01"}
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.888275    4730 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c"}
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.888280    4730 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d"}
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.888285    4730 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91"}
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.888290    4730 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db"}
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.888294    4730 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c"}
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.888299    4730 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e"}
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.888306    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" event={"ID":"c4b4e0e8-af33-491e-b1d1-31079d90c656","Type":"ContainerDied","Data":"31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c"}
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.888314    4730 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"35f50ce915b43165e56de90b2b1949244c29bf63fd85bb0583a8a8a6988d8971"}
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.888319    4730 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7263ba407f8e4372a1f43899fa38bbfefcf8ad7bf2f07d5e4fc6f145dc922aed"}
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.888325    4730 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f"}
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.888330    4730 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01"}
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.888335    4730 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c"}
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.888380    4730 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d"}
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.888387    4730 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91"}
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.888393    4730 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db"}
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.888398    4730 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c"}
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.888404    4730 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e"}
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.888412    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qj97f" event={"ID":"c4b4e0e8-af33-491e-b1d1-31079d90c656","Type":"ContainerDied","Data":"f0bb8a04718d250ff389e424bacc9dc0320526af93827c03eb732b797d1a25fb"}
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.888420    4730 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"35f50ce915b43165e56de90b2b1949244c29bf63fd85bb0583a8a8a6988d8971"}
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.888426    4730 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7263ba407f8e4372a1f43899fa38bbfefcf8ad7bf2f07d5e4fc6f145dc922aed"}
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.888432    4730 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f"}
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.888437    4730 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01"}
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.888442    4730 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c"}
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.888447    4730 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d"}
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.888452    4730 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91"}
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.888457    4730 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db"}
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.888463    4730 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c"}
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.888468    4730 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e"}
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.889409    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6r2kn_6f97b1f1-1fad-44ec-8253-17dd6a5eee54/kube-multus/2.log"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.890103    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6r2kn_6f97b1f1-1fad-44ec-8253-17dd6a5eee54/kube-multus/1.log"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.890133    4730 generic.go:334] "Generic (PLEG): container finished" podID="6f97b1f1-1fad-44ec-8253-17dd6a5eee54" containerID="b07ba8437e9756f6cb976900c9db574ebb08c12f74c7cd2c86009c95fccf5b7e" exitCode=2
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.890179    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6r2kn" event={"ID":"6f97b1f1-1fad-44ec-8253-17dd6a5eee54","Type":"ContainerDied","Data":"b07ba8437e9756f6cb976900c9db574ebb08c12f74c7cd2c86009c95fccf5b7e"}
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.890196    4730 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"12ba423ea0fecce8b2416cc8f75f3323980aae80a20ff26bd2f9a6c4cd464812"}
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.890636    4730 scope.go:117] "RemoveContainer" containerID="b07ba8437e9756f6cb976900c9db574ebb08c12f74c7cd2c86009c95fccf5b7e"
Mar 20 15:51:55 crc kubenswrapper[4730]: E0320 15:51:55.890816    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-6r2kn_openshift-multus(6f97b1f1-1fad-44ec-8253-17dd6a5eee54)\"" pod="openshift-multus/multus-6r2kn" podUID="6f97b1f1-1fad-44ec-8253-17dd6a5eee54"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.892762    4730 generic.go:334] "Generic (PLEG): container finished" podID="393a72ce-ab43-44d6-a484-dcc1ebe1d48e" containerID="6e3523b44c23ee77011cede2dd0b960723c6800426b70b5ad879d54360b10210" exitCode=0
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.892796    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-flvb5" event={"ID":"393a72ce-ab43-44d6-a484-dcc1ebe1d48e","Type":"ContainerDied","Data":"6e3523b44c23ee77011cede2dd0b960723c6800426b70b5ad879d54360b10210"}
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.892818    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-flvb5" event={"ID":"393a72ce-ab43-44d6-a484-dcc1ebe1d48e","Type":"ContainerStarted","Data":"1f466238e3c2440ae3bdf44e2467841d891dda10a0371c505f812e48d302c35f"}
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.904121    4730 scope.go:117] "RemoveContainer" containerID="7263ba407f8e4372a1f43899fa38bbfefcf8ad7bf2f07d5e4fc6f145dc922aed"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.924451    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qj97f"]
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.930745    4730 scope.go:117] "RemoveContainer" containerID="43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.934420    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qj97f"]
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.955279    4730 scope.go:117] "RemoveContainer" containerID="d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.969999    4730 scope.go:117] "RemoveContainer" containerID="462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.984492    4730 scope.go:117] "RemoveContainer" containerID="006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d"
Mar 20 15:51:55 crc kubenswrapper[4730]: I0320 15:51:55.995627    4730 scope.go:117] "RemoveContainer" containerID="e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.033220    4730 scope.go:117] "RemoveContainer" containerID="b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.046617    4730 scope.go:117] "RemoveContainer" containerID="31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.073161    4730 scope.go:117] "RemoveContainer" containerID="b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.088887    4730 scope.go:117] "RemoveContainer" containerID="35f50ce915b43165e56de90b2b1949244c29bf63fd85bb0583a8a8a6988d8971"
Mar 20 15:51:56 crc kubenswrapper[4730]: E0320 15:51:56.089397    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35f50ce915b43165e56de90b2b1949244c29bf63fd85bb0583a8a8a6988d8971\": container with ID starting with 35f50ce915b43165e56de90b2b1949244c29bf63fd85bb0583a8a8a6988d8971 not found: ID does not exist" containerID="35f50ce915b43165e56de90b2b1949244c29bf63fd85bb0583a8a8a6988d8971"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.089427    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35f50ce915b43165e56de90b2b1949244c29bf63fd85bb0583a8a8a6988d8971"} err="failed to get container status \"35f50ce915b43165e56de90b2b1949244c29bf63fd85bb0583a8a8a6988d8971\": rpc error: code = NotFound desc = could not find container \"35f50ce915b43165e56de90b2b1949244c29bf63fd85bb0583a8a8a6988d8971\": container with ID starting with 35f50ce915b43165e56de90b2b1949244c29bf63fd85bb0583a8a8a6988d8971 not found: ID does not exist"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.089449    4730 scope.go:117] "RemoveContainer" containerID="7263ba407f8e4372a1f43899fa38bbfefcf8ad7bf2f07d5e4fc6f145dc922aed"
Mar 20 15:51:56 crc kubenswrapper[4730]: E0320 15:51:56.089778    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7263ba407f8e4372a1f43899fa38bbfefcf8ad7bf2f07d5e4fc6f145dc922aed\": container with ID starting with 7263ba407f8e4372a1f43899fa38bbfefcf8ad7bf2f07d5e4fc6f145dc922aed not found: ID does not exist" containerID="7263ba407f8e4372a1f43899fa38bbfefcf8ad7bf2f07d5e4fc6f145dc922aed"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.089800    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7263ba407f8e4372a1f43899fa38bbfefcf8ad7bf2f07d5e4fc6f145dc922aed"} err="failed to get container status \"7263ba407f8e4372a1f43899fa38bbfefcf8ad7bf2f07d5e4fc6f145dc922aed\": rpc error: code = NotFound desc = could not find container \"7263ba407f8e4372a1f43899fa38bbfefcf8ad7bf2f07d5e4fc6f145dc922aed\": container with ID starting with 7263ba407f8e4372a1f43899fa38bbfefcf8ad7bf2f07d5e4fc6f145dc922aed not found: ID does not exist"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.089813    4730 scope.go:117] "RemoveContainer" containerID="43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f"
Mar 20 15:51:56 crc kubenswrapper[4730]: E0320 15:51:56.090080    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f\": container with ID starting with 43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f not found: ID does not exist" containerID="43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.090101    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f"} err="failed to get container status \"43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f\": rpc error: code = NotFound desc = could not find container \"43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f\": container with ID starting with 43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f not found: ID does not exist"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.090115    4730 scope.go:117] "RemoveContainer" containerID="d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01"
Mar 20 15:51:56 crc kubenswrapper[4730]: E0320 15:51:56.090579    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01\": container with ID starting with d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01 not found: ID does not exist" containerID="d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.090636    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01"} err="failed to get container status \"d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01\": rpc error: code = NotFound desc = could not find container \"d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01\": container with ID starting with d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01 not found: ID does not exist"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.090670    4730 scope.go:117] "RemoveContainer" containerID="462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c"
Mar 20 15:51:56 crc kubenswrapper[4730]: E0320 15:51:56.091032    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c\": container with ID starting with 462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c not found: ID does not exist" containerID="462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.091057    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c"} err="failed to get container status \"462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c\": rpc error: code = NotFound desc = could not find container \"462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c\": container with ID starting with 462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c not found: ID does not exist"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.091072    4730 scope.go:117] "RemoveContainer" containerID="006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d"
Mar 20 15:51:56 crc kubenswrapper[4730]: E0320 15:51:56.091422    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d\": container with ID starting with 006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d not found: ID does not exist" containerID="006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.091608    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d"} err="failed to get container status \"006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d\": rpc error: code = NotFound desc = could not find container \"006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d\": container with ID starting with 006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d not found: ID does not exist"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.091693    4730 scope.go:117] "RemoveContainer" containerID="e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91"
Mar 20 15:51:56 crc kubenswrapper[4730]: E0320 15:51:56.092088    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91\": container with ID starting with e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91 not found: ID does not exist" containerID="e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.092113    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91"} err="failed to get container status \"e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91\": rpc error: code = NotFound desc = could not find container \"e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91\": container with ID starting with e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91 not found: ID does not exist"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.092132    4730 scope.go:117] "RemoveContainer" containerID="b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db"
Mar 20 15:51:56 crc kubenswrapper[4730]: E0320 15:51:56.092501    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db\": container with ID starting with b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db not found: ID does not exist" containerID="b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.092595    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db"} err="failed to get container status \"b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db\": rpc error: code = NotFound desc = could not find container \"b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db\": container with ID starting with b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db not found: ID does not exist"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.092657    4730 scope.go:117] "RemoveContainer" containerID="31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c"
Mar 20 15:51:56 crc kubenswrapper[4730]: E0320 15:51:56.092995    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c\": container with ID starting with 31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c not found: ID does not exist" containerID="31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.093021    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c"} err="failed to get container status \"31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c\": rpc error: code = NotFound desc = could not find container \"31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c\": container with ID starting with 31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c not found: ID does not exist"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.093036    4730 scope.go:117] "RemoveContainer" containerID="b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e"
Mar 20 15:51:56 crc kubenswrapper[4730]: E0320 15:51:56.093294    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\": container with ID starting with b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e not found: ID does not exist" containerID="b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.093323    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e"} err="failed to get container status \"b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\": rpc error: code = NotFound desc = could not find container \"b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\": container with ID starting with b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e not found: ID does not exist"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.093343    4730 scope.go:117] "RemoveContainer" containerID="35f50ce915b43165e56de90b2b1949244c29bf63fd85bb0583a8a8a6988d8971"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.093600    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35f50ce915b43165e56de90b2b1949244c29bf63fd85bb0583a8a8a6988d8971"} err="failed to get container status \"35f50ce915b43165e56de90b2b1949244c29bf63fd85bb0583a8a8a6988d8971\": rpc error: code = NotFound desc = could not find container \"35f50ce915b43165e56de90b2b1949244c29bf63fd85bb0583a8a8a6988d8971\": container with ID starting with 35f50ce915b43165e56de90b2b1949244c29bf63fd85bb0583a8a8a6988d8971 not found: ID does not exist"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.093630    4730 scope.go:117] "RemoveContainer" containerID="7263ba407f8e4372a1f43899fa38bbfefcf8ad7bf2f07d5e4fc6f145dc922aed"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.093963    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7263ba407f8e4372a1f43899fa38bbfefcf8ad7bf2f07d5e4fc6f145dc922aed"} err="failed to get container status \"7263ba407f8e4372a1f43899fa38bbfefcf8ad7bf2f07d5e4fc6f145dc922aed\": rpc error: code = NotFound desc = could not find container \"7263ba407f8e4372a1f43899fa38bbfefcf8ad7bf2f07d5e4fc6f145dc922aed\": container with ID starting with 7263ba407f8e4372a1f43899fa38bbfefcf8ad7bf2f07d5e4fc6f145dc922aed not found: ID does not exist"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.094046    4730 scope.go:117] "RemoveContainer" containerID="43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.094519    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f"} err="failed to get container status \"43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f\": rpc error: code = NotFound desc = could not find container \"43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f\": container with ID starting with 43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f not found: ID does not exist"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.094542    4730 scope.go:117] "RemoveContainer" containerID="d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.094964    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01"} err="failed to get container status \"d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01\": rpc error: code = NotFound desc = could not find container \"d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01\": container with ID starting with d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01 not found: ID does not exist"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.094986    4730 scope.go:117] "RemoveContainer" containerID="462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.095200    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c"} err="failed to get container status \"462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c\": rpc error: code = NotFound desc = could not find container \"462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c\": container with ID starting with 462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c not found: ID does not exist"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.095226    4730 scope.go:117] "RemoveContainer" containerID="006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.095520    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d"} err="failed to get container status \"006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d\": rpc error: code = NotFound desc = could not find container \"006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d\": container with ID starting with 006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d not found: ID does not exist"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.095540    4730 scope.go:117] "RemoveContainer" containerID="e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.095882    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91"} err="failed to get container status \"e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91\": rpc error: code = NotFound desc = could not find container \"e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91\": container with ID starting with e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91 not found: ID does not exist"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.095926    4730 scope.go:117] "RemoveContainer" containerID="b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.096191    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db"} err="failed to get container status \"b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db\": rpc error: code = NotFound desc = could not find container \"b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db\": container with ID starting with b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db not found: ID does not exist"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.096216    4730 scope.go:117] "RemoveContainer" containerID="31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.096479    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c"} err="failed to get container status \"31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c\": rpc error: code = NotFound desc = could not find container \"31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c\": container with ID starting with 31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c not found: ID does not exist"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.096500    4730 scope.go:117] "RemoveContainer" containerID="b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.096731    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e"} err="failed to get container status \"b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\": rpc error: code = NotFound desc = could not find container \"b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\": container with ID starting with b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e not found: ID does not exist"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.096757    4730 scope.go:117] "RemoveContainer" containerID="35f50ce915b43165e56de90b2b1949244c29bf63fd85bb0583a8a8a6988d8971"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.096980    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35f50ce915b43165e56de90b2b1949244c29bf63fd85bb0583a8a8a6988d8971"} err="failed to get container status \"35f50ce915b43165e56de90b2b1949244c29bf63fd85bb0583a8a8a6988d8971\": rpc error: code = NotFound desc = could not find container \"35f50ce915b43165e56de90b2b1949244c29bf63fd85bb0583a8a8a6988d8971\": container with ID starting with 35f50ce915b43165e56de90b2b1949244c29bf63fd85bb0583a8a8a6988d8971 not found: ID does not exist"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.097000    4730 scope.go:117] "RemoveContainer" containerID="7263ba407f8e4372a1f43899fa38bbfefcf8ad7bf2f07d5e4fc6f145dc922aed"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.097268    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7263ba407f8e4372a1f43899fa38bbfefcf8ad7bf2f07d5e4fc6f145dc922aed"} err="failed to get container status \"7263ba407f8e4372a1f43899fa38bbfefcf8ad7bf2f07d5e4fc6f145dc922aed\": rpc error: code = NotFound desc = could not find container \"7263ba407f8e4372a1f43899fa38bbfefcf8ad7bf2f07d5e4fc6f145dc922aed\": container with ID starting with 7263ba407f8e4372a1f43899fa38bbfefcf8ad7bf2f07d5e4fc6f145dc922aed not found: ID does not exist"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.097306    4730 scope.go:117] "RemoveContainer" containerID="43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.097641    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f"} err="failed to get container status \"43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f\": rpc error: code = NotFound desc = could not find container \"43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f\": container with ID starting with 43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f not found: ID does not exist"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.097661    4730 scope.go:117] "RemoveContainer" containerID="d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.097865    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01"} err="failed to get container status \"d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01\": rpc error: code = NotFound desc = could not find container \"d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01\": container with ID starting with d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01 not found: ID does not exist"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.097890    4730 scope.go:117] "RemoveContainer" containerID="462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.098198    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c"} err="failed to get container status \"462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c\": rpc error: code = NotFound desc = could not find container \"462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c\": container with ID starting with 462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c not found: ID does not exist"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.098241    4730 scope.go:117] "RemoveContainer" containerID="006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.098612    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d"} err="failed to get container status \"006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d\": rpc error: code = NotFound desc = could not find container \"006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d\": container with ID starting with 006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d not found: ID does not exist"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.098713    4730 scope.go:117] "RemoveContainer" containerID="e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.099120    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91"} err="failed to get container status \"e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91\": rpc error: code = NotFound desc = could not find container \"e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91\": container with ID starting with e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91 not found: ID does not exist"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.099151    4730 scope.go:117] "RemoveContainer" containerID="b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.099388    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db"} err="failed to get container status \"b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db\": rpc error: code = NotFound desc = could not find container \"b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db\": container with ID starting with b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db not found: ID does not exist"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.099458    4730 scope.go:117] "RemoveContainer" containerID="31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.099721    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c"} err="failed to get container status \"31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c\": rpc error: code = NotFound desc = could not find container \"31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c\": container with ID starting with 31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c not found: ID does not exist"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.099747    4730 scope.go:117] "RemoveContainer" containerID="b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.099958    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e"} err="failed to get container status \"b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\": rpc error: code = NotFound desc = could not find container \"b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\": container with ID starting with b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e not found: ID does not exist"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.100033    4730 scope.go:117] "RemoveContainer" containerID="35f50ce915b43165e56de90b2b1949244c29bf63fd85bb0583a8a8a6988d8971"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.100636    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35f50ce915b43165e56de90b2b1949244c29bf63fd85bb0583a8a8a6988d8971"} err="failed to get container status \"35f50ce915b43165e56de90b2b1949244c29bf63fd85bb0583a8a8a6988d8971\": rpc error: code = NotFound desc = could not find container \"35f50ce915b43165e56de90b2b1949244c29bf63fd85bb0583a8a8a6988d8971\": container with ID starting with 35f50ce915b43165e56de90b2b1949244c29bf63fd85bb0583a8a8a6988d8971 not found: ID does not exist"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.100663    4730 scope.go:117] "RemoveContainer" containerID="7263ba407f8e4372a1f43899fa38bbfefcf8ad7bf2f07d5e4fc6f145dc922aed"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.100898    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7263ba407f8e4372a1f43899fa38bbfefcf8ad7bf2f07d5e4fc6f145dc922aed"} err="failed to get container status \"7263ba407f8e4372a1f43899fa38bbfefcf8ad7bf2f07d5e4fc6f145dc922aed\": rpc error: code = NotFound desc = could not find container \"7263ba407f8e4372a1f43899fa38bbfefcf8ad7bf2f07d5e4fc6f145dc922aed\": container with ID starting with 7263ba407f8e4372a1f43899fa38bbfefcf8ad7bf2f07d5e4fc6f145dc922aed not found: ID does not exist"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.100967    4730 scope.go:117] "RemoveContainer" containerID="43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.101261    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f"} err="failed to get container status \"43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f\": rpc error: code = NotFound desc = could not find container \"43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f\": container with ID starting with 43c3986f6340071e63882af0ddb383761d6c4fda45c266b60036c3c9416c9b2f not found: ID does not exist"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.101283    4730 scope.go:117] "RemoveContainer" containerID="d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.101570    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01"} err="failed to get container status \"d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01\": rpc error: code = NotFound desc = could not find container \"d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01\": container with ID starting with d938120ad60db79c12e709e5133b6705b2f730747fecb69de814b1b6aaf73c01 not found: ID does not exist"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.101642    4730 scope.go:117] "RemoveContainer" containerID="462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.102034    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c"} err="failed to get container status \"462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c\": rpc error: code = NotFound desc = could not find container \"462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c\": container with ID starting with 462d6624fd06581ef72076b52be20160973f59b42767b150892bebc10ce3417c not found: ID does not exist"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.102055    4730 scope.go:117] "RemoveContainer" containerID="006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.102325    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d"} err="failed to get container status \"006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d\": rpc error: code = NotFound desc = could not find container \"006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d\": container with ID starting with 006f9193e64005554c87bfbbc1a18a400222f98aaade64f97c6f5d73a168c19d not found: ID does not exist"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.102368    4730 scope.go:117] "RemoveContainer" containerID="e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.102659    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91"} err="failed to get container status \"e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91\": rpc error: code = NotFound desc = could not find container \"e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91\": container with ID starting with e0cabf222a34e7bb1792d1523467402505bc7c98ce1d9d8dc436e30355438a91 not found: ID does not exist"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.102696    4730 scope.go:117] "RemoveContainer" containerID="b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.102900    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db"} err="failed to get container status \"b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db\": rpc error: code = NotFound desc = could not find container \"b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db\": container with ID starting with b4c48a7668275e645faa69d28f9a51bf63ee0491445dc055e75b697d764de7db not found: ID does not exist"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.102919    4730 scope.go:117] "RemoveContainer" containerID="31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.103100    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c"} err="failed to get container status \"31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c\": rpc error: code = NotFound desc = could not find container \"31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c\": container with ID starting with 31218fb8030abf13b865fafaf7dcfb8b6f589bbc6286dc181a90d1b106493f8c not found: ID does not exist"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.103123    4730 scope.go:117] "RemoveContainer" containerID="b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.103332    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e"} err="failed to get container status \"b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\": rpc error: code = NotFound desc = could not find container \"b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e\": container with ID starting with b4f5c77e05a32be063074d70c18bc46f4c2029b85d34df11fb18c01efc962a4e not found: ID does not exist"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.103352    4730 scope.go:117] "RemoveContainer" containerID="35f50ce915b43165e56de90b2b1949244c29bf63fd85bb0583a8a8a6988d8971"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.103540    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35f50ce915b43165e56de90b2b1949244c29bf63fd85bb0583a8a8a6988d8971"} err="failed to get container status \"35f50ce915b43165e56de90b2b1949244c29bf63fd85bb0583a8a8a6988d8971\": rpc error: code = NotFound desc = could not find container \"35f50ce915b43165e56de90b2b1949244c29bf63fd85bb0583a8a8a6988d8971\": container with ID starting with 35f50ce915b43165e56de90b2b1949244c29bf63fd85bb0583a8a8a6988d8971 not found: ID does not exist"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.188180    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-qcz52"
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.904019    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-flvb5" event={"ID":"393a72ce-ab43-44d6-a484-dcc1ebe1d48e","Type":"ContainerStarted","Data":"4f347eecb49e07026cb50a573ae9c5d4c53f7a4603237c65d54a0fdfd4858a44"}
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.904069    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-flvb5" event={"ID":"393a72ce-ab43-44d6-a484-dcc1ebe1d48e","Type":"ContainerStarted","Data":"2ab08c4edbe9028a68f66af0a72993b98c675bde8ba4640f4a0b2b282b102670"}
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.904082    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-flvb5" event={"ID":"393a72ce-ab43-44d6-a484-dcc1ebe1d48e","Type":"ContainerStarted","Data":"97d950c2bd1fbbd5113ee9e71caf8d3c07baa224f6adba510779486093e3446e"}
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.904097    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-flvb5" event={"ID":"393a72ce-ab43-44d6-a484-dcc1ebe1d48e","Type":"ContainerStarted","Data":"768a32271ccd8aa0722cd68031532fb2b37937e35174a39bc4adae7f42ac2791"}
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.904112    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-flvb5" event={"ID":"393a72ce-ab43-44d6-a484-dcc1ebe1d48e","Type":"ContainerStarted","Data":"477093231ce6aa0e8da323f4ec33551f38f030880e24fb80c9ebe379d61ac84e"}
Mar 20 15:51:56 crc kubenswrapper[4730]: I0320 15:51:56.904126    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-flvb5" event={"ID":"393a72ce-ab43-44d6-a484-dcc1ebe1d48e","Type":"ContainerStarted","Data":"3ac79110b694027c697e932f2d307bb2538e95dacd292559442a53d8b8abbf9e"}
Mar 20 15:51:57 crc kubenswrapper[4730]: I0320 15:51:57.541354    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4b4e0e8-af33-491e-b1d1-31079d90c656" path="/var/lib/kubelet/pods/c4b4e0e8-af33-491e-b1d1-31079d90c656/volumes"
Mar 20 15:51:58 crc kubenswrapper[4730]: I0320 15:51:58.921869    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-flvb5" event={"ID":"393a72ce-ab43-44d6-a484-dcc1ebe1d48e","Type":"ContainerStarted","Data":"721c4967a7e85d772035ccd640b4f1e1162cf4dfc1a938e8efa29daa3dfb1191"}
Mar 20 15:52:00 crc kubenswrapper[4730]: I0320 15:52:00.131235    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567032-mfvhl"]
Mar 20 15:52:00 crc kubenswrapper[4730]: I0320 15:52:00.132967    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567032-mfvhl"
Mar 20 15:52:00 crc kubenswrapper[4730]: I0320 15:52:00.135721    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt"
Mar 20 15:52:00 crc kubenswrapper[4730]: I0320 15:52:00.135834    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt"
Mar 20 15:52:00 crc kubenswrapper[4730]: I0320 15:52:00.136113    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vlqbl"
Mar 20 15:52:00 crc kubenswrapper[4730]: I0320 15:52:00.223578    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7twt2\" (UniqueName: \"kubernetes.io/projected/76deb34d-7c3d-4510-9b0a-ac56dcca047a-kube-api-access-7twt2\") pod \"auto-csr-approver-29567032-mfvhl\" (UID: \"76deb34d-7c3d-4510-9b0a-ac56dcca047a\") " pod="openshift-infra/auto-csr-approver-29567032-mfvhl"
Mar 20 15:52:00 crc kubenswrapper[4730]: I0320 15:52:00.324798    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7twt2\" (UniqueName: \"kubernetes.io/projected/76deb34d-7c3d-4510-9b0a-ac56dcca047a-kube-api-access-7twt2\") pod \"auto-csr-approver-29567032-mfvhl\" (UID: \"76deb34d-7c3d-4510-9b0a-ac56dcca047a\") " pod="openshift-infra/auto-csr-approver-29567032-mfvhl"
Mar 20 15:52:00 crc kubenswrapper[4730]: I0320 15:52:00.346312    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7twt2\" (UniqueName: \"kubernetes.io/projected/76deb34d-7c3d-4510-9b0a-ac56dcca047a-kube-api-access-7twt2\") pod \"auto-csr-approver-29567032-mfvhl\" (UID: \"76deb34d-7c3d-4510-9b0a-ac56dcca047a\") " pod="openshift-infra/auto-csr-approver-29567032-mfvhl"
Mar 20 15:52:00 crc kubenswrapper[4730]: I0320 15:52:00.446439    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567032-mfvhl"
Mar 20 15:52:00 crc kubenswrapper[4730]: E0320 15:52:00.474328    4730 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29567032-mfvhl_openshift-infra_76deb34d-7c3d-4510-9b0a-ac56dcca047a_0(1e43ef10257d0b2bec80ba1eda076867fcaea9dd7cdf1804d016930de389ed6d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"
Mar 20 15:52:00 crc kubenswrapper[4730]: E0320 15:52:00.474718    4730 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29567032-mfvhl_openshift-infra_76deb34d-7c3d-4510-9b0a-ac56dcca047a_0(1e43ef10257d0b2bec80ba1eda076867fcaea9dd7cdf1804d016930de389ed6d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29567032-mfvhl"
Mar 20 15:52:00 crc kubenswrapper[4730]: E0320 15:52:00.474751    4730 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29567032-mfvhl_openshift-infra_76deb34d-7c3d-4510-9b0a-ac56dcca047a_0(1e43ef10257d0b2bec80ba1eda076867fcaea9dd7cdf1804d016930de389ed6d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29567032-mfvhl"
Mar 20 15:52:00 crc kubenswrapper[4730]: E0320 15:52:00.474834    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29567032-mfvhl_openshift-infra(76deb34d-7c3d-4510-9b0a-ac56dcca047a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29567032-mfvhl_openshift-infra(76deb34d-7c3d-4510-9b0a-ac56dcca047a)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29567032-mfvhl_openshift-infra_76deb34d-7c3d-4510-9b0a-ac56dcca047a_0(1e43ef10257d0b2bec80ba1eda076867fcaea9dd7cdf1804d016930de389ed6d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29567032-mfvhl" podUID="76deb34d-7c3d-4510-9b0a-ac56dcca047a"
Mar 20 15:52:01 crc kubenswrapper[4730]: I0320 15:52:01.828012    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567032-mfvhl"]
Mar 20 15:52:01 crc kubenswrapper[4730]: I0320 15:52:01.828126    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567032-mfvhl"
Mar 20 15:52:01 crc kubenswrapper[4730]: I0320 15:52:01.828496    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567032-mfvhl"
Mar 20 15:52:01 crc kubenswrapper[4730]: E0320 15:52:01.875863    4730 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29567032-mfvhl_openshift-infra_76deb34d-7c3d-4510-9b0a-ac56dcca047a_0(ff8e835d53c597d8a6f74ecb7d426d47a474bfb647a1d54cdda58e584ec73301): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"
Mar 20 15:52:01 crc kubenswrapper[4730]: E0320 15:52:01.875925    4730 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29567032-mfvhl_openshift-infra_76deb34d-7c3d-4510-9b0a-ac56dcca047a_0(ff8e835d53c597d8a6f74ecb7d426d47a474bfb647a1d54cdda58e584ec73301): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29567032-mfvhl"
Mar 20 15:52:01 crc kubenswrapper[4730]: E0320 15:52:01.875946    4730 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29567032-mfvhl_openshift-infra_76deb34d-7c3d-4510-9b0a-ac56dcca047a_0(ff8e835d53c597d8a6f74ecb7d426d47a474bfb647a1d54cdda58e584ec73301): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29567032-mfvhl"
Mar 20 15:52:01 crc kubenswrapper[4730]: E0320 15:52:01.875989    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29567032-mfvhl_openshift-infra(76deb34d-7c3d-4510-9b0a-ac56dcca047a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29567032-mfvhl_openshift-infra(76deb34d-7c3d-4510-9b0a-ac56dcca047a)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29567032-mfvhl_openshift-infra_76deb34d-7c3d-4510-9b0a-ac56dcca047a_0(ff8e835d53c597d8a6f74ecb7d426d47a474bfb647a1d54cdda58e584ec73301): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29567032-mfvhl" podUID="76deb34d-7c3d-4510-9b0a-ac56dcca047a"
Mar 20 15:52:01 crc kubenswrapper[4730]: I0320 15:52:01.940875    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-flvb5" event={"ID":"393a72ce-ab43-44d6-a484-dcc1ebe1d48e","Type":"ContainerStarted","Data":"4e4212c062016e6ed95d31e2a1a1ed5789ecd5f6e3b2eaf969b29780913865ae"}
Mar 20 15:52:01 crc kubenswrapper[4730]: I0320 15:52:01.941101    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:52:01 crc kubenswrapper[4730]: I0320 15:52:01.941273    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:52:01 crc kubenswrapper[4730]: I0320 15:52:01.941318    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:52:01 crc kubenswrapper[4730]: I0320 15:52:01.966574    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:52:01 crc kubenswrapper[4730]: I0320 15:52:01.968783    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:52:01 crc kubenswrapper[4730]: I0320 15:52:01.976818    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-flvb5" podStartSLOduration=6.976800493 podStartE2EDuration="6.976800493s" podCreationTimestamp="2026-03-20 15:51:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:52:01.971713519 +0000 UTC m=+781.185084898" watchObservedRunningTime="2026-03-20 15:52:01.976800493 +0000 UTC m=+781.190171872"
Mar 20 15:52:10 crc kubenswrapper[4730]: I0320 15:52:10.533580    4730 scope.go:117] "RemoveContainer" containerID="b07ba8437e9756f6cb976900c9db574ebb08c12f74c7cd2c86009c95fccf5b7e"
Mar 20 15:52:10 crc kubenswrapper[4730]: E0320 15:52:10.534773    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-6r2kn_openshift-multus(6f97b1f1-1fad-44ec-8253-17dd6a5eee54)\"" pod="openshift-multus/multus-6r2kn" podUID="6f97b1f1-1fad-44ec-8253-17dd6a5eee54"
Mar 20 15:52:14 crc kubenswrapper[4730]: I0320 15:52:14.533218    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567032-mfvhl"
Mar 20 15:52:14 crc kubenswrapper[4730]: I0320 15:52:14.534339    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567032-mfvhl"
Mar 20 15:52:14 crc kubenswrapper[4730]: E0320 15:52:14.569918    4730 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29567032-mfvhl_openshift-infra_76deb34d-7c3d-4510-9b0a-ac56dcca047a_0(82a14b4a8cd3a44a3643adb049ebc35ea750447c7fd00fbcf7de17d6022499dc): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"
Mar 20 15:52:14 crc kubenswrapper[4730]: E0320 15:52:14.570010    4730 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29567032-mfvhl_openshift-infra_76deb34d-7c3d-4510-9b0a-ac56dcca047a_0(82a14b4a8cd3a44a3643adb049ebc35ea750447c7fd00fbcf7de17d6022499dc): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29567032-mfvhl"
Mar 20 15:52:14 crc kubenswrapper[4730]: E0320 15:52:14.570047    4730 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29567032-mfvhl_openshift-infra_76deb34d-7c3d-4510-9b0a-ac56dcca047a_0(82a14b4a8cd3a44a3643adb049ebc35ea750447c7fd00fbcf7de17d6022499dc): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29567032-mfvhl"
Mar 20 15:52:14 crc kubenswrapper[4730]: E0320 15:52:14.570122    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29567032-mfvhl_openshift-infra(76deb34d-7c3d-4510-9b0a-ac56dcca047a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29567032-mfvhl_openshift-infra(76deb34d-7c3d-4510-9b0a-ac56dcca047a)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29567032-mfvhl_openshift-infra_76deb34d-7c3d-4510-9b0a-ac56dcca047a_0(82a14b4a8cd3a44a3643adb049ebc35ea750447c7fd00fbcf7de17d6022499dc): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29567032-mfvhl" podUID="76deb34d-7c3d-4510-9b0a-ac56dcca047a"
Mar 20 15:52:22 crc kubenswrapper[4730]: I0320 15:52:22.005937    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv"]
Mar 20 15:52:22 crc kubenswrapper[4730]: I0320 15:52:22.009336    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv"
Mar 20 15:52:22 crc kubenswrapper[4730]: I0320 15:52:22.012394    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc"
Mar 20 15:52:22 crc kubenswrapper[4730]: I0320 15:52:22.019600    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv"]
Mar 20 15:52:22 crc kubenswrapper[4730]: I0320 15:52:22.039323    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/25d50abe-8eeb-4761-83b7-d9e7fbb78a76-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv\" (UID: \"25d50abe-8eeb-4761-83b7-d9e7fbb78a76\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv"
Mar 20 15:52:22 crc kubenswrapper[4730]: I0320 15:52:22.039388    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/25d50abe-8eeb-4761-83b7-d9e7fbb78a76-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv\" (UID: \"25d50abe-8eeb-4761-83b7-d9e7fbb78a76\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv"
Mar 20 15:52:22 crc kubenswrapper[4730]: I0320 15:52:22.039710    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zls7g\" (UniqueName: \"kubernetes.io/projected/25d50abe-8eeb-4761-83b7-d9e7fbb78a76-kube-api-access-zls7g\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv\" (UID: \"25d50abe-8eeb-4761-83b7-d9e7fbb78a76\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv"
Mar 20 15:52:22 crc kubenswrapper[4730]: I0320 15:52:22.140180    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/25d50abe-8eeb-4761-83b7-d9e7fbb78a76-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv\" (UID: \"25d50abe-8eeb-4761-83b7-d9e7fbb78a76\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv"
Mar 20 15:52:22 crc kubenswrapper[4730]: I0320 15:52:22.140301    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zls7g\" (UniqueName: \"kubernetes.io/projected/25d50abe-8eeb-4761-83b7-d9e7fbb78a76-kube-api-access-zls7g\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv\" (UID: \"25d50abe-8eeb-4761-83b7-d9e7fbb78a76\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv"
Mar 20 15:52:22 crc kubenswrapper[4730]: I0320 15:52:22.140345    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/25d50abe-8eeb-4761-83b7-d9e7fbb78a76-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv\" (UID: \"25d50abe-8eeb-4761-83b7-d9e7fbb78a76\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv"
Mar 20 15:52:22 crc kubenswrapper[4730]: I0320 15:52:22.140794    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/25d50abe-8eeb-4761-83b7-d9e7fbb78a76-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv\" (UID: \"25d50abe-8eeb-4761-83b7-d9e7fbb78a76\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv"
Mar 20 15:52:22 crc kubenswrapper[4730]: I0320 15:52:22.140808    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/25d50abe-8eeb-4761-83b7-d9e7fbb78a76-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv\" (UID: \"25d50abe-8eeb-4761-83b7-d9e7fbb78a76\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv"
Mar 20 15:52:22 crc kubenswrapper[4730]: I0320 15:52:22.157746    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zls7g\" (UniqueName: \"kubernetes.io/projected/25d50abe-8eeb-4761-83b7-d9e7fbb78a76-kube-api-access-zls7g\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv\" (UID: \"25d50abe-8eeb-4761-83b7-d9e7fbb78a76\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv"
Mar 20 15:52:22 crc kubenswrapper[4730]: I0320 15:52:22.333429    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv"
Mar 20 15:52:22 crc kubenswrapper[4730]: E0320 15:52:22.366721    4730 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv_openshift-marketplace_25d50abe-8eeb-4761-83b7-d9e7fbb78a76_0(d912d4bf6d2f446fec5eb826fae2313378f1d34dfd35d7dacff120a1355be493): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"
Mar 20 15:52:22 crc kubenswrapper[4730]: E0320 15:52:22.366799    4730 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv_openshift-marketplace_25d50abe-8eeb-4761-83b7-d9e7fbb78a76_0(d912d4bf6d2f446fec5eb826fae2313378f1d34dfd35d7dacff120a1355be493): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv"
Mar 20 15:52:22 crc kubenswrapper[4730]: E0320 15:52:22.366828    4730 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv_openshift-marketplace_25d50abe-8eeb-4761-83b7-d9e7fbb78a76_0(d912d4bf6d2f446fec5eb826fae2313378f1d34dfd35d7dacff120a1355be493): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv"
Mar 20 15:52:22 crc kubenswrapper[4730]: E0320 15:52:22.366898    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv_openshift-marketplace(25d50abe-8eeb-4761-83b7-d9e7fbb78a76)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv_openshift-marketplace(25d50abe-8eeb-4761-83b7-d9e7fbb78a76)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv_openshift-marketplace_25d50abe-8eeb-4761-83b7-d9e7fbb78a76_0(d912d4bf6d2f446fec5eb826fae2313378f1d34dfd35d7dacff120a1355be493): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv" podUID="25d50abe-8eeb-4761-83b7-d9e7fbb78a76"
Mar 20 15:52:22 crc kubenswrapper[4730]: I0320 15:52:22.533725    4730 scope.go:117] "RemoveContainer" containerID="b07ba8437e9756f6cb976900c9db574ebb08c12f74c7cd2c86009c95fccf5b7e"
Mar 20 15:52:23 crc kubenswrapper[4730]: I0320 15:52:23.090154    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6r2kn_6f97b1f1-1fad-44ec-8253-17dd6a5eee54/kube-multus/2.log"
Mar 20 15:52:23 crc kubenswrapper[4730]: I0320 15:52:23.091066    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6r2kn_6f97b1f1-1fad-44ec-8253-17dd6a5eee54/kube-multus/1.log"
Mar 20 15:52:23 crc kubenswrapper[4730]: I0320 15:52:23.091184    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv"
Mar 20 15:52:23 crc kubenswrapper[4730]: I0320 15:52:23.091190    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6r2kn" event={"ID":"6f97b1f1-1fad-44ec-8253-17dd6a5eee54","Type":"ContainerStarted","Data":"242e721c0e4a2c44f93a6e9eb81955d21f775c8c5592f6a79b8fff79bb41b348"}
Mar 20 15:52:23 crc kubenswrapper[4730]: I0320 15:52:23.091708    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv"
Mar 20 15:52:23 crc kubenswrapper[4730]: E0320 15:52:23.117509    4730 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv_openshift-marketplace_25d50abe-8eeb-4761-83b7-d9e7fbb78a76_0(7e370178d2cdf7fe20f426bf26f3d266dfb432fd522d96c37432b06f044fd2e5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"
Mar 20 15:52:23 crc kubenswrapper[4730]: E0320 15:52:23.117627    4730 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv_openshift-marketplace_25d50abe-8eeb-4761-83b7-d9e7fbb78a76_0(7e370178d2cdf7fe20f426bf26f3d266dfb432fd522d96c37432b06f044fd2e5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv"
Mar 20 15:52:23 crc kubenswrapper[4730]: E0320 15:52:23.117666    4730 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv_openshift-marketplace_25d50abe-8eeb-4761-83b7-d9e7fbb78a76_0(7e370178d2cdf7fe20f426bf26f3d266dfb432fd522d96c37432b06f044fd2e5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv"
Mar 20 15:52:23 crc kubenswrapper[4730]: E0320 15:52:23.117749    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv_openshift-marketplace(25d50abe-8eeb-4761-83b7-d9e7fbb78a76)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv_openshift-marketplace(25d50abe-8eeb-4761-83b7-d9e7fbb78a76)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv_openshift-marketplace_25d50abe-8eeb-4761-83b7-d9e7fbb78a76_0(7e370178d2cdf7fe20f426bf26f3d266dfb432fd522d96c37432b06f044fd2e5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv" podUID="25d50abe-8eeb-4761-83b7-d9e7fbb78a76"
Mar 20 15:52:25 crc kubenswrapper[4730]: I0320 15:52:25.621776    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-flvb5"
Mar 20 15:52:27 crc kubenswrapper[4730]: I0320 15:52:27.532232    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567032-mfvhl"
Mar 20 15:52:27 crc kubenswrapper[4730]: I0320 15:52:27.532664    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567032-mfvhl"
Mar 20 15:52:27 crc kubenswrapper[4730]: I0320 15:52:27.735657    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567032-mfvhl"]
Mar 20 15:52:28 crc kubenswrapper[4730]: I0320 15:52:28.120282    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567032-mfvhl" event={"ID":"76deb34d-7c3d-4510-9b0a-ac56dcca047a","Type":"ContainerStarted","Data":"ee01fb471a06970161acecb5413bd9bdf6dd1361a6f3d424489465099a9f3c85"}
Mar 20 15:52:28 crc kubenswrapper[4730]: I0320 15:52:28.304544    4730 scope.go:117] "RemoveContainer" containerID="12ba423ea0fecce8b2416cc8f75f3323980aae80a20ff26bd2f9a6c4cd464812"
Mar 20 15:52:29 crc kubenswrapper[4730]: I0320 15:52:29.130074    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6r2kn_6f97b1f1-1fad-44ec-8253-17dd6a5eee54/kube-multus/2.log"
Mar 20 15:52:34 crc kubenswrapper[4730]: I0320 15:52:34.163031    4730 generic.go:334] "Generic (PLEG): container finished" podID="76deb34d-7c3d-4510-9b0a-ac56dcca047a" containerID="3950e99a8167c1c32630e01067078d701c75fdf49d8f8666a31a81f7a02ba1d9" exitCode=0
Mar 20 15:52:34 crc kubenswrapper[4730]: I0320 15:52:34.163147    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567032-mfvhl" event={"ID":"76deb34d-7c3d-4510-9b0a-ac56dcca047a","Type":"ContainerDied","Data":"3950e99a8167c1c32630e01067078d701c75fdf49d8f8666a31a81f7a02ba1d9"}
Mar 20 15:52:35 crc kubenswrapper[4730]: I0320 15:52:35.416030    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567032-mfvhl"
Mar 20 15:52:35 crc kubenswrapper[4730]: I0320 15:52:35.514621    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7twt2\" (UniqueName: \"kubernetes.io/projected/76deb34d-7c3d-4510-9b0a-ac56dcca047a-kube-api-access-7twt2\") pod \"76deb34d-7c3d-4510-9b0a-ac56dcca047a\" (UID: \"76deb34d-7c3d-4510-9b0a-ac56dcca047a\") "
Mar 20 15:52:35 crc kubenswrapper[4730]: I0320 15:52:35.521325    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76deb34d-7c3d-4510-9b0a-ac56dcca047a-kube-api-access-7twt2" (OuterVolumeSpecName: "kube-api-access-7twt2") pod "76deb34d-7c3d-4510-9b0a-ac56dcca047a" (UID: "76deb34d-7c3d-4510-9b0a-ac56dcca047a"). InnerVolumeSpecName "kube-api-access-7twt2". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:52:35 crc kubenswrapper[4730]: I0320 15:52:35.532601    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv"
Mar 20 15:52:35 crc kubenswrapper[4730]: I0320 15:52:35.533266    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv"
Mar 20 15:52:35 crc kubenswrapper[4730]: I0320 15:52:35.616332    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7twt2\" (UniqueName: \"kubernetes.io/projected/76deb34d-7c3d-4510-9b0a-ac56dcca047a-kube-api-access-7twt2\") on node \"crc\" DevicePath \"\""
Mar 20 15:52:35 crc kubenswrapper[4730]: I0320 15:52:35.956483    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv"]
Mar 20 15:52:35 crc kubenswrapper[4730]: W0320 15:52:35.958061    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25d50abe_8eeb_4761_83b7_d9e7fbb78a76.slice/crio-fdd701b9554689bd678971a8747cd36abc6d9ba91fdad5521ca11546c733c133 WatchSource:0}: Error finding container fdd701b9554689bd678971a8747cd36abc6d9ba91fdad5521ca11546c733c133: Status 404 returned error can't find the container with id fdd701b9554689bd678971a8747cd36abc6d9ba91fdad5521ca11546c733c133
Mar 20 15:52:36 crc kubenswrapper[4730]: I0320 15:52:36.177314    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv" event={"ID":"25d50abe-8eeb-4761-83b7-d9e7fbb78a76","Type":"ContainerStarted","Data":"5da831e15da1bf9d6a82541d2c31ca02cc9597c72f920b0b4ecf41d73364d6da"}
Mar 20 15:52:36 crc kubenswrapper[4730]: I0320 15:52:36.177366    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv" event={"ID":"25d50abe-8eeb-4761-83b7-d9e7fbb78a76","Type":"ContainerStarted","Data":"fdd701b9554689bd678971a8747cd36abc6d9ba91fdad5521ca11546c733c133"}
Mar 20 15:52:36 crc kubenswrapper[4730]: I0320 15:52:36.179103    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567032-mfvhl" event={"ID":"76deb34d-7c3d-4510-9b0a-ac56dcca047a","Type":"ContainerDied","Data":"ee01fb471a06970161acecb5413bd9bdf6dd1361a6f3d424489465099a9f3c85"}
Mar 20 15:52:36 crc kubenswrapper[4730]: I0320 15:52:36.179125    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee01fb471a06970161acecb5413bd9bdf6dd1361a6f3d424489465099a9f3c85"
Mar 20 15:52:36 crc kubenswrapper[4730]: I0320 15:52:36.179177    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567032-mfvhl"
Mar 20 15:52:36 crc kubenswrapper[4730]: I0320 15:52:36.475218    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567026-x7pgl"]
Mar 20 15:52:36 crc kubenswrapper[4730]: I0320 15:52:36.479957    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567026-x7pgl"]
Mar 20 15:52:37 crc kubenswrapper[4730]: I0320 15:52:37.188628    4730 generic.go:334] "Generic (PLEG): container finished" podID="25d50abe-8eeb-4761-83b7-d9e7fbb78a76" containerID="5da831e15da1bf9d6a82541d2c31ca02cc9597c72f920b0b4ecf41d73364d6da" exitCode=0
Mar 20 15:52:37 crc kubenswrapper[4730]: I0320 15:52:37.188693    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv" event={"ID":"25d50abe-8eeb-4761-83b7-d9e7fbb78a76","Type":"ContainerDied","Data":"5da831e15da1bf9d6a82541d2c31ca02cc9597c72f920b0b4ecf41d73364d6da"}
Mar 20 15:52:37 crc kubenswrapper[4730]: I0320 15:52:37.546364    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2acd72a0-988c-4c58-a7b4-c139ee0f6ef1" path="/var/lib/kubelet/pods/2acd72a0-988c-4c58-a7b4-c139ee0f6ef1/volumes"
Mar 20 15:52:39 crc kubenswrapper[4730]: I0320 15:52:39.204576    4730 generic.go:334] "Generic (PLEG): container finished" podID="25d50abe-8eeb-4761-83b7-d9e7fbb78a76" containerID="e2b1cec905a13e9cee0c6111466f9589c7922fe5557b9c7783d948bda532c401" exitCode=0
Mar 20 15:52:39 crc kubenswrapper[4730]: I0320 15:52:39.204628    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv" event={"ID":"25d50abe-8eeb-4761-83b7-d9e7fbb78a76","Type":"ContainerDied","Data":"e2b1cec905a13e9cee0c6111466f9589c7922fe5557b9c7783d948bda532c401"}
Mar 20 15:52:40 crc kubenswrapper[4730]: I0320 15:52:40.214097    4730 generic.go:334] "Generic (PLEG): container finished" podID="25d50abe-8eeb-4761-83b7-d9e7fbb78a76" containerID="ee7c77ecaf8a998b304c004face78076e67a8e4d2d4e920833aeb1d421d33584" exitCode=0
Mar 20 15:52:40 crc kubenswrapper[4730]: I0320 15:52:40.214469    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv" event={"ID":"25d50abe-8eeb-4761-83b7-d9e7fbb78a76","Type":"ContainerDied","Data":"ee7c77ecaf8a998b304c004face78076e67a8e4d2d4e920833aeb1d421d33584"}
Mar 20 15:52:41 crc kubenswrapper[4730]: I0320 15:52:41.500017    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv"
Mar 20 15:52:41 crc kubenswrapper[4730]: I0320 15:52:41.595006    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/25d50abe-8eeb-4761-83b7-d9e7fbb78a76-bundle\") pod \"25d50abe-8eeb-4761-83b7-d9e7fbb78a76\" (UID: \"25d50abe-8eeb-4761-83b7-d9e7fbb78a76\") "
Mar 20 15:52:41 crc kubenswrapper[4730]: I0320 15:52:41.595127    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/25d50abe-8eeb-4761-83b7-d9e7fbb78a76-util\") pod \"25d50abe-8eeb-4761-83b7-d9e7fbb78a76\" (UID: \"25d50abe-8eeb-4761-83b7-d9e7fbb78a76\") "
Mar 20 15:52:41 crc kubenswrapper[4730]: I0320 15:52:41.595364    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zls7g\" (UniqueName: \"kubernetes.io/projected/25d50abe-8eeb-4761-83b7-d9e7fbb78a76-kube-api-access-zls7g\") pod \"25d50abe-8eeb-4761-83b7-d9e7fbb78a76\" (UID: \"25d50abe-8eeb-4761-83b7-d9e7fbb78a76\") "
Mar 20 15:52:41 crc kubenswrapper[4730]: I0320 15:52:41.598623    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25d50abe-8eeb-4761-83b7-d9e7fbb78a76-bundle" (OuterVolumeSpecName: "bundle") pod "25d50abe-8eeb-4761-83b7-d9e7fbb78a76" (UID: "25d50abe-8eeb-4761-83b7-d9e7fbb78a76"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 15:52:41 crc kubenswrapper[4730]: I0320 15:52:41.602057    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25d50abe-8eeb-4761-83b7-d9e7fbb78a76-kube-api-access-zls7g" (OuterVolumeSpecName: "kube-api-access-zls7g") pod "25d50abe-8eeb-4761-83b7-d9e7fbb78a76" (UID: "25d50abe-8eeb-4761-83b7-d9e7fbb78a76"). InnerVolumeSpecName "kube-api-access-zls7g". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:52:41 crc kubenswrapper[4730]: I0320 15:52:41.697436    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zls7g\" (UniqueName: \"kubernetes.io/projected/25d50abe-8eeb-4761-83b7-d9e7fbb78a76-kube-api-access-zls7g\") on node \"crc\" DevicePath \"\""
Mar 20 15:52:41 crc kubenswrapper[4730]: I0320 15:52:41.697477    4730 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/25d50abe-8eeb-4761-83b7-d9e7fbb78a76-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 15:52:41 crc kubenswrapper[4730]: I0320 15:52:41.951449    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25d50abe-8eeb-4761-83b7-d9e7fbb78a76-util" (OuterVolumeSpecName: "util") pod "25d50abe-8eeb-4761-83b7-d9e7fbb78a76" (UID: "25d50abe-8eeb-4761-83b7-d9e7fbb78a76"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 15:52:42 crc kubenswrapper[4730]: I0320 15:52:42.001549    4730 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/25d50abe-8eeb-4761-83b7-d9e7fbb78a76-util\") on node \"crc\" DevicePath \"\""
Mar 20 15:52:42 crc kubenswrapper[4730]: I0320 15:52:42.230574    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv" event={"ID":"25d50abe-8eeb-4761-83b7-d9e7fbb78a76","Type":"ContainerDied","Data":"fdd701b9554689bd678971a8747cd36abc6d9ba91fdad5521ca11546c733c133"}
Mar 20 15:52:42 crc kubenswrapper[4730]: I0320 15:52:42.230631    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdd701b9554689bd678971a8747cd36abc6d9ba91fdad5521ca11546c733c133"
Mar 20 15:52:42 crc kubenswrapper[4730]: I0320 15:52:42.230756    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv"
Mar 20 15:52:50 crc kubenswrapper[4730]: I0320 15:52:50.819029    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-w67vt"]
Mar 20 15:52:50 crc kubenswrapper[4730]: E0320 15:52:50.819868    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25d50abe-8eeb-4761-83b7-d9e7fbb78a76" containerName="pull"
Mar 20 15:52:50 crc kubenswrapper[4730]: I0320 15:52:50.819884    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="25d50abe-8eeb-4761-83b7-d9e7fbb78a76" containerName="pull"
Mar 20 15:52:50 crc kubenswrapper[4730]: E0320 15:52:50.819897    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25d50abe-8eeb-4761-83b7-d9e7fbb78a76" containerName="util"
Mar 20 15:52:50 crc kubenswrapper[4730]: I0320 15:52:50.819905    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="25d50abe-8eeb-4761-83b7-d9e7fbb78a76" containerName="util"
Mar 20 15:52:50 crc kubenswrapper[4730]: E0320 15:52:50.819929    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76deb34d-7c3d-4510-9b0a-ac56dcca047a" containerName="oc"
Mar 20 15:52:50 crc kubenswrapper[4730]: I0320 15:52:50.819939    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="76deb34d-7c3d-4510-9b0a-ac56dcca047a" containerName="oc"
Mar 20 15:52:50 crc kubenswrapper[4730]: E0320 15:52:50.819949    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25d50abe-8eeb-4761-83b7-d9e7fbb78a76" containerName="extract"
Mar 20 15:52:50 crc kubenswrapper[4730]: I0320 15:52:50.819957    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="25d50abe-8eeb-4761-83b7-d9e7fbb78a76" containerName="extract"
Mar 20 15:52:50 crc kubenswrapper[4730]: I0320 15:52:50.820079    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="25d50abe-8eeb-4761-83b7-d9e7fbb78a76" containerName="extract"
Mar 20 15:52:50 crc kubenswrapper[4730]: I0320 15:52:50.820094    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="76deb34d-7c3d-4510-9b0a-ac56dcca047a" containerName="oc"
Mar 20 15:52:50 crc kubenswrapper[4730]: I0320 15:52:50.820562    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-w67vt"
Mar 20 15:52:50 crc kubenswrapper[4730]: I0320 15:52:50.848021    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-94qns"
Mar 20 15:52:50 crc kubenswrapper[4730]: I0320 15:52:50.848232    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt"
Mar 20 15:52:50 crc kubenswrapper[4730]: I0320 15:52:50.849050    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt"
Mar 20 15:52:50 crc kubenswrapper[4730]: I0320 15:52:50.855752    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-w67vt"]
Mar 20 15:52:50 crc kubenswrapper[4730]: I0320 15:52:50.914214    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fvnp\" (UniqueName: \"kubernetes.io/projected/5db89423-34f0-46c3-9dcf-2179c6c6f42a-kube-api-access-9fvnp\") pod \"obo-prometheus-operator-8ff7d675-w67vt\" (UID: \"5db89423-34f0-46c3-9dcf-2179c6c6f42a\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-w67vt"
Mar 20 15:52:51 crc kubenswrapper[4730]: I0320 15:52:51.014962    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fvnp\" (UniqueName: \"kubernetes.io/projected/5db89423-34f0-46c3-9dcf-2179c6c6f42a-kube-api-access-9fvnp\") pod \"obo-prometheus-operator-8ff7d675-w67vt\" (UID: \"5db89423-34f0-46c3-9dcf-2179c6c6f42a\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-w67vt"
Mar 20 15:52:51 crc kubenswrapper[4730]: I0320 15:52:51.046136    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fvnp\" (UniqueName: \"kubernetes.io/projected/5db89423-34f0-46c3-9dcf-2179c6c6f42a-kube-api-access-9fvnp\") pod \"obo-prometheus-operator-8ff7d675-w67vt\" (UID: \"5db89423-34f0-46c3-9dcf-2179c6c6f42a\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-w67vt"
Mar 20 15:52:51 crc kubenswrapper[4730]: I0320 15:52:51.137894    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-w67vt"
Mar 20 15:52:51 crc kubenswrapper[4730]: I0320 15:52:51.241776    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-85c477fd8f-d52bp"]
Mar 20 15:52:51 crc kubenswrapper[4730]: I0320 15:52:51.242749    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-85c477fd8f-d52bp"
Mar 20 15:52:51 crc kubenswrapper[4730]: I0320 15:52:51.245630    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert"
Mar 20 15:52:51 crc kubenswrapper[4730]: I0320 15:52:51.246141    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-msdtd"
Mar 20 15:52:51 crc kubenswrapper[4730]: I0320 15:52:51.268300    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-85c477fd8f-d52bp"]
Mar 20 15:52:51 crc kubenswrapper[4730]: I0320 15:52:51.274382    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-85c477fd8f-7ttsl"]
Mar 20 15:52:51 crc kubenswrapper[4730]: I0320 15:52:51.275236    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-85c477fd8f-7ttsl"
Mar 20 15:52:51 crc kubenswrapper[4730]: I0320 15:52:51.283074    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-85c477fd8f-7ttsl"]
Mar 20 15:52:51 crc kubenswrapper[4730]: I0320 15:52:51.317805    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c12d2a2b-f7db-41be-89e1-97869c8119c2-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-85c477fd8f-d52bp\" (UID: \"c12d2a2b-f7db-41be-89e1-97869c8119c2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-85c477fd8f-d52bp"
Mar 20 15:52:51 crc kubenswrapper[4730]: I0320 15:52:51.317926    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c12d2a2b-f7db-41be-89e1-97869c8119c2-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-85c477fd8f-d52bp\" (UID: \"c12d2a2b-f7db-41be-89e1-97869c8119c2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-85c477fd8f-d52bp"
Mar 20 15:52:51 crc kubenswrapper[4730]: I0320 15:52:51.409971    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-w67vt"]
Mar 20 15:52:51 crc kubenswrapper[4730]: I0320 15:52:51.429849    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c12d2a2b-f7db-41be-89e1-97869c8119c2-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-85c477fd8f-d52bp\" (UID: \"c12d2a2b-f7db-41be-89e1-97869c8119c2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-85c477fd8f-d52bp"
Mar 20 15:52:51 crc kubenswrapper[4730]: I0320 15:52:51.429947    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7520ba92-5020-48d1-8d1c-fa20f0f407be-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-85c477fd8f-7ttsl\" (UID: \"7520ba92-5020-48d1-8d1c-fa20f0f407be\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-85c477fd8f-7ttsl"
Mar 20 15:52:51 crc kubenswrapper[4730]: I0320 15:52:51.429986    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7520ba92-5020-48d1-8d1c-fa20f0f407be-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-85c477fd8f-7ttsl\" (UID: \"7520ba92-5020-48d1-8d1c-fa20f0f407be\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-85c477fd8f-7ttsl"
Mar 20 15:52:51 crc kubenswrapper[4730]: I0320 15:52:51.430027    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c12d2a2b-f7db-41be-89e1-97869c8119c2-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-85c477fd8f-d52bp\" (UID: \"c12d2a2b-f7db-41be-89e1-97869c8119c2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-85c477fd8f-d52bp"
Mar 20 15:52:51 crc kubenswrapper[4730]: I0320 15:52:51.437237    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c12d2a2b-f7db-41be-89e1-97869c8119c2-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-85c477fd8f-d52bp\" (UID: \"c12d2a2b-f7db-41be-89e1-97869c8119c2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-85c477fd8f-d52bp"
Mar 20 15:52:51 crc kubenswrapper[4730]: I0320 15:52:51.438284    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c12d2a2b-f7db-41be-89e1-97869c8119c2-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-85c477fd8f-d52bp\" (UID: \"c12d2a2b-f7db-41be-89e1-97869c8119c2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-85c477fd8f-d52bp"
Mar 20 15:52:51 crc kubenswrapper[4730]: I0320 15:52:51.531074    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7520ba92-5020-48d1-8d1c-fa20f0f407be-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-85c477fd8f-7ttsl\" (UID: \"7520ba92-5020-48d1-8d1c-fa20f0f407be\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-85c477fd8f-7ttsl"
Mar 20 15:52:51 crc kubenswrapper[4730]: I0320 15:52:51.531438    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7520ba92-5020-48d1-8d1c-fa20f0f407be-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-85c477fd8f-7ttsl\" (UID: \"7520ba92-5020-48d1-8d1c-fa20f0f407be\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-85c477fd8f-7ttsl"
Mar 20 15:52:51 crc kubenswrapper[4730]: I0320 15:52:51.534464    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7520ba92-5020-48d1-8d1c-fa20f0f407be-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-85c477fd8f-7ttsl\" (UID: \"7520ba92-5020-48d1-8d1c-fa20f0f407be\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-85c477fd8f-7ttsl"
Mar 20 15:52:51 crc kubenswrapper[4730]: I0320 15:52:51.536837    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7520ba92-5020-48d1-8d1c-fa20f0f407be-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-85c477fd8f-7ttsl\" (UID: \"7520ba92-5020-48d1-8d1c-fa20f0f407be\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-85c477fd8f-7ttsl"
Mar 20 15:52:51 crc kubenswrapper[4730]: I0320 15:52:51.563559    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-nh5dg"]
Mar 20 15:52:51 crc kubenswrapper[4730]: I0320 15:52:51.564536    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-nh5dg"
Mar 20 15:52:51 crc kubenswrapper[4730]: I0320 15:52:51.568521    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls"
Mar 20 15:52:51 crc kubenswrapper[4730]: I0320 15:52:51.568891    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-x7fg9"
Mar 20 15:52:51 crc kubenswrapper[4730]: I0320 15:52:51.572503    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-85c477fd8f-d52bp"
Mar 20 15:52:51 crc kubenswrapper[4730]: I0320 15:52:51.582214    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-nh5dg"]
Mar 20 15:52:51 crc kubenswrapper[4730]: I0320 15:52:51.628505    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-85c477fd8f-7ttsl"
Mar 20 15:52:51 crc kubenswrapper[4730]: I0320 15:52:51.633092    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/28a4594d-a811-4533-8d77-40267a80c581-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-nh5dg\" (UID: \"28a4594d-a811-4533-8d77-40267a80c581\") " pod="openshift-operators/observability-operator-6dd7dd855f-nh5dg"
Mar 20 15:52:51 crc kubenswrapper[4730]: I0320 15:52:51.634052    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-422mh\" (UniqueName: \"kubernetes.io/projected/28a4594d-a811-4533-8d77-40267a80c581-kube-api-access-422mh\") pod \"observability-operator-6dd7dd855f-nh5dg\" (UID: \"28a4594d-a811-4533-8d77-40267a80c581\") " pod="openshift-operators/observability-operator-6dd7dd855f-nh5dg"
Mar 20 15:52:51 crc kubenswrapper[4730]: I0320 15:52:51.735964    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/28a4594d-a811-4533-8d77-40267a80c581-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-nh5dg\" (UID: \"28a4594d-a811-4533-8d77-40267a80c581\") " pod="openshift-operators/observability-operator-6dd7dd855f-nh5dg"
Mar 20 15:52:51 crc kubenswrapper[4730]: I0320 15:52:51.736385    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-422mh\" (UniqueName: \"kubernetes.io/projected/28a4594d-a811-4533-8d77-40267a80c581-kube-api-access-422mh\") pod \"observability-operator-6dd7dd855f-nh5dg\" (UID: \"28a4594d-a811-4533-8d77-40267a80c581\") " pod="openshift-operators/observability-operator-6dd7dd855f-nh5dg"
Mar 20 15:52:51 crc kubenswrapper[4730]: I0320 15:52:51.740074    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/28a4594d-a811-4533-8d77-40267a80c581-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-nh5dg\" (UID: \"28a4594d-a811-4533-8d77-40267a80c581\") " pod="openshift-operators/observability-operator-6dd7dd855f-nh5dg"
Mar 20 15:52:51 crc kubenswrapper[4730]: I0320 15:52:51.755715    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-422mh\" (UniqueName: \"kubernetes.io/projected/28a4594d-a811-4533-8d77-40267a80c581-kube-api-access-422mh\") pod \"observability-operator-6dd7dd855f-nh5dg\" (UID: \"28a4594d-a811-4533-8d77-40267a80c581\") " pod="openshift-operators/observability-operator-6dd7dd855f-nh5dg"
Mar 20 15:52:51 crc kubenswrapper[4730]: I0320 15:52:51.861667    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-85c477fd8f-d52bp"]
Mar 20 15:52:51 crc kubenswrapper[4730]: I0320 15:52:51.889308    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-nh5dg"
Mar 20 15:52:51 crc kubenswrapper[4730]: I0320 15:52:51.955449    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-85c477fd8f-7ttsl"]
Mar 20 15:52:51 crc kubenswrapper[4730]: W0320 15:52:51.975977    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7520ba92_5020_48d1_8d1c_fa20f0f407be.slice/crio-aae66df47fa8f48155e7be17a32c8324af3e7932ebdd68336912748a089e7057 WatchSource:0}: Error finding container aae66df47fa8f48155e7be17a32c8324af3e7932ebdd68336912748a089e7057: Status 404 returned error can't find the container with id aae66df47fa8f48155e7be17a32c8324af3e7932ebdd68336912748a089e7057
Mar 20 15:52:52 crc kubenswrapper[4730]: I0320 15:52:52.049783    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-6b8b4f7dbd-pmzhq"]
Mar 20 15:52:52 crc kubenswrapper[4730]: I0320 15:52:52.050509    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-6b8b4f7dbd-pmzhq"
Mar 20 15:52:52 crc kubenswrapper[4730]: I0320 15:52:52.055537    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-service-cert"
Mar 20 15:52:52 crc kubenswrapper[4730]: I0320 15:52:52.055676    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-cj57w"
Mar 20 15:52:52 crc kubenswrapper[4730]: I0320 15:52:52.096940    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-6b8b4f7dbd-pmzhq"]
Mar 20 15:52:52 crc kubenswrapper[4730]: I0320 15:52:52.146115    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/50aad4a2-a828-49d9-9bb3-115336081293-apiservice-cert\") pod \"perses-operator-6b8b4f7dbd-pmzhq\" (UID: \"50aad4a2-a828-49d9-9bb3-115336081293\") " pod="openshift-operators/perses-operator-6b8b4f7dbd-pmzhq"
Mar 20 15:52:52 crc kubenswrapper[4730]: I0320 15:52:52.146203    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/50aad4a2-a828-49d9-9bb3-115336081293-openshift-service-ca\") pod \"perses-operator-6b8b4f7dbd-pmzhq\" (UID: \"50aad4a2-a828-49d9-9bb3-115336081293\") " pod="openshift-operators/perses-operator-6b8b4f7dbd-pmzhq"
Mar 20 15:52:52 crc kubenswrapper[4730]: I0320 15:52:52.146280    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/50aad4a2-a828-49d9-9bb3-115336081293-webhook-cert\") pod \"perses-operator-6b8b4f7dbd-pmzhq\" (UID: \"50aad4a2-a828-49d9-9bb3-115336081293\") " pod="openshift-operators/perses-operator-6b8b4f7dbd-pmzhq"
Mar 20 15:52:52 crc kubenswrapper[4730]: I0320 15:52:52.146309    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfpb9\" (UniqueName: \"kubernetes.io/projected/50aad4a2-a828-49d9-9bb3-115336081293-kube-api-access-gfpb9\") pod \"perses-operator-6b8b4f7dbd-pmzhq\" (UID: \"50aad4a2-a828-49d9-9bb3-115336081293\") " pod="openshift-operators/perses-operator-6b8b4f7dbd-pmzhq"
Mar 20 15:52:52 crc kubenswrapper[4730]: I0320 15:52:52.153596    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-nh5dg"]
Mar 20 15:52:52 crc kubenswrapper[4730]: W0320 15:52:52.160654    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28a4594d_a811_4533_8d77_40267a80c581.slice/crio-d7cc550886367b76d3e6989f1295f791d379fae6bc9b1b9d5a180e3b7b8e1af3 WatchSource:0}: Error finding container d7cc550886367b76d3e6989f1295f791d379fae6bc9b1b9d5a180e3b7b8e1af3: Status 404 returned error can't find the container with id d7cc550886367b76d3e6989f1295f791d379fae6bc9b1b9d5a180e3b7b8e1af3
Mar 20 15:52:52 crc kubenswrapper[4730]: I0320 15:52:52.247561    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/50aad4a2-a828-49d9-9bb3-115336081293-apiservice-cert\") pod \"perses-operator-6b8b4f7dbd-pmzhq\" (UID: \"50aad4a2-a828-49d9-9bb3-115336081293\") " pod="openshift-operators/perses-operator-6b8b4f7dbd-pmzhq"
Mar 20 15:52:52 crc kubenswrapper[4730]: I0320 15:52:52.247605    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/50aad4a2-a828-49d9-9bb3-115336081293-openshift-service-ca\") pod \"perses-operator-6b8b4f7dbd-pmzhq\" (UID: \"50aad4a2-a828-49d9-9bb3-115336081293\") " pod="openshift-operators/perses-operator-6b8b4f7dbd-pmzhq"
Mar 20 15:52:52 crc kubenswrapper[4730]: I0320 15:52:52.247637    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/50aad4a2-a828-49d9-9bb3-115336081293-webhook-cert\") pod \"perses-operator-6b8b4f7dbd-pmzhq\" (UID: \"50aad4a2-a828-49d9-9bb3-115336081293\") " pod="openshift-operators/perses-operator-6b8b4f7dbd-pmzhq"
Mar 20 15:52:52 crc kubenswrapper[4730]: I0320 15:52:52.247659    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfpb9\" (UniqueName: \"kubernetes.io/projected/50aad4a2-a828-49d9-9bb3-115336081293-kube-api-access-gfpb9\") pod \"perses-operator-6b8b4f7dbd-pmzhq\" (UID: \"50aad4a2-a828-49d9-9bb3-115336081293\") " pod="openshift-operators/perses-operator-6b8b4f7dbd-pmzhq"
Mar 20 15:52:52 crc kubenswrapper[4730]: I0320 15:52:52.248539    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/50aad4a2-a828-49d9-9bb3-115336081293-openshift-service-ca\") pod \"perses-operator-6b8b4f7dbd-pmzhq\" (UID: \"50aad4a2-a828-49d9-9bb3-115336081293\") " pod="openshift-operators/perses-operator-6b8b4f7dbd-pmzhq"
Mar 20 15:52:52 crc kubenswrapper[4730]: I0320 15:52:52.253392    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/50aad4a2-a828-49d9-9bb3-115336081293-apiservice-cert\") pod \"perses-operator-6b8b4f7dbd-pmzhq\" (UID: \"50aad4a2-a828-49d9-9bb3-115336081293\") " pod="openshift-operators/perses-operator-6b8b4f7dbd-pmzhq"
Mar 20 15:52:52 crc kubenswrapper[4730]: I0320 15:52:52.264010    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/50aad4a2-a828-49d9-9bb3-115336081293-webhook-cert\") pod \"perses-operator-6b8b4f7dbd-pmzhq\" (UID: \"50aad4a2-a828-49d9-9bb3-115336081293\") " pod="openshift-operators/perses-operator-6b8b4f7dbd-pmzhq"
Mar 20 15:52:52 crc kubenswrapper[4730]: I0320 15:52:52.274937    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfpb9\" (UniqueName: \"kubernetes.io/projected/50aad4a2-a828-49d9-9bb3-115336081293-kube-api-access-gfpb9\") pod \"perses-operator-6b8b4f7dbd-pmzhq\" (UID: \"50aad4a2-a828-49d9-9bb3-115336081293\") " pod="openshift-operators/perses-operator-6b8b4f7dbd-pmzhq"
Mar 20 15:52:52 crc kubenswrapper[4730]: I0320 15:52:52.290465    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-6dd7dd855f-nh5dg" event={"ID":"28a4594d-a811-4533-8d77-40267a80c581","Type":"ContainerStarted","Data":"d7cc550886367b76d3e6989f1295f791d379fae6bc9b1b9d5a180e3b7b8e1af3"}
Mar 20 15:52:52 crc kubenswrapper[4730]: I0320 15:52:52.291438    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-8ff7d675-w67vt" event={"ID":"5db89423-34f0-46c3-9dcf-2179c6c6f42a","Type":"ContainerStarted","Data":"812c1ec4f28711c2f69e7a6eaa82c5c42f9ab5eefb0d889061a47292630d3101"}
Mar 20 15:52:52 crc kubenswrapper[4730]: I0320 15:52:52.292519    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-85c477fd8f-d52bp" event={"ID":"c12d2a2b-f7db-41be-89e1-97869c8119c2","Type":"ContainerStarted","Data":"2748a7a4f4fc3bb838f537be59e8e63e9fae98e41e8853f3b704c7a55ffc4554"}
Mar 20 15:52:52 crc kubenswrapper[4730]: I0320 15:52:52.293504    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-85c477fd8f-7ttsl" event={"ID":"7520ba92-5020-48d1-8d1c-fa20f0f407be","Type":"ContainerStarted","Data":"aae66df47fa8f48155e7be17a32c8324af3e7932ebdd68336912748a089e7057"}
Mar 20 15:52:52 crc kubenswrapper[4730]: I0320 15:52:52.395232    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-6b8b4f7dbd-pmzhq"
Mar 20 15:52:52 crc kubenswrapper[4730]: I0320 15:52:52.582108    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-6b8b4f7dbd-pmzhq"]
Mar 20 15:52:52 crc kubenswrapper[4730]: W0320 15:52:52.588893    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50aad4a2_a828_49d9_9bb3_115336081293.slice/crio-ae6611d5cf24af0aa2c26d567d53fdc5b2d3e9c0ed92191af4ac29cf066b840f WatchSource:0}: Error finding container ae6611d5cf24af0aa2c26d567d53fdc5b2d3e9c0ed92191af4ac29cf066b840f: Status 404 returned error can't find the container with id ae6611d5cf24af0aa2c26d567d53fdc5b2d3e9c0ed92191af4ac29cf066b840f
Mar 20 15:52:53 crc kubenswrapper[4730]: I0320 15:52:53.185970    4730 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt"
Mar 20 15:52:53 crc kubenswrapper[4730]: I0320 15:52:53.300121    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-6b8b4f7dbd-pmzhq" event={"ID":"50aad4a2-a828-49d9-9bb3-115336081293","Type":"ContainerStarted","Data":"ae6611d5cf24af0aa2c26d567d53fdc5b2d3e9c0ed92191af4ac29cf066b840f"}
Mar 20 15:53:03 crc kubenswrapper[4730]: I0320 15:53:03.414602    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-6dd7dd855f-nh5dg" event={"ID":"28a4594d-a811-4533-8d77-40267a80c581","Type":"ContainerStarted","Data":"54eda06c2dd0c542ff66769d245e46a214a066c5fb29a8837847261e5595fced"}
Mar 20 15:53:03 crc kubenswrapper[4730]: I0320 15:53:03.415119    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-6dd7dd855f-nh5dg"
Mar 20 15:53:03 crc kubenswrapper[4730]: I0320 15:53:03.418290    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-6b8b4f7dbd-pmzhq" event={"ID":"50aad4a2-a828-49d9-9bb3-115336081293","Type":"ContainerStarted","Data":"756db39c91a41f4d6b97f77ab6b32f7be09b7bff7ffc8c60c7ca44b3aaf2e42d"}
Mar 20 15:53:03 crc kubenswrapper[4730]: I0320 15:53:03.418372    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-6b8b4f7dbd-pmzhq"
Mar 20 15:53:03 crc kubenswrapper[4730]: I0320 15:53:03.419836    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-8ff7d675-w67vt" event={"ID":"5db89423-34f0-46c3-9dcf-2179c6c6f42a","Type":"ContainerStarted","Data":"003e343a94dd002b77923c0bf46adad79ef3ccdd07ba9e75c36ca001f3f83214"}
Mar 20 15:53:03 crc kubenswrapper[4730]: I0320 15:53:03.421163    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-85c477fd8f-d52bp" event={"ID":"c12d2a2b-f7db-41be-89e1-97869c8119c2","Type":"ContainerStarted","Data":"b77e7de2158f5672a943cd651cb96e011621e76175ed6cd68e1c500505b74a87"}
Mar 20 15:53:03 crc kubenswrapper[4730]: I0320 15:53:03.422550    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-85c477fd8f-7ttsl" event={"ID":"7520ba92-5020-48d1-8d1c-fa20f0f407be","Type":"ContainerStarted","Data":"93c5db9de95b72bb550f0e835cfd0d2bc9c44293ec8c3368581c658a985dae9a"}
Mar 20 15:53:03 crc kubenswrapper[4730]: I0320 15:53:03.432890    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-6dd7dd855f-nh5dg" podStartSLOduration=2.007650781 podStartE2EDuration="12.43287103s" podCreationTimestamp="2026-03-20 15:52:51 +0000 UTC" firstStartedPulling="2026-03-20 15:52:52.163471061 +0000 UTC m=+831.376842430" lastFinishedPulling="2026-03-20 15:53:02.58869131 +0000 UTC m=+841.802062679" observedRunningTime="2026-03-20 15:53:03.430465122 +0000 UTC m=+842.643836481" watchObservedRunningTime="2026-03-20 15:53:03.43287103 +0000 UTC m=+842.646242399"
Mar 20 15:53:03 crc kubenswrapper[4730]: I0320 15:53:03.446534    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-8ff7d675-w67vt" podStartSLOduration=2.357008322 podStartE2EDuration="13.446515297s" podCreationTimestamp="2026-03-20 15:52:50 +0000 UTC" firstStartedPulling="2026-03-20 15:52:51.44048397 +0000 UTC m=+830.653855349" lastFinishedPulling="2026-03-20 15:53:02.529990955 +0000 UTC m=+841.743362324" observedRunningTime="2026-03-20 15:53:03.44590969 +0000 UTC m=+842.659281059" watchObservedRunningTime="2026-03-20 15:53:03.446515297 +0000 UTC m=+842.659886656"
Mar 20 15:53:03 crc kubenswrapper[4730]: I0320 15:53:03.478544    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-85c477fd8f-d52bp" podStartSLOduration=1.835109015 podStartE2EDuration="12.478524825s" podCreationTimestamp="2026-03-20 15:52:51 +0000 UTC" firstStartedPulling="2026-03-20 15:52:51.888475619 +0000 UTC m=+831.101846998" lastFinishedPulling="2026-03-20 15:53:02.531891439 +0000 UTC m=+841.745262808" observedRunningTime="2026-03-20 15:53:03.473398889 +0000 UTC m=+842.686770278" watchObservedRunningTime="2026-03-20 15:53:03.478524825 +0000 UTC m=+842.691896194"
Mar 20 15:53:03 crc kubenswrapper[4730]: I0320 15:53:03.506645    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-6dd7dd855f-nh5dg"
Mar 20 15:53:03 crc kubenswrapper[4730]: I0320 15:53:03.513504    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-85c477fd8f-7ttsl" podStartSLOduration=1.9424532110000001 podStartE2EDuration="12.513482497s" podCreationTimestamp="2026-03-20 15:52:51 +0000 UTC" firstStartedPulling="2026-03-20 15:52:51.979371868 +0000 UTC m=+831.192743237" lastFinishedPulling="2026-03-20 15:53:02.550401154 +0000 UTC m=+841.763772523" observedRunningTime="2026-03-20 15:53:03.507981061 +0000 UTC m=+842.721352420" watchObservedRunningTime="2026-03-20 15:53:03.513482497 +0000 UTC m=+842.726853866"
Mar 20 15:53:03 crc kubenswrapper[4730]: I0320 15:53:03.526778    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-6b8b4f7dbd-pmzhq" podStartSLOduration=1.5637438160000001 podStartE2EDuration="11.526761203s" podCreationTimestamp="2026-03-20 15:52:52 +0000 UTC" firstStartedPulling="2026-03-20 15:52:52.591267667 +0000 UTC m=+831.804639036" lastFinishedPulling="2026-03-20 15:53:02.554285054 +0000 UTC m=+841.767656423" observedRunningTime="2026-03-20 15:53:03.526734013 +0000 UTC m=+842.740105382" watchObservedRunningTime="2026-03-20 15:53:03.526761203 +0000 UTC m=+842.740132562"
Mar 20 15:53:12 crc kubenswrapper[4730]: I0320 15:53:12.398238    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-6b8b4f7dbd-pmzhq"
Mar 20 15:53:28 crc kubenswrapper[4730]: I0320 15:53:28.350086    4730 scope.go:117] "RemoveContainer" containerID="b5ebe6b01434979e266e3872ff5405b028a732d1dd5830a3d6f3ad270518946a"
Mar 20 15:53:28 crc kubenswrapper[4730]: I0320 15:53:28.927445    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ft8z9"]
Mar 20 15:53:28 crc kubenswrapper[4730]: I0320 15:53:28.928494    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ft8z9"
Mar 20 15:53:28 crc kubenswrapper[4730]: I0320 15:53:28.930320    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc"
Mar 20 15:53:28 crc kubenswrapper[4730]: I0320 15:53:28.943515    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ft8z9"]
Mar 20 15:53:29 crc kubenswrapper[4730]: I0320 15:53:29.061322    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6caa320c-cdca-4f52-aac0-b5c3325396db-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ft8z9\" (UID: \"6caa320c-cdca-4f52-aac0-b5c3325396db\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ft8z9"
Mar 20 15:53:29 crc kubenswrapper[4730]: I0320 15:53:29.061405    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6caa320c-cdca-4f52-aac0-b5c3325396db-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ft8z9\" (UID: \"6caa320c-cdca-4f52-aac0-b5c3325396db\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ft8z9"
Mar 20 15:53:29 crc kubenswrapper[4730]: I0320 15:53:29.061560    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4k8w4\" (UniqueName: \"kubernetes.io/projected/6caa320c-cdca-4f52-aac0-b5c3325396db-kube-api-access-4k8w4\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ft8z9\" (UID: \"6caa320c-cdca-4f52-aac0-b5c3325396db\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ft8z9"
Mar 20 15:53:29 crc kubenswrapper[4730]: I0320 15:53:29.162807    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6caa320c-cdca-4f52-aac0-b5c3325396db-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ft8z9\" (UID: \"6caa320c-cdca-4f52-aac0-b5c3325396db\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ft8z9"
Mar 20 15:53:29 crc kubenswrapper[4730]: I0320 15:53:29.162864    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4k8w4\" (UniqueName: \"kubernetes.io/projected/6caa320c-cdca-4f52-aac0-b5c3325396db-kube-api-access-4k8w4\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ft8z9\" (UID: \"6caa320c-cdca-4f52-aac0-b5c3325396db\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ft8z9"
Mar 20 15:53:29 crc kubenswrapper[4730]: I0320 15:53:29.162911    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6caa320c-cdca-4f52-aac0-b5c3325396db-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ft8z9\" (UID: \"6caa320c-cdca-4f52-aac0-b5c3325396db\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ft8z9"
Mar 20 15:53:29 crc kubenswrapper[4730]: I0320 15:53:29.163347    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6caa320c-cdca-4f52-aac0-b5c3325396db-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ft8z9\" (UID: \"6caa320c-cdca-4f52-aac0-b5c3325396db\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ft8z9"
Mar 20 15:53:29 crc kubenswrapper[4730]: I0320 15:53:29.163437    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6caa320c-cdca-4f52-aac0-b5c3325396db-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ft8z9\" (UID: \"6caa320c-cdca-4f52-aac0-b5c3325396db\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ft8z9"
Mar 20 15:53:29 crc kubenswrapper[4730]: I0320 15:53:29.181650    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4k8w4\" (UniqueName: \"kubernetes.io/projected/6caa320c-cdca-4f52-aac0-b5c3325396db-kube-api-access-4k8w4\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ft8z9\" (UID: \"6caa320c-cdca-4f52-aac0-b5c3325396db\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ft8z9"
Mar 20 15:53:29 crc kubenswrapper[4730]: I0320 15:53:29.244417    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ft8z9"
Mar 20 15:53:29 crc kubenswrapper[4730]: I0320 15:53:29.676200    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ft8z9"]
Mar 20 15:53:30 crc kubenswrapper[4730]: I0320 15:53:30.576485    4730 generic.go:334] "Generic (PLEG): container finished" podID="6caa320c-cdca-4f52-aac0-b5c3325396db" containerID="f3e0533f6d3c39314c123d1b22008e9474bf3e93a3210aaa3ac390f338322834" exitCode=0
Mar 20 15:53:30 crc kubenswrapper[4730]: I0320 15:53:30.576537    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ft8z9" event={"ID":"6caa320c-cdca-4f52-aac0-b5c3325396db","Type":"ContainerDied","Data":"f3e0533f6d3c39314c123d1b22008e9474bf3e93a3210aaa3ac390f338322834"}
Mar 20 15:53:30 crc kubenswrapper[4730]: I0320 15:53:30.576589    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ft8z9" event={"ID":"6caa320c-cdca-4f52-aac0-b5c3325396db","Type":"ContainerStarted","Data":"1d9fe02310e187ee215b30ee661f2d9f0de87857b32d76f159748ea35fbc0a01"}
Mar 20 15:53:30 crc kubenswrapper[4730]: I0320 15:53:30.578029    4730 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider
Mar 20 15:53:31 crc kubenswrapper[4730]: I0320 15:53:31.305006    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-p4mms"]
Mar 20 15:53:31 crc kubenswrapper[4730]: I0320 15:53:31.310151    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p4mms"
Mar 20 15:53:31 crc kubenswrapper[4730]: I0320 15:53:31.335315    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p4mms"]
Mar 20 15:53:31 crc kubenswrapper[4730]: I0320 15:53:31.596214    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f51e780f-650b-45d8-a2c3-b6b73ce74c61-catalog-content\") pod \"redhat-operators-p4mms\" (UID: \"f51e780f-650b-45d8-a2c3-b6b73ce74c61\") " pod="openshift-marketplace/redhat-operators-p4mms"
Mar 20 15:53:31 crc kubenswrapper[4730]: I0320 15:53:31.596353    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f51e780f-650b-45d8-a2c3-b6b73ce74c61-utilities\") pod \"redhat-operators-p4mms\" (UID: \"f51e780f-650b-45d8-a2c3-b6b73ce74c61\") " pod="openshift-marketplace/redhat-operators-p4mms"
Mar 20 15:53:31 crc kubenswrapper[4730]: I0320 15:53:31.596382    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55wwc\" (UniqueName: \"kubernetes.io/projected/f51e780f-650b-45d8-a2c3-b6b73ce74c61-kube-api-access-55wwc\") pod \"redhat-operators-p4mms\" (UID: \"f51e780f-650b-45d8-a2c3-b6b73ce74c61\") " pod="openshift-marketplace/redhat-operators-p4mms"
Mar 20 15:53:31 crc kubenswrapper[4730]: I0320 15:53:31.697501    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f51e780f-650b-45d8-a2c3-b6b73ce74c61-catalog-content\") pod \"redhat-operators-p4mms\" (UID: \"f51e780f-650b-45d8-a2c3-b6b73ce74c61\") " pod="openshift-marketplace/redhat-operators-p4mms"
Mar 20 15:53:31 crc kubenswrapper[4730]: I0320 15:53:31.697622    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f51e780f-650b-45d8-a2c3-b6b73ce74c61-utilities\") pod \"redhat-operators-p4mms\" (UID: \"f51e780f-650b-45d8-a2c3-b6b73ce74c61\") " pod="openshift-marketplace/redhat-operators-p4mms"
Mar 20 15:53:31 crc kubenswrapper[4730]: I0320 15:53:31.697646    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55wwc\" (UniqueName: \"kubernetes.io/projected/f51e780f-650b-45d8-a2c3-b6b73ce74c61-kube-api-access-55wwc\") pod \"redhat-operators-p4mms\" (UID: \"f51e780f-650b-45d8-a2c3-b6b73ce74c61\") " pod="openshift-marketplace/redhat-operators-p4mms"
Mar 20 15:53:31 crc kubenswrapper[4730]: I0320 15:53:31.698492    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f51e780f-650b-45d8-a2c3-b6b73ce74c61-catalog-content\") pod \"redhat-operators-p4mms\" (UID: \"f51e780f-650b-45d8-a2c3-b6b73ce74c61\") " pod="openshift-marketplace/redhat-operators-p4mms"
Mar 20 15:53:31 crc kubenswrapper[4730]: I0320 15:53:31.698499    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f51e780f-650b-45d8-a2c3-b6b73ce74c61-utilities\") pod \"redhat-operators-p4mms\" (UID: \"f51e780f-650b-45d8-a2c3-b6b73ce74c61\") " pod="openshift-marketplace/redhat-operators-p4mms"
Mar 20 15:53:31 crc kubenswrapper[4730]: I0320 15:53:31.729895    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55wwc\" (UniqueName: \"kubernetes.io/projected/f51e780f-650b-45d8-a2c3-b6b73ce74c61-kube-api-access-55wwc\") pod \"redhat-operators-p4mms\" (UID: \"f51e780f-650b-45d8-a2c3-b6b73ce74c61\") " pod="openshift-marketplace/redhat-operators-p4mms"
Mar 20 15:53:31 crc kubenswrapper[4730]: I0320 15:53:31.938219    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p4mms"
Mar 20 15:53:32 crc kubenswrapper[4730]: I0320 15:53:32.162818    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p4mms"]
Mar 20 15:53:32 crc kubenswrapper[4730]: W0320 15:53:32.174416    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf51e780f_650b_45d8_a2c3_b6b73ce74c61.slice/crio-c07e01328d7c5f71cfb88d67fc57544826757077ff694bb1f7eb716cf5cf5bd7 WatchSource:0}: Error finding container c07e01328d7c5f71cfb88d67fc57544826757077ff694bb1f7eb716cf5cf5bd7: Status 404 returned error can't find the container with id c07e01328d7c5f71cfb88d67fc57544826757077ff694bb1f7eb716cf5cf5bd7
Mar 20 15:53:32 crc kubenswrapper[4730]: I0320 15:53:32.587408    4730 generic.go:334] "Generic (PLEG): container finished" podID="f51e780f-650b-45d8-a2c3-b6b73ce74c61" containerID="a9b15d45bb34224d6766d66172ba74c36da9471e707ae2a90cea12d0f5f9aca2" exitCode=0
Mar 20 15:53:32 crc kubenswrapper[4730]: I0320 15:53:32.587487    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p4mms" event={"ID":"f51e780f-650b-45d8-a2c3-b6b73ce74c61","Type":"ContainerDied","Data":"a9b15d45bb34224d6766d66172ba74c36da9471e707ae2a90cea12d0f5f9aca2"}
Mar 20 15:53:32 crc kubenswrapper[4730]: I0320 15:53:32.587523    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p4mms" event={"ID":"f51e780f-650b-45d8-a2c3-b6b73ce74c61","Type":"ContainerStarted","Data":"c07e01328d7c5f71cfb88d67fc57544826757077ff694bb1f7eb716cf5cf5bd7"}
Mar 20 15:53:32 crc kubenswrapper[4730]: I0320 15:53:32.590139    4730 generic.go:334] "Generic (PLEG): container finished" podID="6caa320c-cdca-4f52-aac0-b5c3325396db" containerID="55a20d9cb2b67f689bac1999ea91b76b3f6b8df389a04a628d2ce58c841ccc10" exitCode=0
Mar 20 15:53:32 crc kubenswrapper[4730]: I0320 15:53:32.590186    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ft8z9" event={"ID":"6caa320c-cdca-4f52-aac0-b5c3325396db","Type":"ContainerDied","Data":"55a20d9cb2b67f689bac1999ea91b76b3f6b8df389a04a628d2ce58c841ccc10"}
Mar 20 15:53:33 crc kubenswrapper[4730]: I0320 15:53:33.598711    4730 generic.go:334] "Generic (PLEG): container finished" podID="6caa320c-cdca-4f52-aac0-b5c3325396db" containerID="f50153e25970fd5b818891231f77a23e0800ff162b8e53679d315a70b67ad26d" exitCode=0
Mar 20 15:53:33 crc kubenswrapper[4730]: I0320 15:53:33.598813    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ft8z9" event={"ID":"6caa320c-cdca-4f52-aac0-b5c3325396db","Type":"ContainerDied","Data":"f50153e25970fd5b818891231f77a23e0800ff162b8e53679d315a70b67ad26d"}
Mar 20 15:53:33 crc kubenswrapper[4730]: I0320 15:53:33.601047    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p4mms" event={"ID":"f51e780f-650b-45d8-a2c3-b6b73ce74c61","Type":"ContainerStarted","Data":"f6fd4d176812b359067e665f23250f950ab5e29084994d678328733032dac222"}
Mar 20 15:53:34 crc kubenswrapper[4730]: I0320 15:53:34.608448    4730 generic.go:334] "Generic (PLEG): container finished" podID="f51e780f-650b-45d8-a2c3-b6b73ce74c61" containerID="f6fd4d176812b359067e665f23250f950ab5e29084994d678328733032dac222" exitCode=0
Mar 20 15:53:34 crc kubenswrapper[4730]: I0320 15:53:34.609419    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p4mms" event={"ID":"f51e780f-650b-45d8-a2c3-b6b73ce74c61","Type":"ContainerDied","Data":"f6fd4d176812b359067e665f23250f950ab5e29084994d678328733032dac222"}
Mar 20 15:53:34 crc kubenswrapper[4730]: I0320 15:53:34.846117    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ft8z9"
Mar 20 15:53:35 crc kubenswrapper[4730]: I0320 15:53:35.036878    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6caa320c-cdca-4f52-aac0-b5c3325396db-bundle\") pod \"6caa320c-cdca-4f52-aac0-b5c3325396db\" (UID: \"6caa320c-cdca-4f52-aac0-b5c3325396db\") "
Mar 20 15:53:35 crc kubenswrapper[4730]: I0320 15:53:35.036918    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6caa320c-cdca-4f52-aac0-b5c3325396db-util\") pod \"6caa320c-cdca-4f52-aac0-b5c3325396db\" (UID: \"6caa320c-cdca-4f52-aac0-b5c3325396db\") "
Mar 20 15:53:35 crc kubenswrapper[4730]: I0320 15:53:35.036982    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4k8w4\" (UniqueName: \"kubernetes.io/projected/6caa320c-cdca-4f52-aac0-b5c3325396db-kube-api-access-4k8w4\") pod \"6caa320c-cdca-4f52-aac0-b5c3325396db\" (UID: \"6caa320c-cdca-4f52-aac0-b5c3325396db\") "
Mar 20 15:53:35 crc kubenswrapper[4730]: I0320 15:53:35.037594    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6caa320c-cdca-4f52-aac0-b5c3325396db-bundle" (OuterVolumeSpecName: "bundle") pod "6caa320c-cdca-4f52-aac0-b5c3325396db" (UID: "6caa320c-cdca-4f52-aac0-b5c3325396db"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 15:53:35 crc kubenswrapper[4730]: I0320 15:53:35.043422    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6caa320c-cdca-4f52-aac0-b5c3325396db-kube-api-access-4k8w4" (OuterVolumeSpecName: "kube-api-access-4k8w4") pod "6caa320c-cdca-4f52-aac0-b5c3325396db" (UID: "6caa320c-cdca-4f52-aac0-b5c3325396db"). InnerVolumeSpecName "kube-api-access-4k8w4". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:53:35 crc kubenswrapper[4730]: I0320 15:53:35.050546    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6caa320c-cdca-4f52-aac0-b5c3325396db-util" (OuterVolumeSpecName: "util") pod "6caa320c-cdca-4f52-aac0-b5c3325396db" (UID: "6caa320c-cdca-4f52-aac0-b5c3325396db"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 15:53:35 crc kubenswrapper[4730]: I0320 15:53:35.138746    4730 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6caa320c-cdca-4f52-aac0-b5c3325396db-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 15:53:35 crc kubenswrapper[4730]: I0320 15:53:35.138807    4730 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6caa320c-cdca-4f52-aac0-b5c3325396db-util\") on node \"crc\" DevicePath \"\""
Mar 20 15:53:35 crc kubenswrapper[4730]: I0320 15:53:35.138833    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4k8w4\" (UniqueName: \"kubernetes.io/projected/6caa320c-cdca-4f52-aac0-b5c3325396db-kube-api-access-4k8w4\") on node \"crc\" DevicePath \"\""
Mar 20 15:53:35 crc kubenswrapper[4730]: I0320 15:53:35.630461    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p4mms" event={"ID":"f51e780f-650b-45d8-a2c3-b6b73ce74c61","Type":"ContainerStarted","Data":"dab2914bb52dccdd630edde42af76c8a7122f9d5ae1d800cd9af1744c1e01631"}
Mar 20 15:53:35 crc kubenswrapper[4730]: I0320 15:53:35.637502    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ft8z9" event={"ID":"6caa320c-cdca-4f52-aac0-b5c3325396db","Type":"ContainerDied","Data":"1d9fe02310e187ee215b30ee661f2d9f0de87857b32d76f159748ea35fbc0a01"}
Mar 20 15:53:35 crc kubenswrapper[4730]: I0320 15:53:35.637719    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d9fe02310e187ee215b30ee661f2d9f0de87857b32d76f159748ea35fbc0a01"
Mar 20 15:53:35 crc kubenswrapper[4730]: I0320 15:53:35.637913    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ft8z9"
Mar 20 15:53:35 crc kubenswrapper[4730]: I0320 15:53:35.657833    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-p4mms" podStartSLOduration=2.110061269 podStartE2EDuration="4.657817117s" podCreationTimestamp="2026-03-20 15:53:31 +0000 UTC" firstStartedPulling="2026-03-20 15:53:32.589079423 +0000 UTC m=+871.802450792" lastFinishedPulling="2026-03-20 15:53:35.136835231 +0000 UTC m=+874.350206640" observedRunningTime="2026-03-20 15:53:35.65445351 +0000 UTC m=+874.867824879" watchObservedRunningTime="2026-03-20 15:53:35.657817117 +0000 UTC m=+874.871188486"
Mar 20 15:53:37 crc kubenswrapper[4730]: I0320 15:53:37.881400    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-2qcpg"]
Mar 20 15:53:37 crc kubenswrapper[4730]: E0320 15:53:37.882028    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6caa320c-cdca-4f52-aac0-b5c3325396db" containerName="util"
Mar 20 15:53:37 crc kubenswrapper[4730]: I0320 15:53:37.882047    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="6caa320c-cdca-4f52-aac0-b5c3325396db" containerName="util"
Mar 20 15:53:37 crc kubenswrapper[4730]: E0320 15:53:37.882077    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6caa320c-cdca-4f52-aac0-b5c3325396db" containerName="pull"
Mar 20 15:53:37 crc kubenswrapper[4730]: I0320 15:53:37.882088    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="6caa320c-cdca-4f52-aac0-b5c3325396db" containerName="pull"
Mar 20 15:53:37 crc kubenswrapper[4730]: E0320 15:53:37.882114    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6caa320c-cdca-4f52-aac0-b5c3325396db" containerName="extract"
Mar 20 15:53:37 crc kubenswrapper[4730]: I0320 15:53:37.882125    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="6caa320c-cdca-4f52-aac0-b5c3325396db" containerName="extract"
Mar 20 15:53:37 crc kubenswrapper[4730]: I0320 15:53:37.882302    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="6caa320c-cdca-4f52-aac0-b5c3325396db" containerName="extract"
Mar 20 15:53:37 crc kubenswrapper[4730]: I0320 15:53:37.882906    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-2qcpg"
Mar 20 15:53:37 crc kubenswrapper[4730]: I0320 15:53:37.886038    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt"
Mar 20 15:53:37 crc kubenswrapper[4730]: I0320 15:53:37.886058    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-cwsbf"
Mar 20 15:53:37 crc kubenswrapper[4730]: I0320 15:53:37.886422    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt"
Mar 20 15:53:37 crc kubenswrapper[4730]: I0320 15:53:37.908108    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-2qcpg"]
Mar 20 15:53:37 crc kubenswrapper[4730]: I0320 15:53:37.974559    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ssqv\" (UniqueName: \"kubernetes.io/projected/e3bdfb07-3f68-4262-8116-44b5ea591644-kube-api-access-8ssqv\") pod \"nmstate-operator-796d4cfff4-2qcpg\" (UID: \"e3bdfb07-3f68-4262-8116-44b5ea591644\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-2qcpg"
Mar 20 15:53:38 crc kubenswrapper[4730]: I0320 15:53:38.075845    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ssqv\" (UniqueName: \"kubernetes.io/projected/e3bdfb07-3f68-4262-8116-44b5ea591644-kube-api-access-8ssqv\") pod \"nmstate-operator-796d4cfff4-2qcpg\" (UID: \"e3bdfb07-3f68-4262-8116-44b5ea591644\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-2qcpg"
Mar 20 15:53:38 crc kubenswrapper[4730]: I0320 15:53:38.102043    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ssqv\" (UniqueName: \"kubernetes.io/projected/e3bdfb07-3f68-4262-8116-44b5ea591644-kube-api-access-8ssqv\") pod \"nmstate-operator-796d4cfff4-2qcpg\" (UID: \"e3bdfb07-3f68-4262-8116-44b5ea591644\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-2qcpg"
Mar 20 15:53:38 crc kubenswrapper[4730]: I0320 15:53:38.203915    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-2qcpg"
Mar 20 15:53:38 crc kubenswrapper[4730]: I0320 15:53:38.688048    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-2qcpg"]
Mar 20 15:53:39 crc kubenswrapper[4730]: I0320 15:53:39.657445    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-2qcpg" event={"ID":"e3bdfb07-3f68-4262-8116-44b5ea591644","Type":"ContainerStarted","Data":"a4a50bfe819fc07c9c4b0f05d1cb2bbbb4acbed49d72f5c9bb9d4579267c7e17"}
Mar 20 15:53:41 crc kubenswrapper[4730]: I0320 15:53:41.939191    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-p4mms"
Mar 20 15:53:41 crc kubenswrapper[4730]: I0320 15:53:41.939559    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-p4mms"
Mar 20 15:53:41 crc kubenswrapper[4730]: I0320 15:53:41.977830    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-p4mms"
Mar 20 15:53:42 crc kubenswrapper[4730]: I0320 15:53:42.678487    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-2qcpg" event={"ID":"e3bdfb07-3f68-4262-8116-44b5ea591644","Type":"ContainerStarted","Data":"3a876768ab3e96f444d6f3a493c8af504a7a4838ec7ea5e6f6224f5c1a0e4d1f"}
Mar 20 15:53:42 crc kubenswrapper[4730]: I0320 15:53:42.746641    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-p4mms"
Mar 20 15:53:42 crc kubenswrapper[4730]: I0320 15:53:42.769233    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-2qcpg" podStartSLOduration=2.473302517 podStartE2EDuration="5.769209832s" podCreationTimestamp="2026-03-20 15:53:37 +0000 UTC" firstStartedPulling="2026-03-20 15:53:38.703121198 +0000 UTC m=+877.916492567" lastFinishedPulling="2026-03-20 15:53:41.999028513 +0000 UTC m=+881.212399882" observedRunningTime="2026-03-20 15:53:42.709393885 +0000 UTC m=+881.922765264" watchObservedRunningTime="2026-03-20 15:53:42.769209832 +0000 UTC m=+881.982581211"
Mar 20 15:53:42 crc kubenswrapper[4730]: I0320 15:53:42.880037    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 15:53:42 crc kubenswrapper[4730]: I0320 15:53:42.880087    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 15:53:43 crc kubenswrapper[4730]: I0320 15:53:43.723337    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-nfr9k"]
Mar 20 15:53:43 crc kubenswrapper[4730]: I0320 15:53:43.729813    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-nfr9k"
Mar 20 15:53:43 crc kubenswrapper[4730]: W0320 15:53:43.731189    4730 reflector.go:561] object-"openshift-nmstate"/"nmstate-handler-dockercfg-vc8sk": failed to list *v1.Secret: secrets "nmstate-handler-dockercfg-vc8sk" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-nmstate": no relationship found between node 'crc' and this object
Mar 20 15:53:43 crc kubenswrapper[4730]: E0320 15:53:43.731277    4730 reflector.go:158] "Unhandled Error" err="object-\"openshift-nmstate\"/\"nmstate-handler-dockercfg-vc8sk\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"nmstate-handler-dockercfg-vc8sk\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-nmstate\": no relationship found between node 'crc' and this object" logger="UnhandledError"
Mar 20 15:53:43 crc kubenswrapper[4730]: I0320 15:53:43.731331    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-nq6dd"]
Mar 20 15:53:43 crc kubenswrapper[4730]: I0320 15:53:43.732468    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-nq6dd"
Mar 20 15:53:43 crc kubenswrapper[4730]: I0320 15:53:43.739196    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-nfr9k"]
Mar 20 15:53:43 crc kubenswrapper[4730]: I0320 15:53:43.740489    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook"
Mar 20 15:53:43 crc kubenswrapper[4730]: I0320 15:53:43.744986    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-6tdt2"]
Mar 20 15:53:43 crc kubenswrapper[4730]: I0320 15:53:43.761615    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-nq6dd"]
Mar 20 15:53:43 crc kubenswrapper[4730]: I0320 15:53:43.778727    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-6tdt2"
Mar 20 15:53:43 crc kubenswrapper[4730]: I0320 15:53:43.841169    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-nnrp6"]
Mar 20 15:53:43 crc kubenswrapper[4730]: I0320 15:53:43.842089    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-nnrp6"
Mar 20 15:53:43 crc kubenswrapper[4730]: I0320 15:53:43.846537    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert"
Mar 20 15:53:43 crc kubenswrapper[4730]: I0320 15:53:43.846719    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf"
Mar 20 15:53:43 crc kubenswrapper[4730]: I0320 15:53:43.846831    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-s7htn"
Mar 20 15:53:43 crc kubenswrapper[4730]: I0320 15:53:43.851455    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-nnrp6"]
Mar 20 15:53:43 crc kubenswrapper[4730]: I0320 15:53:43.862150    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/0a4f6fcf-7c76-49cf-8f3c-d83879a650f1-dbus-socket\") pod \"nmstate-handler-6tdt2\" (UID: \"0a4f6fcf-7c76-49cf-8f3c-d83879a650f1\") " pod="openshift-nmstate/nmstate-handler-6tdt2"
Mar 20 15:53:43 crc kubenswrapper[4730]: I0320 15:53:43.862202    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/0f827638-33ac-4f99-920b-6e9b72db7955-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-nq6dd\" (UID: \"0f827638-33ac-4f99-920b-6e9b72db7955\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-nq6dd"
Mar 20 15:53:43 crc kubenswrapper[4730]: I0320 15:53:43.862240    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94hgn\" (UniqueName: \"kubernetes.io/projected/0f827638-33ac-4f99-920b-6e9b72db7955-kube-api-access-94hgn\") pod \"nmstate-webhook-5f558f5558-nq6dd\" (UID: \"0f827638-33ac-4f99-920b-6e9b72db7955\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-nq6dd"
Mar 20 15:53:43 crc kubenswrapper[4730]: I0320 15:53:43.862293    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgv4s\" (UniqueName: \"kubernetes.io/projected/0a4f6fcf-7c76-49cf-8f3c-d83879a650f1-kube-api-access-lgv4s\") pod \"nmstate-handler-6tdt2\" (UID: \"0a4f6fcf-7c76-49cf-8f3c-d83879a650f1\") " pod="openshift-nmstate/nmstate-handler-6tdt2"
Mar 20 15:53:43 crc kubenswrapper[4730]: I0320 15:53:43.862338    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/0a4f6fcf-7c76-49cf-8f3c-d83879a650f1-nmstate-lock\") pod \"nmstate-handler-6tdt2\" (UID: \"0a4f6fcf-7c76-49cf-8f3c-d83879a650f1\") " pod="openshift-nmstate/nmstate-handler-6tdt2"
Mar 20 15:53:43 crc kubenswrapper[4730]: I0320 15:53:43.862362    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p9rd\" (UniqueName: \"kubernetes.io/projected/3f50a695-6f8b-42e6-aa4f-3dfd888b6afa-kube-api-access-2p9rd\") pod \"nmstate-metrics-9b8c8685d-nfr9k\" (UID: \"3f50a695-6f8b-42e6-aa4f-3dfd888b6afa\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-nfr9k"
Mar 20 15:53:43 crc kubenswrapper[4730]: I0320 15:53:43.862387    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/0a4f6fcf-7c76-49cf-8f3c-d83879a650f1-ovs-socket\") pod \"nmstate-handler-6tdt2\" (UID: \"0a4f6fcf-7c76-49cf-8f3c-d83879a650f1\") " pod="openshift-nmstate/nmstate-handler-6tdt2"
Mar 20 15:53:43 crc kubenswrapper[4730]: I0320 15:53:43.963112    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94hgn\" (UniqueName: \"kubernetes.io/projected/0f827638-33ac-4f99-920b-6e9b72db7955-kube-api-access-94hgn\") pod \"nmstate-webhook-5f558f5558-nq6dd\" (UID: \"0f827638-33ac-4f99-920b-6e9b72db7955\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-nq6dd"
Mar 20 15:53:43 crc kubenswrapper[4730]: I0320 15:53:43.963167    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgv4s\" (UniqueName: \"kubernetes.io/projected/0a4f6fcf-7c76-49cf-8f3c-d83879a650f1-kube-api-access-lgv4s\") pod \"nmstate-handler-6tdt2\" (UID: \"0a4f6fcf-7c76-49cf-8f3c-d83879a650f1\") " pod="openshift-nmstate/nmstate-handler-6tdt2"
Mar 20 15:53:43 crc kubenswrapper[4730]: I0320 15:53:43.963196    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g5nq\" (UniqueName: \"kubernetes.io/projected/663e9228-322c-4d6a-8988-0033d5dd587a-kube-api-access-7g5nq\") pod \"nmstate-console-plugin-86f58fcf4-nnrp6\" (UID: \"663e9228-322c-4d6a-8988-0033d5dd587a\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-nnrp6"
Mar 20 15:53:43 crc kubenswrapper[4730]: I0320 15:53:43.963217    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/0a4f6fcf-7c76-49cf-8f3c-d83879a650f1-nmstate-lock\") pod \"nmstate-handler-6tdt2\" (UID: \"0a4f6fcf-7c76-49cf-8f3c-d83879a650f1\") " pod="openshift-nmstate/nmstate-handler-6tdt2"
Mar 20 15:53:43 crc kubenswrapper[4730]: I0320 15:53:43.963235    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2p9rd\" (UniqueName: \"kubernetes.io/projected/3f50a695-6f8b-42e6-aa4f-3dfd888b6afa-kube-api-access-2p9rd\") pod \"nmstate-metrics-9b8c8685d-nfr9k\" (UID: \"3f50a695-6f8b-42e6-aa4f-3dfd888b6afa\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-nfr9k"
Mar 20 15:53:43 crc kubenswrapper[4730]: I0320 15:53:43.963294    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/0a4f6fcf-7c76-49cf-8f3c-d83879a650f1-ovs-socket\") pod \"nmstate-handler-6tdt2\" (UID: \"0a4f6fcf-7c76-49cf-8f3c-d83879a650f1\") " pod="openshift-nmstate/nmstate-handler-6tdt2"
Mar 20 15:53:43 crc kubenswrapper[4730]: I0320 15:53:43.963321    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/663e9228-322c-4d6a-8988-0033d5dd587a-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-nnrp6\" (UID: \"663e9228-322c-4d6a-8988-0033d5dd587a\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-nnrp6"
Mar 20 15:53:43 crc kubenswrapper[4730]: I0320 15:53:43.963368    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/663e9228-322c-4d6a-8988-0033d5dd587a-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-nnrp6\" (UID: \"663e9228-322c-4d6a-8988-0033d5dd587a\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-nnrp6"
Mar 20 15:53:43 crc kubenswrapper[4730]: I0320 15:53:43.963397    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/0a4f6fcf-7c76-49cf-8f3c-d83879a650f1-dbus-socket\") pod \"nmstate-handler-6tdt2\" (UID: \"0a4f6fcf-7c76-49cf-8f3c-d83879a650f1\") " pod="openshift-nmstate/nmstate-handler-6tdt2"
Mar 20 15:53:43 crc kubenswrapper[4730]: I0320 15:53:43.963427    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/0f827638-33ac-4f99-920b-6e9b72db7955-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-nq6dd\" (UID: \"0f827638-33ac-4f99-920b-6e9b72db7955\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-nq6dd"
Mar 20 15:53:43 crc kubenswrapper[4730]: I0320 15:53:43.964270    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/0a4f6fcf-7c76-49cf-8f3c-d83879a650f1-ovs-socket\") pod \"nmstate-handler-6tdt2\" (UID: \"0a4f6fcf-7c76-49cf-8f3c-d83879a650f1\") " pod="openshift-nmstate/nmstate-handler-6tdt2"
Mar 20 15:53:43 crc kubenswrapper[4730]: I0320 15:53:43.964432    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/0a4f6fcf-7c76-49cf-8f3c-d83879a650f1-nmstate-lock\") pod \"nmstate-handler-6tdt2\" (UID: \"0a4f6fcf-7c76-49cf-8f3c-d83879a650f1\") " pod="openshift-nmstate/nmstate-handler-6tdt2"
Mar 20 15:53:43 crc kubenswrapper[4730]: I0320 15:53:43.964634    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/0a4f6fcf-7c76-49cf-8f3c-d83879a650f1-dbus-socket\") pod \"nmstate-handler-6tdt2\" (UID: \"0a4f6fcf-7c76-49cf-8f3c-d83879a650f1\") " pod="openshift-nmstate/nmstate-handler-6tdt2"
Mar 20 15:53:43 crc kubenswrapper[4730]: I0320 15:53:43.979576    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/0f827638-33ac-4f99-920b-6e9b72db7955-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-nq6dd\" (UID: \"0f827638-33ac-4f99-920b-6e9b72db7955\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-nq6dd"
Mar 20 15:53:43 crc kubenswrapper[4730]: I0320 15:53:43.987175    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94hgn\" (UniqueName: \"kubernetes.io/projected/0f827638-33ac-4f99-920b-6e9b72db7955-kube-api-access-94hgn\") pod \"nmstate-webhook-5f558f5558-nq6dd\" (UID: \"0f827638-33ac-4f99-920b-6e9b72db7955\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-nq6dd"
Mar 20 15:53:43 crc kubenswrapper[4730]: I0320 15:53:43.987744    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgv4s\" (UniqueName: \"kubernetes.io/projected/0a4f6fcf-7c76-49cf-8f3c-d83879a650f1-kube-api-access-lgv4s\") pod \"nmstate-handler-6tdt2\" (UID: \"0a4f6fcf-7c76-49cf-8f3c-d83879a650f1\") " pod="openshift-nmstate/nmstate-handler-6tdt2"
Mar 20 15:53:43 crc kubenswrapper[4730]: I0320 15:53:43.993070    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p9rd\" (UniqueName: \"kubernetes.io/projected/3f50a695-6f8b-42e6-aa4f-3dfd888b6afa-kube-api-access-2p9rd\") pod \"nmstate-metrics-9b8c8685d-nfr9k\" (UID: \"3f50a695-6f8b-42e6-aa4f-3dfd888b6afa\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-nfr9k"
Mar 20 15:53:44 crc kubenswrapper[4730]: I0320 15:53:44.029108    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-854c989bdc-94fm2"]
Mar 20 15:53:44 crc kubenswrapper[4730]: I0320 15:53:44.029814    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-854c989bdc-94fm2"
Mar 20 15:53:44 crc kubenswrapper[4730]: I0320 15:53:44.048528    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-854c989bdc-94fm2"]
Mar 20 15:53:44 crc kubenswrapper[4730]: I0320 15:53:44.064081    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/663e9228-322c-4d6a-8988-0033d5dd587a-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-nnrp6\" (UID: \"663e9228-322c-4d6a-8988-0033d5dd587a\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-nnrp6"
Mar 20 15:53:44 crc kubenswrapper[4730]: I0320 15:53:44.064158    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7g5nq\" (UniqueName: \"kubernetes.io/projected/663e9228-322c-4d6a-8988-0033d5dd587a-kube-api-access-7g5nq\") pod \"nmstate-console-plugin-86f58fcf4-nnrp6\" (UID: \"663e9228-322c-4d6a-8988-0033d5dd587a\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-nnrp6"
Mar 20 15:53:44 crc kubenswrapper[4730]: I0320 15:53:44.064195    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/663e9228-322c-4d6a-8988-0033d5dd587a-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-nnrp6\" (UID: \"663e9228-322c-4d6a-8988-0033d5dd587a\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-nnrp6"
Mar 20 15:53:44 crc kubenswrapper[4730]: E0320 15:53:44.064513    4730 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found
Mar 20 15:53:44 crc kubenswrapper[4730]: E0320 15:53:44.064656    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/663e9228-322c-4d6a-8988-0033d5dd587a-plugin-serving-cert podName:663e9228-322c-4d6a-8988-0033d5dd587a nodeName:}" failed. No retries permitted until 2026-03-20 15:53:44.564637861 +0000 UTC m=+883.778009230 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/663e9228-322c-4d6a-8988-0033d5dd587a-plugin-serving-cert") pod "nmstate-console-plugin-86f58fcf4-nnrp6" (UID: "663e9228-322c-4d6a-8988-0033d5dd587a") : secret "plugin-serving-cert" not found
Mar 20 15:53:44 crc kubenswrapper[4730]: I0320 15:53:44.065066    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/663e9228-322c-4d6a-8988-0033d5dd587a-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-nnrp6\" (UID: \"663e9228-322c-4d6a-8988-0033d5dd587a\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-nnrp6"
Mar 20 15:53:44 crc kubenswrapper[4730]: I0320 15:53:44.079905    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g5nq\" (UniqueName: \"kubernetes.io/projected/663e9228-322c-4d6a-8988-0033d5dd587a-kube-api-access-7g5nq\") pod \"nmstate-console-plugin-86f58fcf4-nnrp6\" (UID: \"663e9228-322c-4d6a-8988-0033d5dd587a\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-nnrp6"
Mar 20 15:53:44 crc kubenswrapper[4730]: I0320 15:53:44.165427    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f423bde-b761-4c2a-8519-805e6e44d099-trusted-ca-bundle\") pod \"console-854c989bdc-94fm2\" (UID: \"1f423bde-b761-4c2a-8519-805e6e44d099\") " pod="openshift-console/console-854c989bdc-94fm2"
Mar 20 15:53:44 crc kubenswrapper[4730]: I0320 15:53:44.165523    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1f423bde-b761-4c2a-8519-805e6e44d099-console-oauth-config\") pod \"console-854c989bdc-94fm2\" (UID: \"1f423bde-b761-4c2a-8519-805e6e44d099\") " pod="openshift-console/console-854c989bdc-94fm2"
Mar 20 15:53:44 crc kubenswrapper[4730]: I0320 15:53:44.165569    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f423bde-b761-4c2a-8519-805e6e44d099-console-serving-cert\") pod \"console-854c989bdc-94fm2\" (UID: \"1f423bde-b761-4c2a-8519-805e6e44d099\") " pod="openshift-console/console-854c989bdc-94fm2"
Mar 20 15:53:44 crc kubenswrapper[4730]: I0320 15:53:44.165601    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1f423bde-b761-4c2a-8519-805e6e44d099-service-ca\") pod \"console-854c989bdc-94fm2\" (UID: \"1f423bde-b761-4c2a-8519-805e6e44d099\") " pod="openshift-console/console-854c989bdc-94fm2"
Mar 20 15:53:44 crc kubenswrapper[4730]: I0320 15:53:44.165702    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1f423bde-b761-4c2a-8519-805e6e44d099-oauth-serving-cert\") pod \"console-854c989bdc-94fm2\" (UID: \"1f423bde-b761-4c2a-8519-805e6e44d099\") " pod="openshift-console/console-854c989bdc-94fm2"
Mar 20 15:53:44 crc kubenswrapper[4730]: I0320 15:53:44.165844    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngtgw\" (UniqueName: \"kubernetes.io/projected/1f423bde-b761-4c2a-8519-805e6e44d099-kube-api-access-ngtgw\") pod \"console-854c989bdc-94fm2\" (UID: \"1f423bde-b761-4c2a-8519-805e6e44d099\") " pod="openshift-console/console-854c989bdc-94fm2"
Mar 20 15:53:44 crc kubenswrapper[4730]: I0320 15:53:44.165926    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1f423bde-b761-4c2a-8519-805e6e44d099-console-config\") pod \"console-854c989bdc-94fm2\" (UID: \"1f423bde-b761-4c2a-8519-805e6e44d099\") " pod="openshift-console/console-854c989bdc-94fm2"
Mar 20 15:53:44 crc kubenswrapper[4730]: I0320 15:53:44.266796    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f423bde-b761-4c2a-8519-805e6e44d099-trusted-ca-bundle\") pod \"console-854c989bdc-94fm2\" (UID: \"1f423bde-b761-4c2a-8519-805e6e44d099\") " pod="openshift-console/console-854c989bdc-94fm2"
Mar 20 15:53:44 crc kubenswrapper[4730]: I0320 15:53:44.266853    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1f423bde-b761-4c2a-8519-805e6e44d099-console-oauth-config\") pod \"console-854c989bdc-94fm2\" (UID: \"1f423bde-b761-4c2a-8519-805e6e44d099\") " pod="openshift-console/console-854c989bdc-94fm2"
Mar 20 15:53:44 crc kubenswrapper[4730]: I0320 15:53:44.266882    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f423bde-b761-4c2a-8519-805e6e44d099-console-serving-cert\") pod \"console-854c989bdc-94fm2\" (UID: \"1f423bde-b761-4c2a-8519-805e6e44d099\") " pod="openshift-console/console-854c989bdc-94fm2"
Mar 20 15:53:44 crc kubenswrapper[4730]: I0320 15:53:44.266907    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1f423bde-b761-4c2a-8519-805e6e44d099-service-ca\") pod \"console-854c989bdc-94fm2\" (UID: \"1f423bde-b761-4c2a-8519-805e6e44d099\") " pod="openshift-console/console-854c989bdc-94fm2"
Mar 20 15:53:44 crc kubenswrapper[4730]: I0320 15:53:44.266922    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1f423bde-b761-4c2a-8519-805e6e44d099-oauth-serving-cert\") pod \"console-854c989bdc-94fm2\" (UID: \"1f423bde-b761-4c2a-8519-805e6e44d099\") " pod="openshift-console/console-854c989bdc-94fm2"
Mar 20 15:53:44 crc kubenswrapper[4730]: I0320 15:53:44.266946    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngtgw\" (UniqueName: \"kubernetes.io/projected/1f423bde-b761-4c2a-8519-805e6e44d099-kube-api-access-ngtgw\") pod \"console-854c989bdc-94fm2\" (UID: \"1f423bde-b761-4c2a-8519-805e6e44d099\") " pod="openshift-console/console-854c989bdc-94fm2"
Mar 20 15:53:44 crc kubenswrapper[4730]: I0320 15:53:44.266969    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1f423bde-b761-4c2a-8519-805e6e44d099-console-config\") pod \"console-854c989bdc-94fm2\" (UID: \"1f423bde-b761-4c2a-8519-805e6e44d099\") " pod="openshift-console/console-854c989bdc-94fm2"
Mar 20 15:53:44 crc kubenswrapper[4730]: I0320 15:53:44.268794    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1f423bde-b761-4c2a-8519-805e6e44d099-oauth-serving-cert\") pod \"console-854c989bdc-94fm2\" (UID: \"1f423bde-b761-4c2a-8519-805e6e44d099\") " pod="openshift-console/console-854c989bdc-94fm2"
Mar 20 15:53:44 crc kubenswrapper[4730]: I0320 15:53:44.268797    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1f423bde-b761-4c2a-8519-805e6e44d099-console-config\") pod \"console-854c989bdc-94fm2\" (UID: \"1f423bde-b761-4c2a-8519-805e6e44d099\") " pod="openshift-console/console-854c989bdc-94fm2"
Mar 20 15:53:44 crc kubenswrapper[4730]: I0320 15:53:44.268951    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f423bde-b761-4c2a-8519-805e6e44d099-trusted-ca-bundle\") pod \"console-854c989bdc-94fm2\" (UID: \"1f423bde-b761-4c2a-8519-805e6e44d099\") " pod="openshift-console/console-854c989bdc-94fm2"
Mar 20 15:53:44 crc kubenswrapper[4730]: I0320 15:53:44.269264    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1f423bde-b761-4c2a-8519-805e6e44d099-service-ca\") pod \"console-854c989bdc-94fm2\" (UID: \"1f423bde-b761-4c2a-8519-805e6e44d099\") " pod="openshift-console/console-854c989bdc-94fm2"
Mar 20 15:53:44 crc kubenswrapper[4730]: I0320 15:53:44.270162    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f423bde-b761-4c2a-8519-805e6e44d099-console-serving-cert\") pod \"console-854c989bdc-94fm2\" (UID: \"1f423bde-b761-4c2a-8519-805e6e44d099\") " pod="openshift-console/console-854c989bdc-94fm2"
Mar 20 15:53:44 crc kubenswrapper[4730]: I0320 15:53:44.271824    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1f423bde-b761-4c2a-8519-805e6e44d099-console-oauth-config\") pod \"console-854c989bdc-94fm2\" (UID: \"1f423bde-b761-4c2a-8519-805e6e44d099\") " pod="openshift-console/console-854c989bdc-94fm2"
Mar 20 15:53:44 crc kubenswrapper[4730]: I0320 15:53:44.283900    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p4mms"]
Mar 20 15:53:44 crc kubenswrapper[4730]: I0320 15:53:44.288820    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngtgw\" (UniqueName: \"kubernetes.io/projected/1f423bde-b761-4c2a-8519-805e6e44d099-kube-api-access-ngtgw\") pod \"console-854c989bdc-94fm2\" (UID: \"1f423bde-b761-4c2a-8519-805e6e44d099\") " pod="openshift-console/console-854c989bdc-94fm2"
Mar 20 15:53:44 crc kubenswrapper[4730]: I0320 15:53:44.347657    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-854c989bdc-94fm2"
Mar 20 15:53:44 crc kubenswrapper[4730]: I0320 15:53:44.540002    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-vc8sk"
Mar 20 15:53:44 crc kubenswrapper[4730]: I0320 15:53:44.541455    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-nfr9k"
Mar 20 15:53:44 crc kubenswrapper[4730]: I0320 15:53:44.543815    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-6tdt2"
Mar 20 15:53:44 crc kubenswrapper[4730]: I0320 15:53:44.549135    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-nq6dd"
Mar 20 15:53:44 crc kubenswrapper[4730]: I0320 15:53:44.554421    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-854c989bdc-94fm2"]
Mar 20 15:53:44 crc kubenswrapper[4730]: W0320 15:53:44.568879    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f423bde_b761_4c2a_8519_805e6e44d099.slice/crio-d7a11f205b9b4e991c5697a502c331607ca7da7d297ef34f6d29dd58e050d5c0 WatchSource:0}: Error finding container d7a11f205b9b4e991c5697a502c331607ca7da7d297ef34f6d29dd58e050d5c0: Status 404 returned error can't find the container with id d7a11f205b9b4e991c5697a502c331607ca7da7d297ef34f6d29dd58e050d5c0
Mar 20 15:53:44 crc kubenswrapper[4730]: I0320 15:53:44.580105    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/663e9228-322c-4d6a-8988-0033d5dd587a-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-nnrp6\" (UID: \"663e9228-322c-4d6a-8988-0033d5dd587a\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-nnrp6"
Mar 20 15:53:44 crc kubenswrapper[4730]: I0320 15:53:44.585232    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/663e9228-322c-4d6a-8988-0033d5dd587a-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-nnrp6\" (UID: \"663e9228-322c-4d6a-8988-0033d5dd587a\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-nnrp6"
Mar 20 15:53:44 crc kubenswrapper[4730]: I0320 15:53:44.691228    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-854c989bdc-94fm2" event={"ID":"1f423bde-b761-4c2a-8519-805e6e44d099","Type":"ContainerStarted","Data":"d7a11f205b9b4e991c5697a502c331607ca7da7d297ef34f6d29dd58e050d5c0"}
Mar 20 15:53:44 crc kubenswrapper[4730]: I0320 15:53:44.693532    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-p4mms" podUID="f51e780f-650b-45d8-a2c3-b6b73ce74c61" containerName="registry-server" containerID="cri-o://dab2914bb52dccdd630edde42af76c8a7122f9d5ae1d800cd9af1744c1e01631" gracePeriod=2
Mar 20 15:53:44 crc kubenswrapper[4730]: I0320 15:53:44.693823    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-6tdt2" event={"ID":"0a4f6fcf-7c76-49cf-8f3c-d83879a650f1","Type":"ContainerStarted","Data":"12aa4f4ac579ae5563b0af0d1dfafd2ecfade139943f37fabf1ec3558b1e5c94"}
Mar 20 15:53:44 crc kubenswrapper[4730]: I0320 15:53:44.759131    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-nfr9k"]
Mar 20 15:53:44 crc kubenswrapper[4730]: I0320 15:53:44.761486    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-nnrp6"
Mar 20 15:53:44 crc kubenswrapper[4730]: W0320 15:53:44.772352    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f50a695_6f8b_42e6_aa4f_3dfd888b6afa.slice/crio-f6577e316c5eb7230e82d9a69af5ad70a8d8731e367a973b3c69091fc1bfbe60 WatchSource:0}: Error finding container f6577e316c5eb7230e82d9a69af5ad70a8d8731e367a973b3c69091fc1bfbe60: Status 404 returned error can't find the container with id f6577e316c5eb7230e82d9a69af5ad70a8d8731e367a973b3c69091fc1bfbe60
Mar 20 15:53:44 crc kubenswrapper[4730]: I0320 15:53:44.827859    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-nq6dd"]
Mar 20 15:53:45 crc kubenswrapper[4730]: I0320 15:53:45.171425    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-nnrp6"]
Mar 20 15:53:45 crc kubenswrapper[4730]: W0320 15:53:45.175856    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod663e9228_322c_4d6a_8988_0033d5dd587a.slice/crio-d78d19844368244adb5bc78f42a72d85d41fc682757e6334e2b57ecc31edda9b WatchSource:0}: Error finding container d78d19844368244adb5bc78f42a72d85d41fc682757e6334e2b57ecc31edda9b: Status 404 returned error can't find the container with id d78d19844368244adb5bc78f42a72d85d41fc682757e6334e2b57ecc31edda9b
Mar 20 15:53:45 crc kubenswrapper[4730]: I0320 15:53:45.701113    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-nfr9k" event={"ID":"3f50a695-6f8b-42e6-aa4f-3dfd888b6afa","Type":"ContainerStarted","Data":"f6577e316c5eb7230e82d9a69af5ad70a8d8731e367a973b3c69091fc1bfbe60"}
Mar 20 15:53:45 crc kubenswrapper[4730]: I0320 15:53:45.704007    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-nq6dd" event={"ID":"0f827638-33ac-4f99-920b-6e9b72db7955","Type":"ContainerStarted","Data":"70e0ceee949e00f078fbb5706e2c9ceb5e1c711423fba4d4253b5870c75b584a"}
Mar 20 15:53:45 crc kubenswrapper[4730]: I0320 15:53:45.705362    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-nnrp6" event={"ID":"663e9228-322c-4d6a-8988-0033d5dd587a","Type":"ContainerStarted","Data":"d78d19844368244adb5bc78f42a72d85d41fc682757e6334e2b57ecc31edda9b"}
Mar 20 15:53:45 crc kubenswrapper[4730]: I0320 15:53:45.706746    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-854c989bdc-94fm2" event={"ID":"1f423bde-b761-4c2a-8519-805e6e44d099","Type":"ContainerStarted","Data":"e41e26443fc6de31cd89d338531be24c3f63d731dacf5ce0ae960ef3f12588c3"}
Mar 20 15:53:45 crc kubenswrapper[4730]: I0320 15:53:45.728315    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-854c989bdc-94fm2" podStartSLOduration=1.728295148 podStartE2EDuration="1.728295148s" podCreationTimestamp="2026-03-20 15:53:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:53:45.726241639 +0000 UTC m=+884.939613018" watchObservedRunningTime="2026-03-20 15:53:45.728295148 +0000 UTC m=+884.941666537"
Mar 20 15:53:46 crc kubenswrapper[4730]: I0320 15:53:46.724656    4730 generic.go:334] "Generic (PLEG): container finished" podID="f51e780f-650b-45d8-a2c3-b6b73ce74c61" containerID="dab2914bb52dccdd630edde42af76c8a7122f9d5ae1d800cd9af1744c1e01631" exitCode=0
Mar 20 15:53:46 crc kubenswrapper[4730]: I0320 15:53:46.724733    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p4mms" event={"ID":"f51e780f-650b-45d8-a2c3-b6b73ce74c61","Type":"ContainerDied","Data":"dab2914bb52dccdd630edde42af76c8a7122f9d5ae1d800cd9af1744c1e01631"}
Mar 20 15:53:46 crc kubenswrapper[4730]: I0320 15:53:46.868217    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p4mms"
Mar 20 15:53:47 crc kubenswrapper[4730]: I0320 15:53:47.014594    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55wwc\" (UniqueName: \"kubernetes.io/projected/f51e780f-650b-45d8-a2c3-b6b73ce74c61-kube-api-access-55wwc\") pod \"f51e780f-650b-45d8-a2c3-b6b73ce74c61\" (UID: \"f51e780f-650b-45d8-a2c3-b6b73ce74c61\") "
Mar 20 15:53:47 crc kubenswrapper[4730]: I0320 15:53:47.015811    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f51e780f-650b-45d8-a2c3-b6b73ce74c61-utilities\") pod \"f51e780f-650b-45d8-a2c3-b6b73ce74c61\" (UID: \"f51e780f-650b-45d8-a2c3-b6b73ce74c61\") "
Mar 20 15:53:47 crc kubenswrapper[4730]: I0320 15:53:47.015884    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f51e780f-650b-45d8-a2c3-b6b73ce74c61-catalog-content\") pod \"f51e780f-650b-45d8-a2c3-b6b73ce74c61\" (UID: \"f51e780f-650b-45d8-a2c3-b6b73ce74c61\") "
Mar 20 15:53:47 crc kubenswrapper[4730]: I0320 15:53:47.016792    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f51e780f-650b-45d8-a2c3-b6b73ce74c61-utilities" (OuterVolumeSpecName: "utilities") pod "f51e780f-650b-45d8-a2c3-b6b73ce74c61" (UID: "f51e780f-650b-45d8-a2c3-b6b73ce74c61"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 15:53:47 crc kubenswrapper[4730]: I0320 15:53:47.020878    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f51e780f-650b-45d8-a2c3-b6b73ce74c61-kube-api-access-55wwc" (OuterVolumeSpecName: "kube-api-access-55wwc") pod "f51e780f-650b-45d8-a2c3-b6b73ce74c61" (UID: "f51e780f-650b-45d8-a2c3-b6b73ce74c61"). InnerVolumeSpecName "kube-api-access-55wwc". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:53:47 crc kubenswrapper[4730]: I0320 15:53:47.118047    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55wwc\" (UniqueName: \"kubernetes.io/projected/f51e780f-650b-45d8-a2c3-b6b73ce74c61-kube-api-access-55wwc\") on node \"crc\" DevicePath \"\""
Mar 20 15:53:47 crc kubenswrapper[4730]: I0320 15:53:47.118087    4730 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f51e780f-650b-45d8-a2c3-b6b73ce74c61-utilities\") on node \"crc\" DevicePath \"\""
Mar 20 15:53:47 crc kubenswrapper[4730]: I0320 15:53:47.151583    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f51e780f-650b-45d8-a2c3-b6b73ce74c61-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f51e780f-650b-45d8-a2c3-b6b73ce74c61" (UID: "f51e780f-650b-45d8-a2c3-b6b73ce74c61"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 15:53:47 crc kubenswrapper[4730]: I0320 15:53:47.219083    4730 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f51e780f-650b-45d8-a2c3-b6b73ce74c61-catalog-content\") on node \"crc\" DevicePath \"\""
Mar 20 15:53:47 crc kubenswrapper[4730]: I0320 15:53:47.732536    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p4mms" event={"ID":"f51e780f-650b-45d8-a2c3-b6b73ce74c61","Type":"ContainerDied","Data":"c07e01328d7c5f71cfb88d67fc57544826757077ff694bb1f7eb716cf5cf5bd7"}
Mar 20 15:53:47 crc kubenswrapper[4730]: I0320 15:53:47.732581    4730 scope.go:117] "RemoveContainer" containerID="dab2914bb52dccdd630edde42af76c8a7122f9d5ae1d800cd9af1744c1e01631"
Mar 20 15:53:47 crc kubenswrapper[4730]: I0320 15:53:47.732691    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p4mms"
Mar 20 15:53:47 crc kubenswrapper[4730]: I0320 15:53:47.751900    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p4mms"]
Mar 20 15:53:47 crc kubenswrapper[4730]: I0320 15:53:47.756265    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-p4mms"]
Mar 20 15:53:47 crc kubenswrapper[4730]: I0320 15:53:47.935577    4730 scope.go:117] "RemoveContainer" containerID="f6fd4d176812b359067e665f23250f950ab5e29084994d678328733032dac222"
Mar 20 15:53:47 crc kubenswrapper[4730]: I0320 15:53:47.957589    4730 scope.go:117] "RemoveContainer" containerID="a9b15d45bb34224d6766d66172ba74c36da9471e707ae2a90cea12d0f5f9aca2"
Mar 20 15:53:48 crc kubenswrapper[4730]: I0320 15:53:48.745193    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-nfr9k" event={"ID":"3f50a695-6f8b-42e6-aa4f-3dfd888b6afa","Type":"ContainerStarted","Data":"d3f2447926f0361cdc1c87f47a16412046749eea2217a9b2ad72c3059016c665"}
Mar 20 15:53:48 crc kubenswrapper[4730]: I0320 15:53:48.747602    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-nq6dd"
Mar 20 15:53:48 crc kubenswrapper[4730]: I0320 15:53:48.755201    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-6tdt2"
Mar 20 15:53:48 crc kubenswrapper[4730]: I0320 15:53:48.759666    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-nnrp6" event={"ID":"663e9228-322c-4d6a-8988-0033d5dd587a","Type":"ContainerStarted","Data":"69338df9a9bd7b8eac24e28e0b4c0e0630a7148968a80bdb15f0fa2a86e816d0"}
Mar 20 15:53:48 crc kubenswrapper[4730]: I0320 15:53:48.774942    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-nq6dd" podStartSLOduration=2.63502137 podStartE2EDuration="5.774918157s" podCreationTimestamp="2026-03-20 15:53:43 +0000 UTC" firstStartedPulling="2026-03-20 15:53:44.837167147 +0000 UTC m=+884.050538506" lastFinishedPulling="2026-03-20 15:53:47.977063874 +0000 UTC m=+887.190435293" observedRunningTime="2026-03-20 15:53:48.771960252 +0000 UTC m=+887.985331701" watchObservedRunningTime="2026-03-20 15:53:48.774918157 +0000 UTC m=+887.988289536"
Mar 20 15:53:48 crc kubenswrapper[4730]: I0320 15:53:48.838268    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-nnrp6" podStartSLOduration=3.038412779 podStartE2EDuration="5.838230424s" podCreationTimestamp="2026-03-20 15:53:43 +0000 UTC" firstStartedPulling="2026-03-20 15:53:45.178016511 +0000 UTC m=+884.391387880" lastFinishedPulling="2026-03-20 15:53:47.977834116 +0000 UTC m=+887.191205525" observedRunningTime="2026-03-20 15:53:48.832769448 +0000 UTC m=+888.046140817" watchObservedRunningTime="2026-03-20 15:53:48.838230424 +0000 UTC m=+888.051601803"
Mar 20 15:53:48 crc kubenswrapper[4730]: I0320 15:53:48.861862    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-6tdt2" podStartSLOduration=2.494707832 podStartE2EDuration="5.861838232s" podCreationTimestamp="2026-03-20 15:53:43 +0000 UTC" firstStartedPulling="2026-03-20 15:53:44.608995057 +0000 UTC m=+883.822366416" lastFinishedPulling="2026-03-20 15:53:47.976125447 +0000 UTC m=+887.189496816" observedRunningTime="2026-03-20 15:53:48.855723236 +0000 UTC m=+888.069094605" watchObservedRunningTime="2026-03-20 15:53:48.861838232 +0000 UTC m=+888.075209611"
Mar 20 15:53:49 crc kubenswrapper[4730]: I0320 15:53:49.543802    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f51e780f-650b-45d8-a2c3-b6b73ce74c61" path="/var/lib/kubelet/pods/f51e780f-650b-45d8-a2c3-b6b73ce74c61/volumes"
Mar 20 15:53:49 crc kubenswrapper[4730]: I0320 15:53:49.766261    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-6tdt2" event={"ID":"0a4f6fcf-7c76-49cf-8f3c-d83879a650f1","Type":"ContainerStarted","Data":"057549402a23539c57ae68646300d5508432df8b711504be91aba508301a510a"}
Mar 20 15:53:49 crc kubenswrapper[4730]: I0320 15:53:49.769631    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-nq6dd" event={"ID":"0f827638-33ac-4f99-920b-6e9b72db7955","Type":"ContainerStarted","Data":"36319714d9d076b45a4becfc78e0b05938bf5b292eb742313e25593d679d31f0"}
Mar 20 15:53:51 crc kubenswrapper[4730]: I0320 15:53:51.781998    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-nfr9k" event={"ID":"3f50a695-6f8b-42e6-aa4f-3dfd888b6afa","Type":"ContainerStarted","Data":"859e2cbbf09914effc9ea4141cfae3cc5109235e9a1358e6294af703226839ed"}
Mar 20 15:53:51 crc kubenswrapper[4730]: I0320 15:53:51.806535    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-nfr9k" podStartSLOduration=2.369955501 podStartE2EDuration="8.806518104s" podCreationTimestamp="2026-03-20 15:53:43 +0000 UTC" firstStartedPulling="2026-03-20 15:53:44.774774076 +0000 UTC m=+883.988145445" lastFinishedPulling="2026-03-20 15:53:51.211336679 +0000 UTC m=+890.424708048" observedRunningTime="2026-03-20 15:53:51.804928499 +0000 UTC m=+891.018299918" watchObservedRunningTime="2026-03-20 15:53:51.806518104 +0000 UTC m=+891.019889473"
Mar 20 15:53:54 crc kubenswrapper[4730]: I0320 15:53:54.348217    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-854c989bdc-94fm2"
Mar 20 15:53:54 crc kubenswrapper[4730]: I0320 15:53:54.348348    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-854c989bdc-94fm2"
Mar 20 15:53:54 crc kubenswrapper[4730]: I0320 15:53:54.356038    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-854c989bdc-94fm2"
Mar 20 15:53:54 crc kubenswrapper[4730]: I0320 15:53:54.585322    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-6tdt2"
Mar 20 15:53:54 crc kubenswrapper[4730]: I0320 15:53:54.815963    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-854c989bdc-94fm2"
Mar 20 15:53:54 crc kubenswrapper[4730]: I0320 15:53:54.885617    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-9kgl8"]
Mar 20 15:54:00 crc kubenswrapper[4730]: I0320 15:54:00.146597    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567034-sdvfb"]
Mar 20 15:54:00 crc kubenswrapper[4730]: E0320 15:54:00.147523    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f51e780f-650b-45d8-a2c3-b6b73ce74c61" containerName="extract-utilities"
Mar 20 15:54:00 crc kubenswrapper[4730]: I0320 15:54:00.147545    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="f51e780f-650b-45d8-a2c3-b6b73ce74c61" containerName="extract-utilities"
Mar 20 15:54:00 crc kubenswrapper[4730]: E0320 15:54:00.147563    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f51e780f-650b-45d8-a2c3-b6b73ce74c61" containerName="registry-server"
Mar 20 15:54:00 crc kubenswrapper[4730]: I0320 15:54:00.147573    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="f51e780f-650b-45d8-a2c3-b6b73ce74c61" containerName="registry-server"
Mar 20 15:54:00 crc kubenswrapper[4730]: E0320 15:54:00.147596    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f51e780f-650b-45d8-a2c3-b6b73ce74c61" containerName="extract-content"
Mar 20 15:54:00 crc kubenswrapper[4730]: I0320 15:54:00.147607    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="f51e780f-650b-45d8-a2c3-b6b73ce74c61" containerName="extract-content"
Mar 20 15:54:00 crc kubenswrapper[4730]: I0320 15:54:00.147787    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="f51e780f-650b-45d8-a2c3-b6b73ce74c61" containerName="registry-server"
Mar 20 15:54:00 crc kubenswrapper[4730]: I0320 15:54:00.148446    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567034-sdvfb"
Mar 20 15:54:00 crc kubenswrapper[4730]: I0320 15:54:00.150698    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt"
Mar 20 15:54:00 crc kubenswrapper[4730]: I0320 15:54:00.150834    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vlqbl"
Mar 20 15:54:00 crc kubenswrapper[4730]: I0320 15:54:00.151069    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt"
Mar 20 15:54:00 crc kubenswrapper[4730]: I0320 15:54:00.152173    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567034-sdvfb"]
Mar 20 15:54:00 crc kubenswrapper[4730]: I0320 15:54:00.317620    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j6dw\" (UniqueName: \"kubernetes.io/projected/c84a0097-0ea0-4397-b72b-07e391268b84-kube-api-access-5j6dw\") pod \"auto-csr-approver-29567034-sdvfb\" (UID: \"c84a0097-0ea0-4397-b72b-07e391268b84\") " pod="openshift-infra/auto-csr-approver-29567034-sdvfb"
Mar 20 15:54:00 crc kubenswrapper[4730]: I0320 15:54:00.419098    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j6dw\" (UniqueName: \"kubernetes.io/projected/c84a0097-0ea0-4397-b72b-07e391268b84-kube-api-access-5j6dw\") pod \"auto-csr-approver-29567034-sdvfb\" (UID: \"c84a0097-0ea0-4397-b72b-07e391268b84\") " pod="openshift-infra/auto-csr-approver-29567034-sdvfb"
Mar 20 15:54:00 crc kubenswrapper[4730]: I0320 15:54:00.448656    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j6dw\" (UniqueName: \"kubernetes.io/projected/c84a0097-0ea0-4397-b72b-07e391268b84-kube-api-access-5j6dw\") pod \"auto-csr-approver-29567034-sdvfb\" (UID: \"c84a0097-0ea0-4397-b72b-07e391268b84\") " pod="openshift-infra/auto-csr-approver-29567034-sdvfb"
Mar 20 15:54:00 crc kubenswrapper[4730]: I0320 15:54:00.477277    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567034-sdvfb"
Mar 20 15:54:00 crc kubenswrapper[4730]: I0320 15:54:00.716340    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567034-sdvfb"]
Mar 20 15:54:00 crc kubenswrapper[4730]: W0320 15:54:00.721197    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc84a0097_0ea0_4397_b72b_07e391268b84.slice/crio-8f459ea850de069c76b8e2871471c56ffe66a9caee006badc42c96a8294bc6de WatchSource:0}: Error finding container 8f459ea850de069c76b8e2871471c56ffe66a9caee006badc42c96a8294bc6de: Status 404 returned error can't find the container with id 8f459ea850de069c76b8e2871471c56ffe66a9caee006badc42c96a8294bc6de
Mar 20 15:54:00 crc kubenswrapper[4730]: I0320 15:54:00.857035    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567034-sdvfb" event={"ID":"c84a0097-0ea0-4397-b72b-07e391268b84","Type":"ContainerStarted","Data":"8f459ea850de069c76b8e2871471c56ffe66a9caee006badc42c96a8294bc6de"}
Mar 20 15:54:01 crc kubenswrapper[4730]: I0320 15:54:01.864179    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567034-sdvfb" event={"ID":"c84a0097-0ea0-4397-b72b-07e391268b84","Type":"ContainerStarted","Data":"cd338cd8acc0dfd58bb17dd35d4fa074369101fa940bc1e78ceafdde3c9aa8ec"}
Mar 20 15:54:01 crc kubenswrapper[4730]: I0320 15:54:01.878297    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567034-sdvfb" podStartSLOduration=1.029161386 podStartE2EDuration="1.878278472s" podCreationTimestamp="2026-03-20 15:54:00 +0000 UTC" firstStartedPulling="2026-03-20 15:54:00.723690357 +0000 UTC m=+899.937061716" lastFinishedPulling="2026-03-20 15:54:01.572807423 +0000 UTC m=+900.786178802" observedRunningTime="2026-03-20 15:54:01.876419798 +0000 UTC m=+901.089791207" watchObservedRunningTime="2026-03-20 15:54:01.878278472 +0000 UTC m=+901.091649841"
Mar 20 15:54:02 crc kubenswrapper[4730]: I0320 15:54:02.876279    4730 generic.go:334] "Generic (PLEG): container finished" podID="c84a0097-0ea0-4397-b72b-07e391268b84" containerID="cd338cd8acc0dfd58bb17dd35d4fa074369101fa940bc1e78ceafdde3c9aa8ec" exitCode=0
Mar 20 15:54:02 crc kubenswrapper[4730]: I0320 15:54:02.876413    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567034-sdvfb" event={"ID":"c84a0097-0ea0-4397-b72b-07e391268b84","Type":"ContainerDied","Data":"cd338cd8acc0dfd58bb17dd35d4fa074369101fa940bc1e78ceafdde3c9aa8ec"}
Mar 20 15:54:04 crc kubenswrapper[4730]: I0320 15:54:04.134069    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567034-sdvfb"
Mar 20 15:54:04 crc kubenswrapper[4730]: I0320 15:54:04.267111    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5j6dw\" (UniqueName: \"kubernetes.io/projected/c84a0097-0ea0-4397-b72b-07e391268b84-kube-api-access-5j6dw\") pod \"c84a0097-0ea0-4397-b72b-07e391268b84\" (UID: \"c84a0097-0ea0-4397-b72b-07e391268b84\") "
Mar 20 15:54:04 crc kubenswrapper[4730]: I0320 15:54:04.273492    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c84a0097-0ea0-4397-b72b-07e391268b84-kube-api-access-5j6dw" (OuterVolumeSpecName: "kube-api-access-5j6dw") pod "c84a0097-0ea0-4397-b72b-07e391268b84" (UID: "c84a0097-0ea0-4397-b72b-07e391268b84"). InnerVolumeSpecName "kube-api-access-5j6dw". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:54:04 crc kubenswrapper[4730]: I0320 15:54:04.369270    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5j6dw\" (UniqueName: \"kubernetes.io/projected/c84a0097-0ea0-4397-b72b-07e391268b84-kube-api-access-5j6dw\") on node \"crc\" DevicePath \"\""
Mar 20 15:54:04 crc kubenswrapper[4730]: I0320 15:54:04.559506    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-nq6dd"
Mar 20 15:54:04 crc kubenswrapper[4730]: I0320 15:54:04.616978    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567028-s6xcp"]
Mar 20 15:54:04 crc kubenswrapper[4730]: I0320 15:54:04.623215    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567028-s6xcp"]
Mar 20 15:54:04 crc kubenswrapper[4730]: I0320 15:54:04.889125    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567034-sdvfb" event={"ID":"c84a0097-0ea0-4397-b72b-07e391268b84","Type":"ContainerDied","Data":"8f459ea850de069c76b8e2871471c56ffe66a9caee006badc42c96a8294bc6de"}
Mar 20 15:54:04 crc kubenswrapper[4730]: I0320 15:54:04.889169    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f459ea850de069c76b8e2871471c56ffe66a9caee006badc42c96a8294bc6de"
Mar 20 15:54:04 crc kubenswrapper[4730]: I0320 15:54:04.889304    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567034-sdvfb"
Mar 20 15:54:05 crc kubenswrapper[4730]: I0320 15:54:05.543957    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e56ca246-99ac-4397-a499-62738ac94a39" path="/var/lib/kubelet/pods/e56ca246-99ac-4397-a499-62738ac94a39/volumes"
Mar 20 15:54:12 crc kubenswrapper[4730]: I0320 15:54:12.880338    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 15:54:12 crc kubenswrapper[4730]: I0320 15:54:12.880993    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 15:54:16 crc kubenswrapper[4730]: I0320 15:54:16.796832    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k77ck"]
Mar 20 15:54:16 crc kubenswrapper[4730]: E0320 15:54:16.797560    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c84a0097-0ea0-4397-b72b-07e391268b84" containerName="oc"
Mar 20 15:54:16 crc kubenswrapper[4730]: I0320 15:54:16.797572    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="c84a0097-0ea0-4397-b72b-07e391268b84" containerName="oc"
Mar 20 15:54:16 crc kubenswrapper[4730]: I0320 15:54:16.797685    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="c84a0097-0ea0-4397-b72b-07e391268b84" containerName="oc"
Mar 20 15:54:16 crc kubenswrapper[4730]: I0320 15:54:16.798462    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k77ck"
Mar 20 15:54:16 crc kubenswrapper[4730]: I0320 15:54:16.800650    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc"
Mar 20 15:54:16 crc kubenswrapper[4730]: I0320 15:54:16.850161    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k77ck"]
Mar 20 15:54:16 crc kubenswrapper[4730]: I0320 15:54:16.953990    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aac6cea5-e666-44b1-9507-f57de2361c40-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k77ck\" (UID: \"aac6cea5-e666-44b1-9507-f57de2361c40\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k77ck"
Mar 20 15:54:16 crc kubenswrapper[4730]: I0320 15:54:16.954070    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aac6cea5-e666-44b1-9507-f57de2361c40-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k77ck\" (UID: \"aac6cea5-e666-44b1-9507-f57de2361c40\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k77ck"
Mar 20 15:54:16 crc kubenswrapper[4730]: I0320 15:54:16.954100    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ntq6\" (UniqueName: \"kubernetes.io/projected/aac6cea5-e666-44b1-9507-f57de2361c40-kube-api-access-8ntq6\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k77ck\" (UID: \"aac6cea5-e666-44b1-9507-f57de2361c40\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k77ck"
Mar 20 15:54:17 crc kubenswrapper[4730]: I0320 15:54:17.055849    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aac6cea5-e666-44b1-9507-f57de2361c40-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k77ck\" (UID: \"aac6cea5-e666-44b1-9507-f57de2361c40\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k77ck"
Mar 20 15:54:17 crc kubenswrapper[4730]: I0320 15:54:17.056180    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ntq6\" (UniqueName: \"kubernetes.io/projected/aac6cea5-e666-44b1-9507-f57de2361c40-kube-api-access-8ntq6\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k77ck\" (UID: \"aac6cea5-e666-44b1-9507-f57de2361c40\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k77ck"
Mar 20 15:54:17 crc kubenswrapper[4730]: I0320 15:54:17.056353    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aac6cea5-e666-44b1-9507-f57de2361c40-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k77ck\" (UID: \"aac6cea5-e666-44b1-9507-f57de2361c40\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k77ck"
Mar 20 15:54:17 crc kubenswrapper[4730]: I0320 15:54:17.056443    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aac6cea5-e666-44b1-9507-f57de2361c40-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k77ck\" (UID: \"aac6cea5-e666-44b1-9507-f57de2361c40\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k77ck"
Mar 20 15:54:17 crc kubenswrapper[4730]: I0320 15:54:17.056777    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aac6cea5-e666-44b1-9507-f57de2361c40-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k77ck\" (UID: \"aac6cea5-e666-44b1-9507-f57de2361c40\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k77ck"
Mar 20 15:54:17 crc kubenswrapper[4730]: I0320 15:54:17.081276    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ntq6\" (UniqueName: \"kubernetes.io/projected/aac6cea5-e666-44b1-9507-f57de2361c40-kube-api-access-8ntq6\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k77ck\" (UID: \"aac6cea5-e666-44b1-9507-f57de2361c40\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k77ck"
Mar 20 15:54:17 crc kubenswrapper[4730]: I0320 15:54:17.115129    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k77ck"
Mar 20 15:54:17 crc kubenswrapper[4730]: I0320 15:54:17.347571    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k77ck"]
Mar 20 15:54:17 crc kubenswrapper[4730]: I0320 15:54:17.987164    4730 generic.go:334] "Generic (PLEG): container finished" podID="aac6cea5-e666-44b1-9507-f57de2361c40" containerID="14b29fd92f84091c54c3a87b6489617a775cf21b2dce3a2e2f91de16fefc572f" exitCode=0
Mar 20 15:54:17 crc kubenswrapper[4730]: I0320 15:54:17.987203    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k77ck" event={"ID":"aac6cea5-e666-44b1-9507-f57de2361c40","Type":"ContainerDied","Data":"14b29fd92f84091c54c3a87b6489617a775cf21b2dce3a2e2f91de16fefc572f"}
Mar 20 15:54:17 crc kubenswrapper[4730]: I0320 15:54:17.987227    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k77ck" event={"ID":"aac6cea5-e666-44b1-9507-f57de2361c40","Type":"ContainerStarted","Data":"02bc4b30e192f9e7bce0c5703d33941a824f5102e9a1fc90651580ea747816ca"}
Mar 20 15:54:19 crc kubenswrapper[4730]: I0320 15:54:19.942696    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-9kgl8" podUID="5edbd5a9-6c8b-4ef8-950f-58deaecf36ee" containerName="console" containerID="cri-o://dd7ad8497736491ff86002b539ccd73a88dfd723919819049a39b445ea55904f" gracePeriod=15
Mar 20 15:54:20 crc kubenswrapper[4730]: I0320 15:54:19.999953    4730 generic.go:334] "Generic (PLEG): container finished" podID="aac6cea5-e666-44b1-9507-f57de2361c40" containerID="d09021f97f624fe5bcbe4d6319ffb155927de148a56609e6902f09cbfa54760c" exitCode=0
Mar 20 15:54:20 crc kubenswrapper[4730]: I0320 15:54:20.000055    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k77ck" event={"ID":"aac6cea5-e666-44b1-9507-f57de2361c40","Type":"ContainerDied","Data":"d09021f97f624fe5bcbe4d6319ffb155927de148a56609e6902f09cbfa54760c"}
Mar 20 15:54:20 crc kubenswrapper[4730]: I0320 15:54:20.362795    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-9kgl8_5edbd5a9-6c8b-4ef8-950f-58deaecf36ee/console/0.log"
Mar 20 15:54:20 crc kubenswrapper[4730]: I0320 15:54:20.363081    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-9kgl8"
Mar 20 15:54:20 crc kubenswrapper[4730]: I0320 15:54:20.500080    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5edbd5a9-6c8b-4ef8-950f-58deaecf36ee-console-oauth-config\") pod \"5edbd5a9-6c8b-4ef8-950f-58deaecf36ee\" (UID: \"5edbd5a9-6c8b-4ef8-950f-58deaecf36ee\") "
Mar 20 15:54:20 crc kubenswrapper[4730]: I0320 15:54:20.500138    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5edbd5a9-6c8b-4ef8-950f-58deaecf36ee-trusted-ca-bundle\") pod \"5edbd5a9-6c8b-4ef8-950f-58deaecf36ee\" (UID: \"5edbd5a9-6c8b-4ef8-950f-58deaecf36ee\") "
Mar 20 15:54:20 crc kubenswrapper[4730]: I0320 15:54:20.500155    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5edbd5a9-6c8b-4ef8-950f-58deaecf36ee-console-config\") pod \"5edbd5a9-6c8b-4ef8-950f-58deaecf36ee\" (UID: \"5edbd5a9-6c8b-4ef8-950f-58deaecf36ee\") "
Mar 20 15:54:20 crc kubenswrapper[4730]: I0320 15:54:20.500190    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qvx5\" (UniqueName: \"kubernetes.io/projected/5edbd5a9-6c8b-4ef8-950f-58deaecf36ee-kube-api-access-4qvx5\") pod \"5edbd5a9-6c8b-4ef8-950f-58deaecf36ee\" (UID: \"5edbd5a9-6c8b-4ef8-950f-58deaecf36ee\") "
Mar 20 15:54:20 crc kubenswrapper[4730]: I0320 15:54:20.500237    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5edbd5a9-6c8b-4ef8-950f-58deaecf36ee-console-serving-cert\") pod \"5edbd5a9-6c8b-4ef8-950f-58deaecf36ee\" (UID: \"5edbd5a9-6c8b-4ef8-950f-58deaecf36ee\") "
Mar 20 15:54:20 crc kubenswrapper[4730]: I0320 15:54:20.500294    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5edbd5a9-6c8b-4ef8-950f-58deaecf36ee-oauth-serving-cert\") pod \"5edbd5a9-6c8b-4ef8-950f-58deaecf36ee\" (UID: \"5edbd5a9-6c8b-4ef8-950f-58deaecf36ee\") "
Mar 20 15:54:20 crc kubenswrapper[4730]: I0320 15:54:20.500331    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5edbd5a9-6c8b-4ef8-950f-58deaecf36ee-service-ca\") pod \"5edbd5a9-6c8b-4ef8-950f-58deaecf36ee\" (UID: \"5edbd5a9-6c8b-4ef8-950f-58deaecf36ee\") "
Mar 20 15:54:20 crc kubenswrapper[4730]: I0320 15:54:20.501042    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5edbd5a9-6c8b-4ef8-950f-58deaecf36ee-console-config" (OuterVolumeSpecName: "console-config") pod "5edbd5a9-6c8b-4ef8-950f-58deaecf36ee" (UID: "5edbd5a9-6c8b-4ef8-950f-58deaecf36ee"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:54:20 crc kubenswrapper[4730]: I0320 15:54:20.501098    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5edbd5a9-6c8b-4ef8-950f-58deaecf36ee-service-ca" (OuterVolumeSpecName: "service-ca") pod "5edbd5a9-6c8b-4ef8-950f-58deaecf36ee" (UID: "5edbd5a9-6c8b-4ef8-950f-58deaecf36ee"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:54:20 crc kubenswrapper[4730]: I0320 15:54:20.501219    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5edbd5a9-6c8b-4ef8-950f-58deaecf36ee-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5edbd5a9-6c8b-4ef8-950f-58deaecf36ee" (UID: "5edbd5a9-6c8b-4ef8-950f-58deaecf36ee"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:54:20 crc kubenswrapper[4730]: I0320 15:54:20.501283    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5edbd5a9-6c8b-4ef8-950f-58deaecf36ee-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "5edbd5a9-6c8b-4ef8-950f-58deaecf36ee" (UID: "5edbd5a9-6c8b-4ef8-950f-58deaecf36ee"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:54:20 crc kubenswrapper[4730]: I0320 15:54:20.505433    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5edbd5a9-6c8b-4ef8-950f-58deaecf36ee-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5edbd5a9-6c8b-4ef8-950f-58deaecf36ee" (UID: "5edbd5a9-6c8b-4ef8-950f-58deaecf36ee"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:54:20 crc kubenswrapper[4730]: I0320 15:54:20.505679    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5edbd5a9-6c8b-4ef8-950f-58deaecf36ee-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5edbd5a9-6c8b-4ef8-950f-58deaecf36ee" (UID: "5edbd5a9-6c8b-4ef8-950f-58deaecf36ee"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:54:20 crc kubenswrapper[4730]: I0320 15:54:20.508597    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5edbd5a9-6c8b-4ef8-950f-58deaecf36ee-kube-api-access-4qvx5" (OuterVolumeSpecName: "kube-api-access-4qvx5") pod "5edbd5a9-6c8b-4ef8-950f-58deaecf36ee" (UID: "5edbd5a9-6c8b-4ef8-950f-58deaecf36ee"). InnerVolumeSpecName "kube-api-access-4qvx5". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:54:20 crc kubenswrapper[4730]: I0320 15:54:20.601554    4730 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5edbd5a9-6c8b-4ef8-950f-58deaecf36ee-console-serving-cert\") on node \"crc\" DevicePath \"\""
Mar 20 15:54:20 crc kubenswrapper[4730]: I0320 15:54:20.601583    4730 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5edbd5a9-6c8b-4ef8-950f-58deaecf36ee-oauth-serving-cert\") on node \"crc\" DevicePath \"\""
Mar 20 15:54:20 crc kubenswrapper[4730]: I0320 15:54:20.601592    4730 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5edbd5a9-6c8b-4ef8-950f-58deaecf36ee-service-ca\") on node \"crc\" DevicePath \"\""
Mar 20 15:54:20 crc kubenswrapper[4730]: I0320 15:54:20.601602    4730 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5edbd5a9-6c8b-4ef8-950f-58deaecf36ee-console-oauth-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:54:20 crc kubenswrapper[4730]: I0320 15:54:20.601609    4730 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5edbd5a9-6c8b-4ef8-950f-58deaecf36ee-console-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:54:20 crc kubenswrapper[4730]: I0320 15:54:20.601617    4730 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5edbd5a9-6c8b-4ef8-950f-58deaecf36ee-trusted-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 15:54:20 crc kubenswrapper[4730]: I0320 15:54:20.601625    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qvx5\" (UniqueName: \"kubernetes.io/projected/5edbd5a9-6c8b-4ef8-950f-58deaecf36ee-kube-api-access-4qvx5\") on node \"crc\" DevicePath \"\""
Mar 20 15:54:21 crc kubenswrapper[4730]: I0320 15:54:21.006510    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-9kgl8_5edbd5a9-6c8b-4ef8-950f-58deaecf36ee/console/0.log"
Mar 20 15:54:21 crc kubenswrapper[4730]: I0320 15:54:21.007286    4730 generic.go:334] "Generic (PLEG): container finished" podID="5edbd5a9-6c8b-4ef8-950f-58deaecf36ee" containerID="dd7ad8497736491ff86002b539ccd73a88dfd723919819049a39b445ea55904f" exitCode=2
Mar 20 15:54:21 crc kubenswrapper[4730]: I0320 15:54:21.007361    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-9kgl8" event={"ID":"5edbd5a9-6c8b-4ef8-950f-58deaecf36ee","Type":"ContainerDied","Data":"dd7ad8497736491ff86002b539ccd73a88dfd723919819049a39b445ea55904f"}
Mar 20 15:54:21 crc kubenswrapper[4730]: I0320 15:54:21.007518    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-9kgl8" event={"ID":"5edbd5a9-6c8b-4ef8-950f-58deaecf36ee","Type":"ContainerDied","Data":"e011dbdf40941c9f2e1edba06bd23dad1736901c7815ace4b7b103d548c5c8d5"}
Mar 20 15:54:21 crc kubenswrapper[4730]: I0320 15:54:21.007551    4730 scope.go:117] "RemoveContainer" containerID="dd7ad8497736491ff86002b539ccd73a88dfd723919819049a39b445ea55904f"
Mar 20 15:54:21 crc kubenswrapper[4730]: I0320 15:54:21.007373    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-9kgl8"
Mar 20 15:54:21 crc kubenswrapper[4730]: I0320 15:54:21.011041    4730 generic.go:334] "Generic (PLEG): container finished" podID="aac6cea5-e666-44b1-9507-f57de2361c40" containerID="7774bdd106f067fcc2f20edd2778652f2a8c0bb9018791d243c17e09c7b40daa" exitCode=0
Mar 20 15:54:21 crc kubenswrapper[4730]: I0320 15:54:21.011090    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k77ck" event={"ID":"aac6cea5-e666-44b1-9507-f57de2361c40","Type":"ContainerDied","Data":"7774bdd106f067fcc2f20edd2778652f2a8c0bb9018791d243c17e09c7b40daa"}
Mar 20 15:54:21 crc kubenswrapper[4730]: I0320 15:54:21.034337    4730 scope.go:117] "RemoveContainer" containerID="dd7ad8497736491ff86002b539ccd73a88dfd723919819049a39b445ea55904f"
Mar 20 15:54:21 crc kubenswrapper[4730]: E0320 15:54:21.034718    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd7ad8497736491ff86002b539ccd73a88dfd723919819049a39b445ea55904f\": container with ID starting with dd7ad8497736491ff86002b539ccd73a88dfd723919819049a39b445ea55904f not found: ID does not exist" containerID="dd7ad8497736491ff86002b539ccd73a88dfd723919819049a39b445ea55904f"
Mar 20 15:54:21 crc kubenswrapper[4730]: I0320 15:54:21.034750    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd7ad8497736491ff86002b539ccd73a88dfd723919819049a39b445ea55904f"} err="failed to get container status \"dd7ad8497736491ff86002b539ccd73a88dfd723919819049a39b445ea55904f\": rpc error: code = NotFound desc = could not find container \"dd7ad8497736491ff86002b539ccd73a88dfd723919819049a39b445ea55904f\": container with ID starting with dd7ad8497736491ff86002b539ccd73a88dfd723919819049a39b445ea55904f not found: ID does not exist"
Mar 20 15:54:21 crc kubenswrapper[4730]: I0320 15:54:21.050377    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-9kgl8"]
Mar 20 15:54:21 crc kubenswrapper[4730]: I0320 15:54:21.055446    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-9kgl8"]
Mar 20 15:54:21 crc kubenswrapper[4730]: I0320 15:54:21.542220    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5edbd5a9-6c8b-4ef8-950f-58deaecf36ee" path="/var/lib/kubelet/pods/5edbd5a9-6c8b-4ef8-950f-58deaecf36ee/volumes"
Mar 20 15:54:22 crc kubenswrapper[4730]: I0320 15:54:22.340631    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k77ck"
Mar 20 15:54:22 crc kubenswrapper[4730]: I0320 15:54:22.523018    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aac6cea5-e666-44b1-9507-f57de2361c40-util\") pod \"aac6cea5-e666-44b1-9507-f57de2361c40\" (UID: \"aac6cea5-e666-44b1-9507-f57de2361c40\") "
Mar 20 15:54:22 crc kubenswrapper[4730]: I0320 15:54:22.523063    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aac6cea5-e666-44b1-9507-f57de2361c40-bundle\") pod \"aac6cea5-e666-44b1-9507-f57de2361c40\" (UID: \"aac6cea5-e666-44b1-9507-f57de2361c40\") "
Mar 20 15:54:22 crc kubenswrapper[4730]: I0320 15:54:22.523138    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ntq6\" (UniqueName: \"kubernetes.io/projected/aac6cea5-e666-44b1-9507-f57de2361c40-kube-api-access-8ntq6\") pod \"aac6cea5-e666-44b1-9507-f57de2361c40\" (UID: \"aac6cea5-e666-44b1-9507-f57de2361c40\") "
Mar 20 15:54:22 crc kubenswrapper[4730]: I0320 15:54:22.524072    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aac6cea5-e666-44b1-9507-f57de2361c40-bundle" (OuterVolumeSpecName: "bundle") pod "aac6cea5-e666-44b1-9507-f57de2361c40" (UID: "aac6cea5-e666-44b1-9507-f57de2361c40"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 15:54:22 crc kubenswrapper[4730]: I0320 15:54:22.527772    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aac6cea5-e666-44b1-9507-f57de2361c40-kube-api-access-8ntq6" (OuterVolumeSpecName: "kube-api-access-8ntq6") pod "aac6cea5-e666-44b1-9507-f57de2361c40" (UID: "aac6cea5-e666-44b1-9507-f57de2361c40"). InnerVolumeSpecName "kube-api-access-8ntq6". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:54:22 crc kubenswrapper[4730]: I0320 15:54:22.558124    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aac6cea5-e666-44b1-9507-f57de2361c40-util" (OuterVolumeSpecName: "util") pod "aac6cea5-e666-44b1-9507-f57de2361c40" (UID: "aac6cea5-e666-44b1-9507-f57de2361c40"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 15:54:22 crc kubenswrapper[4730]: I0320 15:54:22.624986    4730 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aac6cea5-e666-44b1-9507-f57de2361c40-util\") on node \"crc\" DevicePath \"\""
Mar 20 15:54:22 crc kubenswrapper[4730]: I0320 15:54:22.625021    4730 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aac6cea5-e666-44b1-9507-f57de2361c40-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 15:54:22 crc kubenswrapper[4730]: I0320 15:54:22.625030    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ntq6\" (UniqueName: \"kubernetes.io/projected/aac6cea5-e666-44b1-9507-f57de2361c40-kube-api-access-8ntq6\") on node \"crc\" DevicePath \"\""
Mar 20 15:54:23 crc kubenswrapper[4730]: I0320 15:54:23.025289    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k77ck" event={"ID":"aac6cea5-e666-44b1-9507-f57de2361c40","Type":"ContainerDied","Data":"02bc4b30e192f9e7bce0c5703d33941a824f5102e9a1fc90651580ea747816ca"}
Mar 20 15:54:23 crc kubenswrapper[4730]: I0320 15:54:23.025559    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02bc4b30e192f9e7bce0c5703d33941a824f5102e9a1fc90651580ea747816ca"
Mar 20 15:54:23 crc kubenswrapper[4730]: I0320 15:54:23.025327    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k77ck"
Mar 20 15:54:28 crc kubenswrapper[4730]: I0320 15:54:28.416515    4730 scope.go:117] "RemoveContainer" containerID="44d76ae85c164cacb0e0982473fd32dc59c0d37d2af3868ef4b22b1a51c8b024"
Mar 20 15:54:31 crc kubenswrapper[4730]: I0320 15:54:31.958609    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-85db46595-g556k"]
Mar 20 15:54:31 crc kubenswrapper[4730]: E0320 15:54:31.959465    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5edbd5a9-6c8b-4ef8-950f-58deaecf36ee" containerName="console"
Mar 20 15:54:31 crc kubenswrapper[4730]: I0320 15:54:31.959482    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="5edbd5a9-6c8b-4ef8-950f-58deaecf36ee" containerName="console"
Mar 20 15:54:31 crc kubenswrapper[4730]: E0320 15:54:31.959492    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aac6cea5-e666-44b1-9507-f57de2361c40" containerName="pull"
Mar 20 15:54:31 crc kubenswrapper[4730]: I0320 15:54:31.959500    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="aac6cea5-e666-44b1-9507-f57de2361c40" containerName="pull"
Mar 20 15:54:31 crc kubenswrapper[4730]: E0320 15:54:31.959518    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aac6cea5-e666-44b1-9507-f57de2361c40" containerName="extract"
Mar 20 15:54:31 crc kubenswrapper[4730]: I0320 15:54:31.959526    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="aac6cea5-e666-44b1-9507-f57de2361c40" containerName="extract"
Mar 20 15:54:31 crc kubenswrapper[4730]: E0320 15:54:31.959545    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aac6cea5-e666-44b1-9507-f57de2361c40" containerName="util"
Mar 20 15:54:31 crc kubenswrapper[4730]: I0320 15:54:31.959553    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="aac6cea5-e666-44b1-9507-f57de2361c40" containerName="util"
Mar 20 15:54:31 crc kubenswrapper[4730]: I0320 15:54:31.959678    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="aac6cea5-e666-44b1-9507-f57de2361c40" containerName="extract"
Mar 20 15:54:31 crc kubenswrapper[4730]: I0320 15:54:31.959696    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="5edbd5a9-6c8b-4ef8-950f-58deaecf36ee" containerName="console"
Mar 20 15:54:31 crc kubenswrapper[4730]: I0320 15:54:31.960202    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-85db46595-g556k"
Mar 20 15:54:31 crc kubenswrapper[4730]: I0320 15:54:31.962637    4730 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert"
Mar 20 15:54:31 crc kubenswrapper[4730]: I0320 15:54:31.962847    4730 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert"
Mar 20 15:54:31 crc kubenswrapper[4730]: I0320 15:54:31.964018    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt"
Mar 20 15:54:31 crc kubenswrapper[4730]: I0320 15:54:31.964090    4730 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-x4dwh"
Mar 20 15:54:31 crc kubenswrapper[4730]: I0320 15:54:31.965773    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt"
Mar 20 15:54:31 crc kubenswrapper[4730]: I0320 15:54:31.980798    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-85db46595-g556k"]
Mar 20 15:54:32 crc kubenswrapper[4730]: I0320 15:54:32.039310    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b41d974a-1e37-48ae-afdc-48c682c73637-apiservice-cert\") pod \"metallb-operator-controller-manager-85db46595-g556k\" (UID: \"b41d974a-1e37-48ae-afdc-48c682c73637\") " pod="metallb-system/metallb-operator-controller-manager-85db46595-g556k"
Mar 20 15:54:32 crc kubenswrapper[4730]: I0320 15:54:32.039616    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phhpv\" (UniqueName: \"kubernetes.io/projected/b41d974a-1e37-48ae-afdc-48c682c73637-kube-api-access-phhpv\") pod \"metallb-operator-controller-manager-85db46595-g556k\" (UID: \"b41d974a-1e37-48ae-afdc-48c682c73637\") " pod="metallb-system/metallb-operator-controller-manager-85db46595-g556k"
Mar 20 15:54:32 crc kubenswrapper[4730]: I0320 15:54:32.039719    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b41d974a-1e37-48ae-afdc-48c682c73637-webhook-cert\") pod \"metallb-operator-controller-manager-85db46595-g556k\" (UID: \"b41d974a-1e37-48ae-afdc-48c682c73637\") " pod="metallb-system/metallb-operator-controller-manager-85db46595-g556k"
Mar 20 15:54:32 crc kubenswrapper[4730]: I0320 15:54:32.141482    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phhpv\" (UniqueName: \"kubernetes.io/projected/b41d974a-1e37-48ae-afdc-48c682c73637-kube-api-access-phhpv\") pod \"metallb-operator-controller-manager-85db46595-g556k\" (UID: \"b41d974a-1e37-48ae-afdc-48c682c73637\") " pod="metallb-system/metallb-operator-controller-manager-85db46595-g556k"
Mar 20 15:54:32 crc kubenswrapper[4730]: I0320 15:54:32.142289    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b41d974a-1e37-48ae-afdc-48c682c73637-webhook-cert\") pod \"metallb-operator-controller-manager-85db46595-g556k\" (UID: \"b41d974a-1e37-48ae-afdc-48c682c73637\") " pod="metallb-system/metallb-operator-controller-manager-85db46595-g556k"
Mar 20 15:54:32 crc kubenswrapper[4730]: I0320 15:54:32.142564    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b41d974a-1e37-48ae-afdc-48c682c73637-apiservice-cert\") pod \"metallb-operator-controller-manager-85db46595-g556k\" (UID: \"b41d974a-1e37-48ae-afdc-48c682c73637\") " pod="metallb-system/metallb-operator-controller-manager-85db46595-g556k"
Mar 20 15:54:32 crc kubenswrapper[4730]: I0320 15:54:32.153464    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b41d974a-1e37-48ae-afdc-48c682c73637-webhook-cert\") pod \"metallb-operator-controller-manager-85db46595-g556k\" (UID: \"b41d974a-1e37-48ae-afdc-48c682c73637\") " pod="metallb-system/metallb-operator-controller-manager-85db46595-g556k"
Mar 20 15:54:32 crc kubenswrapper[4730]: I0320 15:54:32.153470    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b41d974a-1e37-48ae-afdc-48c682c73637-apiservice-cert\") pod \"metallb-operator-controller-manager-85db46595-g556k\" (UID: \"b41d974a-1e37-48ae-afdc-48c682c73637\") " pod="metallb-system/metallb-operator-controller-manager-85db46595-g556k"
Mar 20 15:54:32 crc kubenswrapper[4730]: I0320 15:54:32.171153    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phhpv\" (UniqueName: \"kubernetes.io/projected/b41d974a-1e37-48ae-afdc-48c682c73637-kube-api-access-phhpv\") pod \"metallb-operator-controller-manager-85db46595-g556k\" (UID: \"b41d974a-1e37-48ae-afdc-48c682c73637\") " pod="metallb-system/metallb-operator-controller-manager-85db46595-g556k"
Mar 20 15:54:32 crc kubenswrapper[4730]: I0320 15:54:32.198481    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-5f9794bdc6-ccfwn"]
Mar 20 15:54:32 crc kubenswrapper[4730]: I0320 15:54:32.199401    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5f9794bdc6-ccfwn"
Mar 20 15:54:32 crc kubenswrapper[4730]: I0320 15:54:32.201742    4730 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert"
Mar 20 15:54:32 crc kubenswrapper[4730]: I0320 15:54:32.202055    4730 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert"
Mar 20 15:54:32 crc kubenswrapper[4730]: I0320 15:54:32.202176    4730 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-s2pjt"
Mar 20 15:54:32 crc kubenswrapper[4730]: I0320 15:54:32.217924    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5f9794bdc6-ccfwn"]
Mar 20 15:54:32 crc kubenswrapper[4730]: I0320 15:54:32.279591    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-85db46595-g556k"
Mar 20 15:54:32 crc kubenswrapper[4730]: I0320 15:54:32.344462    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9dfba7eb-850f-4e34-a875-8ef219c8c783-webhook-cert\") pod \"metallb-operator-webhook-server-5f9794bdc6-ccfwn\" (UID: \"9dfba7eb-850f-4e34-a875-8ef219c8c783\") " pod="metallb-system/metallb-operator-webhook-server-5f9794bdc6-ccfwn"
Mar 20 15:54:32 crc kubenswrapper[4730]: I0320 15:54:32.344827    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9dfba7eb-850f-4e34-a875-8ef219c8c783-apiservice-cert\") pod \"metallb-operator-webhook-server-5f9794bdc6-ccfwn\" (UID: \"9dfba7eb-850f-4e34-a875-8ef219c8c783\") " pod="metallb-system/metallb-operator-webhook-server-5f9794bdc6-ccfwn"
Mar 20 15:54:32 crc kubenswrapper[4730]: I0320 15:54:32.344864    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9ftl\" (UniqueName: \"kubernetes.io/projected/9dfba7eb-850f-4e34-a875-8ef219c8c783-kube-api-access-f9ftl\") pod \"metallb-operator-webhook-server-5f9794bdc6-ccfwn\" (UID: \"9dfba7eb-850f-4e34-a875-8ef219c8c783\") " pod="metallb-system/metallb-operator-webhook-server-5f9794bdc6-ccfwn"
Mar 20 15:54:32 crc kubenswrapper[4730]: I0320 15:54:32.451889    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9ftl\" (UniqueName: \"kubernetes.io/projected/9dfba7eb-850f-4e34-a875-8ef219c8c783-kube-api-access-f9ftl\") pod \"metallb-operator-webhook-server-5f9794bdc6-ccfwn\" (UID: \"9dfba7eb-850f-4e34-a875-8ef219c8c783\") " pod="metallb-system/metallb-operator-webhook-server-5f9794bdc6-ccfwn"
Mar 20 15:54:32 crc kubenswrapper[4730]: I0320 15:54:32.451991    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9dfba7eb-850f-4e34-a875-8ef219c8c783-webhook-cert\") pod \"metallb-operator-webhook-server-5f9794bdc6-ccfwn\" (UID: \"9dfba7eb-850f-4e34-a875-8ef219c8c783\") " pod="metallb-system/metallb-operator-webhook-server-5f9794bdc6-ccfwn"
Mar 20 15:54:32 crc kubenswrapper[4730]: I0320 15:54:32.452013    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9dfba7eb-850f-4e34-a875-8ef219c8c783-apiservice-cert\") pod \"metallb-operator-webhook-server-5f9794bdc6-ccfwn\" (UID: \"9dfba7eb-850f-4e34-a875-8ef219c8c783\") " pod="metallb-system/metallb-operator-webhook-server-5f9794bdc6-ccfwn"
Mar 20 15:54:32 crc kubenswrapper[4730]: I0320 15:54:32.464061    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9dfba7eb-850f-4e34-a875-8ef219c8c783-apiservice-cert\") pod \"metallb-operator-webhook-server-5f9794bdc6-ccfwn\" (UID: \"9dfba7eb-850f-4e34-a875-8ef219c8c783\") " pod="metallb-system/metallb-operator-webhook-server-5f9794bdc6-ccfwn"
Mar 20 15:54:32 crc kubenswrapper[4730]: I0320 15:54:32.464206    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9dfba7eb-850f-4e34-a875-8ef219c8c783-webhook-cert\") pod \"metallb-operator-webhook-server-5f9794bdc6-ccfwn\" (UID: \"9dfba7eb-850f-4e34-a875-8ef219c8c783\") " pod="metallb-system/metallb-operator-webhook-server-5f9794bdc6-ccfwn"
Mar 20 15:54:32 crc kubenswrapper[4730]: I0320 15:54:32.472943    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9ftl\" (UniqueName: \"kubernetes.io/projected/9dfba7eb-850f-4e34-a875-8ef219c8c783-kube-api-access-f9ftl\") pod \"metallb-operator-webhook-server-5f9794bdc6-ccfwn\" (UID: \"9dfba7eb-850f-4e34-a875-8ef219c8c783\") " pod="metallb-system/metallb-operator-webhook-server-5f9794bdc6-ccfwn"
Mar 20 15:54:32 crc kubenswrapper[4730]: I0320 15:54:32.514621    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5f9794bdc6-ccfwn"
Mar 20 15:54:32 crc kubenswrapper[4730]: I0320 15:54:32.561536    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-85db46595-g556k"]
Mar 20 15:54:32 crc kubenswrapper[4730]: W0320 15:54:32.571180    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb41d974a_1e37_48ae_afdc_48c682c73637.slice/crio-5dbb3b3778d921b9ff6cedf1a3638ca001a5a37b829e140f387e1673142465ed WatchSource:0}: Error finding container 5dbb3b3778d921b9ff6cedf1a3638ca001a5a37b829e140f387e1673142465ed: Status 404 returned error can't find the container with id 5dbb3b3778d921b9ff6cedf1a3638ca001a5a37b829e140f387e1673142465ed
Mar 20 15:54:32 crc kubenswrapper[4730]: I0320 15:54:32.913132    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5f9794bdc6-ccfwn"]
Mar 20 15:54:32 crc kubenswrapper[4730]: W0320 15:54:32.915974    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9dfba7eb_850f_4e34_a875_8ef219c8c783.slice/crio-8d8074f09deca60877d122fc28a31c0de4c928d762f1a39d0e3f320b72eb652d WatchSource:0}: Error finding container 8d8074f09deca60877d122fc28a31c0de4c928d762f1a39d0e3f320b72eb652d: Status 404 returned error can't find the container with id 8d8074f09deca60877d122fc28a31c0de4c928d762f1a39d0e3f320b72eb652d
Mar 20 15:54:33 crc kubenswrapper[4730]: I0320 15:54:33.084982    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5f9794bdc6-ccfwn" event={"ID":"9dfba7eb-850f-4e34-a875-8ef219c8c783","Type":"ContainerStarted","Data":"8d8074f09deca60877d122fc28a31c0de4c928d762f1a39d0e3f320b72eb652d"}
Mar 20 15:54:33 crc kubenswrapper[4730]: I0320 15:54:33.086231    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-85db46595-g556k" event={"ID":"b41d974a-1e37-48ae-afdc-48c682c73637","Type":"ContainerStarted","Data":"5dbb3b3778d921b9ff6cedf1a3638ca001a5a37b829e140f387e1673142465ed"}
Mar 20 15:54:36 crc kubenswrapper[4730]: I0320 15:54:36.106264    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-85db46595-g556k" event={"ID":"b41d974a-1e37-48ae-afdc-48c682c73637","Type":"ContainerStarted","Data":"38328a93498c64211d3b290ad967186496858da8ab053fbb924892938b402136"}
Mar 20 15:54:36 crc kubenswrapper[4730]: I0320 15:54:36.106904    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-85db46595-g556k"
Mar 20 15:54:36 crc kubenswrapper[4730]: I0320 15:54:36.129434    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-85db46595-g556k" podStartSLOduration=2.116318922 podStartE2EDuration="5.129420201s" podCreationTimestamp="2026-03-20 15:54:31 +0000 UTC" firstStartedPulling="2026-03-20 15:54:32.574573977 +0000 UTC m=+931.787945346" lastFinishedPulling="2026-03-20 15:54:35.587675256 +0000 UTC m=+934.801046625" observedRunningTime="2026-03-20 15:54:36.128396652 +0000 UTC m=+935.341768021" watchObservedRunningTime="2026-03-20 15:54:36.129420201 +0000 UTC m=+935.342791570"
Mar 20 15:54:38 crc kubenswrapper[4730]: I0320 15:54:38.122097    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5f9794bdc6-ccfwn" event={"ID":"9dfba7eb-850f-4e34-a875-8ef219c8c783","Type":"ContainerStarted","Data":"0f0ad8601f2bdadc96283021b8decd64790e350286ce991c8a4fb3b19634540d"}
Mar 20 15:54:38 crc kubenswrapper[4730]: I0320 15:54:38.130691    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-5f9794bdc6-ccfwn"
Mar 20 15:54:38 crc kubenswrapper[4730]: I0320 15:54:38.152481    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-5f9794bdc6-ccfwn" podStartSLOduration=1.584921015 podStartE2EDuration="6.152464155s" podCreationTimestamp="2026-03-20 15:54:32 +0000 UTC" firstStartedPulling="2026-03-20 15:54:32.918533083 +0000 UTC m=+932.131904452" lastFinishedPulling="2026-03-20 15:54:37.486076223 +0000 UTC m=+936.699447592" observedRunningTime="2026-03-20 15:54:38.146880672 +0000 UTC m=+937.360252051" watchObservedRunningTime="2026-03-20 15:54:38.152464155 +0000 UTC m=+937.365835524"
Mar 20 15:54:42 crc kubenswrapper[4730]: I0320 15:54:42.880116    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 15:54:42 crc kubenswrapper[4730]: I0320 15:54:42.880697    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 15:54:42 crc kubenswrapper[4730]: I0320 15:54:42.880740    4730 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf"
Mar 20 15:54:42 crc kubenswrapper[4730]: I0320 15:54:42.881368    4730 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4969adb306e949f48cbf48ac9e1452830c3458afd1750aa781060e2cc0952393"} pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted"
Mar 20 15:54:42 crc kubenswrapper[4730]: I0320 15:54:42.881430    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" containerID="cri-o://4969adb306e949f48cbf48ac9e1452830c3458afd1750aa781060e2cc0952393" gracePeriod=600
Mar 20 15:54:43 crc kubenswrapper[4730]: I0320 15:54:43.156055    4730 generic.go:334] "Generic (PLEG): container finished" podID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerID="4969adb306e949f48cbf48ac9e1452830c3458afd1750aa781060e2cc0952393" exitCode=0
Mar 20 15:54:43 crc kubenswrapper[4730]: I0320 15:54:43.156126    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" event={"ID":"7fcd3db3-55f1-4c23-8fa9-78844495cea3","Type":"ContainerDied","Data":"4969adb306e949f48cbf48ac9e1452830c3458afd1750aa781060e2cc0952393"}
Mar 20 15:54:43 crc kubenswrapper[4730]: I0320 15:54:43.156416    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" event={"ID":"7fcd3db3-55f1-4c23-8fa9-78844495cea3","Type":"ContainerStarted","Data":"5a28eadd1ac2eb334876364a020c16296d471cf45645c126a154825ac93c80d5"}
Mar 20 15:54:43 crc kubenswrapper[4730]: I0320 15:54:43.156437    4730 scope.go:117] "RemoveContainer" containerID="44f44ed17252feb14ca678b8fd7bddf96639b37f5ddb8303898a1167aa46bf9c"
Mar 20 15:54:52 crc kubenswrapper[4730]: I0320 15:54:52.520283    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-5f9794bdc6-ccfwn"
Mar 20 15:55:12 crc kubenswrapper[4730]: I0320 15:55:12.283116    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-85db46595-g556k"
Mar 20 15:55:12 crc kubenswrapper[4730]: I0320 15:55:12.941397    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-pbr4w"]
Mar 20 15:55:12 crc kubenswrapper[4730]: I0320 15:55:12.944153    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-pbr4w"
Mar 20 15:55:12 crc kubenswrapper[4730]: I0320 15:55:12.947199    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup"
Mar 20 15:55:12 crc kubenswrapper[4730]: I0320 15:55:12.948219    4730 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret"
Mar 20 15:55:12 crc kubenswrapper[4730]: I0320 15:55:12.948235    4730 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-2jvnd"
Mar 20 15:55:12 crc kubenswrapper[4730]: I0320 15:55:12.948588    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-vmgrx"]
Mar 20 15:55:12 crc kubenswrapper[4730]: I0320 15:55:12.949450    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-vmgrx"
Mar 20 15:55:12 crc kubenswrapper[4730]: I0320 15:55:12.952515    4730 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert"
Mar 20 15:55:12 crc kubenswrapper[4730]: I0320 15:55:12.959988    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f-frr-conf\") pod \"frr-k8s-pbr4w\" (UID: \"5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f\") " pod="metallb-system/frr-k8s-pbr4w"
Mar 20 15:55:12 crc kubenswrapper[4730]: I0320 15:55:12.960030    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f-frr-sockets\") pod \"frr-k8s-pbr4w\" (UID: \"5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f\") " pod="metallb-system/frr-k8s-pbr4w"
Mar 20 15:55:12 crc kubenswrapper[4730]: I0320 15:55:12.960075    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f-reloader\") pod \"frr-k8s-pbr4w\" (UID: \"5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f\") " pod="metallb-system/frr-k8s-pbr4w"
Mar 20 15:55:12 crc kubenswrapper[4730]: I0320 15:55:12.960099    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f-frr-startup\") pod \"frr-k8s-pbr4w\" (UID: \"5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f\") " pod="metallb-system/frr-k8s-pbr4w"
Mar 20 15:55:12 crc kubenswrapper[4730]: I0320 15:55:12.960118    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9pbt\" (UniqueName: \"kubernetes.io/projected/5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f-kube-api-access-s9pbt\") pod \"frr-k8s-pbr4w\" (UID: \"5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f\") " pod="metallb-system/frr-k8s-pbr4w"
Mar 20 15:55:12 crc kubenswrapper[4730]: I0320 15:55:12.960144    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/70093cb9-bc43-427d-a8e4-5750058e2580-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-vmgrx\" (UID: \"70093cb9-bc43-427d-a8e4-5750058e2580\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-vmgrx"
Mar 20 15:55:12 crc kubenswrapper[4730]: I0320 15:55:12.960175    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f-metrics-certs\") pod \"frr-k8s-pbr4w\" (UID: \"5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f\") " pod="metallb-system/frr-k8s-pbr4w"
Mar 20 15:55:12 crc kubenswrapper[4730]: I0320 15:55:12.960196    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnzdp\" (UniqueName: \"kubernetes.io/projected/70093cb9-bc43-427d-a8e4-5750058e2580-kube-api-access-tnzdp\") pod \"frr-k8s-webhook-server-bcc4b6f68-vmgrx\" (UID: \"70093cb9-bc43-427d-a8e4-5750058e2580\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-vmgrx"
Mar 20 15:55:12 crc kubenswrapper[4730]: I0320 15:55:12.960214    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f-metrics\") pod \"frr-k8s-pbr4w\" (UID: \"5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f\") " pod="metallb-system/frr-k8s-pbr4w"
Mar 20 15:55:12 crc kubenswrapper[4730]: I0320 15:55:12.965601    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-vmgrx"]
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.061089    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f-frr-startup\") pod \"frr-k8s-pbr4w\" (UID: \"5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f\") " pod="metallb-system/frr-k8s-pbr4w"
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.061472    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9pbt\" (UniqueName: \"kubernetes.io/projected/5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f-kube-api-access-s9pbt\") pod \"frr-k8s-pbr4w\" (UID: \"5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f\") " pod="metallb-system/frr-k8s-pbr4w"
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.061513    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/70093cb9-bc43-427d-a8e4-5750058e2580-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-vmgrx\" (UID: \"70093cb9-bc43-427d-a8e4-5750058e2580\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-vmgrx"
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.061535    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f-metrics-certs\") pod \"frr-k8s-pbr4w\" (UID: \"5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f\") " pod="metallb-system/frr-k8s-pbr4w"
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.061555    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnzdp\" (UniqueName: \"kubernetes.io/projected/70093cb9-bc43-427d-a8e4-5750058e2580-kube-api-access-tnzdp\") pod \"frr-k8s-webhook-server-bcc4b6f68-vmgrx\" (UID: \"70093cb9-bc43-427d-a8e4-5750058e2580\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-vmgrx"
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.061582    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f-metrics\") pod \"frr-k8s-pbr4w\" (UID: \"5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f\") " pod="metallb-system/frr-k8s-pbr4w"
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.061605    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f-frr-conf\") pod \"frr-k8s-pbr4w\" (UID: \"5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f\") " pod="metallb-system/frr-k8s-pbr4w"
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.062042    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f-metrics\") pod \"frr-k8s-pbr4w\" (UID: \"5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f\") " pod="metallb-system/frr-k8s-pbr4w"
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.062082    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f-frr-startup\") pod \"frr-k8s-pbr4w\" (UID: \"5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f\") " pod="metallb-system/frr-k8s-pbr4w"
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.062185    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f-frr-sockets\") pod \"frr-k8s-pbr4w\" (UID: \"5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f\") " pod="metallb-system/frr-k8s-pbr4w"
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.062283    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f-reloader\") pod \"frr-k8s-pbr4w\" (UID: \"5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f\") " pod="metallb-system/frr-k8s-pbr4w"
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.062420    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f-frr-sockets\") pod \"frr-k8s-pbr4w\" (UID: \"5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f\") " pod="metallb-system/frr-k8s-pbr4w"
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.062185    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f-frr-conf\") pod \"frr-k8s-pbr4w\" (UID: \"5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f\") " pod="metallb-system/frr-k8s-pbr4w"
Mar 20 15:55:13 crc kubenswrapper[4730]: E0320 15:55:13.062502    4730 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found
Mar 20 15:55:13 crc kubenswrapper[4730]: E0320 15:55:13.062547    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70093cb9-bc43-427d-a8e4-5750058e2580-cert podName:70093cb9-bc43-427d-a8e4-5750058e2580 nodeName:}" failed. No retries permitted until 2026-03-20 15:55:13.562530743 +0000 UTC m=+972.775902112 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/70093cb9-bc43-427d-a8e4-5750058e2580-cert") pod "frr-k8s-webhook-server-bcc4b6f68-vmgrx" (UID: "70093cb9-bc43-427d-a8e4-5750058e2580") : secret "frr-k8s-webhook-server-cert" not found
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.062572    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f-reloader\") pod \"frr-k8s-pbr4w\" (UID: \"5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f\") " pod="metallb-system/frr-k8s-pbr4w"
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.071270    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f-metrics-certs\") pod \"frr-k8s-pbr4w\" (UID: \"5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f\") " pod="metallb-system/frr-k8s-pbr4w"
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.086217    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnzdp\" (UniqueName: \"kubernetes.io/projected/70093cb9-bc43-427d-a8e4-5750058e2580-kube-api-access-tnzdp\") pod \"frr-k8s-webhook-server-bcc4b6f68-vmgrx\" (UID: \"70093cb9-bc43-427d-a8e4-5750058e2580\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-vmgrx"
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.086308    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9pbt\" (UniqueName: \"kubernetes.io/projected/5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f-kube-api-access-s9pbt\") pod \"frr-k8s-pbr4w\" (UID: \"5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f\") " pod="metallb-system/frr-k8s-pbr4w"
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.100219    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-jdxzq"]
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.101416    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-jdxzq"
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.105570    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-tbvnw"]
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.109530    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-tbvnw"
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.110340    4730 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret"
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.113016    4730 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret"
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.113047    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2"
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.113186    4730 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-zn4jj"
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.113529    4730 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist"
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.128798    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-jdxzq"]
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.163137    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/02f5e1af-23a0-43ef-89ad-9c5af9e98cfd-metallb-excludel2\") pod \"speaker-tbvnw\" (UID: \"02f5e1af-23a0-43ef-89ad-9c5af9e98cfd\") " pod="metallb-system/speaker-tbvnw"
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.163196    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfkvr\" (UniqueName: \"kubernetes.io/projected/02f5e1af-23a0-43ef-89ad-9c5af9e98cfd-kube-api-access-kfkvr\") pod \"speaker-tbvnw\" (UID: \"02f5e1af-23a0-43ef-89ad-9c5af9e98cfd\") " pod="metallb-system/speaker-tbvnw"
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.163226    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a42a5cd0-d730-4d48-8082-2491494e90ff-cert\") pod \"controller-7bb4cc7c98-jdxzq\" (UID: \"a42a5cd0-d730-4d48-8082-2491494e90ff\") " pod="metallb-system/controller-7bb4cc7c98-jdxzq"
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.163340    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02f5e1af-23a0-43ef-89ad-9c5af9e98cfd-metrics-certs\") pod \"speaker-tbvnw\" (UID: \"02f5e1af-23a0-43ef-89ad-9c5af9e98cfd\") " pod="metallb-system/speaker-tbvnw"
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.163416    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a42a5cd0-d730-4d48-8082-2491494e90ff-metrics-certs\") pod \"controller-7bb4cc7c98-jdxzq\" (UID: \"a42a5cd0-d730-4d48-8082-2491494e90ff\") " pod="metallb-system/controller-7bb4cc7c98-jdxzq"
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.163448    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/02f5e1af-23a0-43ef-89ad-9c5af9e98cfd-memberlist\") pod \"speaker-tbvnw\" (UID: \"02f5e1af-23a0-43ef-89ad-9c5af9e98cfd\") " pod="metallb-system/speaker-tbvnw"
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.163473    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7tdw\" (UniqueName: \"kubernetes.io/projected/a42a5cd0-d730-4d48-8082-2491494e90ff-kube-api-access-n7tdw\") pod \"controller-7bb4cc7c98-jdxzq\" (UID: \"a42a5cd0-d730-4d48-8082-2491494e90ff\") " pod="metallb-system/controller-7bb4cc7c98-jdxzq"
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.264522    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/02f5e1af-23a0-43ef-89ad-9c5af9e98cfd-metallb-excludel2\") pod \"speaker-tbvnw\" (UID: \"02f5e1af-23a0-43ef-89ad-9c5af9e98cfd\") " pod="metallb-system/speaker-tbvnw"
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.264598    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfkvr\" (UniqueName: \"kubernetes.io/projected/02f5e1af-23a0-43ef-89ad-9c5af9e98cfd-kube-api-access-kfkvr\") pod \"speaker-tbvnw\" (UID: \"02f5e1af-23a0-43ef-89ad-9c5af9e98cfd\") " pod="metallb-system/speaker-tbvnw"
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.264623    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a42a5cd0-d730-4d48-8082-2491494e90ff-cert\") pod \"controller-7bb4cc7c98-jdxzq\" (UID: \"a42a5cd0-d730-4d48-8082-2491494e90ff\") " pod="metallb-system/controller-7bb4cc7c98-jdxzq"
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.264702    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02f5e1af-23a0-43ef-89ad-9c5af9e98cfd-metrics-certs\") pod \"speaker-tbvnw\" (UID: \"02f5e1af-23a0-43ef-89ad-9c5af9e98cfd\") " pod="metallb-system/speaker-tbvnw"
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.264757    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a42a5cd0-d730-4d48-8082-2491494e90ff-metrics-certs\") pod \"controller-7bb4cc7c98-jdxzq\" (UID: \"a42a5cd0-d730-4d48-8082-2491494e90ff\") " pod="metallb-system/controller-7bb4cc7c98-jdxzq"
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.264783    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/02f5e1af-23a0-43ef-89ad-9c5af9e98cfd-memberlist\") pod \"speaker-tbvnw\" (UID: \"02f5e1af-23a0-43ef-89ad-9c5af9e98cfd\") " pod="metallb-system/speaker-tbvnw"
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.264806    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7tdw\" (UniqueName: \"kubernetes.io/projected/a42a5cd0-d730-4d48-8082-2491494e90ff-kube-api-access-n7tdw\") pod \"controller-7bb4cc7c98-jdxzq\" (UID: \"a42a5cd0-d730-4d48-8082-2491494e90ff\") " pod="metallb-system/controller-7bb4cc7c98-jdxzq"
Mar 20 15:55:13 crc kubenswrapper[4730]: E0320 15:55:13.265239    4730 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found
Mar 20 15:55:13 crc kubenswrapper[4730]: E0320 15:55:13.265286    4730 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found
Mar 20 15:55:13 crc kubenswrapper[4730]: E0320 15:55:13.265320    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02f5e1af-23a0-43ef-89ad-9c5af9e98cfd-memberlist podName:02f5e1af-23a0-43ef-89ad-9c5af9e98cfd nodeName:}" failed. No retries permitted until 2026-03-20 15:55:13.765301039 +0000 UTC m=+972.978672408 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/02f5e1af-23a0-43ef-89ad-9c5af9e98cfd-memberlist") pod "speaker-tbvnw" (UID: "02f5e1af-23a0-43ef-89ad-9c5af9e98cfd") : secret "metallb-memberlist" not found
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.265335    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/02f5e1af-23a0-43ef-89ad-9c5af9e98cfd-metallb-excludel2\") pod \"speaker-tbvnw\" (UID: \"02f5e1af-23a0-43ef-89ad-9c5af9e98cfd\") " pod="metallb-system/speaker-tbvnw"
Mar 20 15:55:13 crc kubenswrapper[4730]: E0320 15:55:13.265364    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a42a5cd0-d730-4d48-8082-2491494e90ff-metrics-certs podName:a42a5cd0-d730-4d48-8082-2491494e90ff nodeName:}" failed. No retries permitted until 2026-03-20 15:55:13.76534581 +0000 UTC m=+972.978717179 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a42a5cd0-d730-4d48-8082-2491494e90ff-metrics-certs") pod "controller-7bb4cc7c98-jdxzq" (UID: "a42a5cd0-d730-4d48-8082-2491494e90ff") : secret "controller-certs-secret" not found
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.266822    4730 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert"
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.271680    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02f5e1af-23a0-43ef-89ad-9c5af9e98cfd-metrics-certs\") pod \"speaker-tbvnw\" (UID: \"02f5e1af-23a0-43ef-89ad-9c5af9e98cfd\") " pod="metallb-system/speaker-tbvnw"
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.278568    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-pbr4w"
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.279171    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a42a5cd0-d730-4d48-8082-2491494e90ff-cert\") pod \"controller-7bb4cc7c98-jdxzq\" (UID: \"a42a5cd0-d730-4d48-8082-2491494e90ff\") " pod="metallb-system/controller-7bb4cc7c98-jdxzq"
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.285486    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfkvr\" (UniqueName: \"kubernetes.io/projected/02f5e1af-23a0-43ef-89ad-9c5af9e98cfd-kube-api-access-kfkvr\") pod \"speaker-tbvnw\" (UID: \"02f5e1af-23a0-43ef-89ad-9c5af9e98cfd\") " pod="metallb-system/speaker-tbvnw"
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.287068    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7tdw\" (UniqueName: \"kubernetes.io/projected/a42a5cd0-d730-4d48-8082-2491494e90ff-kube-api-access-n7tdw\") pod \"controller-7bb4cc7c98-jdxzq\" (UID: \"a42a5cd0-d730-4d48-8082-2491494e90ff\") " pod="metallb-system/controller-7bb4cc7c98-jdxzq"
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.568369    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/70093cb9-bc43-427d-a8e4-5750058e2580-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-vmgrx\" (UID: \"70093cb9-bc43-427d-a8e4-5750058e2580\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-vmgrx"
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.571446    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/70093cb9-bc43-427d-a8e4-5750058e2580-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-vmgrx\" (UID: \"70093cb9-bc43-427d-a8e4-5750058e2580\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-vmgrx"
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.586468    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-vmgrx"
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.771600    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a42a5cd0-d730-4d48-8082-2491494e90ff-metrics-certs\") pod \"controller-7bb4cc7c98-jdxzq\" (UID: \"a42a5cd0-d730-4d48-8082-2491494e90ff\") " pod="metallb-system/controller-7bb4cc7c98-jdxzq"
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.771975    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/02f5e1af-23a0-43ef-89ad-9c5af9e98cfd-memberlist\") pod \"speaker-tbvnw\" (UID: \"02f5e1af-23a0-43ef-89ad-9c5af9e98cfd\") " pod="metallb-system/speaker-tbvnw"
Mar 20 15:55:13 crc kubenswrapper[4730]: E0320 15:55:13.772131    4730 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found
Mar 20 15:55:13 crc kubenswrapper[4730]: E0320 15:55:13.772222    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02f5e1af-23a0-43ef-89ad-9c5af9e98cfd-memberlist podName:02f5e1af-23a0-43ef-89ad-9c5af9e98cfd nodeName:}" failed. No retries permitted until 2026-03-20 15:55:14.772204418 +0000 UTC m=+973.985575787 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/02f5e1af-23a0-43ef-89ad-9c5af9e98cfd-memberlist") pod "speaker-tbvnw" (UID: "02f5e1af-23a0-43ef-89ad-9c5af9e98cfd") : secret "metallb-memberlist" not found
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.777072    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a42a5cd0-d730-4d48-8082-2491494e90ff-metrics-certs\") pod \"controller-7bb4cc7c98-jdxzq\" (UID: \"a42a5cd0-d730-4d48-8082-2491494e90ff\") " pod="metallb-system/controller-7bb4cc7c98-jdxzq"
Mar 20 15:55:13 crc kubenswrapper[4730]: I0320 15:55:13.980104    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-vmgrx"]
Mar 20 15:55:13 crc kubenswrapper[4730]: W0320 15:55:13.989770    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70093cb9_bc43_427d_a8e4_5750058e2580.slice/crio-61cac70bdf88f8c040c18bd00bb7e5a81b732ea8e9e56b85389a3d342bb7c6a7 WatchSource:0}: Error finding container 61cac70bdf88f8c040c18bd00bb7e5a81b732ea8e9e56b85389a3d342bb7c6a7: Status 404 returned error can't find the container with id 61cac70bdf88f8c040c18bd00bb7e5a81b732ea8e9e56b85389a3d342bb7c6a7
Mar 20 15:55:14 crc kubenswrapper[4730]: I0320 15:55:14.040579    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-jdxzq"
Mar 20 15:55:14 crc kubenswrapper[4730]: I0320 15:55:14.330131    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pbr4w" event={"ID":"5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f","Type":"ContainerStarted","Data":"a41420ebc0affd4b2a65376f49566e8ae45051aeba6c79c3c846b89081ef4a08"}
Mar 20 15:55:14 crc kubenswrapper[4730]: I0320 15:55:14.331409    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-vmgrx" event={"ID":"70093cb9-bc43-427d-a8e4-5750058e2580","Type":"ContainerStarted","Data":"61cac70bdf88f8c040c18bd00bb7e5a81b732ea8e9e56b85389a3d342bb7c6a7"}
Mar 20 15:55:14 crc kubenswrapper[4730]: I0320 15:55:14.433703    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-jdxzq"]
Mar 20 15:55:14 crc kubenswrapper[4730]: I0320 15:55:14.785416    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/02f5e1af-23a0-43ef-89ad-9c5af9e98cfd-memberlist\") pod \"speaker-tbvnw\" (UID: \"02f5e1af-23a0-43ef-89ad-9c5af9e98cfd\") " pod="metallb-system/speaker-tbvnw"
Mar 20 15:55:14 crc kubenswrapper[4730]: I0320 15:55:14.792834    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/02f5e1af-23a0-43ef-89ad-9c5af9e98cfd-memberlist\") pod \"speaker-tbvnw\" (UID: \"02f5e1af-23a0-43ef-89ad-9c5af9e98cfd\") " pod="metallb-system/speaker-tbvnw"
Mar 20 15:55:14 crc kubenswrapper[4730]: I0320 15:55:14.947844    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-tbvnw"
Mar 20 15:55:14 crc kubenswrapper[4730]: W0320 15:55:14.969877    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02f5e1af_23a0_43ef_89ad_9c5af9e98cfd.slice/crio-917db0179f9c74d8205f5c4e198e877300d10d61df025b7083c47e3b07ef1ddc WatchSource:0}: Error finding container 917db0179f9c74d8205f5c4e198e877300d10d61df025b7083c47e3b07ef1ddc: Status 404 returned error can't find the container with id 917db0179f9c74d8205f5c4e198e877300d10d61df025b7083c47e3b07ef1ddc
Mar 20 15:55:15 crc kubenswrapper[4730]: I0320 15:55:15.356200    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-tbvnw" event={"ID":"02f5e1af-23a0-43ef-89ad-9c5af9e98cfd","Type":"ContainerStarted","Data":"c0c9f70afccc4a389fadabdf265f8983e4aa5217c947cf3302a7e5cf0eda081c"}
Mar 20 15:55:15 crc kubenswrapper[4730]: I0320 15:55:15.356287    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-tbvnw" event={"ID":"02f5e1af-23a0-43ef-89ad-9c5af9e98cfd","Type":"ContainerStarted","Data":"917db0179f9c74d8205f5c4e198e877300d10d61df025b7083c47e3b07ef1ddc"}
Mar 20 15:55:15 crc kubenswrapper[4730]: I0320 15:55:15.358869    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-jdxzq" event={"ID":"a42a5cd0-d730-4d48-8082-2491494e90ff","Type":"ContainerStarted","Data":"28336ed68bedb4159591928ffdfad2081cddd80b2953c0212abad857c301462d"}
Mar 20 15:55:15 crc kubenswrapper[4730]: I0320 15:55:15.358901    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-jdxzq" event={"ID":"a42a5cd0-d730-4d48-8082-2491494e90ff","Type":"ContainerStarted","Data":"c88e8ce4fed6c1da66c130a28f982f0606a71a0fd295591ea15f375c11ac5ad2"}
Mar 20 15:55:15 crc kubenswrapper[4730]: I0320 15:55:15.358912    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-jdxzq" event={"ID":"a42a5cd0-d730-4d48-8082-2491494e90ff","Type":"ContainerStarted","Data":"a7884ca535db7dc481e5d6e2b8df9e19ac5dcfce14254770c591f3691aaeca72"}
Mar 20 15:55:15 crc kubenswrapper[4730]: I0320 15:55:15.359933    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-jdxzq"
Mar 20 15:55:15 crc kubenswrapper[4730]: I0320 15:55:15.394423    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-jdxzq" podStartSLOduration=2.394404411 podStartE2EDuration="2.394404411s" podCreationTimestamp="2026-03-20 15:55:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:55:15.393489285 +0000 UTC m=+974.606860694" watchObservedRunningTime="2026-03-20 15:55:15.394404411 +0000 UTC m=+974.607775780"
Mar 20 15:55:16 crc kubenswrapper[4730]: I0320 15:55:16.372062    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-tbvnw" event={"ID":"02f5e1af-23a0-43ef-89ad-9c5af9e98cfd","Type":"ContainerStarted","Data":"8e6e0c714389e8537c520f55a5ad4a49fa5ada4cfbea8c26e74e7866e3191408"}
Mar 20 15:55:16 crc kubenswrapper[4730]: I0320 15:55:16.394171    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-tbvnw" podStartSLOduration=3.394152773 podStartE2EDuration="3.394152773s" podCreationTimestamp="2026-03-20 15:55:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:55:16.391013063 +0000 UTC m=+975.604384432" watchObservedRunningTime="2026-03-20 15:55:16.394152773 +0000 UTC m=+975.607524142"
Mar 20 15:55:17 crc kubenswrapper[4730]: I0320 15:55:17.377783    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-tbvnw"
Mar 20 15:55:17 crc kubenswrapper[4730]: I0320 15:55:17.685206    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dczcg"]
Mar 20 15:55:17 crc kubenswrapper[4730]: I0320 15:55:17.689863    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dczcg"
Mar 20 15:55:17 crc kubenswrapper[4730]: I0320 15:55:17.697627    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dczcg"]
Mar 20 15:55:17 crc kubenswrapper[4730]: I0320 15:55:17.755419    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzkfz\" (UniqueName: \"kubernetes.io/projected/146c64ad-b085-471a-a540-7faa5c6e969f-kube-api-access-zzkfz\") pod \"redhat-marketplace-dczcg\" (UID: \"146c64ad-b085-471a-a540-7faa5c6e969f\") " pod="openshift-marketplace/redhat-marketplace-dczcg"
Mar 20 15:55:17 crc kubenswrapper[4730]: I0320 15:55:17.755466    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/146c64ad-b085-471a-a540-7faa5c6e969f-catalog-content\") pod \"redhat-marketplace-dczcg\" (UID: \"146c64ad-b085-471a-a540-7faa5c6e969f\") " pod="openshift-marketplace/redhat-marketplace-dczcg"
Mar 20 15:55:17 crc kubenswrapper[4730]: I0320 15:55:17.755532    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/146c64ad-b085-471a-a540-7faa5c6e969f-utilities\") pod \"redhat-marketplace-dczcg\" (UID: \"146c64ad-b085-471a-a540-7faa5c6e969f\") " pod="openshift-marketplace/redhat-marketplace-dczcg"
Mar 20 15:55:17 crc kubenswrapper[4730]: I0320 15:55:17.857266    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/146c64ad-b085-471a-a540-7faa5c6e969f-catalog-content\") pod \"redhat-marketplace-dczcg\" (UID: \"146c64ad-b085-471a-a540-7faa5c6e969f\") " pod="openshift-marketplace/redhat-marketplace-dczcg"
Mar 20 15:55:17 crc kubenswrapper[4730]: I0320 15:55:17.857656    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/146c64ad-b085-471a-a540-7faa5c6e969f-utilities\") pod \"redhat-marketplace-dczcg\" (UID: \"146c64ad-b085-471a-a540-7faa5c6e969f\") " pod="openshift-marketplace/redhat-marketplace-dczcg"
Mar 20 15:55:17 crc kubenswrapper[4730]: I0320 15:55:17.857713    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzkfz\" (UniqueName: \"kubernetes.io/projected/146c64ad-b085-471a-a540-7faa5c6e969f-kube-api-access-zzkfz\") pod \"redhat-marketplace-dczcg\" (UID: \"146c64ad-b085-471a-a540-7faa5c6e969f\") " pod="openshift-marketplace/redhat-marketplace-dczcg"
Mar 20 15:55:17 crc kubenswrapper[4730]: I0320 15:55:17.858153    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/146c64ad-b085-471a-a540-7faa5c6e969f-catalog-content\") pod \"redhat-marketplace-dczcg\" (UID: \"146c64ad-b085-471a-a540-7faa5c6e969f\") " pod="openshift-marketplace/redhat-marketplace-dczcg"
Mar 20 15:55:17 crc kubenswrapper[4730]: I0320 15:55:17.858667    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/146c64ad-b085-471a-a540-7faa5c6e969f-utilities\") pod \"redhat-marketplace-dczcg\" (UID: \"146c64ad-b085-471a-a540-7faa5c6e969f\") " pod="openshift-marketplace/redhat-marketplace-dczcg"
Mar 20 15:55:17 crc kubenswrapper[4730]: I0320 15:55:17.892628    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzkfz\" (UniqueName: \"kubernetes.io/projected/146c64ad-b085-471a-a540-7faa5c6e969f-kube-api-access-zzkfz\") pod \"redhat-marketplace-dczcg\" (UID: \"146c64ad-b085-471a-a540-7faa5c6e969f\") " pod="openshift-marketplace/redhat-marketplace-dczcg"
Mar 20 15:55:18 crc kubenswrapper[4730]: I0320 15:55:18.008164    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dczcg"
Mar 20 15:55:18 crc kubenswrapper[4730]: I0320 15:55:18.456282    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dczcg"]
Mar 20 15:55:18 crc kubenswrapper[4730]: W0320 15:55:18.465571    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod146c64ad_b085_471a_a540_7faa5c6e969f.slice/crio-aa1e2ed4596b23c9bada0e041e5fe2d2c27f666f02d8278cf4e320c407e509a5 WatchSource:0}: Error finding container aa1e2ed4596b23c9bada0e041e5fe2d2c27f666f02d8278cf4e320c407e509a5: Status 404 returned error can't find the container with id aa1e2ed4596b23c9bada0e041e5fe2d2c27f666f02d8278cf4e320c407e509a5
Mar 20 15:55:19 crc kubenswrapper[4730]: I0320 15:55:19.404098    4730 generic.go:334] "Generic (PLEG): container finished" podID="146c64ad-b085-471a-a540-7faa5c6e969f" containerID="887bc31eff014e32cf26c264c448be6c71dca22cdbc492ad535c3590b7f3e8da" exitCode=0
Mar 20 15:55:19 crc kubenswrapper[4730]: I0320 15:55:19.404210    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dczcg" event={"ID":"146c64ad-b085-471a-a540-7faa5c6e969f","Type":"ContainerDied","Data":"887bc31eff014e32cf26c264c448be6c71dca22cdbc492ad535c3590b7f3e8da"}
Mar 20 15:55:19 crc kubenswrapper[4730]: I0320 15:55:19.404431    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dczcg" event={"ID":"146c64ad-b085-471a-a540-7faa5c6e969f","Type":"ContainerStarted","Data":"aa1e2ed4596b23c9bada0e041e5fe2d2c27f666f02d8278cf4e320c407e509a5"}
Mar 20 15:55:22 crc kubenswrapper[4730]: I0320 15:55:22.427537    4730 generic.go:334] "Generic (PLEG): container finished" podID="146c64ad-b085-471a-a540-7faa5c6e969f" containerID="83d094f34ae090c9eeccb57b8f05b10f573a3244c65881d549b6ba6a62fc367e" exitCode=0
Mar 20 15:55:22 crc kubenswrapper[4730]: I0320 15:55:22.427705    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dczcg" event={"ID":"146c64ad-b085-471a-a540-7faa5c6e969f","Type":"ContainerDied","Data":"83d094f34ae090c9eeccb57b8f05b10f573a3244c65881d549b6ba6a62fc367e"}
Mar 20 15:55:22 crc kubenswrapper[4730]: I0320 15:55:22.429486    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-vmgrx" event={"ID":"70093cb9-bc43-427d-a8e4-5750058e2580","Type":"ContainerStarted","Data":"a8cf3edb8ebd289046edc5c4bdd1333705cc765b9f08dac3ce17d249ec624915"}
Mar 20 15:55:22 crc kubenswrapper[4730]: I0320 15:55:22.430055    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-vmgrx"
Mar 20 15:55:22 crc kubenswrapper[4730]: I0320 15:55:22.432385    4730 generic.go:334] "Generic (PLEG): container finished" podID="5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f" containerID="98a3547c9abda663872929df276239d41f451090d85f020b6dfac556444da1b5" exitCode=0
Mar 20 15:55:22 crc kubenswrapper[4730]: I0320 15:55:22.432413    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pbr4w" event={"ID":"5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f","Type":"ContainerDied","Data":"98a3547c9abda663872929df276239d41f451090d85f020b6dfac556444da1b5"}
Mar 20 15:55:22 crc kubenswrapper[4730]: I0320 15:55:22.495713    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-vmgrx" podStartSLOduration=3.138963081 podStartE2EDuration="10.495695168s" podCreationTimestamp="2026-03-20 15:55:12 +0000 UTC" firstStartedPulling="2026-03-20 15:55:13.991666191 +0000 UTC m=+973.205037550" lastFinishedPulling="2026-03-20 15:55:21.348398268 +0000 UTC m=+980.561769637" observedRunningTime="2026-03-20 15:55:22.492800846 +0000 UTC m=+981.706172215" watchObservedRunningTime="2026-03-20 15:55:22.495695168 +0000 UTC m=+981.709066527"
Mar 20 15:55:23 crc kubenswrapper[4730]: I0320 15:55:23.440020    4730 generic.go:334] "Generic (PLEG): container finished" podID="5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f" containerID="60c71f27837972075d15c3f2fb8934da891bfce388025185cf314ba79c175488" exitCode=0
Mar 20 15:55:23 crc kubenswrapper[4730]: I0320 15:55:23.440108    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pbr4w" event={"ID":"5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f","Type":"ContainerDied","Data":"60c71f27837972075d15c3f2fb8934da891bfce388025185cf314ba79c175488"}
Mar 20 15:55:23 crc kubenswrapper[4730]: I0320 15:55:23.444595    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dczcg" event={"ID":"146c64ad-b085-471a-a540-7faa5c6e969f","Type":"ContainerStarted","Data":"19eabcfbe8c8ecc95b9e7dabd08da3c0a83dd15b9e6ba2626654ee2384904d17"}
Mar 20 15:55:23 crc kubenswrapper[4730]: I0320 15:55:23.488101    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dczcg" podStartSLOduration=4.880344984 podStartE2EDuration="6.48808524s" podCreationTimestamp="2026-03-20 15:55:17 +0000 UTC" firstStartedPulling="2026-03-20 15:55:21.273217343 +0000 UTC m=+980.486588722" lastFinishedPulling="2026-03-20 15:55:22.880957599 +0000 UTC m=+982.094328978" observedRunningTime="2026-03-20 15:55:23.484713774 +0000 UTC m=+982.698085153" watchObservedRunningTime="2026-03-20 15:55:23.48808524 +0000 UTC m=+982.701456599"
Mar 20 15:55:24 crc kubenswrapper[4730]: I0320 15:55:24.050636    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-jdxzq"
Mar 20 15:55:24 crc kubenswrapper[4730]: I0320 15:55:24.450969    4730 generic.go:334] "Generic (PLEG): container finished" podID="5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f" containerID="8dff8d10a59de330a94e20e32ff057bcc85bf5fc56e98ab20b2537a99bfb05be" exitCode=0
Mar 20 15:55:24 crc kubenswrapper[4730]: I0320 15:55:24.451049    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pbr4w" event={"ID":"5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f","Type":"ContainerDied","Data":"8dff8d10a59de330a94e20e32ff057bcc85bf5fc56e98ab20b2537a99bfb05be"}
Mar 20 15:55:25 crc kubenswrapper[4730]: I0320 15:55:25.462132    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pbr4w" event={"ID":"5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f","Type":"ContainerStarted","Data":"fd5bab4984293792a6250d51f252a53fa1ce0338e3285efabcace1946ca454a4"}
Mar 20 15:55:25 crc kubenswrapper[4730]: I0320 15:55:25.462527    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-pbr4w"
Mar 20 15:55:25 crc kubenswrapper[4730]: I0320 15:55:25.462541    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pbr4w" event={"ID":"5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f","Type":"ContainerStarted","Data":"01e81d52130169fe0ae15d0067de3a90f219f6143074655cb14ff2d07033cf9a"}
Mar 20 15:55:25 crc kubenswrapper[4730]: I0320 15:55:25.462553    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pbr4w" event={"ID":"5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f","Type":"ContainerStarted","Data":"876622305eb46edc2eb833d62f52ed83f5ac5ac4685cf4bd0284a002ae0d0a91"}
Mar 20 15:55:25 crc kubenswrapper[4730]: I0320 15:55:25.462587    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pbr4w" event={"ID":"5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f","Type":"ContainerStarted","Data":"714771251e434b57c6f751228bf4f567739a60ee3f80dc96d5cd6dbf2d848f76"}
Mar 20 15:55:25 crc kubenswrapper[4730]: I0320 15:55:25.462597    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pbr4w" event={"ID":"5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f","Type":"ContainerStarted","Data":"4ec8ba38fb2e568025562a8690bbb9b1b92aec199792851980dba74f4875592a"}
Mar 20 15:55:25 crc kubenswrapper[4730]: I0320 15:55:25.462606    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pbr4w" event={"ID":"5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f","Type":"ContainerStarted","Data":"0d8b6c8b9f596c16cbd5269ba63455b8069330b3a5b27540a07a26f562be3cc8"}
Mar 20 15:55:25 crc kubenswrapper[4730]: I0320 15:55:25.491950    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-pbr4w" podStartSLOduration=5.560453518 podStartE2EDuration="13.491929215s" podCreationTimestamp="2026-03-20 15:55:12 +0000 UTC" firstStartedPulling="2026-03-20 15:55:13.429340425 +0000 UTC m=+972.642711794" lastFinishedPulling="2026-03-20 15:55:21.360816122 +0000 UTC m=+980.574187491" observedRunningTime="2026-03-20 15:55:25.488525568 +0000 UTC m=+984.701896967" watchObservedRunningTime="2026-03-20 15:55:25.491929215 +0000 UTC m=+984.705300584"
Mar 20 15:55:28 crc kubenswrapper[4730]: I0320 15:55:28.009301    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dczcg"
Mar 20 15:55:28 crc kubenswrapper[4730]: I0320 15:55:28.009617    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dczcg"
Mar 20 15:55:28 crc kubenswrapper[4730]: I0320 15:55:28.048391    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dczcg"
Mar 20 15:55:28 crc kubenswrapper[4730]: I0320 15:55:28.279295    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-pbr4w"
Mar 20 15:55:28 crc kubenswrapper[4730]: I0320 15:55:28.318555    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-pbr4w"
Mar 20 15:55:28 crc kubenswrapper[4730]: I0320 15:55:28.527774    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dczcg"
Mar 20 15:55:28 crc kubenswrapper[4730]: I0320 15:55:28.568714    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dczcg"]
Mar 20 15:55:30 crc kubenswrapper[4730]: I0320 15:55:30.491444    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dczcg" podUID="146c64ad-b085-471a-a540-7faa5c6e969f" containerName="registry-server" containerID="cri-o://19eabcfbe8c8ecc95b9e7dabd08da3c0a83dd15b9e6ba2626654ee2384904d17" gracePeriod=2
Mar 20 15:55:30 crc kubenswrapper[4730]: I0320 15:55:30.726725    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ccfvp"]
Mar 20 15:55:30 crc kubenswrapper[4730]: I0320 15:55:30.727961    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ccfvp"
Mar 20 15:55:30 crc kubenswrapper[4730]: I0320 15:55:30.750331    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ccfvp"]
Mar 20 15:55:30 crc kubenswrapper[4730]: I0320 15:55:30.857514    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67135f13-b182-4e78-b64d-59e924cc6d06-utilities\") pod \"community-operators-ccfvp\" (UID: \"67135f13-b182-4e78-b64d-59e924cc6d06\") " pod="openshift-marketplace/community-operators-ccfvp"
Mar 20 15:55:30 crc kubenswrapper[4730]: I0320 15:55:30.857561    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67135f13-b182-4e78-b64d-59e924cc6d06-catalog-content\") pod \"community-operators-ccfvp\" (UID: \"67135f13-b182-4e78-b64d-59e924cc6d06\") " pod="openshift-marketplace/community-operators-ccfvp"
Mar 20 15:55:30 crc kubenswrapper[4730]: I0320 15:55:30.857606    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzjls\" (UniqueName: \"kubernetes.io/projected/67135f13-b182-4e78-b64d-59e924cc6d06-kube-api-access-kzjls\") pod \"community-operators-ccfvp\" (UID: \"67135f13-b182-4e78-b64d-59e924cc6d06\") " pod="openshift-marketplace/community-operators-ccfvp"
Mar 20 15:55:30 crc kubenswrapper[4730]: I0320 15:55:30.862557    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dczcg"
Mar 20 15:55:30 crc kubenswrapper[4730]: I0320 15:55:30.958557    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/146c64ad-b085-471a-a540-7faa5c6e969f-catalog-content\") pod \"146c64ad-b085-471a-a540-7faa5c6e969f\" (UID: \"146c64ad-b085-471a-a540-7faa5c6e969f\") "
Mar 20 15:55:30 crc kubenswrapper[4730]: I0320 15:55:30.958596    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzkfz\" (UniqueName: \"kubernetes.io/projected/146c64ad-b085-471a-a540-7faa5c6e969f-kube-api-access-zzkfz\") pod \"146c64ad-b085-471a-a540-7faa5c6e969f\" (UID: \"146c64ad-b085-471a-a540-7faa5c6e969f\") "
Mar 20 15:55:30 crc kubenswrapper[4730]: I0320 15:55:30.958728    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/146c64ad-b085-471a-a540-7faa5c6e969f-utilities\") pod \"146c64ad-b085-471a-a540-7faa5c6e969f\" (UID: \"146c64ad-b085-471a-a540-7faa5c6e969f\") "
Mar 20 15:55:30 crc kubenswrapper[4730]: I0320 15:55:30.958890    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67135f13-b182-4e78-b64d-59e924cc6d06-utilities\") pod \"community-operators-ccfvp\" (UID: \"67135f13-b182-4e78-b64d-59e924cc6d06\") " pod="openshift-marketplace/community-operators-ccfvp"
Mar 20 15:55:30 crc kubenswrapper[4730]: I0320 15:55:30.958923    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67135f13-b182-4e78-b64d-59e924cc6d06-catalog-content\") pod \"community-operators-ccfvp\" (UID: \"67135f13-b182-4e78-b64d-59e924cc6d06\") " pod="openshift-marketplace/community-operators-ccfvp"
Mar 20 15:55:30 crc kubenswrapper[4730]: I0320 15:55:30.958966    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzjls\" (UniqueName: \"kubernetes.io/projected/67135f13-b182-4e78-b64d-59e924cc6d06-kube-api-access-kzjls\") pod \"community-operators-ccfvp\" (UID: \"67135f13-b182-4e78-b64d-59e924cc6d06\") " pod="openshift-marketplace/community-operators-ccfvp"
Mar 20 15:55:30 crc kubenswrapper[4730]: I0320 15:55:30.959529    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67135f13-b182-4e78-b64d-59e924cc6d06-catalog-content\") pod \"community-operators-ccfvp\" (UID: \"67135f13-b182-4e78-b64d-59e924cc6d06\") " pod="openshift-marketplace/community-operators-ccfvp"
Mar 20 15:55:30 crc kubenswrapper[4730]: I0320 15:55:30.959720    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/146c64ad-b085-471a-a540-7faa5c6e969f-utilities" (OuterVolumeSpecName: "utilities") pod "146c64ad-b085-471a-a540-7faa5c6e969f" (UID: "146c64ad-b085-471a-a540-7faa5c6e969f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 15:55:30 crc kubenswrapper[4730]: I0320 15:55:30.959920    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67135f13-b182-4e78-b64d-59e924cc6d06-utilities\") pod \"community-operators-ccfvp\" (UID: \"67135f13-b182-4e78-b64d-59e924cc6d06\") " pod="openshift-marketplace/community-operators-ccfvp"
Mar 20 15:55:30 crc kubenswrapper[4730]: I0320 15:55:30.965469    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/146c64ad-b085-471a-a540-7faa5c6e969f-kube-api-access-zzkfz" (OuterVolumeSpecName: "kube-api-access-zzkfz") pod "146c64ad-b085-471a-a540-7faa5c6e969f" (UID: "146c64ad-b085-471a-a540-7faa5c6e969f"). InnerVolumeSpecName "kube-api-access-zzkfz". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:55:30 crc kubenswrapper[4730]: I0320 15:55:30.978640    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzjls\" (UniqueName: \"kubernetes.io/projected/67135f13-b182-4e78-b64d-59e924cc6d06-kube-api-access-kzjls\") pod \"community-operators-ccfvp\" (UID: \"67135f13-b182-4e78-b64d-59e924cc6d06\") " pod="openshift-marketplace/community-operators-ccfvp"
Mar 20 15:55:30 crc kubenswrapper[4730]: I0320 15:55:30.988652    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/146c64ad-b085-471a-a540-7faa5c6e969f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "146c64ad-b085-471a-a540-7faa5c6e969f" (UID: "146c64ad-b085-471a-a540-7faa5c6e969f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 15:55:31 crc kubenswrapper[4730]: I0320 15:55:31.056572    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ccfvp"
Mar 20 15:55:31 crc kubenswrapper[4730]: I0320 15:55:31.060937    4730 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/146c64ad-b085-471a-a540-7faa5c6e969f-utilities\") on node \"crc\" DevicePath \"\""
Mar 20 15:55:31 crc kubenswrapper[4730]: I0320 15:55:31.061079    4730 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/146c64ad-b085-471a-a540-7faa5c6e969f-catalog-content\") on node \"crc\" DevicePath \"\""
Mar 20 15:55:31 crc kubenswrapper[4730]: I0320 15:55:31.061166    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzkfz\" (UniqueName: \"kubernetes.io/projected/146c64ad-b085-471a-a540-7faa5c6e969f-kube-api-access-zzkfz\") on node \"crc\" DevicePath \"\""
Mar 20 15:55:31 crc kubenswrapper[4730]: I0320 15:55:31.498952    4730 generic.go:334] "Generic (PLEG): container finished" podID="146c64ad-b085-471a-a540-7faa5c6e969f" containerID="19eabcfbe8c8ecc95b9e7dabd08da3c0a83dd15b9e6ba2626654ee2384904d17" exitCode=0
Mar 20 15:55:31 crc kubenswrapper[4730]: I0320 15:55:31.498993    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dczcg" event={"ID":"146c64ad-b085-471a-a540-7faa5c6e969f","Type":"ContainerDied","Data":"19eabcfbe8c8ecc95b9e7dabd08da3c0a83dd15b9e6ba2626654ee2384904d17"}
Mar 20 15:55:31 crc kubenswrapper[4730]: I0320 15:55:31.499020    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dczcg" event={"ID":"146c64ad-b085-471a-a540-7faa5c6e969f","Type":"ContainerDied","Data":"aa1e2ed4596b23c9bada0e041e5fe2d2c27f666f02d8278cf4e320c407e509a5"}
Mar 20 15:55:31 crc kubenswrapper[4730]: I0320 15:55:31.499038    4730 scope.go:117] "RemoveContainer" containerID="19eabcfbe8c8ecc95b9e7dabd08da3c0a83dd15b9e6ba2626654ee2384904d17"
Mar 20 15:55:31 crc kubenswrapper[4730]: I0320 15:55:31.499091    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dczcg"
Mar 20 15:55:31 crc kubenswrapper[4730]: I0320 15:55:31.515033    4730 scope.go:117] "RemoveContainer" containerID="83d094f34ae090c9eeccb57b8f05b10f573a3244c65881d549b6ba6a62fc367e"
Mar 20 15:55:31 crc kubenswrapper[4730]: I0320 15:55:31.531305    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dczcg"]
Mar 20 15:55:31 crc kubenswrapper[4730]: I0320 15:55:31.536826    4730 scope.go:117] "RemoveContainer" containerID="887bc31eff014e32cf26c264c448be6c71dca22cdbc492ad535c3590b7f3e8da"
Mar 20 15:55:31 crc kubenswrapper[4730]: I0320 15:55:31.544932    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dczcg"]
Mar 20 15:55:31 crc kubenswrapper[4730]: I0320 15:55:31.553471    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ccfvp"]
Mar 20 15:55:31 crc kubenswrapper[4730]: I0320 15:55:31.563781    4730 scope.go:117] "RemoveContainer" containerID="19eabcfbe8c8ecc95b9e7dabd08da3c0a83dd15b9e6ba2626654ee2384904d17"
Mar 20 15:55:31 crc kubenswrapper[4730]: E0320 15:55:31.564663    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19eabcfbe8c8ecc95b9e7dabd08da3c0a83dd15b9e6ba2626654ee2384904d17\": container with ID starting with 19eabcfbe8c8ecc95b9e7dabd08da3c0a83dd15b9e6ba2626654ee2384904d17 not found: ID does not exist" containerID="19eabcfbe8c8ecc95b9e7dabd08da3c0a83dd15b9e6ba2626654ee2384904d17"
Mar 20 15:55:31 crc kubenswrapper[4730]: I0320 15:55:31.564697    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19eabcfbe8c8ecc95b9e7dabd08da3c0a83dd15b9e6ba2626654ee2384904d17"} err="failed to get container status \"19eabcfbe8c8ecc95b9e7dabd08da3c0a83dd15b9e6ba2626654ee2384904d17\": rpc error: code = NotFound desc = could not find container \"19eabcfbe8c8ecc95b9e7dabd08da3c0a83dd15b9e6ba2626654ee2384904d17\": container with ID starting with 19eabcfbe8c8ecc95b9e7dabd08da3c0a83dd15b9e6ba2626654ee2384904d17 not found: ID does not exist"
Mar 20 15:55:31 crc kubenswrapper[4730]: I0320 15:55:31.564721    4730 scope.go:117] "RemoveContainer" containerID="83d094f34ae090c9eeccb57b8f05b10f573a3244c65881d549b6ba6a62fc367e"
Mar 20 15:55:31 crc kubenswrapper[4730]: E0320 15:55:31.565369    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83d094f34ae090c9eeccb57b8f05b10f573a3244c65881d549b6ba6a62fc367e\": container with ID starting with 83d094f34ae090c9eeccb57b8f05b10f573a3244c65881d549b6ba6a62fc367e not found: ID does not exist" containerID="83d094f34ae090c9eeccb57b8f05b10f573a3244c65881d549b6ba6a62fc367e"
Mar 20 15:55:31 crc kubenswrapper[4730]: I0320 15:55:31.565437    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83d094f34ae090c9eeccb57b8f05b10f573a3244c65881d549b6ba6a62fc367e"} err="failed to get container status \"83d094f34ae090c9eeccb57b8f05b10f573a3244c65881d549b6ba6a62fc367e\": rpc error: code = NotFound desc = could not find container \"83d094f34ae090c9eeccb57b8f05b10f573a3244c65881d549b6ba6a62fc367e\": container with ID starting with 83d094f34ae090c9eeccb57b8f05b10f573a3244c65881d549b6ba6a62fc367e not found: ID does not exist"
Mar 20 15:55:31 crc kubenswrapper[4730]: I0320 15:55:31.565464    4730 scope.go:117] "RemoveContainer" containerID="887bc31eff014e32cf26c264c448be6c71dca22cdbc492ad535c3590b7f3e8da"
Mar 20 15:55:31 crc kubenswrapper[4730]: E0320 15:55:31.565906    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"887bc31eff014e32cf26c264c448be6c71dca22cdbc492ad535c3590b7f3e8da\": container with ID starting with 887bc31eff014e32cf26c264c448be6c71dca22cdbc492ad535c3590b7f3e8da not found: ID does not exist" containerID="887bc31eff014e32cf26c264c448be6c71dca22cdbc492ad535c3590b7f3e8da"
Mar 20 15:55:31 crc kubenswrapper[4730]: I0320 15:55:31.565940    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"887bc31eff014e32cf26c264c448be6c71dca22cdbc492ad535c3590b7f3e8da"} err="failed to get container status \"887bc31eff014e32cf26c264c448be6c71dca22cdbc492ad535c3590b7f3e8da\": rpc error: code = NotFound desc = could not find container \"887bc31eff014e32cf26c264c448be6c71dca22cdbc492ad535c3590b7f3e8da\": container with ID starting with 887bc31eff014e32cf26c264c448be6c71dca22cdbc492ad535c3590b7f3e8da not found: ID does not exist"
Mar 20 15:55:31 crc kubenswrapper[4730]: W0320 15:55:31.567616    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67135f13_b182_4e78_b64d_59e924cc6d06.slice/crio-658a195d5ec687dd420baf8249b97d002c1f423d4390763b728a9e13cc7604b3 WatchSource:0}: Error finding container 658a195d5ec687dd420baf8249b97d002c1f423d4390763b728a9e13cc7604b3: Status 404 returned error can't find the container with id 658a195d5ec687dd420baf8249b97d002c1f423d4390763b728a9e13cc7604b3
Mar 20 15:55:32 crc kubenswrapper[4730]: I0320 15:55:32.508328    4730 generic.go:334] "Generic (PLEG): container finished" podID="67135f13-b182-4e78-b64d-59e924cc6d06" containerID="0248869be7a45daf83343efdfb1d43f4b7d10b0b6b268922404b998ed0f2cd9e" exitCode=0
Mar 20 15:55:32 crc kubenswrapper[4730]: I0320 15:55:32.508382    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ccfvp" event={"ID":"67135f13-b182-4e78-b64d-59e924cc6d06","Type":"ContainerDied","Data":"0248869be7a45daf83343efdfb1d43f4b7d10b0b6b268922404b998ed0f2cd9e"}
Mar 20 15:55:32 crc kubenswrapper[4730]: I0320 15:55:32.508704    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ccfvp" event={"ID":"67135f13-b182-4e78-b64d-59e924cc6d06","Type":"ContainerStarted","Data":"658a195d5ec687dd420baf8249b97d002c1f423d4390763b728a9e13cc7604b3"}
Mar 20 15:55:33 crc kubenswrapper[4730]: I0320 15:55:33.516795    4730 generic.go:334] "Generic (PLEG): container finished" podID="67135f13-b182-4e78-b64d-59e924cc6d06" containerID="303fadbe91512351177d280b599a2b8cc89d2b5b2c6d9ee58ec15258affc5eb8" exitCode=0
Mar 20 15:55:33 crc kubenswrapper[4730]: I0320 15:55:33.516875    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ccfvp" event={"ID":"67135f13-b182-4e78-b64d-59e924cc6d06","Type":"ContainerDied","Data":"303fadbe91512351177d280b599a2b8cc89d2b5b2c6d9ee58ec15258affc5eb8"}
Mar 20 15:55:33 crc kubenswrapper[4730]: I0320 15:55:33.542746    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="146c64ad-b085-471a-a540-7faa5c6e969f" path="/var/lib/kubelet/pods/146c64ad-b085-471a-a540-7faa5c6e969f/volumes"
Mar 20 15:55:33 crc kubenswrapper[4730]: I0320 15:55:33.591832    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-vmgrx"
Mar 20 15:55:34 crc kubenswrapper[4730]: I0320 15:55:34.952618    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-tbvnw"
Mar 20 15:55:35 crc kubenswrapper[4730]: I0320 15:55:35.540862    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ccfvp" event={"ID":"67135f13-b182-4e78-b64d-59e924cc6d06","Type":"ContainerStarted","Data":"6f9edd02be6f2dfb433a49be2863aba5cd96a3fbb691894959161dfee263cc66"}
Mar 20 15:55:35 crc kubenswrapper[4730]: I0320 15:55:35.565599    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ccfvp" podStartSLOduration=3.023705403 podStartE2EDuration="5.565583858s" podCreationTimestamp="2026-03-20 15:55:30 +0000 UTC" firstStartedPulling="2026-03-20 15:55:32.510221615 +0000 UTC m=+991.723592994" lastFinishedPulling="2026-03-20 15:55:35.05210009 +0000 UTC m=+994.265471449" observedRunningTime="2026-03-20 15:55:35.558776014 +0000 UTC m=+994.772147383" watchObservedRunningTime="2026-03-20 15:55:35.565583858 +0000 UTC m=+994.778955217"
Mar 20 15:55:37 crc kubenswrapper[4730]: I0320 15:55:37.825694    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-6f444"]
Mar 20 15:55:37 crc kubenswrapper[4730]: E0320 15:55:37.826157    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="146c64ad-b085-471a-a540-7faa5c6e969f" containerName="extract-utilities"
Mar 20 15:55:37 crc kubenswrapper[4730]: I0320 15:55:37.826168    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="146c64ad-b085-471a-a540-7faa5c6e969f" containerName="extract-utilities"
Mar 20 15:55:37 crc kubenswrapper[4730]: E0320 15:55:37.826176    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="146c64ad-b085-471a-a540-7faa5c6e969f" containerName="extract-content"
Mar 20 15:55:37 crc kubenswrapper[4730]: I0320 15:55:37.826183    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="146c64ad-b085-471a-a540-7faa5c6e969f" containerName="extract-content"
Mar 20 15:55:37 crc kubenswrapper[4730]: E0320 15:55:37.826196    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="146c64ad-b085-471a-a540-7faa5c6e969f" containerName="registry-server"
Mar 20 15:55:37 crc kubenswrapper[4730]: I0320 15:55:37.826201    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="146c64ad-b085-471a-a540-7faa5c6e969f" containerName="registry-server"
Mar 20 15:55:37 crc kubenswrapper[4730]: I0320 15:55:37.826321    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="146c64ad-b085-471a-a540-7faa5c6e969f" containerName="registry-server"
Mar 20 15:55:37 crc kubenswrapper[4730]: I0320 15:55:37.826781    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-6f444"
Mar 20 15:55:37 crc kubenswrapper[4730]: I0320 15:55:37.828752    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-xkpgj"
Mar 20 15:55:37 crc kubenswrapper[4730]: I0320 15:55:37.828893    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt"
Mar 20 15:55:37 crc kubenswrapper[4730]: I0320 15:55:37.828892    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt"
Mar 20 15:55:37 crc kubenswrapper[4730]: I0320 15:55:37.847847    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-6f444"]
Mar 20 15:55:37 crc kubenswrapper[4730]: I0320 15:55:37.978047    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhtb8\" (UniqueName: \"kubernetes.io/projected/ed34ebb3-3d6a-4ddf-8364-fd5b7baa6953-kube-api-access-fhtb8\") pod \"openstack-operator-index-6f444\" (UID: \"ed34ebb3-3d6a-4ddf-8364-fd5b7baa6953\") " pod="openstack-operators/openstack-operator-index-6f444"
Mar 20 15:55:38 crc kubenswrapper[4730]: I0320 15:55:38.079413    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhtb8\" (UniqueName: \"kubernetes.io/projected/ed34ebb3-3d6a-4ddf-8364-fd5b7baa6953-kube-api-access-fhtb8\") pod \"openstack-operator-index-6f444\" (UID: \"ed34ebb3-3d6a-4ddf-8364-fd5b7baa6953\") " pod="openstack-operators/openstack-operator-index-6f444"
Mar 20 15:55:38 crc kubenswrapper[4730]: I0320 15:55:38.113148    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhtb8\" (UniqueName: \"kubernetes.io/projected/ed34ebb3-3d6a-4ddf-8364-fd5b7baa6953-kube-api-access-fhtb8\") pod \"openstack-operator-index-6f444\" (UID: \"ed34ebb3-3d6a-4ddf-8364-fd5b7baa6953\") " pod="openstack-operators/openstack-operator-index-6f444"
Mar 20 15:55:38 crc kubenswrapper[4730]: I0320 15:55:38.143291    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-6f444"
Mar 20 15:55:38 crc kubenswrapper[4730]: I0320 15:55:38.567013    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-6f444"]
Mar 20 15:55:38 crc kubenswrapper[4730]: W0320 15:55:38.570194    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded34ebb3_3d6a_4ddf_8364_fd5b7baa6953.slice/crio-32941a4fdec1fcd9654c63bca33d3659cc2cf1d8de808659c098b3722cb53ead WatchSource:0}: Error finding container 32941a4fdec1fcd9654c63bca33d3659cc2cf1d8de808659c098b3722cb53ead: Status 404 returned error can't find the container with id 32941a4fdec1fcd9654c63bca33d3659cc2cf1d8de808659c098b3722cb53ead
Mar 20 15:55:39 crc kubenswrapper[4730]: I0320 15:55:39.561552    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-6f444" event={"ID":"ed34ebb3-3d6a-4ddf-8364-fd5b7baa6953","Type":"ContainerStarted","Data":"32941a4fdec1fcd9654c63bca33d3659cc2cf1d8de808659c098b3722cb53ead"}
Mar 20 15:55:41 crc kubenswrapper[4730]: I0320 15:55:41.056745    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ccfvp"
Mar 20 15:55:41 crc kubenswrapper[4730]: I0320 15:55:41.056898    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ccfvp"
Mar 20 15:55:41 crc kubenswrapper[4730]: I0320 15:55:41.105120    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ccfvp"
Mar 20 15:55:41 crc kubenswrapper[4730]: I0320 15:55:41.574620    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-6f444" event={"ID":"ed34ebb3-3d6a-4ddf-8364-fd5b7baa6953","Type":"ContainerStarted","Data":"36dbac8f732d4df3db5ae5bc7263b80ff9feb79f642eaa9edc01e075cd51f146"}
Mar 20 15:55:41 crc kubenswrapper[4730]: I0320 15:55:41.601352    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-6f444" podStartSLOduration=2.202520615 podStartE2EDuration="4.601332889s" podCreationTimestamp="2026-03-20 15:55:37 +0000 UTC" firstStartedPulling="2026-03-20 15:55:38.571930765 +0000 UTC m=+997.785302144" lastFinishedPulling="2026-03-20 15:55:40.970743049 +0000 UTC m=+1000.184114418" observedRunningTime="2026-03-20 15:55:41.59960695 +0000 UTC m=+1000.812978319" watchObservedRunningTime="2026-03-20 15:55:41.601332889 +0000 UTC m=+1000.814704258"
Mar 20 15:55:41 crc kubenswrapper[4730]: I0320 15:55:41.693232    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-6f444"]
Mar 20 15:55:41 crc kubenswrapper[4730]: I0320 15:55:41.702377    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ccfvp"
Mar 20 15:55:42 crc kubenswrapper[4730]: I0320 15:55:42.499822    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-f9xcd"]
Mar 20 15:55:42 crc kubenswrapper[4730]: I0320 15:55:42.501732    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-f9xcd"
Mar 20 15:55:42 crc kubenswrapper[4730]: I0320 15:55:42.510433    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-f9xcd"]
Mar 20 15:55:42 crc kubenswrapper[4730]: I0320 15:55:42.645249    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgg5s\" (UniqueName: \"kubernetes.io/projected/d125a115-3173-4a52-8794-2832951fa428-kube-api-access-kgg5s\") pod \"openstack-operator-index-f9xcd\" (UID: \"d125a115-3173-4a52-8794-2832951fa428\") " pod="openstack-operators/openstack-operator-index-f9xcd"
Mar 20 15:55:42 crc kubenswrapper[4730]: I0320 15:55:42.748289    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgg5s\" (UniqueName: \"kubernetes.io/projected/d125a115-3173-4a52-8794-2832951fa428-kube-api-access-kgg5s\") pod \"openstack-operator-index-f9xcd\" (UID: \"d125a115-3173-4a52-8794-2832951fa428\") " pod="openstack-operators/openstack-operator-index-f9xcd"
Mar 20 15:55:42 crc kubenswrapper[4730]: I0320 15:55:42.773566    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgg5s\" (UniqueName: \"kubernetes.io/projected/d125a115-3173-4a52-8794-2832951fa428-kube-api-access-kgg5s\") pod \"openstack-operator-index-f9xcd\" (UID: \"d125a115-3173-4a52-8794-2832951fa428\") " pod="openstack-operators/openstack-operator-index-f9xcd"
Mar 20 15:55:42 crc kubenswrapper[4730]: I0320 15:55:42.823817    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-f9xcd"
Mar 20 15:55:43 crc kubenswrapper[4730]: I0320 15:55:43.288385    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-f9xcd"]
Mar 20 15:55:43 crc kubenswrapper[4730]: I0320 15:55:43.288858    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-pbr4w"
Mar 20 15:55:43 crc kubenswrapper[4730]: I0320 15:55:43.588890    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-f9xcd" event={"ID":"d125a115-3173-4a52-8794-2832951fa428","Type":"ContainerStarted","Data":"0bc71ca5af35eb6dd545efbdcabf0b6ae657161f87c08e6fd6b7d141c6a33724"}
Mar 20 15:55:43 crc kubenswrapper[4730]: I0320 15:55:43.589307    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-f9xcd" event={"ID":"d125a115-3173-4a52-8794-2832951fa428","Type":"ContainerStarted","Data":"9686c2d37b01c0de22f81c267b06b28451fc2e46e0983435c2d9281955e8f5ff"}
Mar 20 15:55:43 crc kubenswrapper[4730]: I0320 15:55:43.588956    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-6f444" podUID="ed34ebb3-3d6a-4ddf-8364-fd5b7baa6953" containerName="registry-server" containerID="cri-o://36dbac8f732d4df3db5ae5bc7263b80ff9feb79f642eaa9edc01e075cd51f146" gracePeriod=2
Mar 20 15:55:43 crc kubenswrapper[4730]: I0320 15:55:43.614193    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-f9xcd" podStartSLOduration=1.546950824 podStartE2EDuration="1.614172472s" podCreationTimestamp="2026-03-20 15:55:42 +0000 UTC" firstStartedPulling="2026-03-20 15:55:43.292473164 +0000 UTC m=+1002.505844573" lastFinishedPulling="2026-03-20 15:55:43.359694852 +0000 UTC m=+1002.573066221" observedRunningTime="2026-03-20 15:55:43.608961753 +0000 UTC m=+1002.822333152" watchObservedRunningTime="2026-03-20 15:55:43.614172472 +0000 UTC m=+1002.827543851"
Mar 20 15:55:44 crc kubenswrapper[4730]: I0320 15:55:44.058067    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-6f444"
Mar 20 15:55:44 crc kubenswrapper[4730]: I0320 15:55:44.177359    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhtb8\" (UniqueName: \"kubernetes.io/projected/ed34ebb3-3d6a-4ddf-8364-fd5b7baa6953-kube-api-access-fhtb8\") pod \"ed34ebb3-3d6a-4ddf-8364-fd5b7baa6953\" (UID: \"ed34ebb3-3d6a-4ddf-8364-fd5b7baa6953\") "
Mar 20 15:55:44 crc kubenswrapper[4730]: I0320 15:55:44.185494    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed34ebb3-3d6a-4ddf-8364-fd5b7baa6953-kube-api-access-fhtb8" (OuterVolumeSpecName: "kube-api-access-fhtb8") pod "ed34ebb3-3d6a-4ddf-8364-fd5b7baa6953" (UID: "ed34ebb3-3d6a-4ddf-8364-fd5b7baa6953"). InnerVolumeSpecName "kube-api-access-fhtb8". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:55:44 crc kubenswrapper[4730]: I0320 15:55:44.278915    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhtb8\" (UniqueName: \"kubernetes.io/projected/ed34ebb3-3d6a-4ddf-8364-fd5b7baa6953-kube-api-access-fhtb8\") on node \"crc\" DevicePath \"\""
Mar 20 15:55:44 crc kubenswrapper[4730]: I0320 15:55:44.596153    4730 generic.go:334] "Generic (PLEG): container finished" podID="ed34ebb3-3d6a-4ddf-8364-fd5b7baa6953" containerID="36dbac8f732d4df3db5ae5bc7263b80ff9feb79f642eaa9edc01e075cd51f146" exitCode=0
Mar 20 15:55:44 crc kubenswrapper[4730]: I0320 15:55:44.596865    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-6f444"
Mar 20 15:55:44 crc kubenswrapper[4730]: I0320 15:55:44.598329    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-6f444" event={"ID":"ed34ebb3-3d6a-4ddf-8364-fd5b7baa6953","Type":"ContainerDied","Data":"36dbac8f732d4df3db5ae5bc7263b80ff9feb79f642eaa9edc01e075cd51f146"}
Mar 20 15:55:44 crc kubenswrapper[4730]: I0320 15:55:44.598378    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-6f444" event={"ID":"ed34ebb3-3d6a-4ddf-8364-fd5b7baa6953","Type":"ContainerDied","Data":"32941a4fdec1fcd9654c63bca33d3659cc2cf1d8de808659c098b3722cb53ead"}
Mar 20 15:55:44 crc kubenswrapper[4730]: I0320 15:55:44.598399    4730 scope.go:117] "RemoveContainer" containerID="36dbac8f732d4df3db5ae5bc7263b80ff9feb79f642eaa9edc01e075cd51f146"
Mar 20 15:55:44 crc kubenswrapper[4730]: I0320 15:55:44.620501    4730 scope.go:117] "RemoveContainer" containerID="36dbac8f732d4df3db5ae5bc7263b80ff9feb79f642eaa9edc01e075cd51f146"
Mar 20 15:55:44 crc kubenswrapper[4730]: E0320 15:55:44.621761    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36dbac8f732d4df3db5ae5bc7263b80ff9feb79f642eaa9edc01e075cd51f146\": container with ID starting with 36dbac8f732d4df3db5ae5bc7263b80ff9feb79f642eaa9edc01e075cd51f146 not found: ID does not exist" containerID="36dbac8f732d4df3db5ae5bc7263b80ff9feb79f642eaa9edc01e075cd51f146"
Mar 20 15:55:44 crc kubenswrapper[4730]: I0320 15:55:44.621807    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36dbac8f732d4df3db5ae5bc7263b80ff9feb79f642eaa9edc01e075cd51f146"} err="failed to get container status \"36dbac8f732d4df3db5ae5bc7263b80ff9feb79f642eaa9edc01e075cd51f146\": rpc error: code = NotFound desc = could not find container \"36dbac8f732d4df3db5ae5bc7263b80ff9feb79f642eaa9edc01e075cd51f146\": container with ID starting with 36dbac8f732d4df3db5ae5bc7263b80ff9feb79f642eaa9edc01e075cd51f146 not found: ID does not exist"
Mar 20 15:55:44 crc kubenswrapper[4730]: I0320 15:55:44.633199    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-6f444"]
Mar 20 15:55:44 crc kubenswrapper[4730]: I0320 15:55:44.645818    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-6f444"]
Mar 20 15:55:45 crc kubenswrapper[4730]: I0320 15:55:45.486153    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ccfvp"]
Mar 20 15:55:45 crc kubenswrapper[4730]: I0320 15:55:45.486754    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ccfvp" podUID="67135f13-b182-4e78-b64d-59e924cc6d06" containerName="registry-server" containerID="cri-o://6f9edd02be6f2dfb433a49be2863aba5cd96a3fbb691894959161dfee263cc66" gracePeriod=2
Mar 20 15:55:45 crc kubenswrapper[4730]: I0320 15:55:45.542176    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed34ebb3-3d6a-4ddf-8364-fd5b7baa6953" path="/var/lib/kubelet/pods/ed34ebb3-3d6a-4ddf-8364-fd5b7baa6953/volumes"
Mar 20 15:55:45 crc kubenswrapper[4730]: I0320 15:55:45.611429    4730 generic.go:334] "Generic (PLEG): container finished" podID="67135f13-b182-4e78-b64d-59e924cc6d06" containerID="6f9edd02be6f2dfb433a49be2863aba5cd96a3fbb691894959161dfee263cc66" exitCode=0
Mar 20 15:55:45 crc kubenswrapper[4730]: I0320 15:55:45.611479    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ccfvp" event={"ID":"67135f13-b182-4e78-b64d-59e924cc6d06","Type":"ContainerDied","Data":"6f9edd02be6f2dfb433a49be2863aba5cd96a3fbb691894959161dfee263cc66"}
Mar 20 15:55:45 crc kubenswrapper[4730]: I0320 15:55:45.828806    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ccfvp"
Mar 20 15:55:45 crc kubenswrapper[4730]: I0320 15:55:45.904858    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67135f13-b182-4e78-b64d-59e924cc6d06-utilities\") pod \"67135f13-b182-4e78-b64d-59e924cc6d06\" (UID: \"67135f13-b182-4e78-b64d-59e924cc6d06\") "
Mar 20 15:55:45 crc kubenswrapper[4730]: I0320 15:55:45.904945    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67135f13-b182-4e78-b64d-59e924cc6d06-catalog-content\") pod \"67135f13-b182-4e78-b64d-59e924cc6d06\" (UID: \"67135f13-b182-4e78-b64d-59e924cc6d06\") "
Mar 20 15:55:45 crc kubenswrapper[4730]: I0320 15:55:45.905018    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzjls\" (UniqueName: \"kubernetes.io/projected/67135f13-b182-4e78-b64d-59e924cc6d06-kube-api-access-kzjls\") pod \"67135f13-b182-4e78-b64d-59e924cc6d06\" (UID: \"67135f13-b182-4e78-b64d-59e924cc6d06\") "
Mar 20 15:55:45 crc kubenswrapper[4730]: I0320 15:55:45.906016    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67135f13-b182-4e78-b64d-59e924cc6d06-utilities" (OuterVolumeSpecName: "utilities") pod "67135f13-b182-4e78-b64d-59e924cc6d06" (UID: "67135f13-b182-4e78-b64d-59e924cc6d06"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 15:55:45 crc kubenswrapper[4730]: I0320 15:55:45.920425    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67135f13-b182-4e78-b64d-59e924cc6d06-kube-api-access-kzjls" (OuterVolumeSpecName: "kube-api-access-kzjls") pod "67135f13-b182-4e78-b64d-59e924cc6d06" (UID: "67135f13-b182-4e78-b64d-59e924cc6d06"). InnerVolumeSpecName "kube-api-access-kzjls". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:55:45 crc kubenswrapper[4730]: I0320 15:55:45.964079    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67135f13-b182-4e78-b64d-59e924cc6d06-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "67135f13-b182-4e78-b64d-59e924cc6d06" (UID: "67135f13-b182-4e78-b64d-59e924cc6d06"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 15:55:46 crc kubenswrapper[4730]: I0320 15:55:46.006358    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzjls\" (UniqueName: \"kubernetes.io/projected/67135f13-b182-4e78-b64d-59e924cc6d06-kube-api-access-kzjls\") on node \"crc\" DevicePath \"\""
Mar 20 15:55:46 crc kubenswrapper[4730]: I0320 15:55:46.006593    4730 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67135f13-b182-4e78-b64d-59e924cc6d06-utilities\") on node \"crc\" DevicePath \"\""
Mar 20 15:55:46 crc kubenswrapper[4730]: I0320 15:55:46.006652    4730 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67135f13-b182-4e78-b64d-59e924cc6d06-catalog-content\") on node \"crc\" DevicePath \"\""
Mar 20 15:55:46 crc kubenswrapper[4730]: I0320 15:55:46.623022    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ccfvp" event={"ID":"67135f13-b182-4e78-b64d-59e924cc6d06","Type":"ContainerDied","Data":"658a195d5ec687dd420baf8249b97d002c1f423d4390763b728a9e13cc7604b3"}
Mar 20 15:55:46 crc kubenswrapper[4730]: I0320 15:55:46.623094    4730 scope.go:117] "RemoveContainer" containerID="6f9edd02be6f2dfb433a49be2863aba5cd96a3fbb691894959161dfee263cc66"
Mar 20 15:55:46 crc kubenswrapper[4730]: I0320 15:55:46.623177    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ccfvp"
Mar 20 15:55:46 crc kubenswrapper[4730]: I0320 15:55:46.655585    4730 scope.go:117] "RemoveContainer" containerID="303fadbe91512351177d280b599a2b8cc89d2b5b2c6d9ee58ec15258affc5eb8"
Mar 20 15:55:46 crc kubenswrapper[4730]: I0320 15:55:46.677539    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ccfvp"]
Mar 20 15:55:46 crc kubenswrapper[4730]: I0320 15:55:46.678581    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ccfvp"]
Mar 20 15:55:46 crc kubenswrapper[4730]: I0320 15:55:46.700571    4730 scope.go:117] "RemoveContainer" containerID="0248869be7a45daf83343efdfb1d43f4b7d10b0b6b268922404b998ed0f2cd9e"
Mar 20 15:55:47 crc kubenswrapper[4730]: I0320 15:55:47.542582    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67135f13-b182-4e78-b64d-59e924cc6d06" path="/var/lib/kubelet/pods/67135f13-b182-4e78-b64d-59e924cc6d06/volumes"
Mar 20 15:55:52 crc kubenswrapper[4730]: I0320 15:55:52.824841    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-f9xcd"
Mar 20 15:55:52 crc kubenswrapper[4730]: I0320 15:55:52.825305    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-f9xcd"
Mar 20 15:55:52 crc kubenswrapper[4730]: I0320 15:55:52.863578    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-f9xcd"
Mar 20 15:55:53 crc kubenswrapper[4730]: I0320 15:55:53.706892    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-f9xcd"
Mar 20 15:56:00 crc kubenswrapper[4730]: I0320 15:56:00.132883    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567036-mnkd8"]
Mar 20 15:56:00 crc kubenswrapper[4730]: E0320 15:56:00.133659    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed34ebb3-3d6a-4ddf-8364-fd5b7baa6953" containerName="registry-server"
Mar 20 15:56:00 crc kubenswrapper[4730]: I0320 15:56:00.133671    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed34ebb3-3d6a-4ddf-8364-fd5b7baa6953" containerName="registry-server"
Mar 20 15:56:00 crc kubenswrapper[4730]: E0320 15:56:00.133679    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67135f13-b182-4e78-b64d-59e924cc6d06" containerName="extract-content"
Mar 20 15:56:00 crc kubenswrapper[4730]: I0320 15:56:00.133685    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="67135f13-b182-4e78-b64d-59e924cc6d06" containerName="extract-content"
Mar 20 15:56:00 crc kubenswrapper[4730]: E0320 15:56:00.133696    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67135f13-b182-4e78-b64d-59e924cc6d06" containerName="extract-utilities"
Mar 20 15:56:00 crc kubenswrapper[4730]: I0320 15:56:00.133703    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="67135f13-b182-4e78-b64d-59e924cc6d06" containerName="extract-utilities"
Mar 20 15:56:00 crc kubenswrapper[4730]: E0320 15:56:00.133713    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67135f13-b182-4e78-b64d-59e924cc6d06" containerName="registry-server"
Mar 20 15:56:00 crc kubenswrapper[4730]: I0320 15:56:00.133718    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="67135f13-b182-4e78-b64d-59e924cc6d06" containerName="registry-server"
Mar 20 15:56:00 crc kubenswrapper[4730]: I0320 15:56:00.133816    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="67135f13-b182-4e78-b64d-59e924cc6d06" containerName="registry-server"
Mar 20 15:56:00 crc kubenswrapper[4730]: I0320 15:56:00.133827    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed34ebb3-3d6a-4ddf-8364-fd5b7baa6953" containerName="registry-server"
Mar 20 15:56:00 crc kubenswrapper[4730]: I0320 15:56:00.134550    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567036-mnkd8"
Mar 20 15:56:00 crc kubenswrapper[4730]: I0320 15:56:00.136929    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt"
Mar 20 15:56:00 crc kubenswrapper[4730]: I0320 15:56:00.136998    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vlqbl"
Mar 20 15:56:00 crc kubenswrapper[4730]: I0320 15:56:00.137140    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567036-mnkd8"]
Mar 20 15:56:00 crc kubenswrapper[4730]: I0320 15:56:00.137228    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt"
Mar 20 15:56:00 crc kubenswrapper[4730]: I0320 15:56:00.200290    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw9d9\" (UniqueName: \"kubernetes.io/projected/889da12d-843a-4c71-8d48-cbb0360b024a-kube-api-access-xw9d9\") pod \"auto-csr-approver-29567036-mnkd8\" (UID: \"889da12d-843a-4c71-8d48-cbb0360b024a\") " pod="openshift-infra/auto-csr-approver-29567036-mnkd8"
Mar 20 15:56:00 crc kubenswrapper[4730]: I0320 15:56:00.301768    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xw9d9\" (UniqueName: \"kubernetes.io/projected/889da12d-843a-4c71-8d48-cbb0360b024a-kube-api-access-xw9d9\") pod \"auto-csr-approver-29567036-mnkd8\" (UID: \"889da12d-843a-4c71-8d48-cbb0360b024a\") " pod="openshift-infra/auto-csr-approver-29567036-mnkd8"
Mar 20 15:56:00 crc kubenswrapper[4730]: I0320 15:56:00.318544    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw9d9\" (UniqueName: \"kubernetes.io/projected/889da12d-843a-4c71-8d48-cbb0360b024a-kube-api-access-xw9d9\") pod \"auto-csr-approver-29567036-mnkd8\" (UID: \"889da12d-843a-4c71-8d48-cbb0360b024a\") " pod="openshift-infra/auto-csr-approver-29567036-mnkd8"
Mar 20 15:56:00 crc kubenswrapper[4730]: I0320 15:56:00.450558    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567036-mnkd8"
Mar 20 15:56:00 crc kubenswrapper[4730]: I0320 15:56:00.870921    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567036-mnkd8"]
Mar 20 15:56:01 crc kubenswrapper[4730]: I0320 15:56:01.162216    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/6e5835fdb519dbfa6963d219a5ec71b0d73708b5df93d36b8aba1618bammpbt"]
Mar 20 15:56:01 crc kubenswrapper[4730]: I0320 15:56:01.167622    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/6e5835fdb519dbfa6963d219a5ec71b0d73708b5df93d36b8aba1618bammpbt"]
Mar 20 15:56:01 crc kubenswrapper[4730]: I0320 15:56:01.167718    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/6e5835fdb519dbfa6963d219a5ec71b0d73708b5df93d36b8aba1618bammpbt"
Mar 20 15:56:01 crc kubenswrapper[4730]: I0320 15:56:01.169984    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-bznvq"
Mar 20 15:56:01 crc kubenswrapper[4730]: I0320 15:56:01.319375    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a-bundle\") pod \"6e5835fdb519dbfa6963d219a5ec71b0d73708b5df93d36b8aba1618bammpbt\" (UID: \"8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a\") " pod="openstack-operators/6e5835fdb519dbfa6963d219a5ec71b0d73708b5df93d36b8aba1618bammpbt"
Mar 20 15:56:01 crc kubenswrapper[4730]: I0320 15:56:01.319469    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a-util\") pod \"6e5835fdb519dbfa6963d219a5ec71b0d73708b5df93d36b8aba1618bammpbt\" (UID: \"8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a\") " pod="openstack-operators/6e5835fdb519dbfa6963d219a5ec71b0d73708b5df93d36b8aba1618bammpbt"
Mar 20 15:56:01 crc kubenswrapper[4730]: I0320 15:56:01.319593    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27dcc\" (UniqueName: \"kubernetes.io/projected/8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a-kube-api-access-27dcc\") pod \"6e5835fdb519dbfa6963d219a5ec71b0d73708b5df93d36b8aba1618bammpbt\" (UID: \"8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a\") " pod="openstack-operators/6e5835fdb519dbfa6963d219a5ec71b0d73708b5df93d36b8aba1618bammpbt"
Mar 20 15:56:01 crc kubenswrapper[4730]: I0320 15:56:01.420867    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a-bundle\") pod \"6e5835fdb519dbfa6963d219a5ec71b0d73708b5df93d36b8aba1618bammpbt\" (UID: \"8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a\") " pod="openstack-operators/6e5835fdb519dbfa6963d219a5ec71b0d73708b5df93d36b8aba1618bammpbt"
Mar 20 15:56:01 crc kubenswrapper[4730]: I0320 15:56:01.421145    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a-util\") pod \"6e5835fdb519dbfa6963d219a5ec71b0d73708b5df93d36b8aba1618bammpbt\" (UID: \"8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a\") " pod="openstack-operators/6e5835fdb519dbfa6963d219a5ec71b0d73708b5df93d36b8aba1618bammpbt"
Mar 20 15:56:01 crc kubenswrapper[4730]: I0320 15:56:01.421300    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27dcc\" (UniqueName: \"kubernetes.io/projected/8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a-kube-api-access-27dcc\") pod \"6e5835fdb519dbfa6963d219a5ec71b0d73708b5df93d36b8aba1618bammpbt\" (UID: \"8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a\") " pod="openstack-operators/6e5835fdb519dbfa6963d219a5ec71b0d73708b5df93d36b8aba1618bammpbt"
Mar 20 15:56:01 crc kubenswrapper[4730]: I0320 15:56:01.421768    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a-util\") pod \"6e5835fdb519dbfa6963d219a5ec71b0d73708b5df93d36b8aba1618bammpbt\" (UID: \"8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a\") " pod="openstack-operators/6e5835fdb519dbfa6963d219a5ec71b0d73708b5df93d36b8aba1618bammpbt"
Mar 20 15:56:01 crc kubenswrapper[4730]: I0320 15:56:01.421945    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a-bundle\") pod \"6e5835fdb519dbfa6963d219a5ec71b0d73708b5df93d36b8aba1618bammpbt\" (UID: \"8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a\") " pod="openstack-operators/6e5835fdb519dbfa6963d219a5ec71b0d73708b5df93d36b8aba1618bammpbt"
Mar 20 15:56:01 crc kubenswrapper[4730]: I0320 15:56:01.447017    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27dcc\" (UniqueName: \"kubernetes.io/projected/8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a-kube-api-access-27dcc\") pod \"6e5835fdb519dbfa6963d219a5ec71b0d73708b5df93d36b8aba1618bammpbt\" (UID: \"8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a\") " pod="openstack-operators/6e5835fdb519dbfa6963d219a5ec71b0d73708b5df93d36b8aba1618bammpbt"
Mar 20 15:56:01 crc kubenswrapper[4730]: I0320 15:56:01.488582    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-bznvq"
Mar 20 15:56:01 crc kubenswrapper[4730]: I0320 15:56:01.497630    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/6e5835fdb519dbfa6963d219a5ec71b0d73708b5df93d36b8aba1618bammpbt"
Mar 20 15:56:01 crc kubenswrapper[4730]: I0320 15:56:01.716160    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/6e5835fdb519dbfa6963d219a5ec71b0d73708b5df93d36b8aba1618bammpbt"]
Mar 20 15:56:01 crc kubenswrapper[4730]: W0320 15:56:01.722837    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b2334c1_9644_4fd3_9ea3_984ebcd8dc5a.slice/crio-39f50a0bde24a3fc4c9b9f752f5c3f4ddc1714b08bce4063cca2dec5f9e70995 WatchSource:0}: Error finding container 39f50a0bde24a3fc4c9b9f752f5c3f4ddc1714b08bce4063cca2dec5f9e70995: Status 404 returned error can't find the container with id 39f50a0bde24a3fc4c9b9f752f5c3f4ddc1714b08bce4063cca2dec5f9e70995
Mar 20 15:56:01 crc kubenswrapper[4730]: I0320 15:56:01.732945    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567036-mnkd8" event={"ID":"889da12d-843a-4c71-8d48-cbb0360b024a","Type":"ContainerStarted","Data":"1b28fd03418b6f353e6627eaef64c3d1916825ee1c8a46df8c737da3eb3b606a"}
Mar 20 15:56:02 crc kubenswrapper[4730]: I0320 15:56:02.739315    4730 generic.go:334] "Generic (PLEG): container finished" podID="8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a" containerID="8cf0199ce2da0157ec10f0f6d25409b2da9972c181577bd215ad3f0cf65e50f1" exitCode=0
Mar 20 15:56:02 crc kubenswrapper[4730]: I0320 15:56:02.739410    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/6e5835fdb519dbfa6963d219a5ec71b0d73708b5df93d36b8aba1618bammpbt" event={"ID":"8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a","Type":"ContainerDied","Data":"8cf0199ce2da0157ec10f0f6d25409b2da9972c181577bd215ad3f0cf65e50f1"}
Mar 20 15:56:02 crc kubenswrapper[4730]: I0320 15:56:02.739694    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/6e5835fdb519dbfa6963d219a5ec71b0d73708b5df93d36b8aba1618bammpbt" event={"ID":"8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a","Type":"ContainerStarted","Data":"39f50a0bde24a3fc4c9b9f752f5c3f4ddc1714b08bce4063cca2dec5f9e70995"}
Mar 20 15:56:02 crc kubenswrapper[4730]: I0320 15:56:02.743667    4730 generic.go:334] "Generic (PLEG): container finished" podID="889da12d-843a-4c71-8d48-cbb0360b024a" containerID="2ec54c009b326db4c49da642b8ab1232405aacb430ead248fe894a34dfe7c452" exitCode=0
Mar 20 15:56:02 crc kubenswrapper[4730]: I0320 15:56:02.743706    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567036-mnkd8" event={"ID":"889da12d-843a-4c71-8d48-cbb0360b024a","Type":"ContainerDied","Data":"2ec54c009b326db4c49da642b8ab1232405aacb430ead248fe894a34dfe7c452"}
Mar 20 15:56:03 crc kubenswrapper[4730]: I0320 15:56:03.752016    4730 generic.go:334] "Generic (PLEG): container finished" podID="8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a" containerID="94f44e77a5dcfdbd97bb6208822f02fb0b29dd74e63ad2a09b0cde739252bf64" exitCode=0
Mar 20 15:56:03 crc kubenswrapper[4730]: I0320 15:56:03.752068    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/6e5835fdb519dbfa6963d219a5ec71b0d73708b5df93d36b8aba1618bammpbt" event={"ID":"8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a","Type":"ContainerDied","Data":"94f44e77a5dcfdbd97bb6208822f02fb0b29dd74e63ad2a09b0cde739252bf64"}
Mar 20 15:56:04 crc kubenswrapper[4730]: I0320 15:56:04.013087    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567036-mnkd8"
Mar 20 15:56:04 crc kubenswrapper[4730]: I0320 15:56:04.158513    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xw9d9\" (UniqueName: \"kubernetes.io/projected/889da12d-843a-4c71-8d48-cbb0360b024a-kube-api-access-xw9d9\") pod \"889da12d-843a-4c71-8d48-cbb0360b024a\" (UID: \"889da12d-843a-4c71-8d48-cbb0360b024a\") "
Mar 20 15:56:04 crc kubenswrapper[4730]: I0320 15:56:04.163879    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/889da12d-843a-4c71-8d48-cbb0360b024a-kube-api-access-xw9d9" (OuterVolumeSpecName: "kube-api-access-xw9d9") pod "889da12d-843a-4c71-8d48-cbb0360b024a" (UID: "889da12d-843a-4c71-8d48-cbb0360b024a"). InnerVolumeSpecName "kube-api-access-xw9d9". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:56:04 crc kubenswrapper[4730]: I0320 15:56:04.260107    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xw9d9\" (UniqueName: \"kubernetes.io/projected/889da12d-843a-4c71-8d48-cbb0360b024a-kube-api-access-xw9d9\") on node \"crc\" DevicePath \"\""
Mar 20 15:56:04 crc kubenswrapper[4730]: I0320 15:56:04.761380    4730 generic.go:334] "Generic (PLEG): container finished" podID="8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a" containerID="30e4a30153a37b946ecc906e059164d0866e21ff8e26907ff74c60a8683c99e7" exitCode=0
Mar 20 15:56:04 crc kubenswrapper[4730]: I0320 15:56:04.761478    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/6e5835fdb519dbfa6963d219a5ec71b0d73708b5df93d36b8aba1618bammpbt" event={"ID":"8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a","Type":"ContainerDied","Data":"30e4a30153a37b946ecc906e059164d0866e21ff8e26907ff74c60a8683c99e7"}
Mar 20 15:56:04 crc kubenswrapper[4730]: I0320 15:56:04.764705    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567036-mnkd8" event={"ID":"889da12d-843a-4c71-8d48-cbb0360b024a","Type":"ContainerDied","Data":"1b28fd03418b6f353e6627eaef64c3d1916825ee1c8a46df8c737da3eb3b606a"}
Mar 20 15:56:04 crc kubenswrapper[4730]: I0320 15:56:04.764740    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567036-mnkd8"
Mar 20 15:56:04 crc kubenswrapper[4730]: I0320 15:56:04.764752    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b28fd03418b6f353e6627eaef64c3d1916825ee1c8a46df8c737da3eb3b606a"
Mar 20 15:56:05 crc kubenswrapper[4730]: I0320 15:56:05.074292    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567030-pwdln"]
Mar 20 15:56:05 crc kubenswrapper[4730]: I0320 15:56:05.074963    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567030-pwdln"]
Mar 20 15:56:05 crc kubenswrapper[4730]: I0320 15:56:05.539537    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7dcd73b-be94-4b96-b001-593d2fd56aa3" path="/var/lib/kubelet/pods/b7dcd73b-be94-4b96-b001-593d2fd56aa3/volumes"
Mar 20 15:56:06 crc kubenswrapper[4730]: I0320 15:56:06.008347    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/6e5835fdb519dbfa6963d219a5ec71b0d73708b5df93d36b8aba1618bammpbt"
Mar 20 15:56:06 crc kubenswrapper[4730]: I0320 15:56:06.083040    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27dcc\" (UniqueName: \"kubernetes.io/projected/8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a-kube-api-access-27dcc\") pod \"8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a\" (UID: \"8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a\") "
Mar 20 15:56:06 crc kubenswrapper[4730]: I0320 15:56:06.083203    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a-util\") pod \"8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a\" (UID: \"8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a\") "
Mar 20 15:56:06 crc kubenswrapper[4730]: I0320 15:56:06.083240    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a-bundle\") pod \"8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a\" (UID: \"8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a\") "
Mar 20 15:56:06 crc kubenswrapper[4730]: I0320 15:56:06.084010    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a-bundle" (OuterVolumeSpecName: "bundle") pod "8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a" (UID: "8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 15:56:06 crc kubenswrapper[4730]: I0320 15:56:06.093556    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a-kube-api-access-27dcc" (OuterVolumeSpecName: "kube-api-access-27dcc") pod "8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a" (UID: "8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a"). InnerVolumeSpecName "kube-api-access-27dcc". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:56:06 crc kubenswrapper[4730]: I0320 15:56:06.096674    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a-util" (OuterVolumeSpecName: "util") pod "8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a" (UID: "8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 15:56:06 crc kubenswrapper[4730]: I0320 15:56:06.184838    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27dcc\" (UniqueName: \"kubernetes.io/projected/8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a-kube-api-access-27dcc\") on node \"crc\" DevicePath \"\""
Mar 20 15:56:06 crc kubenswrapper[4730]: I0320 15:56:06.184869    4730 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a-util\") on node \"crc\" DevicePath \"\""
Mar 20 15:56:06 crc kubenswrapper[4730]: I0320 15:56:06.184880    4730 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 15:56:06 crc kubenswrapper[4730]: I0320 15:56:06.779230    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/6e5835fdb519dbfa6963d219a5ec71b0d73708b5df93d36b8aba1618bammpbt" event={"ID":"8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a","Type":"ContainerDied","Data":"39f50a0bde24a3fc4c9b9f752f5c3f4ddc1714b08bce4063cca2dec5f9e70995"}
Mar 20 15:56:06 crc kubenswrapper[4730]: I0320 15:56:06.779315    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39f50a0bde24a3fc4c9b9f752f5c3f4ddc1714b08bce4063cca2dec5f9e70995"
Mar 20 15:56:06 crc kubenswrapper[4730]: I0320 15:56:06.779362    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/6e5835fdb519dbfa6963d219a5ec71b0d73708b5df93d36b8aba1618bammpbt"
Mar 20 15:56:13 crc kubenswrapper[4730]: I0320 15:56:13.829003    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-646f48576b-5p6h9"]
Mar 20 15:56:13 crc kubenswrapper[4730]: E0320 15:56:13.829583    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a" containerName="extract"
Mar 20 15:56:13 crc kubenswrapper[4730]: I0320 15:56:13.829596    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a" containerName="extract"
Mar 20 15:56:13 crc kubenswrapper[4730]: E0320 15:56:13.829614    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a" containerName="util"
Mar 20 15:56:13 crc kubenswrapper[4730]: I0320 15:56:13.829620    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a" containerName="util"
Mar 20 15:56:13 crc kubenswrapper[4730]: E0320 15:56:13.829633    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="889da12d-843a-4c71-8d48-cbb0360b024a" containerName="oc"
Mar 20 15:56:13 crc kubenswrapper[4730]: I0320 15:56:13.829639    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="889da12d-843a-4c71-8d48-cbb0360b024a" containerName="oc"
Mar 20 15:56:13 crc kubenswrapper[4730]: E0320 15:56:13.829651    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a" containerName="pull"
Mar 20 15:56:13 crc kubenswrapper[4730]: I0320 15:56:13.829656    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a" containerName="pull"
Mar 20 15:56:13 crc kubenswrapper[4730]: I0320 15:56:13.829755    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="889da12d-843a-4c71-8d48-cbb0360b024a" containerName="oc"
Mar 20 15:56:13 crc kubenswrapper[4730]: I0320 15:56:13.829767    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a" containerName="extract"
Mar 20 15:56:13 crc kubenswrapper[4730]: I0320 15:56:13.830138    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-646f48576b-5p6h9"
Mar 20 15:56:13 crc kubenswrapper[4730]: I0320 15:56:13.833232    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-pnth6"
Mar 20 15:56:13 crc kubenswrapper[4730]: I0320 15:56:13.855870    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-646f48576b-5p6h9"]
Mar 20 15:56:14 crc kubenswrapper[4730]: I0320 15:56:14.026663    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s9rh\" (UniqueName: \"kubernetes.io/projected/d85bb2c7-8dba-4091-a6cf-12cf58bf64a9-kube-api-access-5s9rh\") pod \"openstack-operator-controller-init-646f48576b-5p6h9\" (UID: \"d85bb2c7-8dba-4091-a6cf-12cf58bf64a9\") " pod="openstack-operators/openstack-operator-controller-init-646f48576b-5p6h9"
Mar 20 15:56:14 crc kubenswrapper[4730]: I0320 15:56:14.127662    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s9rh\" (UniqueName: \"kubernetes.io/projected/d85bb2c7-8dba-4091-a6cf-12cf58bf64a9-kube-api-access-5s9rh\") pod \"openstack-operator-controller-init-646f48576b-5p6h9\" (UID: \"d85bb2c7-8dba-4091-a6cf-12cf58bf64a9\") " pod="openstack-operators/openstack-operator-controller-init-646f48576b-5p6h9"
Mar 20 15:56:14 crc kubenswrapper[4730]: I0320 15:56:14.154483    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s9rh\" (UniqueName: \"kubernetes.io/projected/d85bb2c7-8dba-4091-a6cf-12cf58bf64a9-kube-api-access-5s9rh\") pod \"openstack-operator-controller-init-646f48576b-5p6h9\" (UID: \"d85bb2c7-8dba-4091-a6cf-12cf58bf64a9\") " pod="openstack-operators/openstack-operator-controller-init-646f48576b-5p6h9"
Mar 20 15:56:14 crc kubenswrapper[4730]: I0320 15:56:14.220932    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-646f48576b-5p6h9"
Mar 20 15:56:14 crc kubenswrapper[4730]: I0320 15:56:14.660609    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-646f48576b-5p6h9"]
Mar 20 15:56:14 crc kubenswrapper[4730]: W0320 15:56:14.677636    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd85bb2c7_8dba_4091_a6cf_12cf58bf64a9.slice/crio-6d30eeb068a680ff0fac19656d50ead28e6e692ed91f698dd2150db25654db6b WatchSource:0}: Error finding container 6d30eeb068a680ff0fac19656d50ead28e6e692ed91f698dd2150db25654db6b: Status 404 returned error can't find the container with id 6d30eeb068a680ff0fac19656d50ead28e6e692ed91f698dd2150db25654db6b
Mar 20 15:56:14 crc kubenswrapper[4730]: I0320 15:56:14.827360    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-646f48576b-5p6h9" event={"ID":"d85bb2c7-8dba-4091-a6cf-12cf58bf64a9","Type":"ContainerStarted","Data":"6d30eeb068a680ff0fac19656d50ead28e6e692ed91f698dd2150db25654db6b"}
Mar 20 15:56:18 crc kubenswrapper[4730]: I0320 15:56:18.854209    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-646f48576b-5p6h9" event={"ID":"d85bb2c7-8dba-4091-a6cf-12cf58bf64a9","Type":"ContainerStarted","Data":"ef52146127814dd76e89f2192a0ab5609649d658f1ece4c6ac4cec32b6563640"}
Mar 20 15:56:18 crc kubenswrapper[4730]: I0320 15:56:18.854774    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-646f48576b-5p6h9"
Mar 20 15:56:18 crc kubenswrapper[4730]: I0320 15:56:18.907197    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-646f48576b-5p6h9" podStartSLOduration=2.347632943 podStartE2EDuration="5.907176492s" podCreationTimestamp="2026-03-20 15:56:13 +0000 UTC" firstStartedPulling="2026-03-20 15:56:14.680970405 +0000 UTC m=+1033.894341774" lastFinishedPulling="2026-03-20 15:56:18.240513944 +0000 UTC m=+1037.453885323" observedRunningTime="2026-03-20 15:56:18.90184332 +0000 UTC m=+1038.115214689" watchObservedRunningTime="2026-03-20 15:56:18.907176492 +0000 UTC m=+1038.120547861"
Mar 20 15:56:24 crc kubenswrapper[4730]: I0320 15:56:24.224784    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-646f48576b-5p6h9"
Mar 20 15:56:28 crc kubenswrapper[4730]: I0320 15:56:28.498810    4730 scope.go:117] "RemoveContainer" containerID="db53fcef559ab1b37329ca537473be13177cc4e3055c12b3c5b8536921ff4616"
Mar 20 15:56:40 crc kubenswrapper[4730]: I0320 15:56:40.223659    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kgdl9"]
Mar 20 15:56:40 crc kubenswrapper[4730]: I0320 15:56:40.225749    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kgdl9"
Mar 20 15:56:40 crc kubenswrapper[4730]: I0320 15:56:40.236570    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kgdl9"]
Mar 20 15:56:40 crc kubenswrapper[4730]: I0320 15:56:40.314668    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2133f9df-adc5-426d-87eb-b229d518b130-utilities\") pod \"certified-operators-kgdl9\" (UID: \"2133f9df-adc5-426d-87eb-b229d518b130\") " pod="openshift-marketplace/certified-operators-kgdl9"
Mar 20 15:56:40 crc kubenswrapper[4730]: I0320 15:56:40.314712    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfpff\" (UniqueName: \"kubernetes.io/projected/2133f9df-adc5-426d-87eb-b229d518b130-kube-api-access-qfpff\") pod \"certified-operators-kgdl9\" (UID: \"2133f9df-adc5-426d-87eb-b229d518b130\") " pod="openshift-marketplace/certified-operators-kgdl9"
Mar 20 15:56:40 crc kubenswrapper[4730]: I0320 15:56:40.314732    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2133f9df-adc5-426d-87eb-b229d518b130-catalog-content\") pod \"certified-operators-kgdl9\" (UID: \"2133f9df-adc5-426d-87eb-b229d518b130\") " pod="openshift-marketplace/certified-operators-kgdl9"
Mar 20 15:56:40 crc kubenswrapper[4730]: I0320 15:56:40.415792    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2133f9df-adc5-426d-87eb-b229d518b130-utilities\") pod \"certified-operators-kgdl9\" (UID: \"2133f9df-adc5-426d-87eb-b229d518b130\") " pod="openshift-marketplace/certified-operators-kgdl9"
Mar 20 15:56:40 crc kubenswrapper[4730]: I0320 15:56:40.415845    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfpff\" (UniqueName: \"kubernetes.io/projected/2133f9df-adc5-426d-87eb-b229d518b130-kube-api-access-qfpff\") pod \"certified-operators-kgdl9\" (UID: \"2133f9df-adc5-426d-87eb-b229d518b130\") " pod="openshift-marketplace/certified-operators-kgdl9"
Mar 20 15:56:40 crc kubenswrapper[4730]: I0320 15:56:40.415866    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2133f9df-adc5-426d-87eb-b229d518b130-catalog-content\") pod \"certified-operators-kgdl9\" (UID: \"2133f9df-adc5-426d-87eb-b229d518b130\") " pod="openshift-marketplace/certified-operators-kgdl9"
Mar 20 15:56:40 crc kubenswrapper[4730]: I0320 15:56:40.416352    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2133f9df-adc5-426d-87eb-b229d518b130-utilities\") pod \"certified-operators-kgdl9\" (UID: \"2133f9df-adc5-426d-87eb-b229d518b130\") " pod="openshift-marketplace/certified-operators-kgdl9"
Mar 20 15:56:40 crc kubenswrapper[4730]: I0320 15:56:40.416381    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2133f9df-adc5-426d-87eb-b229d518b130-catalog-content\") pod \"certified-operators-kgdl9\" (UID: \"2133f9df-adc5-426d-87eb-b229d518b130\") " pod="openshift-marketplace/certified-operators-kgdl9"
Mar 20 15:56:40 crc kubenswrapper[4730]: I0320 15:56:40.439806    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfpff\" (UniqueName: \"kubernetes.io/projected/2133f9df-adc5-426d-87eb-b229d518b130-kube-api-access-qfpff\") pod \"certified-operators-kgdl9\" (UID: \"2133f9df-adc5-426d-87eb-b229d518b130\") " pod="openshift-marketplace/certified-operators-kgdl9"
Mar 20 15:56:40 crc kubenswrapper[4730]: I0320 15:56:40.543936    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kgdl9"
Mar 20 15:56:40 crc kubenswrapper[4730]: I0320 15:56:40.797217    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kgdl9"]
Mar 20 15:56:41 crc kubenswrapper[4730]: I0320 15:56:41.376934    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kgdl9" event={"ID":"2133f9df-adc5-426d-87eb-b229d518b130","Type":"ContainerStarted","Data":"eb5c3ed708196e8c37c652b11aa1ad313db538615cda523b4337b38046c73b0f"}
Mar 20 15:56:42 crc kubenswrapper[4730]: I0320 15:56:42.383814    4730 generic.go:334] "Generic (PLEG): container finished" podID="2133f9df-adc5-426d-87eb-b229d518b130" containerID="48b55984ef597f25366eb00045958af43e53d6257da11c9de40c0685dc1b9bca" exitCode=0
Mar 20 15:56:42 crc kubenswrapper[4730]: I0320 15:56:42.383861    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kgdl9" event={"ID":"2133f9df-adc5-426d-87eb-b229d518b130","Type":"ContainerDied","Data":"48b55984ef597f25366eb00045958af43e53d6257da11c9de40c0685dc1b9bca"}
Mar 20 15:56:43 crc kubenswrapper[4730]: I0320 15:56:43.394339    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kgdl9" event={"ID":"2133f9df-adc5-426d-87eb-b229d518b130","Type":"ContainerStarted","Data":"f47a9232cf9f18f060d1f87bcd94f5bf265453d693e22fe8ffad53c140c9a15d"}
Mar 20 15:56:44 crc kubenswrapper[4730]: I0320 15:56:44.415333    4730 generic.go:334] "Generic (PLEG): container finished" podID="2133f9df-adc5-426d-87eb-b229d518b130" containerID="f47a9232cf9f18f060d1f87bcd94f5bf265453d693e22fe8ffad53c140c9a15d" exitCode=0
Mar 20 15:56:44 crc kubenswrapper[4730]: I0320 15:56:44.415562    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kgdl9" event={"ID":"2133f9df-adc5-426d-87eb-b229d518b130","Type":"ContainerDied","Data":"f47a9232cf9f18f060d1f87bcd94f5bf265453d693e22fe8ffad53c140c9a15d"}
Mar 20 15:56:45 crc kubenswrapper[4730]: I0320 15:56:45.425290    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kgdl9" event={"ID":"2133f9df-adc5-426d-87eb-b229d518b130","Type":"ContainerStarted","Data":"e30d27703acd54f286baa7767ed6b870660616b2cea2e85eb4333f75a1d87f96"}
Mar 20 15:56:45 crc kubenswrapper[4730]: I0320 15:56:45.446086    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kgdl9" podStartSLOduration=2.960585725 podStartE2EDuration="5.446066691s" podCreationTimestamp="2026-03-20 15:56:40 +0000 UTC" firstStartedPulling="2026-03-20 15:56:42.385415966 +0000 UTC m=+1061.598787335" lastFinishedPulling="2026-03-20 15:56:44.870896932 +0000 UTC m=+1064.084268301" observedRunningTime="2026-03-20 15:56:45.445594898 +0000 UTC m=+1064.658966267" watchObservedRunningTime="2026-03-20 15:56:45.446066691 +0000 UTC m=+1064.659438060"
Mar 20 15:56:50 crc kubenswrapper[4730]: I0320 15:56:50.544877    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kgdl9"
Mar 20 15:56:50 crc kubenswrapper[4730]: I0320 15:56:50.545324    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kgdl9"
Mar 20 15:56:50 crc kubenswrapper[4730]: I0320 15:56:50.646698    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kgdl9"
Mar 20 15:56:51 crc kubenswrapper[4730]: I0320 15:56:51.513182    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kgdl9"
Mar 20 15:56:51 crc kubenswrapper[4730]: I0320 15:56:51.566219    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kgdl9"]
Mar 20 15:56:53 crc kubenswrapper[4730]: I0320 15:56:53.482843    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kgdl9" podUID="2133f9df-adc5-426d-87eb-b229d518b130" containerName="registry-server" containerID="cri-o://e30d27703acd54f286baa7767ed6b870660616b2cea2e85eb4333f75a1d87f96" gracePeriod=2
Mar 20 15:56:53 crc kubenswrapper[4730]: E0320 15:56:53.610368    4730 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2133f9df_adc5_426d_87eb_b229d518b130.slice/crio-e30d27703acd54f286baa7767ed6b870660616b2cea2e85eb4333f75a1d87f96.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2133f9df_adc5_426d_87eb_b229d518b130.slice/crio-conmon-e30d27703acd54f286baa7767ed6b870660616b2cea2e85eb4333f75a1d87f96.scope\": RecentStats: unable to find data in memory cache]"
Mar 20 15:56:54 crc kubenswrapper[4730]: I0320 15:56:54.015515    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kgdl9"
Mar 20 15:56:54 crc kubenswrapper[4730]: I0320 15:56:54.116068    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2133f9df-adc5-426d-87eb-b229d518b130-catalog-content\") pod \"2133f9df-adc5-426d-87eb-b229d518b130\" (UID: \"2133f9df-adc5-426d-87eb-b229d518b130\") "
Mar 20 15:56:54 crc kubenswrapper[4730]: I0320 15:56:54.116177    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2133f9df-adc5-426d-87eb-b229d518b130-utilities\") pod \"2133f9df-adc5-426d-87eb-b229d518b130\" (UID: \"2133f9df-adc5-426d-87eb-b229d518b130\") "
Mar 20 15:56:54 crc kubenswrapper[4730]: I0320 15:56:54.116200    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfpff\" (UniqueName: \"kubernetes.io/projected/2133f9df-adc5-426d-87eb-b229d518b130-kube-api-access-qfpff\") pod \"2133f9df-adc5-426d-87eb-b229d518b130\" (UID: \"2133f9df-adc5-426d-87eb-b229d518b130\") "
Mar 20 15:56:54 crc kubenswrapper[4730]: I0320 15:56:54.117152    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2133f9df-adc5-426d-87eb-b229d518b130-utilities" (OuterVolumeSpecName: "utilities") pod "2133f9df-adc5-426d-87eb-b229d518b130" (UID: "2133f9df-adc5-426d-87eb-b229d518b130"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 15:56:54 crc kubenswrapper[4730]: I0320 15:56:54.125752    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2133f9df-adc5-426d-87eb-b229d518b130-kube-api-access-qfpff" (OuterVolumeSpecName: "kube-api-access-qfpff") pod "2133f9df-adc5-426d-87eb-b229d518b130" (UID: "2133f9df-adc5-426d-87eb-b229d518b130"). InnerVolumeSpecName "kube-api-access-qfpff". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:56:54 crc kubenswrapper[4730]: I0320 15:56:54.217352    4730 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2133f9df-adc5-426d-87eb-b229d518b130-utilities\") on node \"crc\" DevicePath \"\""
Mar 20 15:56:54 crc kubenswrapper[4730]: I0320 15:56:54.217384    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfpff\" (UniqueName: \"kubernetes.io/projected/2133f9df-adc5-426d-87eb-b229d518b130-kube-api-access-qfpff\") on node \"crc\" DevicePath \"\""
Mar 20 15:56:54 crc kubenswrapper[4730]: I0320 15:56:54.489972    4730 generic.go:334] "Generic (PLEG): container finished" podID="2133f9df-adc5-426d-87eb-b229d518b130" containerID="e30d27703acd54f286baa7767ed6b870660616b2cea2e85eb4333f75a1d87f96" exitCode=0
Mar 20 15:56:54 crc kubenswrapper[4730]: I0320 15:56:54.490014    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kgdl9" event={"ID":"2133f9df-adc5-426d-87eb-b229d518b130","Type":"ContainerDied","Data":"e30d27703acd54f286baa7767ed6b870660616b2cea2e85eb4333f75a1d87f96"}
Mar 20 15:56:54 crc kubenswrapper[4730]: I0320 15:56:54.490065    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kgdl9" event={"ID":"2133f9df-adc5-426d-87eb-b229d518b130","Type":"ContainerDied","Data":"eb5c3ed708196e8c37c652b11aa1ad313db538615cda523b4337b38046c73b0f"}
Mar 20 15:56:54 crc kubenswrapper[4730]: I0320 15:56:54.490064    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kgdl9"
Mar 20 15:56:54 crc kubenswrapper[4730]: I0320 15:56:54.490087    4730 scope.go:117] "RemoveContainer" containerID="e30d27703acd54f286baa7767ed6b870660616b2cea2e85eb4333f75a1d87f96"
Mar 20 15:56:54 crc kubenswrapper[4730]: I0320 15:56:54.508102    4730 scope.go:117] "RemoveContainer" containerID="f47a9232cf9f18f060d1f87bcd94f5bf265453d693e22fe8ffad53c140c9a15d"
Mar 20 15:56:54 crc kubenswrapper[4730]: I0320 15:56:54.523679    4730 scope.go:117] "RemoveContainer" containerID="48b55984ef597f25366eb00045958af43e53d6257da11c9de40c0685dc1b9bca"
Mar 20 15:56:54 crc kubenswrapper[4730]: I0320 15:56:54.545448    4730 scope.go:117] "RemoveContainer" containerID="e30d27703acd54f286baa7767ed6b870660616b2cea2e85eb4333f75a1d87f96"
Mar 20 15:56:54 crc kubenswrapper[4730]: E0320 15:56:54.546849    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e30d27703acd54f286baa7767ed6b870660616b2cea2e85eb4333f75a1d87f96\": container with ID starting with e30d27703acd54f286baa7767ed6b870660616b2cea2e85eb4333f75a1d87f96 not found: ID does not exist" containerID="e30d27703acd54f286baa7767ed6b870660616b2cea2e85eb4333f75a1d87f96"
Mar 20 15:56:54 crc kubenswrapper[4730]: I0320 15:56:54.546909    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e30d27703acd54f286baa7767ed6b870660616b2cea2e85eb4333f75a1d87f96"} err="failed to get container status \"e30d27703acd54f286baa7767ed6b870660616b2cea2e85eb4333f75a1d87f96\": rpc error: code = NotFound desc = could not find container \"e30d27703acd54f286baa7767ed6b870660616b2cea2e85eb4333f75a1d87f96\": container with ID starting with e30d27703acd54f286baa7767ed6b870660616b2cea2e85eb4333f75a1d87f96 not found: ID does not exist"
Mar 20 15:56:54 crc kubenswrapper[4730]: I0320 15:56:54.546944    4730 scope.go:117] "RemoveContainer" containerID="f47a9232cf9f18f060d1f87bcd94f5bf265453d693e22fe8ffad53c140c9a15d"
Mar 20 15:56:54 crc kubenswrapper[4730]: E0320 15:56:54.547411    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f47a9232cf9f18f060d1f87bcd94f5bf265453d693e22fe8ffad53c140c9a15d\": container with ID starting with f47a9232cf9f18f060d1f87bcd94f5bf265453d693e22fe8ffad53c140c9a15d not found: ID does not exist" containerID="f47a9232cf9f18f060d1f87bcd94f5bf265453d693e22fe8ffad53c140c9a15d"
Mar 20 15:56:54 crc kubenswrapper[4730]: I0320 15:56:54.547447    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f47a9232cf9f18f060d1f87bcd94f5bf265453d693e22fe8ffad53c140c9a15d"} err="failed to get container status \"f47a9232cf9f18f060d1f87bcd94f5bf265453d693e22fe8ffad53c140c9a15d\": rpc error: code = NotFound desc = could not find container \"f47a9232cf9f18f060d1f87bcd94f5bf265453d693e22fe8ffad53c140c9a15d\": container with ID starting with f47a9232cf9f18f060d1f87bcd94f5bf265453d693e22fe8ffad53c140c9a15d not found: ID does not exist"
Mar 20 15:56:54 crc kubenswrapper[4730]: I0320 15:56:54.547472    4730 scope.go:117] "RemoveContainer" containerID="48b55984ef597f25366eb00045958af43e53d6257da11c9de40c0685dc1b9bca"
Mar 20 15:56:54 crc kubenswrapper[4730]: E0320 15:56:54.547831    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48b55984ef597f25366eb00045958af43e53d6257da11c9de40c0685dc1b9bca\": container with ID starting with 48b55984ef597f25366eb00045958af43e53d6257da11c9de40c0685dc1b9bca not found: ID does not exist" containerID="48b55984ef597f25366eb00045958af43e53d6257da11c9de40c0685dc1b9bca"
Mar 20 15:56:54 crc kubenswrapper[4730]: I0320 15:56:54.547855    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48b55984ef597f25366eb00045958af43e53d6257da11c9de40c0685dc1b9bca"} err="failed to get container status \"48b55984ef597f25366eb00045958af43e53d6257da11c9de40c0685dc1b9bca\": rpc error: code = NotFound desc = could not find container \"48b55984ef597f25366eb00045958af43e53d6257da11c9de40c0685dc1b9bca\": container with ID starting with 48b55984ef597f25366eb00045958af43e53d6257da11c9de40c0685dc1b9bca not found: ID does not exist"
Mar 20 15:56:54 crc kubenswrapper[4730]: I0320 15:56:54.924068    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2133f9df-adc5-426d-87eb-b229d518b130-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2133f9df-adc5-426d-87eb-b229d518b130" (UID: "2133f9df-adc5-426d-87eb-b229d518b130"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 15:56:54 crc kubenswrapper[4730]: I0320 15:56:54.929930    4730 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2133f9df-adc5-426d-87eb-b229d518b130-catalog-content\") on node \"crc\" DevicePath \"\""
Mar 20 15:56:55 crc kubenswrapper[4730]: I0320 15:56:55.127346    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kgdl9"]
Mar 20 15:56:55 crc kubenswrapper[4730]: I0320 15:56:55.133391    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kgdl9"]
Mar 20 15:56:55 crc kubenswrapper[4730]: I0320 15:56:55.539869    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2133f9df-adc5-426d-87eb-b229d518b130" path="/var/lib/kubelet/pods/2133f9df-adc5-426d-87eb-b229d518b130/volumes"
Mar 20 15:56:59 crc kubenswrapper[4730]: I0320 15:56:59.907907    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-wqqnd"]
Mar 20 15:56:59 crc kubenswrapper[4730]: E0320 15:56:59.908190    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2133f9df-adc5-426d-87eb-b229d518b130" containerName="extract-utilities"
Mar 20 15:56:59 crc kubenswrapper[4730]: I0320 15:56:59.908206    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="2133f9df-adc5-426d-87eb-b229d518b130" containerName="extract-utilities"
Mar 20 15:56:59 crc kubenswrapper[4730]: E0320 15:56:59.908226    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2133f9df-adc5-426d-87eb-b229d518b130" containerName="extract-content"
Mar 20 15:56:59 crc kubenswrapper[4730]: I0320 15:56:59.908234    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="2133f9df-adc5-426d-87eb-b229d518b130" containerName="extract-content"
Mar 20 15:56:59 crc kubenswrapper[4730]: E0320 15:56:59.908264    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2133f9df-adc5-426d-87eb-b229d518b130" containerName="registry-server"
Mar 20 15:56:59 crc kubenswrapper[4730]: I0320 15:56:59.908273    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="2133f9df-adc5-426d-87eb-b229d518b130" containerName="registry-server"
Mar 20 15:56:59 crc kubenswrapper[4730]: I0320 15:56:59.908430    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="2133f9df-adc5-426d-87eb-b229d518b130" containerName="registry-server"
Mar 20 15:56:59 crc kubenswrapper[4730]: I0320 15:56:59.908936    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-wqqnd"
Mar 20 15:56:59 crc kubenswrapper[4730]: I0320 15:56:59.910995    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-jtrs9"
Mar 20 15:56:59 crc kubenswrapper[4730]: I0320 15:56:59.913370    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-dmd8z"]
Mar 20 15:56:59 crc kubenswrapper[4730]: I0320 15:56:59.914278    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-dmd8z"
Mar 20 15:56:59 crc kubenswrapper[4730]: I0320 15:56:59.920813    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-tph25"
Mar 20 15:56:59 crc kubenswrapper[4730]: I0320 15:56:59.921424    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-wqqnd"]
Mar 20 15:56:59 crc kubenswrapper[4730]: I0320 15:56:59.925998    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-dmd8z"]
Mar 20 15:56:59 crc kubenswrapper[4730]: I0320 15:56:59.936355    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-nwwzc"]
Mar 20 15:56:59 crc kubenswrapper[4730]: I0320 15:56:59.937182    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-nwwzc"
Mar 20 15:56:59 crc kubenswrapper[4730]: I0320 15:56:59.943827    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-llp6b"]
Mar 20 15:56:59 crc kubenswrapper[4730]: I0320 15:56:59.946023    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-llp6b"
Mar 20 15:56:59 crc kubenswrapper[4730]: I0320 15:56:59.946617    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-lf45j"
Mar 20 15:56:59 crc kubenswrapper[4730]: I0320 15:56:59.947776    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-22gcs"
Mar 20 15:56:59 crc kubenswrapper[4730]: I0320 15:56:59.962625    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-nwwzc"]
Mar 20 15:56:59 crc kubenswrapper[4730]: I0320 15:56:59.972026    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-llp6b"]
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.000322    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-v96m5"]
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.001185    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-v96m5"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.006863    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-xddkf"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.016298    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-v96m5"]
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.046948    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-pf8sw"]
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.047786    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-pf8sw"]
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.047872    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-pf8sw"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.050477    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-k8gx9"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.066583    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-4pkr9"]
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.069002    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-4pkr9"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.078417    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.078684    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-hg8js"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.086197    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-9k6lh"]
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.087154    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-9k6lh"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.091853    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-rddvd"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.092681    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cgmv\" (UniqueName: \"kubernetes.io/projected/d658514c-f369-4ce2-ad50-d055fd208694-kube-api-access-7cgmv\") pod \"glance-operator-controller-manager-79df6bcc97-llp6b\" (UID: \"d658514c-f369-4ce2-ad50-d055fd208694\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-llp6b"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.092745    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfw6d\" (UniqueName: \"kubernetes.io/projected/e8ad6f56-863f-473b-a4d4-d4f70d9489a4-kube-api-access-nfw6d\") pod \"designate-operator-controller-manager-588d4d986b-nwwzc\" (UID: \"e8ad6f56-863f-473b-a4d4-d4f70d9489a4\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-nwwzc"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.092805    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fndw\" (UniqueName: \"kubernetes.io/projected/4fb51ed6-04e3-40db-ab21-eb0fe66442fe-kube-api-access-5fndw\") pod \"barbican-operator-controller-manager-59bc569d95-dmd8z\" (UID: \"4fb51ed6-04e3-40db-ab21-eb0fe66442fe\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-dmd8z"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.092847    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r4ml\" (UniqueName: \"kubernetes.io/projected/c5aaa9e9-aebc-4daa-b7ab-c6064b5a78ef-kube-api-access-6r4ml\") pod \"cinder-operator-controller-manager-8d58dc466-wqqnd\" (UID: \"c5aaa9e9-aebc-4daa-b7ab-c6064b5a78ef\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-wqqnd"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.092884    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74dbz\" (UniqueName: \"kubernetes.io/projected/acffaecc-dd6c-4819-91cf-99c5d0154143-kube-api-access-74dbz\") pod \"heat-operator-controller-manager-67dd5f86f5-v96m5\" (UID: \"acffaecc-dd6c-4819-91cf-99c5d0154143\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-v96m5"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.103827    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-4pkr9"]
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.120446    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-9k6lh"]
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.131059    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-g4kgd"]
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.131879    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-g4kgd"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.134778    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-2cwsv"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.148370    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-g4kgd"]
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.160154    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-bqjxs"]
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.161514    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-bqjxs"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.166826    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-pvtt9"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.169346    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-rnx2d"]
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.172931    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-rnx2d"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.174746    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-647b7"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.191086    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-rnx2d"]
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.220195    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-xw6kk"]
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.220825    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krfh6\" (UniqueName: \"kubernetes.io/projected/24280954-941c-445f-aa52-e360ce544046-kube-api-access-krfh6\") pod \"ironic-operator-controller-manager-6f787dddc9-9k6lh\" (UID: \"24280954-941c-445f-aa52-e360ce544046\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-9k6lh"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.220908    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fndw\" (UniqueName: \"kubernetes.io/projected/4fb51ed6-04e3-40db-ab21-eb0fe66442fe-kube-api-access-5fndw\") pod \"barbican-operator-controller-manager-59bc569d95-dmd8z\" (UID: \"4fb51ed6-04e3-40db-ab21-eb0fe66442fe\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-dmd8z"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.221067    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c48t4\" (UniqueName: \"kubernetes.io/projected/d8b68e41-b53d-4fb3-8a86-0c604cda0e46-kube-api-access-c48t4\") pod \"infra-operator-controller-manager-7b9c774f96-4pkr9\" (UID: \"d8b68e41-b53d-4fb3-8a86-0c604cda0e46\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-4pkr9"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.221221    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r4ml\" (UniqueName: \"kubernetes.io/projected/c5aaa9e9-aebc-4daa-b7ab-c6064b5a78ef-kube-api-access-6r4ml\") pod \"cinder-operator-controller-manager-8d58dc466-wqqnd\" (UID: \"c5aaa9e9-aebc-4daa-b7ab-c6064b5a78ef\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-wqqnd"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.221908    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74dbz\" (UniqueName: \"kubernetes.io/projected/acffaecc-dd6c-4819-91cf-99c5d0154143-kube-api-access-74dbz\") pod \"heat-operator-controller-manager-67dd5f86f5-v96m5\" (UID: \"acffaecc-dd6c-4819-91cf-99c5d0154143\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-v96m5"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.222126    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cgmv\" (UniqueName: \"kubernetes.io/projected/d658514c-f369-4ce2-ad50-d055fd208694-kube-api-access-7cgmv\") pod \"glance-operator-controller-manager-79df6bcc97-llp6b\" (UID: \"d658514c-f369-4ce2-ad50-d055fd208694\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-llp6b"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.222370    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rdf7\" (UniqueName: \"kubernetes.io/projected/f733406e-5258-4cfe-870d-4fb86152363e-kube-api-access-2rdf7\") pod \"horizon-operator-controller-manager-8464cc45fb-pf8sw\" (UID: \"f733406e-5258-4cfe-870d-4fb86152363e\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-pf8sw"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.222578    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfw6d\" (UniqueName: \"kubernetes.io/projected/e8ad6f56-863f-473b-a4d4-d4f70d9489a4-kube-api-access-nfw6d\") pod \"designate-operator-controller-manager-588d4d986b-nwwzc\" (UID: \"e8ad6f56-863f-473b-a4d4-d4f70d9489a4\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-nwwzc"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.222737    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8b68e41-b53d-4fb3-8a86-0c604cda0e46-cert\") pod \"infra-operator-controller-manager-7b9c774f96-4pkr9\" (UID: \"d8b68e41-b53d-4fb3-8a86-0c604cda0e46\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-4pkr9"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.243649    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-xw6kk"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.257140    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-7cctn"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.263231    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fndw\" (UniqueName: \"kubernetes.io/projected/4fb51ed6-04e3-40db-ab21-eb0fe66442fe-kube-api-access-5fndw\") pod \"barbican-operator-controller-manager-59bc569d95-dmd8z\" (UID: \"4fb51ed6-04e3-40db-ab21-eb0fe66442fe\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-dmd8z"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.264162    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-dmd8z"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.266936    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74dbz\" (UniqueName: \"kubernetes.io/projected/acffaecc-dd6c-4819-91cf-99c5d0154143-kube-api-access-74dbz\") pod \"heat-operator-controller-manager-67dd5f86f5-v96m5\" (UID: \"acffaecc-dd6c-4819-91cf-99c5d0154143\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-v96m5"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.272854    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r4ml\" (UniqueName: \"kubernetes.io/projected/c5aaa9e9-aebc-4daa-b7ab-c6064b5a78ef-kube-api-access-6r4ml\") pod \"cinder-operator-controller-manager-8d58dc466-wqqnd\" (UID: \"c5aaa9e9-aebc-4daa-b7ab-c6064b5a78ef\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-wqqnd"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.287139    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-bqjxs"]
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.296677    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfw6d\" (UniqueName: \"kubernetes.io/projected/e8ad6f56-863f-473b-a4d4-d4f70d9489a4-kube-api-access-nfw6d\") pod \"designate-operator-controller-manager-588d4d986b-nwwzc\" (UID: \"e8ad6f56-863f-473b-a4d4-d4f70d9489a4\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-nwwzc"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.299126    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-xw6kk"]
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.301758    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cgmv\" (UniqueName: \"kubernetes.io/projected/d658514c-f369-4ce2-ad50-d055fd208694-kube-api-access-7cgmv\") pod \"glance-operator-controller-manager-79df6bcc97-llp6b\" (UID: \"d658514c-f369-4ce2-ad50-d055fd208694\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-llp6b"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.322752    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-l7v9q"]
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.324077    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-l7v9q"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.325605    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsqkt\" (UniqueName: \"kubernetes.io/projected/cf3ded14-d81b-4384-93e4-e51cde6a31ec-kube-api-access-vsqkt\") pod \"keystone-operator-controller-manager-768b96df4c-g4kgd\" (UID: \"cf3ded14-d81b-4384-93e4-e51cde6a31ec\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-g4kgd"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.325674    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krfh6\" (UniqueName: \"kubernetes.io/projected/24280954-941c-445f-aa52-e360ce544046-kube-api-access-krfh6\") pod \"ironic-operator-controller-manager-6f787dddc9-9k6lh\" (UID: \"24280954-941c-445f-aa52-e360ce544046\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-9k6lh"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.325711    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kv24\" (UniqueName: \"kubernetes.io/projected/87b37583-ab1d-4f9e-98e9-8cb9bdcc5165-kube-api-access-8kv24\") pod \"manila-operator-controller-manager-55f864c847-bqjxs\" (UID: \"87b37583-ab1d-4f9e-98e9-8cb9bdcc5165\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-bqjxs"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.325733    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvkw4\" (UniqueName: \"kubernetes.io/projected/19a5ba3c-9f89-43f6-bd55-6998df2e3533-kube-api-access-tvkw4\") pod \"mariadb-operator-controller-manager-67ccfc9778-rnx2d\" (UID: \"19a5ba3c-9f89-43f6-bd55-6998df2e3533\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-rnx2d"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.325769    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c48t4\" (UniqueName: \"kubernetes.io/projected/d8b68e41-b53d-4fb3-8a86-0c604cda0e46-kube-api-access-c48t4\") pod \"infra-operator-controller-manager-7b9c774f96-4pkr9\" (UID: \"d8b68e41-b53d-4fb3-8a86-0c604cda0e46\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-4pkr9"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.325834    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rdf7\" (UniqueName: \"kubernetes.io/projected/f733406e-5258-4cfe-870d-4fb86152363e-kube-api-access-2rdf7\") pod \"horizon-operator-controller-manager-8464cc45fb-pf8sw\" (UID: \"f733406e-5258-4cfe-870d-4fb86152363e\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-pf8sw"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.325868    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8b68e41-b53d-4fb3-8a86-0c604cda0e46-cert\") pod \"infra-operator-controller-manager-7b9c774f96-4pkr9\" (UID: \"d8b68e41-b53d-4fb3-8a86-0c604cda0e46\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-4pkr9"
Mar 20 15:57:00 crc kubenswrapper[4730]: E0320 15:57:00.326975    4730 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found
Mar 20 15:57:00 crc kubenswrapper[4730]: E0320 15:57:00.327049    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8b68e41-b53d-4fb3-8a86-0c604cda0e46-cert podName:d8b68e41-b53d-4fb3-8a86-0c604cda0e46 nodeName:}" failed. No retries permitted until 2026-03-20 15:57:00.82702877 +0000 UTC m=+1080.040400139 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d8b68e41-b53d-4fb3-8a86-0c604cda0e46-cert") pod "infra-operator-controller-manager-7b9c774f96-4pkr9" (UID: "d8b68e41-b53d-4fb3-8a86-0c604cda0e46") : secret "infra-operator-webhook-server-cert" not found
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.328858    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-v96m5"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.336065    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-w8x5z"]
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.336938    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-w8x5z"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.343133    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-l7v9q"]
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.343881    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-lk442"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.345993    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-7d5bf"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.347706    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krfh6\" (UniqueName: \"kubernetes.io/projected/24280954-941c-445f-aa52-e360ce544046-kube-api-access-krfh6\") pod \"ironic-operator-controller-manager-6f787dddc9-9k6lh\" (UID: \"24280954-941c-445f-aa52-e360ce544046\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-9k6lh"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.347767    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-w8x5z"]
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.366893    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c48t4\" (UniqueName: \"kubernetes.io/projected/d8b68e41-b53d-4fb3-8a86-0c604cda0e46-kube-api-access-c48t4\") pod \"infra-operator-controller-manager-7b9c774f96-4pkr9\" (UID: \"d8b68e41-b53d-4fb3-8a86-0c604cda0e46\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-4pkr9"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.374214    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rdf7\" (UniqueName: \"kubernetes.io/projected/f733406e-5258-4cfe-870d-4fb86152363e-kube-api-access-2rdf7\") pod \"horizon-operator-controller-manager-8464cc45fb-pf8sw\" (UID: \"f733406e-5258-4cfe-870d-4fb86152363e\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-pf8sw"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.378981    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-pf8sw"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.383203    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-f8l2x"]
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.384424    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-f8l2x"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.388140    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.391043    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-57nkr"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.400909    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-t7kkm"]
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.402059    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-t7kkm"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.405956    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-g8rlc"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.416753    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-lt49w"]
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.417876    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-lt49w"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.420420    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-qsg4k"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.424531    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-9k6lh"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.433210    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-t7kkm"]
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.428135    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzrlg\" (UniqueName: \"kubernetes.io/projected/61755ffd-de91-4a38-a174-fe1a4c57dfd0-kube-api-access-zzrlg\") pod \"neutron-operator-controller-manager-767865f676-xw6kk\" (UID: \"61755ffd-de91-4a38-a174-fe1a4c57dfd0\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-xw6kk"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.434911    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsqkt\" (UniqueName: \"kubernetes.io/projected/cf3ded14-d81b-4384-93e4-e51cde6a31ec-kube-api-access-vsqkt\") pod \"keystone-operator-controller-manager-768b96df4c-g4kgd\" (UID: \"cf3ded14-d81b-4384-93e4-e51cde6a31ec\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-g4kgd"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.434978    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kv24\" (UniqueName: \"kubernetes.io/projected/87b37583-ab1d-4f9e-98e9-8cb9bdcc5165-kube-api-access-8kv24\") pod \"manila-operator-controller-manager-55f864c847-bqjxs\" (UID: \"87b37583-ab1d-4f9e-98e9-8cb9bdcc5165\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-bqjxs"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.434996    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2gcf\" (UniqueName: \"kubernetes.io/projected/36dd23cb-43b2-4c25-9e24-3e2f69f93eff-kube-api-access-l2gcf\") pod \"nova-operator-controller-manager-5d488d59fb-l7v9q\" (UID: \"36dd23cb-43b2-4c25-9e24-3e2f69f93eff\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-l7v9q"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.435017    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvkw4\" (UniqueName: \"kubernetes.io/projected/19a5ba3c-9f89-43f6-bd55-6998df2e3533-kube-api-access-tvkw4\") pod \"mariadb-operator-controller-manager-67ccfc9778-rnx2d\" (UID: \"19a5ba3c-9f89-43f6-bd55-6998df2e3533\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-rnx2d"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.435131    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkwmj\" (UniqueName: \"kubernetes.io/projected/d7ad408f-56db-4b5b-bea9-ba821eae2b80-kube-api-access-kkwmj\") pod \"octavia-operator-controller-manager-5b9f45d989-w8x5z\" (UID: \"d7ad408f-56db-4b5b-bea9-ba821eae2b80\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-w8x5z"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.455219    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-lt49w"]
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.460569    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kv24\" (UniqueName: \"kubernetes.io/projected/87b37583-ab1d-4f9e-98e9-8cb9bdcc5165-kube-api-access-8kv24\") pod \"manila-operator-controller-manager-55f864c847-bqjxs\" (UID: \"87b37583-ab1d-4f9e-98e9-8cb9bdcc5165\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-bqjxs"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.460640    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-6f2w8"]
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.461601    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6f2w8"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.463004    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-4znx8"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.464987    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvkw4\" (UniqueName: \"kubernetes.io/projected/19a5ba3c-9f89-43f6-bd55-6998df2e3533-kube-api-access-tvkw4\") pod \"mariadb-operator-controller-manager-67ccfc9778-rnx2d\" (UID: \"19a5ba3c-9f89-43f6-bd55-6998df2e3533\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-rnx2d"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.468027    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsqkt\" (UniqueName: \"kubernetes.io/projected/cf3ded14-d81b-4384-93e4-e51cde6a31ec-kube-api-access-vsqkt\") pod \"keystone-operator-controller-manager-768b96df4c-g4kgd\" (UID: \"cf3ded14-d81b-4384-93e4-e51cde6a31ec\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-g4kgd"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.475063    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-f8l2x"]
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.488466    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-bqjxs"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.510963    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-6f2w8"]
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.522153    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-rnx2d"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.527358    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-lrpjm"]
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.528204    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-lrpjm"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.530583    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-gbr7k"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.536151    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkwmj\" (UniqueName: \"kubernetes.io/projected/d7ad408f-56db-4b5b-bea9-ba821eae2b80-kube-api-access-kkwmj\") pod \"octavia-operator-controller-manager-5b9f45d989-w8x5z\" (UID: \"d7ad408f-56db-4b5b-bea9-ba821eae2b80\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-w8x5z"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.536191    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpdpg\" (UniqueName: \"kubernetes.io/projected/6944c865-92a4-441c-907b-27424898cb99-kube-api-access-rpdpg\") pod \"ovn-operator-controller-manager-884679f54-t7kkm\" (UID: \"6944c865-92a4-441c-907b-27424898cb99\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-t7kkm"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.536229    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8f74be61-d309-417c-90a3-2962b57071c4-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-f8l2x\" (UID: \"8f74be61-d309-417c-90a3-2962b57071c4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-f8l2x"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.536270    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzrlg\" (UniqueName: \"kubernetes.io/projected/61755ffd-de91-4a38-a174-fe1a4c57dfd0-kube-api-access-zzrlg\") pod \"neutron-operator-controller-manager-767865f676-xw6kk\" (UID: \"61755ffd-de91-4a38-a174-fe1a4c57dfd0\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-xw6kk"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.536303    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm4mm\" (UniqueName: \"kubernetes.io/projected/9944d85d-4f1c-4312-ac57-49ee75a8fd16-kube-api-access-xm4mm\") pod \"placement-operator-controller-manager-5784578c99-lt49w\" (UID: \"9944d85d-4f1c-4312-ac57-49ee75a8fd16\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-lt49w"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.536332    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2gcf\" (UniqueName: \"kubernetes.io/projected/36dd23cb-43b2-4c25-9e24-3e2f69f93eff-kube-api-access-l2gcf\") pod \"nova-operator-controller-manager-5d488d59fb-l7v9q\" (UID: \"36dd23cb-43b2-4c25-9e24-3e2f69f93eff\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-l7v9q"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.536374    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wrrk\" (UniqueName: \"kubernetes.io/projected/8f74be61-d309-417c-90a3-2962b57071c4-kube-api-access-7wrrk\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-f8l2x\" (UID: \"8f74be61-d309-417c-90a3-2962b57071c4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-f8l2x"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.536917    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-wqqnd"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.542920    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-lrpjm"]
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.569199    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-bm7hr"]
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.572579    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-bm7hr"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.573223    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-nwwzc"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.578700    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-2nljx"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.578839    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2gcf\" (UniqueName: \"kubernetes.io/projected/36dd23cb-43b2-4c25-9e24-3e2f69f93eff-kube-api-access-l2gcf\") pod \"nova-operator-controller-manager-5d488d59fb-l7v9q\" (UID: \"36dd23cb-43b2-4c25-9e24-3e2f69f93eff\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-l7v9q"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.579816    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-bm7hr"]
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.580561    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzrlg\" (UniqueName: \"kubernetes.io/projected/61755ffd-de91-4a38-a174-fe1a4c57dfd0-kube-api-access-zzrlg\") pod \"neutron-operator-controller-manager-767865f676-xw6kk\" (UID: \"61755ffd-de91-4a38-a174-fe1a4c57dfd0\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-xw6kk"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.583532    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-llp6b"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.593952    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkwmj\" (UniqueName: \"kubernetes.io/projected/d7ad408f-56db-4b5b-bea9-ba821eae2b80-kube-api-access-kkwmj\") pod \"octavia-operator-controller-manager-5b9f45d989-w8x5z\" (UID: \"d7ad408f-56db-4b5b-bea9-ba821eae2b80\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-w8x5z"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.637890    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wrrk\" (UniqueName: \"kubernetes.io/projected/8f74be61-d309-417c-90a3-2962b57071c4-kube-api-access-7wrrk\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-f8l2x\" (UID: \"8f74be61-d309-417c-90a3-2962b57071c4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-f8l2x"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.637979    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpdpg\" (UniqueName: \"kubernetes.io/projected/6944c865-92a4-441c-907b-27424898cb99-kube-api-access-rpdpg\") pod \"ovn-operator-controller-manager-884679f54-t7kkm\" (UID: \"6944c865-92a4-441c-907b-27424898cb99\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-t7kkm"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.638044    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8f74be61-d309-417c-90a3-2962b57071c4-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-f8l2x\" (UID: \"8f74be61-d309-417c-90a3-2962b57071c4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-f8l2x"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.638147    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm4mm\" (UniqueName: \"kubernetes.io/projected/9944d85d-4f1c-4312-ac57-49ee75a8fd16-kube-api-access-xm4mm\") pod \"placement-operator-controller-manager-5784578c99-lt49w\" (UID: \"9944d85d-4f1c-4312-ac57-49ee75a8fd16\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-lt49w"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.638181    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jrvs\" (UniqueName: \"kubernetes.io/projected/db4a9305-eefd-4804-ac7a-4d811bd928f5-kube-api-access-5jrvs\") pod \"swift-operator-controller-manager-c674c5965-6f2w8\" (UID: \"db4a9305-eefd-4804-ac7a-4d811bd928f5\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-6f2w8"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.638239    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5h4l\" (UniqueName: \"kubernetes.io/projected/cdbd62c8-9960-4257-87d9-d4923c7ef8dd-kube-api-access-d5h4l\") pod \"telemetry-operator-controller-manager-d6b694c5-lrpjm\" (UID: \"cdbd62c8-9960-4257-87d9-d4923c7ef8dd\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-lrpjm"
Mar 20 15:57:00 crc kubenswrapper[4730]: E0320 15:57:00.638827    4730 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found
Mar 20 15:57:00 crc kubenswrapper[4730]: E0320 15:57:00.638881    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f74be61-d309-417c-90a3-2962b57071c4-cert podName:8f74be61-d309-417c-90a3-2962b57071c4 nodeName:}" failed. No retries permitted until 2026-03-20 15:57:01.138864376 +0000 UTC m=+1080.352235745 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8f74be61-d309-417c-90a3-2962b57071c4-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-f8l2x" (UID: "8f74be61-d309-417c-90a3-2962b57071c4") : secret "openstack-baremetal-operator-webhook-server-cert" not found
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.676074    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wrrk\" (UniqueName: \"kubernetes.io/projected/8f74be61-d309-417c-90a3-2962b57071c4-kube-api-access-7wrrk\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-f8l2x\" (UID: \"8f74be61-d309-417c-90a3-2962b57071c4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-f8l2x"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.679051    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpdpg\" (UniqueName: \"kubernetes.io/projected/6944c865-92a4-441c-907b-27424898cb99-kube-api-access-rpdpg\") pod \"ovn-operator-controller-manager-884679f54-t7kkm\" (UID: \"6944c865-92a4-441c-907b-27424898cb99\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-t7kkm"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.691163    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm4mm\" (UniqueName: \"kubernetes.io/projected/9944d85d-4f1c-4312-ac57-49ee75a8fd16-kube-api-access-xm4mm\") pod \"placement-operator-controller-manager-5784578c99-lt49w\" (UID: \"9944d85d-4f1c-4312-ac57-49ee75a8fd16\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-lt49w"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.722053    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c5858c67b-cfmtk"]
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.723379    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c5858c67b-cfmtk"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.730827    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-xw6kk"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.743869    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jrvs\" (UniqueName: \"kubernetes.io/projected/db4a9305-eefd-4804-ac7a-4d811bd928f5-kube-api-access-5jrvs\") pod \"swift-operator-controller-manager-c674c5965-6f2w8\" (UID: \"db4a9305-eefd-4804-ac7a-4d811bd928f5\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-6f2w8"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.743913    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5h4l\" (UniqueName: \"kubernetes.io/projected/cdbd62c8-9960-4257-87d9-d4923c7ef8dd-kube-api-access-d5h4l\") pod \"telemetry-operator-controller-manager-d6b694c5-lrpjm\" (UID: \"cdbd62c8-9960-4257-87d9-d4923c7ef8dd\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-lrpjm"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.743947    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkvtm\" (UniqueName: \"kubernetes.io/projected/92c29eff-b9ab-4420-86c6-6b388cfc87af-kube-api-access-tkvtm\") pod \"test-operator-controller-manager-5c5cb9c4d7-bm7hr\" (UID: \"92c29eff-b9ab-4420-86c6-6b388cfc87af\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-bm7hr"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.754381    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c5858c67b-cfmtk"]
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.754641    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-g4kgd"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.757352    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-n27l7"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.785991    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-l7v9q"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.822815    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jrvs\" (UniqueName: \"kubernetes.io/projected/db4a9305-eefd-4804-ac7a-4d811bd928f5-kube-api-access-5jrvs\") pod \"swift-operator-controller-manager-c674c5965-6f2w8\" (UID: \"db4a9305-eefd-4804-ac7a-4d811bd928f5\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-6f2w8"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.846166    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6f58c59cbb-76ssq"]
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.847774    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6f58c59cbb-76ssq"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.858485    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w6bk\" (UniqueName: \"kubernetes.io/projected/f00b4813-358d-49c4-bf9d-486e35f5a94f-kube-api-access-7w6bk\") pod \"watcher-operator-controller-manager-6c5858c67b-cfmtk\" (UID: \"f00b4813-358d-49c4-bf9d-486e35f5a94f\") " pod="openstack-operators/watcher-operator-controller-manager-6c5858c67b-cfmtk"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.858593    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8b68e41-b53d-4fb3-8a86-0c604cda0e46-cert\") pod \"infra-operator-controller-manager-7b9c774f96-4pkr9\" (UID: \"d8b68e41-b53d-4fb3-8a86-0c604cda0e46\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-4pkr9"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.858660    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkvtm\" (UniqueName: \"kubernetes.io/projected/92c29eff-b9ab-4420-86c6-6b388cfc87af-kube-api-access-tkvtm\") pod \"test-operator-controller-manager-5c5cb9c4d7-bm7hr\" (UID: \"92c29eff-b9ab-4420-86c6-6b388cfc87af\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-bm7hr"
Mar 20 15:57:00 crc kubenswrapper[4730]: E0320 15:57:00.859659    4730 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found
Mar 20 15:57:00 crc kubenswrapper[4730]: E0320 15:57:00.859730    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8b68e41-b53d-4fb3-8a86-0c604cda0e46-cert podName:d8b68e41-b53d-4fb3-8a86-0c604cda0e46 nodeName:}" failed. No retries permitted until 2026-03-20 15:57:01.859711007 +0000 UTC m=+1081.073082376 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d8b68e41-b53d-4fb3-8a86-0c604cda0e46-cert") pod "infra-operator-controller-manager-7b9c774f96-4pkr9" (UID: "d8b68e41-b53d-4fb3-8a86-0c604cda0e46") : secret "infra-operator-webhook-server-cert" not found
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.862919    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.869870    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-w8x5z"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.870328    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-nctfr"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.878851    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-t7kkm"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.870361    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.859672    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5h4l\" (UniqueName: \"kubernetes.io/projected/cdbd62c8-9960-4257-87d9-d4923c7ef8dd-kube-api-access-d5h4l\") pod \"telemetry-operator-controller-manager-d6b694c5-lrpjm\" (UID: \"cdbd62c8-9960-4257-87d9-d4923c7ef8dd\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-lrpjm"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.909921    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkvtm\" (UniqueName: \"kubernetes.io/projected/92c29eff-b9ab-4420-86c6-6b388cfc87af-kube-api-access-tkvtm\") pod \"test-operator-controller-manager-5c5cb9c4d7-bm7hr\" (UID: \"92c29eff-b9ab-4420-86c6-6b388cfc87af\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-bm7hr"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.910079    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-lt49w"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.944554    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6f58c59cbb-76ssq"]
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.974390    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6f2w8"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.975427    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008-metrics-certs\") pod \"openstack-operator-controller-manager-6f58c59cbb-76ssq\" (UID: \"c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008\") " pod="openstack-operators/openstack-operator-controller-manager-6f58c59cbb-76ssq"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.975463    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008-webhook-certs\") pod \"openstack-operator-controller-manager-6f58c59cbb-76ssq\" (UID: \"c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008\") " pod="openstack-operators/openstack-operator-controller-manager-6f58c59cbb-76ssq"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.975511    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7w6bk\" (UniqueName: \"kubernetes.io/projected/f00b4813-358d-49c4-bf9d-486e35f5a94f-kube-api-access-7w6bk\") pod \"watcher-operator-controller-manager-6c5858c67b-cfmtk\" (UID: \"f00b4813-358d-49c4-bf9d-486e35f5a94f\") " pod="openstack-operators/watcher-operator-controller-manager-6c5858c67b-cfmtk"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.975564    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99p9c\" (UniqueName: \"kubernetes.io/projected/c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008-kube-api-access-99p9c\") pod \"openstack-operator-controller-manager-6f58c59cbb-76ssq\" (UID: \"c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008\") " pod="openstack-operators/openstack-operator-controller-manager-6f58c59cbb-76ssq"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.977513    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mdzv5"]
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.978419    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mdzv5"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.983724    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mdzv5"]
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.991618    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-lrpjm"
Mar 20 15:57:00 crc kubenswrapper[4730]: I0320 15:57:00.994539    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-7j9w6"
Mar 20 15:57:01 crc kubenswrapper[4730]: I0320 15:57:00.997371    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-dmd8z"]
Mar 20 15:57:01 crc kubenswrapper[4730]: I0320 15:57:01.003748    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7w6bk\" (UniqueName: \"kubernetes.io/projected/f00b4813-358d-49c4-bf9d-486e35f5a94f-kube-api-access-7w6bk\") pod \"watcher-operator-controller-manager-6c5858c67b-cfmtk\" (UID: \"f00b4813-358d-49c4-bf9d-486e35f5a94f\") " pod="openstack-operators/watcher-operator-controller-manager-6c5858c67b-cfmtk"
Mar 20 15:57:01 crc kubenswrapper[4730]: I0320 15:57:01.022462    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-bm7hr"
Mar 20 15:57:01 crc kubenswrapper[4730]: I0320 15:57:01.061796    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c5858c67b-cfmtk"
Mar 20 15:57:01 crc kubenswrapper[4730]: I0320 15:57:01.079021    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008-metrics-certs\") pod \"openstack-operator-controller-manager-6f58c59cbb-76ssq\" (UID: \"c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008\") " pod="openstack-operators/openstack-operator-controller-manager-6f58c59cbb-76ssq"
Mar 20 15:57:01 crc kubenswrapper[4730]: I0320 15:57:01.079077    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008-webhook-certs\") pod \"openstack-operator-controller-manager-6f58c59cbb-76ssq\" (UID: \"c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008\") " pod="openstack-operators/openstack-operator-controller-manager-6f58c59cbb-76ssq"
Mar 20 15:57:01 crc kubenswrapper[4730]: I0320 15:57:01.079161    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz65q\" (UniqueName: \"kubernetes.io/projected/82cae974-2029-42c3-81bf-e9bee167e991-kube-api-access-hz65q\") pod \"rabbitmq-cluster-operator-manager-668c99d594-mdzv5\" (UID: \"82cae974-2029-42c3-81bf-e9bee167e991\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mdzv5"
Mar 20 15:57:01 crc kubenswrapper[4730]: I0320 15:57:01.079228    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99p9c\" (UniqueName: \"kubernetes.io/projected/c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008-kube-api-access-99p9c\") pod \"openstack-operator-controller-manager-6f58c59cbb-76ssq\" (UID: \"c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008\") " pod="openstack-operators/openstack-operator-controller-manager-6f58c59cbb-76ssq"
Mar 20 15:57:01 crc kubenswrapper[4730]: E0320 15:57:01.079391    4730 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found
Mar 20 15:57:01 crc kubenswrapper[4730]: E0320 15:57:01.079455    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008-webhook-certs podName:c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008 nodeName:}" failed. No retries permitted until 2026-03-20 15:57:01.579438515 +0000 UTC m=+1080.792809884 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008-webhook-certs") pod "openstack-operator-controller-manager-6f58c59cbb-76ssq" (UID: "c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008") : secret "webhook-server-cert" not found
Mar 20 15:57:01 crc kubenswrapper[4730]: E0320 15:57:01.079174    4730 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found
Mar 20 15:57:01 crc kubenswrapper[4730]: E0320 15:57:01.079691    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008-metrics-certs podName:c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008 nodeName:}" failed. No retries permitted until 2026-03-20 15:57:01.579670662 +0000 UTC m=+1080.793042031 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008-metrics-certs") pod "openstack-operator-controller-manager-6f58c59cbb-76ssq" (UID: "c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008") : secret "metrics-server-cert" not found
Mar 20 15:57:01 crc kubenswrapper[4730]: I0320 15:57:01.120500    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99p9c\" (UniqueName: \"kubernetes.io/projected/c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008-kube-api-access-99p9c\") pod \"openstack-operator-controller-manager-6f58c59cbb-76ssq\" (UID: \"c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008\") " pod="openstack-operators/openstack-operator-controller-manager-6f58c59cbb-76ssq"
Mar 20 15:57:01 crc kubenswrapper[4730]: I0320 15:57:01.180862    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8f74be61-d309-417c-90a3-2962b57071c4-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-f8l2x\" (UID: \"8f74be61-d309-417c-90a3-2962b57071c4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-f8l2x"
Mar 20 15:57:01 crc kubenswrapper[4730]: I0320 15:57:01.180916    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz65q\" (UniqueName: \"kubernetes.io/projected/82cae974-2029-42c3-81bf-e9bee167e991-kube-api-access-hz65q\") pod \"rabbitmq-cluster-operator-manager-668c99d594-mdzv5\" (UID: \"82cae974-2029-42c3-81bf-e9bee167e991\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mdzv5"
Mar 20 15:57:01 crc kubenswrapper[4730]: E0320 15:57:01.181358    4730 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found
Mar 20 15:57:01 crc kubenswrapper[4730]: E0320 15:57:01.181434    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f74be61-d309-417c-90a3-2962b57071c4-cert podName:8f74be61-d309-417c-90a3-2962b57071c4 nodeName:}" failed. No retries permitted until 2026-03-20 15:57:02.181416495 +0000 UTC m=+1081.394787864 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8f74be61-d309-417c-90a3-2962b57071c4-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-f8l2x" (UID: "8f74be61-d309-417c-90a3-2962b57071c4") : secret "openstack-baremetal-operator-webhook-server-cert" not found
Mar 20 15:57:01 crc kubenswrapper[4730]: I0320 15:57:01.204807    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz65q\" (UniqueName: \"kubernetes.io/projected/82cae974-2029-42c3-81bf-e9bee167e991-kube-api-access-hz65q\") pod \"rabbitmq-cluster-operator-manager-668c99d594-mdzv5\" (UID: \"82cae974-2029-42c3-81bf-e9bee167e991\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mdzv5"
Mar 20 15:57:01 crc kubenswrapper[4730]: I0320 15:57:01.239726    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-v96m5"]
Mar 20 15:57:01 crc kubenswrapper[4730]: I0320 15:57:01.349958    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-pf8sw"]
Mar 20 15:57:01 crc kubenswrapper[4730]: I0320 15:57:01.377808    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mdzv5"
Mar 20 15:57:01 crc kubenswrapper[4730]: I0320 15:57:01.584142    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-v96m5" event={"ID":"acffaecc-dd6c-4819-91cf-99c5d0154143","Type":"ContainerStarted","Data":"13d001283395a603cd311d6bbb865c68956f3919996ffd420b77272abed644bc"}
Mar 20 15:57:01 crc kubenswrapper[4730]: I0320 15:57:01.584504    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-pf8sw" event={"ID":"f733406e-5258-4cfe-870d-4fb86152363e","Type":"ContainerStarted","Data":"c06c8dafbc5f1431a6c3449ac140f7b0107a38e11283f93df1b4982555086cf7"}
Mar 20 15:57:01 crc kubenswrapper[4730]: I0320 15:57:01.585635    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-dmd8z" event={"ID":"4fb51ed6-04e3-40db-ab21-eb0fe66442fe","Type":"ContainerStarted","Data":"d1a9c5ffc6adbb662f6b3059cb05c6bc3d05624788e0fc7aed8e7aca54dff196"}
Mar 20 15:57:01 crc kubenswrapper[4730]: I0320 15:57:01.590695    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008-metrics-certs\") pod \"openstack-operator-controller-manager-6f58c59cbb-76ssq\" (UID: \"c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008\") " pod="openstack-operators/openstack-operator-controller-manager-6f58c59cbb-76ssq"
Mar 20 15:57:01 crc kubenswrapper[4730]: I0320 15:57:01.590738    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008-webhook-certs\") pod \"openstack-operator-controller-manager-6f58c59cbb-76ssq\" (UID: \"c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008\") " pod="openstack-operators/openstack-operator-controller-manager-6f58c59cbb-76ssq"
Mar 20 15:57:01 crc kubenswrapper[4730]: E0320 15:57:01.590932    4730 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found
Mar 20 15:57:01 crc kubenswrapper[4730]: E0320 15:57:01.591000    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008-webhook-certs podName:c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008 nodeName:}" failed. No retries permitted until 2026-03-20 15:57:02.590961377 +0000 UTC m=+1081.804332746 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008-webhook-certs") pod "openstack-operator-controller-manager-6f58c59cbb-76ssq" (UID: "c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008") : secret "webhook-server-cert" not found
Mar 20 15:57:01 crc kubenswrapper[4730]: E0320 15:57:01.591073    4730 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found
Mar 20 15:57:01 crc kubenswrapper[4730]: E0320 15:57:01.591099    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008-metrics-certs podName:c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008 nodeName:}" failed. No retries permitted until 2026-03-20 15:57:02.591092361 +0000 UTC m=+1081.804463730 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008-metrics-certs") pod "openstack-operator-controller-manager-6f58c59cbb-76ssq" (UID: "c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008") : secret "metrics-server-cert" not found
Mar 20 15:57:01 crc kubenswrapper[4730]: I0320 15:57:01.905483    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8b68e41-b53d-4fb3-8a86-0c604cda0e46-cert\") pod \"infra-operator-controller-manager-7b9c774f96-4pkr9\" (UID: \"d8b68e41-b53d-4fb3-8a86-0c604cda0e46\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-4pkr9"
Mar 20 15:57:01 crc kubenswrapper[4730]: E0320 15:57:01.905697    4730 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found
Mar 20 15:57:01 crc kubenswrapper[4730]: E0320 15:57:01.905752    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8b68e41-b53d-4fb3-8a86-0c604cda0e46-cert podName:d8b68e41-b53d-4fb3-8a86-0c604cda0e46 nodeName:}" failed. No retries permitted until 2026-03-20 15:57:03.905737857 +0000 UTC m=+1083.119109226 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d8b68e41-b53d-4fb3-8a86-0c604cda0e46-cert") pod "infra-operator-controller-manager-7b9c774f96-4pkr9" (UID: "d8b68e41-b53d-4fb3-8a86-0c604cda0e46") : secret "infra-operator-webhook-server-cert" not found
Mar 20 15:57:02 crc kubenswrapper[4730]: I0320 15:57:01.997891    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-9k6lh"]
Mar 20 15:57:02 crc kubenswrapper[4730]: I0320 15:57:02.017874    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-bqjxs"]
Mar 20 15:57:02 crc kubenswrapper[4730]: W0320 15:57:02.029276    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87b37583_ab1d_4f9e_98e9_8cb9bdcc5165.slice/crio-e8d7eff02b8b76372a54743ad9d731405d54eb260e6f785ad1cadc3653509179 WatchSource:0}: Error finding container e8d7eff02b8b76372a54743ad9d731405d54eb260e6f785ad1cadc3653509179: Status 404 returned error can't find the container with id e8d7eff02b8b76372a54743ad9d731405d54eb260e6f785ad1cadc3653509179
Mar 20 15:57:02 crc kubenswrapper[4730]: I0320 15:57:02.227434    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8f74be61-d309-417c-90a3-2962b57071c4-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-f8l2x\" (UID: \"8f74be61-d309-417c-90a3-2962b57071c4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-f8l2x"
Mar 20 15:57:02 crc kubenswrapper[4730]: E0320 15:57:02.227617    4730 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found
Mar 20 15:57:02 crc kubenswrapper[4730]: E0320 15:57:02.227699    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f74be61-d309-417c-90a3-2962b57071c4-cert podName:8f74be61-d309-417c-90a3-2962b57071c4 nodeName:}" failed. No retries permitted until 2026-03-20 15:57:04.227675062 +0000 UTC m=+1083.441046481 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8f74be61-d309-417c-90a3-2962b57071c4-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-f8l2x" (UID: "8f74be61-d309-417c-90a3-2962b57071c4") : secret "openstack-baremetal-operator-webhook-server-cert" not found
Mar 20 15:57:02 crc kubenswrapper[4730]: I0320 15:57:02.388675    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-t7kkm"]
Mar 20 15:57:02 crc kubenswrapper[4730]: I0320 15:57:02.431442    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-xw6kk"]
Mar 20 15:57:02 crc kubenswrapper[4730]: I0320 15:57:02.471183    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-g4kgd"]
Mar 20 15:57:02 crc kubenswrapper[4730]: I0320 15:57:02.482877    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-rnx2d"]
Mar 20 15:57:02 crc kubenswrapper[4730]: I0320 15:57:02.497981    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-llp6b"]
Mar 20 15:57:02 crc kubenswrapper[4730]: I0320 15:57:02.520386    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-lt49w"]
Mar 20 15:57:02 crc kubenswrapper[4730]: I0320 15:57:02.539740    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-lrpjm"]
Mar 20 15:57:02 crc kubenswrapper[4730]: I0320 15:57:02.563362    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-nwwzc"]
Mar 20 15:57:02 crc kubenswrapper[4730]: I0320 15:57:02.577348    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-l7v9q"]
Mar 20 15:57:02 crc kubenswrapper[4730]: I0320 15:57:02.588703    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-6f2w8"]
Mar 20 15:57:02 crc kubenswrapper[4730]: I0320 15:57:02.611954    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-w8x5z"]
Mar 20 15:57:02 crc kubenswrapper[4730]: I0320 15:57:02.624327    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-bm7hr"]
Mar 20 15:57:02 crc kubenswrapper[4730]: I0320 15:57:02.624394    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-wqqnd"]
Mar 20 15:57:02 crc kubenswrapper[4730]: W0320 15:57:02.630843    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb4a9305_eefd_4804_ac7a_4d811bd928f5.slice/crio-9cb30f7561968d0045bfd9720752e9e4b74aedd5bd67f8abf662c7010c4d21fd WatchSource:0}: Error finding container 9cb30f7561968d0045bfd9720752e9e4b74aedd5bd67f8abf662c7010c4d21fd: Status 404 returned error can't find the container with id 9cb30f7561968d0045bfd9720752e9e4b74aedd5bd67f8abf662c7010c4d21fd
Mar 20 15:57:02 crc kubenswrapper[4730]: I0320 15:57:02.633549    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-lt49w" event={"ID":"9944d85d-4f1c-4312-ac57-49ee75a8fd16","Type":"ContainerStarted","Data":"e8fdc344e225b28c02f725d643a077fdf7d7f73b111b57f938236e92ee02ba2f"}
Mar 20 15:57:02 crc kubenswrapper[4730]: I0320 15:57:02.636338    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c5858c67b-cfmtk"]
Mar 20 15:57:02 crc kubenswrapper[4730]: I0320 15:57:02.642161    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008-metrics-certs\") pod \"openstack-operator-controller-manager-6f58c59cbb-76ssq\" (UID: \"c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008\") " pod="openstack-operators/openstack-operator-controller-manager-6f58c59cbb-76ssq"
Mar 20 15:57:02 crc kubenswrapper[4730]: I0320 15:57:02.642210    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008-webhook-certs\") pod \"openstack-operator-controller-manager-6f58c59cbb-76ssq\" (UID: \"c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008\") " pod="openstack-operators/openstack-operator-controller-manager-6f58c59cbb-76ssq"
Mar 20 15:57:02 crc kubenswrapper[4730]: E0320 15:57:02.643029    4730 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found
Mar 20 15:57:02 crc kubenswrapper[4730]: E0320 15:57:02.643108    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008-webhook-certs podName:c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008 nodeName:}" failed. No retries permitted until 2026-03-20 15:57:04.643088223 +0000 UTC m=+1083.856459592 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008-webhook-certs") pod "openstack-operator-controller-manager-6f58c59cbb-76ssq" (UID: "c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008") : secret "webhook-server-cert" not found
Mar 20 15:57:02 crc kubenswrapper[4730]: E0320 15:57:02.643156    4730 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found
Mar 20 15:57:02 crc kubenswrapper[4730]: E0320 15:57:02.643178    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008-metrics-certs podName:c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008 nodeName:}" failed. No retries permitted until 2026-03-20 15:57:04.643170455 +0000 UTC m=+1083.856541824 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008-metrics-certs") pod "openstack-operator-controller-manager-6f58c59cbb-76ssq" (UID: "c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008") : secret "metrics-server-cert" not found
Mar 20 15:57:02 crc kubenswrapper[4730]: I0320 15:57:02.644150    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-9k6lh" event={"ID":"24280954-941c-445f-aa52-e360ce544046","Type":"ContainerStarted","Data":"6c59770b79346d41c65371ffa59c7bedc5774bd27cf057465db667dfca45ebe3"}
Mar 20 15:57:02 crc kubenswrapper[4730]: I0320 15:57:02.694390    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-xw6kk" event={"ID":"61755ffd-de91-4a38-a174-fe1a4c57dfd0","Type":"ContainerStarted","Data":"58a5658fce2da71a575534b0605f321b73b382661cbc5abeff444b1dc640bae0"}
Mar 20 15:57:02 crc kubenswrapper[4730]: I0320 15:57:02.694749    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mdzv5"]
Mar 20 15:57:02 crc kubenswrapper[4730]: E0320 15:57:02.696748    4730 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:12841b27173f5f1beeb83112e057c8753f4cf411f583fba4f0610fac0f60b7ad,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {}  BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {}  BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nfw6d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-588d4d986b-nwwzc_openstack-operators(e8ad6f56-863f-473b-a4d4-d4f70d9489a4): ErrImagePull: pull QPS exceeded" logger="UnhandledError"
Mar 20 15:57:02 crc kubenswrapper[4730]: E0320 15:57:02.699064    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-nwwzc" podUID="e8ad6f56-863f-473b-a4d4-d4f70d9489a4"
Mar 20 15:57:02 crc kubenswrapper[4730]: E0320 15:57:02.700939    4730 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:d8210bb21d4d298271a7b43f92fe58789393546e616aaaec1ce71bb2a754e777,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {}  BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {}  BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6r4ml,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-8d58dc466-wqqnd_openstack-operators(c5aaa9e9-aebc-4daa-b7ab-c6064b5a78ef): ErrImagePull: pull QPS exceeded" logger="UnhandledError"
Mar 20 15:57:02 crc kubenswrapper[4730]: E0320 15:57:02.701173    4730 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.147:5001/openstack-k8s-operators/watcher-operator:ee00c2d330b27d46c48ac29a20680b56ca50df3c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {}  BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {}  BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7w6bk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6c5858c67b-cfmtk_openstack-operators(f00b4813-358d-49c4-bf9d-486e35f5a94f): ErrImagePull: pull QPS exceeded" logger="UnhandledError"
Mar 20 15:57:02 crc kubenswrapper[4730]: E0320 15:57:02.704654    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6c5858c67b-cfmtk" podUID="f00b4813-358d-49c4-bf9d-486e35f5a94f"
Mar 20 15:57:02 crc kubenswrapper[4730]: E0320 15:57:02.704697    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-wqqnd" podUID="c5aaa9e9-aebc-4daa-b7ab-c6064b5a78ef"
Mar 20 15:57:02 crc kubenswrapper[4730]: I0320 15:57:02.711448    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-g4kgd" event={"ID":"cf3ded14-d81b-4384-93e4-e51cde6a31ec","Type":"ContainerStarted","Data":"bc855432d1a2fe31aa565f0f52bf4a30b404c62e959d0a58ae7bc441629b60c2"}
Mar 20 15:57:02 crc kubenswrapper[4730]: I0320 15:57:02.717539    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-t7kkm" event={"ID":"6944c865-92a4-441c-907b-27424898cb99","Type":"ContainerStarted","Data":"3034cb64846551ae2909b23f5eb9dcd7905f4b99d08ea71aba65ddfaa5696724"}
Mar 20 15:57:02 crc kubenswrapper[4730]: I0320 15:57:02.731758    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-l7v9q" event={"ID":"36dd23cb-43b2-4c25-9e24-3e2f69f93eff","Type":"ContainerStarted","Data":"72490f86974170aaec1959afaeaed04d8cf9dfbe52ed67d0a2ca0df16df2af51"}
Mar 20 15:57:02 crc kubenswrapper[4730]: E0320 15:57:02.738469    4730 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {}  BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hz65q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-mdzv5_openstack-operators(82cae974-2029-42c3-81bf-e9bee167e991): ErrImagePull: pull QPS exceeded" logger="UnhandledError"
Mar 20 15:57:02 crc kubenswrapper[4730]: E0320 15:57:02.739556    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mdzv5" podUID="82cae974-2029-42c3-81bf-e9bee167e991"
Mar 20 15:57:02 crc kubenswrapper[4730]: I0320 15:57:02.739633    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-bqjxs" event={"ID":"87b37583-ab1d-4f9e-98e9-8cb9bdcc5165","Type":"ContainerStarted","Data":"e8d7eff02b8b76372a54743ad9d731405d54eb260e6f785ad1cadc3653509179"}
Mar 20 15:57:03 crc kubenswrapper[4730]: I0320 15:57:03.781760    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-w8x5z" event={"ID":"d7ad408f-56db-4b5b-bea9-ba821eae2b80","Type":"ContainerStarted","Data":"38b85e5df71c3d01a6abf19999f1840430fce94b3ac1abee5e699f0c0326a5d4"}
Mar 20 15:57:03 crc kubenswrapper[4730]: I0320 15:57:03.797387    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6f2w8" event={"ID":"db4a9305-eefd-4804-ac7a-4d811bd928f5","Type":"ContainerStarted","Data":"9cb30f7561968d0045bfd9720752e9e4b74aedd5bd67f8abf662c7010c4d21fd"}
Mar 20 15:57:03 crc kubenswrapper[4730]: I0320 15:57:03.798822    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-lrpjm" event={"ID":"cdbd62c8-9960-4257-87d9-d4923c7ef8dd","Type":"ContainerStarted","Data":"581a1a413334b45b9a22d9be8ff637ffea65dd102aafa5ad0261cc439f275cee"}
Mar 20 15:57:03 crc kubenswrapper[4730]: I0320 15:57:03.799830    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-rnx2d" event={"ID":"19a5ba3c-9f89-43f6-bd55-6998df2e3533","Type":"ContainerStarted","Data":"b93f7cbc0548701886f61a08ae3045c3ac97aca16082cd09f17ae6f91edbe0fd"}
Mar 20 15:57:03 crc kubenswrapper[4730]: I0320 15:57:03.802869    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-nwwzc" event={"ID":"e8ad6f56-863f-473b-a4d4-d4f70d9489a4","Type":"ContainerStarted","Data":"5b1fc862b3295795aec3d62f2d172208c4189c4a71c75481fd169af14696d7c5"}
Mar 20 15:57:03 crc kubenswrapper[4730]: E0320 15:57:03.817587    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:12841b27173f5f1beeb83112e057c8753f4cf411f583fba4f0610fac0f60b7ad\\\"\"" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-nwwzc" podUID="e8ad6f56-863f-473b-a4d4-d4f70d9489a4"
Mar 20 15:57:03 crc kubenswrapper[4730]: I0320 15:57:03.842535    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-wqqnd" event={"ID":"c5aaa9e9-aebc-4daa-b7ab-c6064b5a78ef","Type":"ContainerStarted","Data":"a7fea7c7594813ef3dcae93e9bcbb34de39b987c788f796f8de9678836f63fe2"}
Mar 20 15:57:03 crc kubenswrapper[4730]: E0320 15:57:03.864668    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:d8210bb21d4d298271a7b43f92fe58789393546e616aaaec1ce71bb2a754e777\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-wqqnd" podUID="c5aaa9e9-aebc-4daa-b7ab-c6064b5a78ef"
Mar 20 15:57:03 crc kubenswrapper[4730]: I0320 15:57:03.868091    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-llp6b" event={"ID":"d658514c-f369-4ce2-ad50-d055fd208694","Type":"ContainerStarted","Data":"f079d7f9ca767ac45da6e292b1bcb8f259c63524f1476c80f910852835296af9"}
Mar 20 15:57:03 crc kubenswrapper[4730]: I0320 15:57:03.905109    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c5858c67b-cfmtk" event={"ID":"f00b4813-358d-49c4-bf9d-486e35f5a94f","Type":"ContainerStarted","Data":"6b95d0657ecda55b2b17bbf5242d6b63c057c1735705995ac8dacddff38955c5"}
Mar 20 15:57:03 crc kubenswrapper[4730]: E0320 15:57:03.914753    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.147:5001/openstack-k8s-operators/watcher-operator:ee00c2d330b27d46c48ac29a20680b56ca50df3c\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c5858c67b-cfmtk" podUID="f00b4813-358d-49c4-bf9d-486e35f5a94f"
Mar 20 15:57:03 crc kubenswrapper[4730]: I0320 15:57:03.954467    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mdzv5" event={"ID":"82cae974-2029-42c3-81bf-e9bee167e991","Type":"ContainerStarted","Data":"0e906f2f4a3a756fd43e424eebf93c5dceee8ef81562506c0f023281c9ffa95b"}
Mar 20 15:57:03 crc kubenswrapper[4730]: E0320 15:57:03.963396    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mdzv5" podUID="82cae974-2029-42c3-81bf-e9bee167e991"
Mar 20 15:57:03 crc kubenswrapper[4730]: I0320 15:57:03.965621    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-bm7hr" event={"ID":"92c29eff-b9ab-4420-86c6-6b388cfc87af","Type":"ContainerStarted","Data":"2360582b0caa13e57019f3e37d3c50b63cf2025f962d5b3b40517954b864fc8c"}
Mar 20 15:57:04 crc kubenswrapper[4730]: I0320 15:57:04.000920    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8b68e41-b53d-4fb3-8a86-0c604cda0e46-cert\") pod \"infra-operator-controller-manager-7b9c774f96-4pkr9\" (UID: \"d8b68e41-b53d-4fb3-8a86-0c604cda0e46\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-4pkr9"
Mar 20 15:57:04 crc kubenswrapper[4730]: E0320 15:57:04.001065    4730 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found
Mar 20 15:57:04 crc kubenswrapper[4730]: E0320 15:57:04.001107    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8b68e41-b53d-4fb3-8a86-0c604cda0e46-cert podName:d8b68e41-b53d-4fb3-8a86-0c604cda0e46 nodeName:}" failed. No retries permitted until 2026-03-20 15:57:08.001093635 +0000 UTC m=+1087.214465014 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d8b68e41-b53d-4fb3-8a86-0c604cda0e46-cert") pod "infra-operator-controller-manager-7b9c774f96-4pkr9" (UID: "d8b68e41-b53d-4fb3-8a86-0c604cda0e46") : secret "infra-operator-webhook-server-cert" not found
Mar 20 15:57:04 crc kubenswrapper[4730]: I0320 15:57:04.318022    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8f74be61-d309-417c-90a3-2962b57071c4-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-f8l2x\" (UID: \"8f74be61-d309-417c-90a3-2962b57071c4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-f8l2x"
Mar 20 15:57:04 crc kubenswrapper[4730]: E0320 15:57:04.318268    4730 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found
Mar 20 15:57:04 crc kubenswrapper[4730]: E0320 15:57:04.318432    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f74be61-d309-417c-90a3-2962b57071c4-cert podName:8f74be61-d309-417c-90a3-2962b57071c4 nodeName:}" failed. No retries permitted until 2026-03-20 15:57:08.318416107 +0000 UTC m=+1087.531787476 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8f74be61-d309-417c-90a3-2962b57071c4-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-f8l2x" (UID: "8f74be61-d309-417c-90a3-2962b57071c4") : secret "openstack-baremetal-operator-webhook-server-cert" not found
Mar 20 15:57:04 crc kubenswrapper[4730]: I0320 15:57:04.736529    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008-metrics-certs\") pod \"openstack-operator-controller-manager-6f58c59cbb-76ssq\" (UID: \"c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008\") " pod="openstack-operators/openstack-operator-controller-manager-6f58c59cbb-76ssq"
Mar 20 15:57:04 crc kubenswrapper[4730]: I0320 15:57:04.736571    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008-webhook-certs\") pod \"openstack-operator-controller-manager-6f58c59cbb-76ssq\" (UID: \"c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008\") " pod="openstack-operators/openstack-operator-controller-manager-6f58c59cbb-76ssq"
Mar 20 15:57:04 crc kubenswrapper[4730]: E0320 15:57:04.736730    4730 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found
Mar 20 15:57:04 crc kubenswrapper[4730]: E0320 15:57:04.736783    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008-webhook-certs podName:c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008 nodeName:}" failed. No retries permitted until 2026-03-20 15:57:08.736769462 +0000 UTC m=+1087.950140831 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008-webhook-certs") pod "openstack-operator-controller-manager-6f58c59cbb-76ssq" (UID: "c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008") : secret "webhook-server-cert" not found
Mar 20 15:57:04 crc kubenswrapper[4730]: E0320 15:57:04.737089    4730 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found
Mar 20 15:57:04 crc kubenswrapper[4730]: E0320 15:57:04.737157    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008-metrics-certs podName:c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008 nodeName:}" failed. No retries permitted until 2026-03-20 15:57:08.737106542 +0000 UTC m=+1087.950477911 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008-metrics-certs") pod "openstack-operator-controller-manager-6f58c59cbb-76ssq" (UID: "c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008") : secret "metrics-server-cert" not found
Mar 20 15:57:04 crc kubenswrapper[4730]: E0320 15:57:04.990131    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:12841b27173f5f1beeb83112e057c8753f4cf411f583fba4f0610fac0f60b7ad\\\"\"" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-nwwzc" podUID="e8ad6f56-863f-473b-a4d4-d4f70d9489a4"
Mar 20 15:57:04 crc kubenswrapper[4730]: E0320 15:57:04.990805    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.147:5001/openstack-k8s-operators/watcher-operator:ee00c2d330b27d46c48ac29a20680b56ca50df3c\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c5858c67b-cfmtk" podUID="f00b4813-358d-49c4-bf9d-486e35f5a94f"
Mar 20 15:57:04 crc kubenswrapper[4730]: E0320 15:57:04.990912    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mdzv5" podUID="82cae974-2029-42c3-81bf-e9bee167e991"
Mar 20 15:57:04 crc kubenswrapper[4730]: E0320 15:57:04.991331    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:d8210bb21d4d298271a7b43f92fe58789393546e616aaaec1ce71bb2a754e777\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-wqqnd" podUID="c5aaa9e9-aebc-4daa-b7ab-c6064b5a78ef"
Mar 20 15:57:08 crc kubenswrapper[4730]: I0320 15:57:08.098597    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8b68e41-b53d-4fb3-8a86-0c604cda0e46-cert\") pod \"infra-operator-controller-manager-7b9c774f96-4pkr9\" (UID: \"d8b68e41-b53d-4fb3-8a86-0c604cda0e46\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-4pkr9"
Mar 20 15:57:08 crc kubenswrapper[4730]: E0320 15:57:08.098762    4730 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found
Mar 20 15:57:08 crc kubenswrapper[4730]: E0320 15:57:08.099129    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8b68e41-b53d-4fb3-8a86-0c604cda0e46-cert podName:d8b68e41-b53d-4fb3-8a86-0c604cda0e46 nodeName:}" failed. No retries permitted until 2026-03-20 15:57:16.099088953 +0000 UTC m=+1095.312460322 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d8b68e41-b53d-4fb3-8a86-0c604cda0e46-cert") pod "infra-operator-controller-manager-7b9c774f96-4pkr9" (UID: "d8b68e41-b53d-4fb3-8a86-0c604cda0e46") : secret "infra-operator-webhook-server-cert" not found
Mar 20 15:57:08 crc kubenswrapper[4730]: I0320 15:57:08.403567    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8f74be61-d309-417c-90a3-2962b57071c4-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-f8l2x\" (UID: \"8f74be61-d309-417c-90a3-2962b57071c4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-f8l2x"
Mar 20 15:57:08 crc kubenswrapper[4730]: E0320 15:57:08.403741    4730 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found
Mar 20 15:57:08 crc kubenswrapper[4730]: E0320 15:57:08.403820    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f74be61-d309-417c-90a3-2962b57071c4-cert podName:8f74be61-d309-417c-90a3-2962b57071c4 nodeName:}" failed. No retries permitted until 2026-03-20 15:57:16.403801416 +0000 UTC m=+1095.617172785 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8f74be61-d309-417c-90a3-2962b57071c4-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-f8l2x" (UID: "8f74be61-d309-417c-90a3-2962b57071c4") : secret "openstack-baremetal-operator-webhook-server-cert" not found
Mar 20 15:57:08 crc kubenswrapper[4730]: I0320 15:57:08.807913    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008-metrics-certs\") pod \"openstack-operator-controller-manager-6f58c59cbb-76ssq\" (UID: \"c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008\") " pod="openstack-operators/openstack-operator-controller-manager-6f58c59cbb-76ssq"
Mar 20 15:57:08 crc kubenswrapper[4730]: I0320 15:57:08.807956    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008-webhook-certs\") pod \"openstack-operator-controller-manager-6f58c59cbb-76ssq\" (UID: \"c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008\") " pod="openstack-operators/openstack-operator-controller-manager-6f58c59cbb-76ssq"
Mar 20 15:57:08 crc kubenswrapper[4730]: E0320 15:57:08.808112    4730 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found
Mar 20 15:57:08 crc kubenswrapper[4730]: E0320 15:57:08.808157    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008-webhook-certs podName:c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008 nodeName:}" failed. No retries permitted until 2026-03-20 15:57:16.80814336 +0000 UTC m=+1096.021514729 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008-webhook-certs") pod "openstack-operator-controller-manager-6f58c59cbb-76ssq" (UID: "c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008") : secret "webhook-server-cert" not found
Mar 20 15:57:08 crc kubenswrapper[4730]: E0320 15:57:08.808459    4730 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found
Mar 20 15:57:08 crc kubenswrapper[4730]: E0320 15:57:08.808488    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008-metrics-certs podName:c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008 nodeName:}" failed. No retries permitted until 2026-03-20 15:57:16.80848074 +0000 UTC m=+1096.021852109 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008-metrics-certs") pod "openstack-operator-controller-manager-6f58c59cbb-76ssq" (UID: "c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008") : secret "metrics-server-cert" not found
Mar 20 15:57:12 crc kubenswrapper[4730]: I0320 15:57:12.880024    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 15:57:12 crc kubenswrapper[4730]: I0320 15:57:12.880666    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 15:57:14 crc kubenswrapper[4730]: E0320 15:57:14.748090    4730 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:703ad3a2b749bce100f1e2a445312b65dc3b8b45e8c8ba59f311d3f8f3368113"
Mar 20 15:57:14 crc kubenswrapper[4730]: E0320 15:57:14.748754    4730 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:703ad3a2b749bce100f1e2a445312b65dc3b8b45e8c8ba59f311d3f8f3368113,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {}  BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {}  BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2rdf7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-8464cc45fb-pf8sw_openstack-operators(f733406e-5258-4cfe-870d-4fb86152363e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError"
Mar 20 15:57:14 crc kubenswrapper[4730]: E0320 15:57:14.750062    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-pf8sw" podUID="f733406e-5258-4cfe-870d-4fb86152363e"
Mar 20 15:57:15 crc kubenswrapper[4730]: E0320 15:57:15.053295    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:703ad3a2b749bce100f1e2a445312b65dc3b8b45e8c8ba59f311d3f8f3368113\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-pf8sw" podUID="f733406e-5258-4cfe-870d-4fb86152363e"
Mar 20 15:57:15 crc kubenswrapper[4730]: E0320 15:57:15.322043    4730 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444"
Mar 20 15:57:15 crc kubenswrapper[4730]: E0320 15:57:15.322197    4730 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {}  BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {}  BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-d5h4l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-d6b694c5-lrpjm_openstack-operators(cdbd62c8-9960-4257-87d9-d4923c7ef8dd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError"
Mar 20 15:57:15 crc kubenswrapper[4730]: E0320 15:57:15.323370    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-lrpjm" podUID="cdbd62c8-9960-4257-87d9-d4923c7ef8dd"
Mar 20 15:57:15 crc kubenswrapper[4730]: E0320 15:57:15.844760    4730 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56"
Mar 20 15:57:15 crc kubenswrapper[4730]: E0320 15:57:15.845300    4730 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {}  BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {}  BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vsqkt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-768b96df4c-g4kgd_openstack-operators(cf3ded14-d81b-4384-93e4-e51cde6a31ec): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError"
Mar 20 15:57:15 crc kubenswrapper[4730]: E0320 15:57:15.846573    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-g4kgd" podUID="cf3ded14-d81b-4384-93e4-e51cde6a31ec"
Mar 20 15:57:16 crc kubenswrapper[4730]: E0320 15:57:16.059795    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-g4kgd" podUID="cf3ded14-d81b-4384-93e4-e51cde6a31ec"
Mar 20 15:57:16 crc kubenswrapper[4730]: E0320 15:57:16.059779    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-lrpjm" podUID="cdbd62c8-9960-4257-87d9-d4923c7ef8dd"
Mar 20 15:57:16 crc kubenswrapper[4730]: I0320 15:57:16.116082    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8b68e41-b53d-4fb3-8a86-0c604cda0e46-cert\") pod \"infra-operator-controller-manager-7b9c774f96-4pkr9\" (UID: \"d8b68e41-b53d-4fb3-8a86-0c604cda0e46\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-4pkr9"
Mar 20 15:57:16 crc kubenswrapper[4730]: I0320 15:57:16.131289    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8b68e41-b53d-4fb3-8a86-0c604cda0e46-cert\") pod \"infra-operator-controller-manager-7b9c774f96-4pkr9\" (UID: \"d8b68e41-b53d-4fb3-8a86-0c604cda0e46\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-4pkr9"
Mar 20 15:57:16 crc kubenswrapper[4730]: I0320 15:57:16.305741    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-hg8js"
Mar 20 15:57:16 crc kubenswrapper[4730]: I0320 15:57:16.314945    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-4pkr9"
Mar 20 15:57:16 crc kubenswrapper[4730]: E0320 15:57:16.361135    4730 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e"
Mar 20 15:57:16 crc kubenswrapper[4730]: E0320 15:57:16.361369    4730 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {}  BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {}  BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5jrvs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-c674c5965-6f2w8_openstack-operators(db4a9305-eefd-4804-ac7a-4d811bd928f5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError"
Mar 20 15:57:16 crc kubenswrapper[4730]: E0320 15:57:16.362883    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6f2w8" podUID="db4a9305-eefd-4804-ac7a-4d811bd928f5"
Mar 20 15:57:16 crc kubenswrapper[4730]: I0320 15:57:16.421109    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8f74be61-d309-417c-90a3-2962b57071c4-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-f8l2x\" (UID: \"8f74be61-d309-417c-90a3-2962b57071c4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-f8l2x"
Mar 20 15:57:16 crc kubenswrapper[4730]: E0320 15:57:16.421441    4730 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found
Mar 20 15:57:16 crc kubenswrapper[4730]: E0320 15:57:16.421532    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f74be61-d309-417c-90a3-2962b57071c4-cert podName:8f74be61-d309-417c-90a3-2962b57071c4 nodeName:}" failed. No retries permitted until 2026-03-20 15:57:32.421489865 +0000 UTC m=+1111.634861234 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8f74be61-d309-417c-90a3-2962b57071c4-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-f8l2x" (UID: "8f74be61-d309-417c-90a3-2962b57071c4") : secret "openstack-baremetal-operator-webhook-server-cert" not found
Mar 20 15:57:16 crc kubenswrapper[4730]: I0320 15:57:16.826745    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008-metrics-certs\") pod \"openstack-operator-controller-manager-6f58c59cbb-76ssq\" (UID: \"c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008\") " pod="openstack-operators/openstack-operator-controller-manager-6f58c59cbb-76ssq"
Mar 20 15:57:16 crc kubenswrapper[4730]: I0320 15:57:16.826797    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008-webhook-certs\") pod \"openstack-operator-controller-manager-6f58c59cbb-76ssq\" (UID: \"c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008\") " pod="openstack-operators/openstack-operator-controller-manager-6f58c59cbb-76ssq"
Mar 20 15:57:16 crc kubenswrapper[4730]: E0320 15:57:16.826997    4730 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found
Mar 20 15:57:16 crc kubenswrapper[4730]: E0320 15:57:16.827053    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008-webhook-certs podName:c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008 nodeName:}" failed. No retries permitted until 2026-03-20 15:57:32.827039835 +0000 UTC m=+1112.040411194 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008-webhook-certs") pod "openstack-operator-controller-manager-6f58c59cbb-76ssq" (UID: "c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008") : secret "webhook-server-cert" not found
Mar 20 15:57:16 crc kubenswrapper[4730]: E0320 15:57:16.827069    4730 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found
Mar 20 15:57:16 crc kubenswrapper[4730]: E0320 15:57:16.827180    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008-metrics-certs podName:c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008 nodeName:}" failed. No retries permitted until 2026-03-20 15:57:32.827156199 +0000 UTC m=+1112.040527628 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008-metrics-certs") pod "openstack-operator-controller-manager-6f58c59cbb-76ssq" (UID: "c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008") : secret "metrics-server-cert" not found
Mar 20 15:57:16 crc kubenswrapper[4730]: E0320 15:57:16.937495    4730 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:425fd66675becbe0ca2b2fe1a5a6694ac6e0b1cdce9a77a7a37f99785eadc74a"
Mar 20 15:57:16 crc kubenswrapper[4730]: E0320 15:57:16.937731    4730 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:425fd66675becbe0ca2b2fe1a5a6694ac6e0b1cdce9a77a7a37f99785eadc74a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {}  BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {}  BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kkwmj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-5b9f45d989-w8x5z_openstack-operators(d7ad408f-56db-4b5b-bea9-ba821eae2b80): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError"
Mar 20 15:57:16 crc kubenswrapper[4730]: E0320 15:57:16.939700    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-w8x5z" podUID="d7ad408f-56db-4b5b-bea9-ba821eae2b80"
Mar 20 15:57:17 crc kubenswrapper[4730]: E0320 15:57:17.068529    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6f2w8" podUID="db4a9305-eefd-4804-ac7a-4d811bd928f5"
Mar 20 15:57:17 crc kubenswrapper[4730]: E0320 15:57:17.068586    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:425fd66675becbe0ca2b2fe1a5a6694ac6e0b1cdce9a77a7a37f99785eadc74a\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-w8x5z" podUID="d7ad408f-56db-4b5b-bea9-ba821eae2b80"
Mar 20 15:57:17 crc kubenswrapper[4730]: E0320 15:57:17.558816    4730 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55"
Mar 20 15:57:17 crc kubenswrapper[4730]: E0320 15:57:17.559305    4730 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {}  BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {}  BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rpdpg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-884679f54-t7kkm_openstack-operators(6944c865-92a4-441c-907b-27424898cb99): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError"
Mar 20 15:57:17 crc kubenswrapper[4730]: E0320 15:57:17.560465    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-t7kkm" podUID="6944c865-92a4-441c-907b-27424898cb99"
Mar 20 15:57:18 crc kubenswrapper[4730]: E0320 15:57:18.024776    4730 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:9dd26bc51e7757d84736528d4988a1f980ad50ccb070aef6fc252e32c5c423a8"
Mar 20 15:57:18 crc kubenswrapper[4730]: E0320 15:57:18.024983    4730 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:9dd26bc51e7757d84736528d4988a1f980ad50ccb070aef6fc252e32c5c423a8,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {}  BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {}  BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-krfh6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-6f787dddc9-9k6lh_openstack-operators(24280954-941c-445f-aa52-e360ce544046): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError"
Mar 20 15:57:18 crc kubenswrapper[4730]: E0320 15:57:18.026509    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-9k6lh" podUID="24280954-941c-445f-aa52-e360ce544046"
Mar 20 15:57:18 crc kubenswrapper[4730]: E0320 15:57:18.074289    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:9dd26bc51e7757d84736528d4988a1f980ad50ccb070aef6fc252e32c5c423a8\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-9k6lh" podUID="24280954-941c-445f-aa52-e360ce544046"
Mar 20 15:57:18 crc kubenswrapper[4730]: E0320 15:57:18.075212    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-t7kkm" podUID="6944c865-92a4-441c-907b-27424898cb99"
Mar 20 15:57:18 crc kubenswrapper[4730]: E0320 15:57:18.656702    4730 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:76a1cde9f29fb39ed715b06be16adb803b9a2e24d68acb369911c0a88e33bc7d"
Mar 20 15:57:18 crc kubenswrapper[4730]: E0320 15:57:18.657003    4730 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:76a1cde9f29fb39ed715b06be16adb803b9a2e24d68acb369911c0a88e33bc7d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {}  BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {}  BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7cgmv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-79df6bcc97-llp6b_openstack-operators(d658514c-f369-4ce2-ad50-d055fd208694): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError"
Mar 20 15:57:18 crc kubenswrapper[4730]: E0320 15:57:18.659152    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-llp6b" podUID="d658514c-f369-4ce2-ad50-d055fd208694"
Mar 20 15:57:18 crc kubenswrapper[4730]: I0320 15:57:18.951458    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-4pkr9"]
Mar 20 15:57:18 crc kubenswrapper[4730]: W0320 15:57:18.964120    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8b68e41_b53d_4fb3_8a86_0c604cda0e46.slice/crio-45848c5168243fc5bf9cef27b7a4afa483350368b8eaf99823d67cdd8005b630 WatchSource:0}: Error finding container 45848c5168243fc5bf9cef27b7a4afa483350368b8eaf99823d67cdd8005b630: Status 404 returned error can't find the container with id 45848c5168243fc5bf9cef27b7a4afa483350368b8eaf99823d67cdd8005b630
Mar 20 15:57:19 crc kubenswrapper[4730]: I0320 15:57:19.085021    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-4pkr9" event={"ID":"d8b68e41-b53d-4fb3-8a86-0c604cda0e46","Type":"ContainerStarted","Data":"45848c5168243fc5bf9cef27b7a4afa483350368b8eaf99823d67cdd8005b630"}
Mar 20 15:57:19 crc kubenswrapper[4730]: I0320 15:57:19.089340    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-l7v9q" event={"ID":"36dd23cb-43b2-4c25-9e24-3e2f69f93eff","Type":"ContainerStarted","Data":"9ff53b311f2f8ff5ec20d469511209241f556a39d369fedd4ad7b6d2d624630a"}
Mar 20 15:57:19 crc kubenswrapper[4730]: I0320 15:57:19.089699    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-l7v9q"
Mar 20 15:57:19 crc kubenswrapper[4730]: I0320 15:57:19.092108    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-v96m5" event={"ID":"acffaecc-dd6c-4819-91cf-99c5d0154143","Type":"ContainerStarted","Data":"a90c4d6afa0b750bfaee685cd5a45009fdac35d3b730420486be1bc882b46468"}
Mar 20 15:57:19 crc kubenswrapper[4730]: I0320 15:57:19.092174    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-v96m5"
Mar 20 15:57:19 crc kubenswrapper[4730]: I0320 15:57:19.097028    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-rnx2d" event={"ID":"19a5ba3c-9f89-43f6-bd55-6998df2e3533","Type":"ContainerStarted","Data":"df708a5bd69cf0b999854d5a59081b33481add9b718024ef449b1b8e6c7d877c"}
Mar 20 15:57:19 crc kubenswrapper[4730]: I0320 15:57:19.097202    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-rnx2d"
Mar 20 15:57:19 crc kubenswrapper[4730]: E0320 15:57:19.100075    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:76a1cde9f29fb39ed715b06be16adb803b9a2e24d68acb369911c0a88e33bc7d\\\"\"" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-llp6b" podUID="d658514c-f369-4ce2-ad50-d055fd208694"
Mar 20 15:57:19 crc kubenswrapper[4730]: I0320 15:57:19.118018    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-l7v9q" podStartSLOduration=2.919874029 podStartE2EDuration="19.117994963s" podCreationTimestamp="2026-03-20 15:57:00 +0000 UTC" firstStartedPulling="2026-03-20 15:57:02.492544508 +0000 UTC m=+1081.705915877" lastFinishedPulling="2026-03-20 15:57:18.690665432 +0000 UTC m=+1097.904036811" observedRunningTime="2026-03-20 15:57:19.108532603 +0000 UTC m=+1098.321903972" watchObservedRunningTime="2026-03-20 15:57:19.117994963 +0000 UTC m=+1098.331366332"
Mar 20 15:57:19 crc kubenswrapper[4730]: I0320 15:57:19.179818    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-rnx2d" podStartSLOduration=3.108321185 podStartE2EDuration="19.179799156s" podCreationTimestamp="2026-03-20 15:57:00 +0000 UTC" firstStartedPulling="2026-03-20 15:57:02.631155062 +0000 UTC m=+1081.844526431" lastFinishedPulling="2026-03-20 15:57:18.702633033 +0000 UTC m=+1097.916004402" observedRunningTime="2026-03-20 15:57:19.17188201 +0000 UTC m=+1098.385253379" watchObservedRunningTime="2026-03-20 15:57:19.179799156 +0000 UTC m=+1098.393170525"
Mar 20 15:57:19 crc kubenswrapper[4730]: I0320 15:57:19.189868    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-v96m5" podStartSLOduration=2.762878892 podStartE2EDuration="20.189848942s" podCreationTimestamp="2026-03-20 15:56:59 +0000 UTC" firstStartedPulling="2026-03-20 15:57:01.272798761 +0000 UTC m=+1080.486170130" lastFinishedPulling="2026-03-20 15:57:18.699768811 +0000 UTC m=+1097.913140180" observedRunningTime="2026-03-20 15:57:19.188159874 +0000 UTC m=+1098.401531243" watchObservedRunningTime="2026-03-20 15:57:19.189848942 +0000 UTC m=+1098.403220311"
Mar 20 15:57:20 crc kubenswrapper[4730]: I0320 15:57:20.127011    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-bqjxs" event={"ID":"87b37583-ab1d-4f9e-98e9-8cb9bdcc5165","Type":"ContainerStarted","Data":"20f25722465c682576a32ae4c86d08f9fc1d891b4133459452e4ababb33b352c"}
Mar 20 15:57:20 crc kubenswrapper[4730]: I0320 15:57:20.127336    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-55f864c847-bqjxs"
Mar 20 15:57:20 crc kubenswrapper[4730]: I0320 15:57:20.136144    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-lt49w" event={"ID":"9944d85d-4f1c-4312-ac57-49ee75a8fd16","Type":"ContainerStarted","Data":"69ecde21dc19745c54d0c34f06496f5328e8e68eed2a10d90ff03d02fb5cf23a"}
Mar 20 15:57:20 crc kubenswrapper[4730]: I0320 15:57:20.136221    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5784578c99-lt49w"
Mar 20 15:57:20 crc kubenswrapper[4730]: I0320 15:57:20.138761    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-xw6kk" event={"ID":"61755ffd-de91-4a38-a174-fe1a4c57dfd0","Type":"ContainerStarted","Data":"59162b0311a880f7c0c7f4a4fad71dc31d8fed5e2134cde37845cf230e6a1c4e"}
Mar 20 15:57:20 crc kubenswrapper[4730]: I0320 15:57:20.138887    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-767865f676-xw6kk"
Mar 20 15:57:20 crc kubenswrapper[4730]: I0320 15:57:20.143669    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-bm7hr" event={"ID":"92c29eff-b9ab-4420-86c6-6b388cfc87af","Type":"ContainerStarted","Data":"448bf8cc49eec8fef296101660f5fc0c7f56530380214b089ca64e190103ced2"}
Mar 20 15:57:20 crc kubenswrapper[4730]: I0320 15:57:20.143771    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-bm7hr"
Mar 20 15:57:20 crc kubenswrapper[4730]: I0320 15:57:20.146210    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-dmd8z" event={"ID":"4fb51ed6-04e3-40db-ab21-eb0fe66442fe","Type":"ContainerStarted","Data":"ea7deb44b6fc5317f3970450d061efbbcddadb8a2b3117b80c462b39c592a24f"}
Mar 20 15:57:20 crc kubenswrapper[4730]: I0320 15:57:20.148275    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-dmd8z"
Mar 20 15:57:20 crc kubenswrapper[4730]: I0320 15:57:20.149831    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-55f864c847-bqjxs" podStartSLOduration=3.498451994 podStartE2EDuration="20.149812108s" podCreationTimestamp="2026-03-20 15:57:00 +0000 UTC" firstStartedPulling="2026-03-20 15:57:02.039464952 +0000 UTC m=+1081.252836321" lastFinishedPulling="2026-03-20 15:57:18.690825066 +0000 UTC m=+1097.904196435" observedRunningTime="2026-03-20 15:57:20.145070172 +0000 UTC m=+1099.358441541" watchObservedRunningTime="2026-03-20 15:57:20.149812108 +0000 UTC m=+1099.363183467"
Mar 20 15:57:20 crc kubenswrapper[4730]: I0320 15:57:20.161296    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-bm7hr" podStartSLOduration=4.139236405 podStartE2EDuration="20.161280875s" podCreationTimestamp="2026-03-20 15:57:00 +0000 UTC" firstStartedPulling="2026-03-20 15:57:02.667405387 +0000 UTC m=+1081.880776756" lastFinishedPulling="2026-03-20 15:57:18.689449867 +0000 UTC m=+1097.902821226" observedRunningTime="2026-03-20 15:57:20.157050894 +0000 UTC m=+1099.370422263" watchObservedRunningTime="2026-03-20 15:57:20.161280875 +0000 UTC m=+1099.374652244"
Mar 20 15:57:20 crc kubenswrapper[4730]: I0320 15:57:20.174317    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5784578c99-lt49w" podStartSLOduration=3.980947599 podStartE2EDuration="20.174297606s" podCreationTimestamp="2026-03-20 15:57:00 +0000 UTC" firstStartedPulling="2026-03-20 15:57:02.506378083 +0000 UTC m=+1081.719749452" lastFinishedPulling="2026-03-20 15:57:18.69972809 +0000 UTC m=+1097.913099459" observedRunningTime="2026-03-20 15:57:20.172389812 +0000 UTC m=+1099.385761181" watchObservedRunningTime="2026-03-20 15:57:20.174297606 +0000 UTC m=+1099.387668975"
Mar 20 15:57:20 crc kubenswrapper[4730]: I0320 15:57:20.192001    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-767865f676-xw6kk" podStartSLOduration=3.928321668 podStartE2EDuration="20.191978411s" podCreationTimestamp="2026-03-20 15:57:00 +0000 UTC" firstStartedPulling="2026-03-20 15:57:02.438327881 +0000 UTC m=+1081.651699250" lastFinishedPulling="2026-03-20 15:57:18.701984624 +0000 UTC m=+1097.915355993" observedRunningTime="2026-03-20 15:57:20.189383277 +0000 UTC m=+1099.402754666" watchObservedRunningTime="2026-03-20 15:57:20.191978411 +0000 UTC m=+1099.405349790"
Mar 20 15:57:20 crc kubenswrapper[4730]: I0320 15:57:20.215018    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-dmd8z" podStartSLOduration=3.5115083289999998 podStartE2EDuration="21.215002257s" podCreationTimestamp="2026-03-20 15:56:59 +0000 UTC" firstStartedPulling="2026-03-20 15:57:00.986149704 +0000 UTC m=+1080.199521073" lastFinishedPulling="2026-03-20 15:57:18.689643612 +0000 UTC m=+1097.903015001" observedRunningTime="2026-03-20 15:57:20.208966095 +0000 UTC m=+1099.422337474" watchObservedRunningTime="2026-03-20 15:57:20.215002257 +0000 UTC m=+1099.428373626"
Mar 20 15:57:25 crc kubenswrapper[4730]: I0320 15:57:25.194690    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-4pkr9" event={"ID":"d8b68e41-b53d-4fb3-8a86-0c604cda0e46","Type":"ContainerStarted","Data":"5fb0ff525af3cf18991d023fc39f7fba44f374ac1e9aa04bdc8441f66fe1756f"}
Mar 20 15:57:25 crc kubenswrapper[4730]: I0320 15:57:25.195172    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-4pkr9"
Mar 20 15:57:25 crc kubenswrapper[4730]: I0320 15:57:25.206603    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c5858c67b-cfmtk" event={"ID":"f00b4813-358d-49c4-bf9d-486e35f5a94f","Type":"ContainerStarted","Data":"017d151d5e07cd8c9d38dcba58fdcc0910cdd019aa668a50bf0bc303aec28023"}
Mar 20 15:57:25 crc kubenswrapper[4730]: I0320 15:57:25.207560    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c5858c67b-cfmtk"
Mar 20 15:57:25 crc kubenswrapper[4730]: I0320 15:57:25.211198    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mdzv5" event={"ID":"82cae974-2029-42c3-81bf-e9bee167e991","Type":"ContainerStarted","Data":"4f271b3f9738e4f9672107879d5284ffdb334b69bf42c7a9be9d6c97c168cb77"}
Mar 20 15:57:25 crc kubenswrapper[4730]: I0320 15:57:25.214191    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-nwwzc" event={"ID":"e8ad6f56-863f-473b-a4d4-d4f70d9489a4","Type":"ContainerStarted","Data":"796b15e07a238d32353575620108c25121126d0ec7fd9b43a7d6fa102c17301a"}
Mar 20 15:57:25 crc kubenswrapper[4730]: I0320 15:57:25.214448    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-nwwzc"
Mar 20 15:57:25 crc kubenswrapper[4730]: I0320 15:57:25.216744    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-wqqnd" event={"ID":"c5aaa9e9-aebc-4daa-b7ab-c6064b5a78ef","Type":"ContainerStarted","Data":"fe45ed2582e95ad4f73c51c40d883f2f26a7c1cf741e5dccb7005bb8f04578be"}
Mar 20 15:57:25 crc kubenswrapper[4730]: I0320 15:57:25.217010    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-wqqnd"
Mar 20 15:57:25 crc kubenswrapper[4730]: I0320 15:57:25.232334    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-4pkr9" podStartSLOduration=20.165020031 podStartE2EDuration="25.232310572s" podCreationTimestamp="2026-03-20 15:57:00 +0000 UTC" firstStartedPulling="2026-03-20 15:57:18.988819137 +0000 UTC m=+1098.202190506" lastFinishedPulling="2026-03-20 15:57:24.056109678 +0000 UTC m=+1103.269481047" observedRunningTime="2026-03-20 15:57:25.226581159 +0000 UTC m=+1104.439952558" watchObservedRunningTime="2026-03-20 15:57:25.232310572 +0000 UTC m=+1104.445681981"
Mar 20 15:57:25 crc kubenswrapper[4730]: I0320 15:57:25.250617    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c5858c67b-cfmtk" podStartSLOduration=3.901627385 podStartE2EDuration="25.250587952s" podCreationTimestamp="2026-03-20 15:57:00 +0000 UTC" firstStartedPulling="2026-03-20 15:57:02.701031756 +0000 UTC m=+1081.914403125" lastFinishedPulling="2026-03-20 15:57:24.049992323 +0000 UTC m=+1103.263363692" observedRunningTime="2026-03-20 15:57:25.245435205 +0000 UTC m=+1104.458806584" watchObservedRunningTime="2026-03-20 15:57:25.250587952 +0000 UTC m=+1104.463959351"
Mar 20 15:57:25 crc kubenswrapper[4730]: I0320 15:57:25.276323    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-nwwzc" podStartSLOduration=4.916952709 podStartE2EDuration="26.276293052s" podCreationTimestamp="2026-03-20 15:56:59 +0000 UTC" firstStartedPulling="2026-03-20 15:57:02.69662319 +0000 UTC m=+1081.909994559" lastFinishedPulling="2026-03-20 15:57:24.055963533 +0000 UTC m=+1103.269334902" observedRunningTime="2026-03-20 15:57:25.267983906 +0000 UTC m=+1104.481355315" watchObservedRunningTime="2026-03-20 15:57:25.276293052 +0000 UTC m=+1104.489664461"
Mar 20 15:57:25 crc kubenswrapper[4730]: I0320 15:57:25.314660    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mdzv5" podStartSLOduration=3.960155757 podStartE2EDuration="25.314639471s" podCreationTimestamp="2026-03-20 15:57:00 +0000 UTC" firstStartedPulling="2026-03-20 15:57:02.738291279 +0000 UTC m=+1081.951662648" lastFinishedPulling="2026-03-20 15:57:24.092774973 +0000 UTC m=+1103.306146362" observedRunningTime="2026-03-20 15:57:25.288281382 +0000 UTC m=+1104.501652771" watchObservedRunningTime="2026-03-20 15:57:25.314639471 +0000 UTC m=+1104.528010850"
Mar 20 15:57:25 crc kubenswrapper[4730]: I0320 15:57:25.348625    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-wqqnd" podStartSLOduration=4.998073374 podStartE2EDuration="26.348605276s" podCreationTimestamp="2026-03-20 15:56:59 +0000 UTC" firstStartedPulling="2026-03-20 15:57:02.700749538 +0000 UTC m=+1081.914120907" lastFinishedPulling="2026-03-20 15:57:24.05128144 +0000 UTC m=+1103.264652809" observedRunningTime="2026-03-20 15:57:25.341925677 +0000 UTC m=+1104.555297056" watchObservedRunningTime="2026-03-20 15:57:25.348605276 +0000 UTC m=+1104.561976655"
Mar 20 15:57:27 crc kubenswrapper[4730]: I0320 15:57:27.238571    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-g4kgd" event={"ID":"cf3ded14-d81b-4384-93e4-e51cde6a31ec","Type":"ContainerStarted","Data":"2ba359c5692f05c532cfa9341dede45b2c57a3bf1dc1e5d078311209dec28d5a"}
Mar 20 15:57:27 crc kubenswrapper[4730]: I0320 15:57:27.239485    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-g4kgd"
Mar 20 15:57:27 crc kubenswrapper[4730]: I0320 15:57:27.263607    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-g4kgd" podStartSLOduration=2.826154018 podStartE2EDuration="27.263581764s" podCreationTimestamp="2026-03-20 15:57:00 +0000 UTC" firstStartedPulling="2026-03-20 15:57:02.502478391 +0000 UTC m=+1081.715849770" lastFinishedPulling="2026-03-20 15:57:26.939906157 +0000 UTC m=+1106.153277516" observedRunningTime="2026-03-20 15:57:27.255783972 +0000 UTC m=+1106.469155381" watchObservedRunningTime="2026-03-20 15:57:27.263581764 +0000 UTC m=+1106.476953173"
Mar 20 15:57:28 crc kubenswrapper[4730]: I0320 15:57:28.246856    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-w8x5z" event={"ID":"d7ad408f-56db-4b5b-bea9-ba821eae2b80","Type":"ContainerStarted","Data":"b88b5c0c25cd9b6cd8aa454324911b4841e83a5be7ccb7cf16bc6eb6e133f3f2"}
Mar 20 15:57:28 crc kubenswrapper[4730]: I0320 15:57:28.247768    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-w8x5z"
Mar 20 15:57:28 crc kubenswrapper[4730]: I0320 15:57:28.267292    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-w8x5z" podStartSLOduration=2.902335698 podStartE2EDuration="28.267276181s" podCreationTimestamp="2026-03-20 15:57:00 +0000 UTC" firstStartedPulling="2026-03-20 15:57:02.638673907 +0000 UTC m=+1081.852045276" lastFinishedPulling="2026-03-20 15:57:28.00361438 +0000 UTC m=+1107.216985759" observedRunningTime="2026-03-20 15:57:28.260890709 +0000 UTC m=+1107.474262068" watchObservedRunningTime="2026-03-20 15:57:28.267276181 +0000 UTC m=+1107.480647550"
Mar 20 15:57:30 crc kubenswrapper[4730]: I0320 15:57:30.266436    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-dmd8z"
Mar 20 15:57:30 crc kubenswrapper[4730]: I0320 15:57:30.330886    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-v96m5"
Mar 20 15:57:30 crc kubenswrapper[4730]: I0320 15:57:30.490989    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-55f864c847-bqjxs"
Mar 20 15:57:30 crc kubenswrapper[4730]: I0320 15:57:30.525538    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-rnx2d"
Mar 20 15:57:30 crc kubenswrapper[4730]: I0320 15:57:30.542044    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-wqqnd"
Mar 20 15:57:30 crc kubenswrapper[4730]: I0320 15:57:30.578137    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-nwwzc"
Mar 20 15:57:30 crc kubenswrapper[4730]: I0320 15:57:30.733775    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-767865f676-xw6kk"
Mar 20 15:57:30 crc kubenswrapper[4730]: I0320 15:57:30.789586    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-l7v9q"
Mar 20 15:57:30 crc kubenswrapper[4730]: I0320 15:57:30.917305    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5784578c99-lt49w"
Mar 20 15:57:31 crc kubenswrapper[4730]: I0320 15:57:31.025665    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-bm7hr"
Mar 20 15:57:31 crc kubenswrapper[4730]: I0320 15:57:31.064336    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c5858c67b-cfmtk"
Mar 20 15:57:31 crc kubenswrapper[4730]: I0320 15:57:31.271028    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6f2w8" event={"ID":"db4a9305-eefd-4804-ac7a-4d811bd928f5","Type":"ContainerStarted","Data":"2e5e47598bc490cf46a7f146d14a0101f676bf93573bd08ec5fa627f467be2bd"}
Mar 20 15:57:31 crc kubenswrapper[4730]: I0320 15:57:31.271222    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6f2w8"
Mar 20 15:57:31 crc kubenswrapper[4730]: I0320 15:57:31.286485    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6f2w8" podStartSLOduration=2.917380278 podStartE2EDuration="31.286471802s" podCreationTimestamp="2026-03-20 15:57:00 +0000 UTC" firstStartedPulling="2026-03-20 15:57:02.658790381 +0000 UTC m=+1081.872161750" lastFinishedPulling="2026-03-20 15:57:31.027881905 +0000 UTC m=+1110.241253274" observedRunningTime="2026-03-20 15:57:31.284048703 +0000 UTC m=+1110.497420082" watchObservedRunningTime="2026-03-20 15:57:31.286471802 +0000 UTC m=+1110.499843171"
Mar 20 15:57:32 crc kubenswrapper[4730]: I0320 15:57:32.285351    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-pf8sw" event={"ID":"f733406e-5258-4cfe-870d-4fb86152363e","Type":"ContainerStarted","Data":"d6839305cb3287a7265c266d8c0b765de6ece764de0fad970fcb7410eee58982"}
Mar 20 15:57:32 crc kubenswrapper[4730]: I0320 15:57:32.287062    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-pf8sw"
Mar 20 15:57:32 crc kubenswrapper[4730]: I0320 15:57:32.320115    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-pf8sw" podStartSLOduration=3.611944902 podStartE2EDuration="33.320094059s" podCreationTimestamp="2026-03-20 15:56:59 +0000 UTC" firstStartedPulling="2026-03-20 15:57:01.474407352 +0000 UTC m=+1080.687778721" lastFinishedPulling="2026-03-20 15:57:31.182556509 +0000 UTC m=+1110.395927878" observedRunningTime="2026-03-20 15:57:32.310198918 +0000 UTC m=+1111.523570297" watchObservedRunningTime="2026-03-20 15:57:32.320094059 +0000 UTC m=+1111.533465448"
Mar 20 15:57:32 crc kubenswrapper[4730]: I0320 15:57:32.489931    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8f74be61-d309-417c-90a3-2962b57071c4-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-f8l2x\" (UID: \"8f74be61-d309-417c-90a3-2962b57071c4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-f8l2x"
Mar 20 15:57:32 crc kubenswrapper[4730]: I0320 15:57:32.499019    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8f74be61-d309-417c-90a3-2962b57071c4-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-f8l2x\" (UID: \"8f74be61-d309-417c-90a3-2962b57071c4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-f8l2x"
Mar 20 15:57:32 crc kubenswrapper[4730]: I0320 15:57:32.645018    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-57nkr"
Mar 20 15:57:32 crc kubenswrapper[4730]: I0320 15:57:32.654093    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-f8l2x"
Mar 20 15:57:32 crc kubenswrapper[4730]: I0320 15:57:32.896989    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008-webhook-certs\") pod \"openstack-operator-controller-manager-6f58c59cbb-76ssq\" (UID: \"c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008\") " pod="openstack-operators/openstack-operator-controller-manager-6f58c59cbb-76ssq"
Mar 20 15:57:32 crc kubenswrapper[4730]: I0320 15:57:32.897434    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008-metrics-certs\") pod \"openstack-operator-controller-manager-6f58c59cbb-76ssq\" (UID: \"c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008\") " pod="openstack-operators/openstack-operator-controller-manager-6f58c59cbb-76ssq"
Mar 20 15:57:32 crc kubenswrapper[4730]: I0320 15:57:32.902152    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008-webhook-certs\") pod \"openstack-operator-controller-manager-6f58c59cbb-76ssq\" (UID: \"c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008\") " pod="openstack-operators/openstack-operator-controller-manager-6f58c59cbb-76ssq"
Mar 20 15:57:32 crc kubenswrapper[4730]: I0320 15:57:32.904439    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008-metrics-certs\") pod \"openstack-operator-controller-manager-6f58c59cbb-76ssq\" (UID: \"c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008\") " pod="openstack-operators/openstack-operator-controller-manager-6f58c59cbb-76ssq"
Mar 20 15:57:33 crc kubenswrapper[4730]: I0320 15:57:33.082000    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-f8l2x"]
Mar 20 15:57:33 crc kubenswrapper[4730]: I0320 15:57:33.084619    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-nctfr"
Mar 20 15:57:33 crc kubenswrapper[4730]: I0320 15:57:33.090528    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6f58c59cbb-76ssq"
Mar 20 15:57:33 crc kubenswrapper[4730]: W0320 15:57:33.091301    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f74be61_d309_417c_90a3_2962b57071c4.slice/crio-078707346a29bd1255e679676c6e90f14c0c76fae721b0b15b5642fc1d59b4ee WatchSource:0}: Error finding container 078707346a29bd1255e679676c6e90f14c0c76fae721b0b15b5642fc1d59b4ee: Status 404 returned error can't find the container with id 078707346a29bd1255e679676c6e90f14c0c76fae721b0b15b5642fc1d59b4ee
Mar 20 15:57:33 crc kubenswrapper[4730]: I0320 15:57:33.308218    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-f8l2x" event={"ID":"8f74be61-d309-417c-90a3-2962b57071c4","Type":"ContainerStarted","Data":"078707346a29bd1255e679676c6e90f14c0c76fae721b0b15b5642fc1d59b4ee"}
Mar 20 15:57:33 crc kubenswrapper[4730]: I0320 15:57:33.337784    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6f58c59cbb-76ssq"]
Mar 20 15:57:33 crc kubenswrapper[4730]: W0320 15:57:33.347161    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc66e7fcc_f4ab_4d70_ad2b_b9186a4a2008.slice/crio-7ce315dfb8fa9c5b992fbe6ce1bc45ec6438bde4e26c9d2aa087c65f50979de1 WatchSource:0}: Error finding container 7ce315dfb8fa9c5b992fbe6ce1bc45ec6438bde4e26c9d2aa087c65f50979de1: Status 404 returned error can't find the container with id 7ce315dfb8fa9c5b992fbe6ce1bc45ec6438bde4e26c9d2aa087c65f50979de1
Mar 20 15:57:34 crc kubenswrapper[4730]: I0320 15:57:34.316693    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6f58c59cbb-76ssq" event={"ID":"c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008","Type":"ContainerStarted","Data":"c8298b88b014ea89cabc3e6b0ec9d07375465b7f4747f88c5c164a9a6ea1f076"}
Mar 20 15:57:34 crc kubenswrapper[4730]: I0320 15:57:34.317326    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6f58c59cbb-76ssq" event={"ID":"c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008","Type":"ContainerStarted","Data":"7ce315dfb8fa9c5b992fbe6ce1bc45ec6438bde4e26c9d2aa087c65f50979de1"}
Mar 20 15:57:34 crc kubenswrapper[4730]: I0320 15:57:34.318499    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6f58c59cbb-76ssq"
Mar 20 15:57:34 crc kubenswrapper[4730]: I0320 15:57:34.332564    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-t7kkm" event={"ID":"6944c865-92a4-441c-907b-27424898cb99","Type":"ContainerStarted","Data":"75f68910e535049be37918a27f66338e1a4493a21f3138874b0a97cef82a892c"}
Mar 20 15:57:34 crc kubenswrapper[4730]: I0320 15:57:34.332896    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-884679f54-t7kkm"
Mar 20 15:57:34 crc kubenswrapper[4730]: I0320 15:57:34.334013    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-lrpjm" event={"ID":"cdbd62c8-9960-4257-87d9-d4923c7ef8dd","Type":"ContainerStarted","Data":"ed974fa89b66e95c6f70abb9bbef5505331291b3c46c8815467ac7c4644ed4bf"}
Mar 20 15:57:34 crc kubenswrapper[4730]: I0320 15:57:34.334544    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-lrpjm"
Mar 20 15:57:34 crc kubenswrapper[4730]: I0320 15:57:34.335910    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-9k6lh" event={"ID":"24280954-941c-445f-aa52-e360ce544046","Type":"ContainerStarted","Data":"3b2de06b0db9205a91cd21cef240e85947cdf44a30605b8850a783e857ce58a5"}
Mar 20 15:57:34 crc kubenswrapper[4730]: I0320 15:57:34.336301    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-9k6lh"
Mar 20 15:57:34 crc kubenswrapper[4730]: I0320 15:57:34.368836    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6f58c59cbb-76ssq" podStartSLOduration=34.368820067 podStartE2EDuration="34.368820067s" podCreationTimestamp="2026-03-20 15:57:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:57:34.35590974 +0000 UTC m=+1113.569281119" watchObservedRunningTime="2026-03-20 15:57:34.368820067 +0000 UTC m=+1113.582191436"
Mar 20 15:57:34 crc kubenswrapper[4730]: I0320 15:57:34.370926    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-lrpjm" podStartSLOduration=3.585917131 podStartE2EDuration="34.370919556s" podCreationTimestamp="2026-03-20 15:57:00 +0000 UTC" firstStartedPulling="2026-03-20 15:57:02.660948192 +0000 UTC m=+1081.874319561" lastFinishedPulling="2026-03-20 15:57:33.445950617 +0000 UTC m=+1112.659321986" observedRunningTime="2026-03-20 15:57:34.36823167 +0000 UTC m=+1113.581603049" watchObservedRunningTime="2026-03-20 15:57:34.370919556 +0000 UTC m=+1113.584290925"
Mar 20 15:57:34 crc kubenswrapper[4730]: I0320 15:57:34.383899    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-9k6lh" podStartSLOduration=2.9038323200000002 podStartE2EDuration="34.383884855s" podCreationTimestamp="2026-03-20 15:57:00 +0000 UTC" firstStartedPulling="2026-03-20 15:57:02.039667678 +0000 UTC m=+1081.253039047" lastFinishedPulling="2026-03-20 15:57:33.519720213 +0000 UTC m=+1112.733091582" observedRunningTime="2026-03-20 15:57:34.382522386 +0000 UTC m=+1113.595893755" watchObservedRunningTime="2026-03-20 15:57:34.383884855 +0000 UTC m=+1113.597256224"
Mar 20 15:57:34 crc kubenswrapper[4730]: I0320 15:57:34.396667    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-884679f54-t7kkm" podStartSLOduration=3.472215752 podStartE2EDuration="34.396648657s" podCreationTimestamp="2026-03-20 15:57:00 +0000 UTC" firstStartedPulling="2026-03-20 15:57:02.411403223 +0000 UTC m=+1081.624774592" lastFinishedPulling="2026-03-20 15:57:33.335836128 +0000 UTC m=+1112.549207497" observedRunningTime="2026-03-20 15:57:34.395903026 +0000 UTC m=+1113.609274395" watchObservedRunningTime="2026-03-20 15:57:34.396648657 +0000 UTC m=+1113.610020026"
Mar 20 15:57:35 crc kubenswrapper[4730]: I0320 15:57:35.344306    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-f8l2x" event={"ID":"8f74be61-d309-417c-90a3-2962b57071c4","Type":"ContainerStarted","Data":"70c8fd815396c9bece6283008ba5280cf90bc831678f763c27a770c0bd8ce113"}
Mar 20 15:57:35 crc kubenswrapper[4730]: I0320 15:57:35.344849    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-f8l2x"
Mar 20 15:57:35 crc kubenswrapper[4730]: I0320 15:57:35.346536    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-llp6b" event={"ID":"d658514c-f369-4ce2-ad50-d055fd208694","Type":"ContainerStarted","Data":"a2959034776d964bb1bbe003c5623b055223b6e55f21e17f7e03a3cb64a6bb33"}
Mar 20 15:57:35 crc kubenswrapper[4730]: I0320 15:57:35.395370    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-llp6b" podStartSLOduration=3.938011231 podStartE2EDuration="36.395350193s" podCreationTimestamp="2026-03-20 15:56:59 +0000 UTC" firstStartedPulling="2026-03-20 15:57:02.634309242 +0000 UTC m=+1081.847680601" lastFinishedPulling="2026-03-20 15:57:35.091648194 +0000 UTC m=+1114.305019563" observedRunningTime="2026-03-20 15:57:35.391842883 +0000 UTC m=+1114.605214242" watchObservedRunningTime="2026-03-20 15:57:35.395350193 +0000 UTC m=+1114.608721582"
Mar 20 15:57:35 crc kubenswrapper[4730]: I0320 15:57:35.395978    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-f8l2x" podStartSLOduration=33.399455806 podStartE2EDuration="35.39596892s" podCreationTimestamp="2026-03-20 15:57:00 +0000 UTC" firstStartedPulling="2026-03-20 15:57:33.099050341 +0000 UTC m=+1112.312421750" lastFinishedPulling="2026-03-20 15:57:35.095563495 +0000 UTC m=+1114.308934864" observedRunningTime="2026-03-20 15:57:35.379149562 +0000 UTC m=+1114.592520931" watchObservedRunningTime="2026-03-20 15:57:35.39596892 +0000 UTC m=+1114.609340299"
Mar 20 15:57:36 crc kubenswrapper[4730]: I0320 15:57:36.321687    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-4pkr9"
Mar 20 15:57:40 crc kubenswrapper[4730]: I0320 15:57:40.382721    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-pf8sw"
Mar 20 15:57:40 crc kubenswrapper[4730]: I0320 15:57:40.427906    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-9k6lh"
Mar 20 15:57:40 crc kubenswrapper[4730]: I0320 15:57:40.584001    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-llp6b"
Mar 20 15:57:40 crc kubenswrapper[4730]: I0320 15:57:40.586733    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-llp6b"
Mar 20 15:57:40 crc kubenswrapper[4730]: I0320 15:57:40.757654    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-g4kgd"
Mar 20 15:57:40 crc kubenswrapper[4730]: I0320 15:57:40.873687    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-w8x5z"
Mar 20 15:57:40 crc kubenswrapper[4730]: I0320 15:57:40.881010    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-884679f54-t7kkm"
Mar 20 15:57:40 crc kubenswrapper[4730]: I0320 15:57:40.976417    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6f2w8"
Mar 20 15:57:40 crc kubenswrapper[4730]: I0320 15:57:40.997940    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-lrpjm"
Mar 20 15:57:42 crc kubenswrapper[4730]: I0320 15:57:42.662057    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-f8l2x"
Mar 20 15:57:42 crc kubenswrapper[4730]: I0320 15:57:42.880761    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 15:57:42 crc kubenswrapper[4730]: I0320 15:57:42.880848    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 15:57:43 crc kubenswrapper[4730]: I0320 15:57:43.099273    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6f58c59cbb-76ssq"
Mar 20 15:58:00 crc kubenswrapper[4730]: I0320 15:58:00.141371    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567038-qvgqb"]
Mar 20 15:58:00 crc kubenswrapper[4730]: I0320 15:58:00.143467    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567038-qvgqb"
Mar 20 15:58:00 crc kubenswrapper[4730]: I0320 15:58:00.146861    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vlqbl"
Mar 20 15:58:00 crc kubenswrapper[4730]: I0320 15:58:00.148158    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt"
Mar 20 15:58:00 crc kubenswrapper[4730]: I0320 15:58:00.150189    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt"
Mar 20 15:58:00 crc kubenswrapper[4730]: I0320 15:58:00.155925    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567038-qvgqb"]
Mar 20 15:58:00 crc kubenswrapper[4730]: I0320 15:58:00.232070    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvhrt\" (UniqueName: \"kubernetes.io/projected/67854402-4e0e-4ebe-b9d4-700669827780-kube-api-access-bvhrt\") pod \"auto-csr-approver-29567038-qvgqb\" (UID: \"67854402-4e0e-4ebe-b9d4-700669827780\") " pod="openshift-infra/auto-csr-approver-29567038-qvgqb"
Mar 20 15:58:00 crc kubenswrapper[4730]: I0320 15:58:00.333586    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvhrt\" (UniqueName: \"kubernetes.io/projected/67854402-4e0e-4ebe-b9d4-700669827780-kube-api-access-bvhrt\") pod \"auto-csr-approver-29567038-qvgqb\" (UID: \"67854402-4e0e-4ebe-b9d4-700669827780\") " pod="openshift-infra/auto-csr-approver-29567038-qvgqb"
Mar 20 15:58:00 crc kubenswrapper[4730]: I0320 15:58:00.359013    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvhrt\" (UniqueName: \"kubernetes.io/projected/67854402-4e0e-4ebe-b9d4-700669827780-kube-api-access-bvhrt\") pod \"auto-csr-approver-29567038-qvgqb\" (UID: \"67854402-4e0e-4ebe-b9d4-700669827780\") " pod="openshift-infra/auto-csr-approver-29567038-qvgqb"
Mar 20 15:58:00 crc kubenswrapper[4730]: I0320 15:58:00.467374    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567038-qvgqb"
Mar 20 15:58:00 crc kubenswrapper[4730]: I0320 15:58:00.918778    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567038-qvgqb"]
Mar 20 15:58:01 crc kubenswrapper[4730]: I0320 15:58:01.606782    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567038-qvgqb" event={"ID":"67854402-4e0e-4ebe-b9d4-700669827780","Type":"ContainerStarted","Data":"9440066ffdb65a07be2e47ff1767e0405bb8e3440041b92f4bf330c3197708ed"}
Mar 20 15:58:02 crc kubenswrapper[4730]: I0320 15:58:02.614130    4730 generic.go:334] "Generic (PLEG): container finished" podID="67854402-4e0e-4ebe-b9d4-700669827780" containerID="caec51e4f1b5d91020f11b5970f403cd0356b8c6fa1f260cecf4ea6e449980f1" exitCode=0
Mar 20 15:58:02 crc kubenswrapper[4730]: I0320 15:58:02.614412    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567038-qvgqb" event={"ID":"67854402-4e0e-4ebe-b9d4-700669827780","Type":"ContainerDied","Data":"caec51e4f1b5d91020f11b5970f403cd0356b8c6fa1f260cecf4ea6e449980f1"}
Mar 20 15:58:03 crc kubenswrapper[4730]: I0320 15:58:03.869548    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-784b55c5d9-5wk8n"]
Mar 20 15:58:03 crc kubenswrapper[4730]: I0320 15:58:03.870964    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-784b55c5d9-5wk8n"
Mar 20 15:58:03 crc kubenswrapper[4730]: I0320 15:58:03.873533    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt"
Mar 20 15:58:03 crc kubenswrapper[4730]: I0320 15:58:03.873930    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-6gvmp"
Mar 20 15:58:03 crc kubenswrapper[4730]: I0320 15:58:03.874100    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns"
Mar 20 15:58:03 crc kubenswrapper[4730]: I0320 15:58:03.874357    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt"
Mar 20 15:58:03 crc kubenswrapper[4730]: I0320 15:58:03.892568    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-784b55c5d9-5wk8n"]
Mar 20 15:58:03 crc kubenswrapper[4730]: I0320 15:58:03.916447    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567038-qvgqb"
Mar 20 15:58:03 crc kubenswrapper[4730]: I0320 15:58:03.931413    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bf56b5889-hwkdn"]
Mar 20 15:58:03 crc kubenswrapper[4730]: E0320 15:58:03.932546    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67854402-4e0e-4ebe-b9d4-700669827780" containerName="oc"
Mar 20 15:58:03 crc kubenswrapper[4730]: I0320 15:58:03.932565    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="67854402-4e0e-4ebe-b9d4-700669827780" containerName="oc"
Mar 20 15:58:03 crc kubenswrapper[4730]: I0320 15:58:03.932693    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="67854402-4e0e-4ebe-b9d4-700669827780" containerName="oc"
Mar 20 15:58:03 crc kubenswrapper[4730]: I0320 15:58:03.933370    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bf56b5889-hwkdn"
Mar 20 15:58:03 crc kubenswrapper[4730]: I0320 15:58:03.935881    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc"
Mar 20 15:58:03 crc kubenswrapper[4730]: I0320 15:58:03.955609    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bf56b5889-hwkdn"]
Mar 20 15:58:03 crc kubenswrapper[4730]: I0320 15:58:03.989230    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcd9cd10-c13e-446b-9dad-8b30f04de37e-config\") pod \"dnsmasq-dns-784b55c5d9-5wk8n\" (UID: \"dcd9cd10-c13e-446b-9dad-8b30f04de37e\") " pod="openstack/dnsmasq-dns-784b55c5d9-5wk8n"
Mar 20 15:58:03 crc kubenswrapper[4730]: I0320 15:58:03.989427    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqsxv\" (UniqueName: \"kubernetes.io/projected/dcd9cd10-c13e-446b-9dad-8b30f04de37e-kube-api-access-rqsxv\") pod \"dnsmasq-dns-784b55c5d9-5wk8n\" (UID: \"dcd9cd10-c13e-446b-9dad-8b30f04de37e\") " pod="openstack/dnsmasq-dns-784b55c5d9-5wk8n"
Mar 20 15:58:04 crc kubenswrapper[4730]: I0320 15:58:04.090652    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvhrt\" (UniqueName: \"kubernetes.io/projected/67854402-4e0e-4ebe-b9d4-700669827780-kube-api-access-bvhrt\") pod \"67854402-4e0e-4ebe-b9d4-700669827780\" (UID: \"67854402-4e0e-4ebe-b9d4-700669827780\") "
Mar 20 15:58:04 crc kubenswrapper[4730]: I0320 15:58:04.091089    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxh4d\" (UniqueName: \"kubernetes.io/projected/ae05e358-cb9f-4772-a644-8ec5131415eb-kube-api-access-hxh4d\") pod \"dnsmasq-dns-bf56b5889-hwkdn\" (UID: \"ae05e358-cb9f-4772-a644-8ec5131415eb\") " pod="openstack/dnsmasq-dns-bf56b5889-hwkdn"
Mar 20 15:58:04 crc kubenswrapper[4730]: I0320 15:58:04.091157    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae05e358-cb9f-4772-a644-8ec5131415eb-dns-svc\") pod \"dnsmasq-dns-bf56b5889-hwkdn\" (UID: \"ae05e358-cb9f-4772-a644-8ec5131415eb\") " pod="openstack/dnsmasq-dns-bf56b5889-hwkdn"
Mar 20 15:58:04 crc kubenswrapper[4730]: I0320 15:58:04.091294    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcd9cd10-c13e-446b-9dad-8b30f04de37e-config\") pod \"dnsmasq-dns-784b55c5d9-5wk8n\" (UID: \"dcd9cd10-c13e-446b-9dad-8b30f04de37e\") " pod="openstack/dnsmasq-dns-784b55c5d9-5wk8n"
Mar 20 15:58:04 crc kubenswrapper[4730]: I0320 15:58:04.091348    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqsxv\" (UniqueName: \"kubernetes.io/projected/dcd9cd10-c13e-446b-9dad-8b30f04de37e-kube-api-access-rqsxv\") pod \"dnsmasq-dns-784b55c5d9-5wk8n\" (UID: \"dcd9cd10-c13e-446b-9dad-8b30f04de37e\") " pod="openstack/dnsmasq-dns-784b55c5d9-5wk8n"
Mar 20 15:58:04 crc kubenswrapper[4730]: I0320 15:58:04.091400    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae05e358-cb9f-4772-a644-8ec5131415eb-config\") pod \"dnsmasq-dns-bf56b5889-hwkdn\" (UID: \"ae05e358-cb9f-4772-a644-8ec5131415eb\") " pod="openstack/dnsmasq-dns-bf56b5889-hwkdn"
Mar 20 15:58:04 crc kubenswrapper[4730]: I0320 15:58:04.092103    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcd9cd10-c13e-446b-9dad-8b30f04de37e-config\") pod \"dnsmasq-dns-784b55c5d9-5wk8n\" (UID: \"dcd9cd10-c13e-446b-9dad-8b30f04de37e\") " pod="openstack/dnsmasq-dns-784b55c5d9-5wk8n"
Mar 20 15:58:04 crc kubenswrapper[4730]: I0320 15:58:04.099420    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67854402-4e0e-4ebe-b9d4-700669827780-kube-api-access-bvhrt" (OuterVolumeSpecName: "kube-api-access-bvhrt") pod "67854402-4e0e-4ebe-b9d4-700669827780" (UID: "67854402-4e0e-4ebe-b9d4-700669827780"). InnerVolumeSpecName "kube-api-access-bvhrt". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:58:04 crc kubenswrapper[4730]: I0320 15:58:04.111833    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqsxv\" (UniqueName: \"kubernetes.io/projected/dcd9cd10-c13e-446b-9dad-8b30f04de37e-kube-api-access-rqsxv\") pod \"dnsmasq-dns-784b55c5d9-5wk8n\" (UID: \"dcd9cd10-c13e-446b-9dad-8b30f04de37e\") " pod="openstack/dnsmasq-dns-784b55c5d9-5wk8n"
Mar 20 15:58:04 crc kubenswrapper[4730]: I0320 15:58:04.192926    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxh4d\" (UniqueName: \"kubernetes.io/projected/ae05e358-cb9f-4772-a644-8ec5131415eb-kube-api-access-hxh4d\") pod \"dnsmasq-dns-bf56b5889-hwkdn\" (UID: \"ae05e358-cb9f-4772-a644-8ec5131415eb\") " pod="openstack/dnsmasq-dns-bf56b5889-hwkdn"
Mar 20 15:58:04 crc kubenswrapper[4730]: I0320 15:58:04.193219    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae05e358-cb9f-4772-a644-8ec5131415eb-dns-svc\") pod \"dnsmasq-dns-bf56b5889-hwkdn\" (UID: \"ae05e358-cb9f-4772-a644-8ec5131415eb\") " pod="openstack/dnsmasq-dns-bf56b5889-hwkdn"
Mar 20 15:58:04 crc kubenswrapper[4730]: I0320 15:58:04.193365    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae05e358-cb9f-4772-a644-8ec5131415eb-config\") pod \"dnsmasq-dns-bf56b5889-hwkdn\" (UID: \"ae05e358-cb9f-4772-a644-8ec5131415eb\") " pod="openstack/dnsmasq-dns-bf56b5889-hwkdn"
Mar 20 15:58:04 crc kubenswrapper[4730]: I0320 15:58:04.193476    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvhrt\" (UniqueName: \"kubernetes.io/projected/67854402-4e0e-4ebe-b9d4-700669827780-kube-api-access-bvhrt\") on node \"crc\" DevicePath \"\""
Mar 20 15:58:04 crc kubenswrapper[4730]: I0320 15:58:04.194224    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae05e358-cb9f-4772-a644-8ec5131415eb-config\") pod \"dnsmasq-dns-bf56b5889-hwkdn\" (UID: \"ae05e358-cb9f-4772-a644-8ec5131415eb\") " pod="openstack/dnsmasq-dns-bf56b5889-hwkdn"
Mar 20 15:58:04 crc kubenswrapper[4730]: I0320 15:58:04.194780    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae05e358-cb9f-4772-a644-8ec5131415eb-dns-svc\") pod \"dnsmasq-dns-bf56b5889-hwkdn\" (UID: \"ae05e358-cb9f-4772-a644-8ec5131415eb\") " pod="openstack/dnsmasq-dns-bf56b5889-hwkdn"
Mar 20 15:58:04 crc kubenswrapper[4730]: I0320 15:58:04.227732    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxh4d\" (UniqueName: \"kubernetes.io/projected/ae05e358-cb9f-4772-a644-8ec5131415eb-kube-api-access-hxh4d\") pod \"dnsmasq-dns-bf56b5889-hwkdn\" (UID: \"ae05e358-cb9f-4772-a644-8ec5131415eb\") " pod="openstack/dnsmasq-dns-bf56b5889-hwkdn"
Mar 20 15:58:04 crc kubenswrapper[4730]: I0320 15:58:04.232157    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-784b55c5d9-5wk8n"
Mar 20 15:58:04 crc kubenswrapper[4730]: I0320 15:58:04.253894    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bf56b5889-hwkdn"
Mar 20 15:58:04 crc kubenswrapper[4730]: I0320 15:58:04.631112    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567038-qvgqb" event={"ID":"67854402-4e0e-4ebe-b9d4-700669827780","Type":"ContainerDied","Data":"9440066ffdb65a07be2e47ff1767e0405bb8e3440041b92f4bf330c3197708ed"}
Mar 20 15:58:04 crc kubenswrapper[4730]: I0320 15:58:04.631157    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9440066ffdb65a07be2e47ff1767e0405bb8e3440041b92f4bf330c3197708ed"
Mar 20 15:58:04 crc kubenswrapper[4730]: I0320 15:58:04.631137    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567038-qvgqb"
Mar 20 15:58:04 crc kubenswrapper[4730]: I0320 15:58:04.705355    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-784b55c5d9-5wk8n"]
Mar 20 15:58:04 crc kubenswrapper[4730]: I0320 15:58:04.741930    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bf56b5889-hwkdn"]
Mar 20 15:58:04 crc kubenswrapper[4730]: W0320 15:58:04.752956    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae05e358_cb9f_4772_a644_8ec5131415eb.slice/crio-0f15db04a290faf8bce634d61170874353004570e3569dc452521a6a7ac01ec4 WatchSource:0}: Error finding container 0f15db04a290faf8bce634d61170874353004570e3569dc452521a6a7ac01ec4: Status 404 returned error can't find the container with id 0f15db04a290faf8bce634d61170874353004570e3569dc452521a6a7ac01ec4
Mar 20 15:58:04 crc kubenswrapper[4730]: I0320 15:58:04.992097    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567032-mfvhl"]
Mar 20 15:58:05 crc kubenswrapper[4730]: I0320 15:58:05.000050    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567032-mfvhl"]
Mar 20 15:58:05 crc kubenswrapper[4730]: I0320 15:58:05.542644    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76deb34d-7c3d-4510-9b0a-ac56dcca047a" path="/var/lib/kubelet/pods/76deb34d-7c3d-4510-9b0a-ac56dcca047a/volumes"
Mar 20 15:58:05 crc kubenswrapper[4730]: I0320 15:58:05.640945    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-784b55c5d9-5wk8n" event={"ID":"dcd9cd10-c13e-446b-9dad-8b30f04de37e","Type":"ContainerStarted","Data":"cc74e55810f8f5b64f9744e711295d16d00bc26b41f45c20f60b7913df187ae8"}
Mar 20 15:58:05 crc kubenswrapper[4730]: I0320 15:58:05.643087    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bf56b5889-hwkdn" event={"ID":"ae05e358-cb9f-4772-a644-8ec5131415eb","Type":"ContainerStarted","Data":"0f15db04a290faf8bce634d61170874353004570e3569dc452521a6a7ac01ec4"}
Mar 20 15:58:07 crc kubenswrapper[4730]: I0320 15:58:07.671234    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-784b55c5d9-5wk8n"]
Mar 20 15:58:07 crc kubenswrapper[4730]: I0320 15:58:07.690800    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-68d64f5f8f-ncpth"]
Mar 20 15:58:07 crc kubenswrapper[4730]: I0320 15:58:07.692044    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68d64f5f8f-ncpth"
Mar 20 15:58:07 crc kubenswrapper[4730]: I0320 15:58:07.703272    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68d64f5f8f-ncpth"]
Mar 20 15:58:07 crc kubenswrapper[4730]: I0320 15:58:07.843728    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrdjk\" (UniqueName: \"kubernetes.io/projected/7a5c1062-c366-4407-a395-cc3ad80ed296-kube-api-access-mrdjk\") pod \"dnsmasq-dns-68d64f5f8f-ncpth\" (UID: \"7a5c1062-c366-4407-a395-cc3ad80ed296\") " pod="openstack/dnsmasq-dns-68d64f5f8f-ncpth"
Mar 20 15:58:07 crc kubenswrapper[4730]: I0320 15:58:07.843821    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a5c1062-c366-4407-a395-cc3ad80ed296-config\") pod \"dnsmasq-dns-68d64f5f8f-ncpth\" (UID: \"7a5c1062-c366-4407-a395-cc3ad80ed296\") " pod="openstack/dnsmasq-dns-68d64f5f8f-ncpth"
Mar 20 15:58:07 crc kubenswrapper[4730]: I0320 15:58:07.843943    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a5c1062-c366-4407-a395-cc3ad80ed296-dns-svc\") pod \"dnsmasq-dns-68d64f5f8f-ncpth\" (UID: \"7a5c1062-c366-4407-a395-cc3ad80ed296\") " pod="openstack/dnsmasq-dns-68d64f5f8f-ncpth"
Mar 20 15:58:07 crc kubenswrapper[4730]: I0320 15:58:07.935694    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bf56b5889-hwkdn"]
Mar 20 15:58:07 crc kubenswrapper[4730]: I0320 15:58:07.945654    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a5c1062-c366-4407-a395-cc3ad80ed296-dns-svc\") pod \"dnsmasq-dns-68d64f5f8f-ncpth\" (UID: \"7a5c1062-c366-4407-a395-cc3ad80ed296\") " pod="openstack/dnsmasq-dns-68d64f5f8f-ncpth"
Mar 20 15:58:07 crc kubenswrapper[4730]: I0320 15:58:07.945742    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrdjk\" (UniqueName: \"kubernetes.io/projected/7a5c1062-c366-4407-a395-cc3ad80ed296-kube-api-access-mrdjk\") pod \"dnsmasq-dns-68d64f5f8f-ncpth\" (UID: \"7a5c1062-c366-4407-a395-cc3ad80ed296\") " pod="openstack/dnsmasq-dns-68d64f5f8f-ncpth"
Mar 20 15:58:07 crc kubenswrapper[4730]: I0320 15:58:07.945807    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a5c1062-c366-4407-a395-cc3ad80ed296-config\") pod \"dnsmasq-dns-68d64f5f8f-ncpth\" (UID: \"7a5c1062-c366-4407-a395-cc3ad80ed296\") " pod="openstack/dnsmasq-dns-68d64f5f8f-ncpth"
Mar 20 15:58:07 crc kubenswrapper[4730]: I0320 15:58:07.946744    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a5c1062-c366-4407-a395-cc3ad80ed296-dns-svc\") pod \"dnsmasq-dns-68d64f5f8f-ncpth\" (UID: \"7a5c1062-c366-4407-a395-cc3ad80ed296\") " pod="openstack/dnsmasq-dns-68d64f5f8f-ncpth"
Mar 20 15:58:07 crc kubenswrapper[4730]: I0320 15:58:07.946996    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a5c1062-c366-4407-a395-cc3ad80ed296-config\") pod \"dnsmasq-dns-68d64f5f8f-ncpth\" (UID: \"7a5c1062-c366-4407-a395-cc3ad80ed296\") " pod="openstack/dnsmasq-dns-68d64f5f8f-ncpth"
Mar 20 15:58:07 crc kubenswrapper[4730]: I0320 15:58:07.972615    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrdjk\" (UniqueName: \"kubernetes.io/projected/7a5c1062-c366-4407-a395-cc3ad80ed296-kube-api-access-mrdjk\") pod \"dnsmasq-dns-68d64f5f8f-ncpth\" (UID: \"7a5c1062-c366-4407-a395-cc3ad80ed296\") " pod="openstack/dnsmasq-dns-68d64f5f8f-ncpth"
Mar 20 15:58:07 crc kubenswrapper[4730]: I0320 15:58:07.972684    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7847d45595-fnchx"]
Mar 20 15:58:07 crc kubenswrapper[4730]: I0320 15:58:07.974214    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7847d45595-fnchx"
Mar 20 15:58:07 crc kubenswrapper[4730]: I0320 15:58:07.979617    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7847d45595-fnchx"]
Mar 20 15:58:08 crc kubenswrapper[4730]: I0320 15:58:08.014747    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68d64f5f8f-ncpth"
Mar 20 15:58:08 crc kubenswrapper[4730]: I0320 15:58:08.046809    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27dbbb52-2bd1-4e24-b621-128e7c880a2b-config\") pod \"dnsmasq-dns-7847d45595-fnchx\" (UID: \"27dbbb52-2bd1-4e24-b621-128e7c880a2b\") " pod="openstack/dnsmasq-dns-7847d45595-fnchx"
Mar 20 15:58:08 crc kubenswrapper[4730]: I0320 15:58:08.046910    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgkz6\" (UniqueName: \"kubernetes.io/projected/27dbbb52-2bd1-4e24-b621-128e7c880a2b-kube-api-access-dgkz6\") pod \"dnsmasq-dns-7847d45595-fnchx\" (UID: \"27dbbb52-2bd1-4e24-b621-128e7c880a2b\") " pod="openstack/dnsmasq-dns-7847d45595-fnchx"
Mar 20 15:58:08 crc kubenswrapper[4730]: I0320 15:58:08.046969    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27dbbb52-2bd1-4e24-b621-128e7c880a2b-dns-svc\") pod \"dnsmasq-dns-7847d45595-fnchx\" (UID: \"27dbbb52-2bd1-4e24-b621-128e7c880a2b\") " pod="openstack/dnsmasq-dns-7847d45595-fnchx"
Mar 20 15:58:08 crc kubenswrapper[4730]: I0320 15:58:08.148007    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27dbbb52-2bd1-4e24-b621-128e7c880a2b-config\") pod \"dnsmasq-dns-7847d45595-fnchx\" (UID: \"27dbbb52-2bd1-4e24-b621-128e7c880a2b\") " pod="openstack/dnsmasq-dns-7847d45595-fnchx"
Mar 20 15:58:08 crc kubenswrapper[4730]: I0320 15:58:08.148112    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgkz6\" (UniqueName: \"kubernetes.io/projected/27dbbb52-2bd1-4e24-b621-128e7c880a2b-kube-api-access-dgkz6\") pod \"dnsmasq-dns-7847d45595-fnchx\" (UID: \"27dbbb52-2bd1-4e24-b621-128e7c880a2b\") " pod="openstack/dnsmasq-dns-7847d45595-fnchx"
Mar 20 15:58:08 crc kubenswrapper[4730]: I0320 15:58:08.148208    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27dbbb52-2bd1-4e24-b621-128e7c880a2b-dns-svc\") pod \"dnsmasq-dns-7847d45595-fnchx\" (UID: \"27dbbb52-2bd1-4e24-b621-128e7c880a2b\") " pod="openstack/dnsmasq-dns-7847d45595-fnchx"
Mar 20 15:58:08 crc kubenswrapper[4730]: I0320 15:58:08.148945    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27dbbb52-2bd1-4e24-b621-128e7c880a2b-dns-svc\") pod \"dnsmasq-dns-7847d45595-fnchx\" (UID: \"27dbbb52-2bd1-4e24-b621-128e7c880a2b\") " pod="openstack/dnsmasq-dns-7847d45595-fnchx"
Mar 20 15:58:08 crc kubenswrapper[4730]: I0320 15:58:08.149017    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27dbbb52-2bd1-4e24-b621-128e7c880a2b-config\") pod \"dnsmasq-dns-7847d45595-fnchx\" (UID: \"27dbbb52-2bd1-4e24-b621-128e7c880a2b\") " pod="openstack/dnsmasq-dns-7847d45595-fnchx"
Mar 20 15:58:08 crc kubenswrapper[4730]: I0320 15:58:08.176218    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgkz6\" (UniqueName: \"kubernetes.io/projected/27dbbb52-2bd1-4e24-b621-128e7c880a2b-kube-api-access-dgkz6\") pod \"dnsmasq-dns-7847d45595-fnchx\" (UID: \"27dbbb52-2bd1-4e24-b621-128e7c880a2b\") " pod="openstack/dnsmasq-dns-7847d45595-fnchx"
Mar 20 15:58:08 crc kubenswrapper[4730]: I0320 15:58:08.254351    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68d64f5f8f-ncpth"]
Mar 20 15:58:08 crc kubenswrapper[4730]: I0320 15:58:08.274618    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74bcc47849-2r2xb"]
Mar 20 15:58:08 crc kubenswrapper[4730]: I0320 15:58:08.275710    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74bcc47849-2r2xb"
Mar 20 15:58:08 crc kubenswrapper[4730]: I0320 15:58:08.293066    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74bcc47849-2r2xb"]
Mar 20 15:58:08 crc kubenswrapper[4730]: I0320 15:58:08.334350    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7847d45595-fnchx"
Mar 20 15:58:08 crc kubenswrapper[4730]: I0320 15:58:08.350544    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7d94g\" (UniqueName: \"kubernetes.io/projected/a5e88dae-c3fd-456c-92c6-3bc143b5a399-kube-api-access-7d94g\") pod \"dnsmasq-dns-74bcc47849-2r2xb\" (UID: \"a5e88dae-c3fd-456c-92c6-3bc143b5a399\") " pod="openstack/dnsmasq-dns-74bcc47849-2r2xb"
Mar 20 15:58:08 crc kubenswrapper[4730]: I0320 15:58:08.350589    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5e88dae-c3fd-456c-92c6-3bc143b5a399-config\") pod \"dnsmasq-dns-74bcc47849-2r2xb\" (UID: \"a5e88dae-c3fd-456c-92c6-3bc143b5a399\") " pod="openstack/dnsmasq-dns-74bcc47849-2r2xb"
Mar 20 15:58:08 crc kubenswrapper[4730]: I0320 15:58:08.350644    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a5e88dae-c3fd-456c-92c6-3bc143b5a399-dns-svc\") pod \"dnsmasq-dns-74bcc47849-2r2xb\" (UID: \"a5e88dae-c3fd-456c-92c6-3bc143b5a399\") " pod="openstack/dnsmasq-dns-74bcc47849-2r2xb"
Mar 20 15:58:08 crc kubenswrapper[4730]: I0320 15:58:08.452297    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7d94g\" (UniqueName: \"kubernetes.io/projected/a5e88dae-c3fd-456c-92c6-3bc143b5a399-kube-api-access-7d94g\") pod \"dnsmasq-dns-74bcc47849-2r2xb\" (UID: \"a5e88dae-c3fd-456c-92c6-3bc143b5a399\") " pod="openstack/dnsmasq-dns-74bcc47849-2r2xb"
Mar 20 15:58:08 crc kubenswrapper[4730]: I0320 15:58:08.452617    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5e88dae-c3fd-456c-92c6-3bc143b5a399-config\") pod \"dnsmasq-dns-74bcc47849-2r2xb\" (UID: \"a5e88dae-c3fd-456c-92c6-3bc143b5a399\") " pod="openstack/dnsmasq-dns-74bcc47849-2r2xb"
Mar 20 15:58:08 crc kubenswrapper[4730]: I0320 15:58:08.452738    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a5e88dae-c3fd-456c-92c6-3bc143b5a399-dns-svc\") pod \"dnsmasq-dns-74bcc47849-2r2xb\" (UID: \"a5e88dae-c3fd-456c-92c6-3bc143b5a399\") " pod="openstack/dnsmasq-dns-74bcc47849-2r2xb"
Mar 20 15:58:08 crc kubenswrapper[4730]: I0320 15:58:08.453864    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a5e88dae-c3fd-456c-92c6-3bc143b5a399-dns-svc\") pod \"dnsmasq-dns-74bcc47849-2r2xb\" (UID: \"a5e88dae-c3fd-456c-92c6-3bc143b5a399\") " pod="openstack/dnsmasq-dns-74bcc47849-2r2xb"
Mar 20 15:58:08 crc kubenswrapper[4730]: I0320 15:58:08.453928    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5e88dae-c3fd-456c-92c6-3bc143b5a399-config\") pod \"dnsmasq-dns-74bcc47849-2r2xb\" (UID: \"a5e88dae-c3fd-456c-92c6-3bc143b5a399\") " pod="openstack/dnsmasq-dns-74bcc47849-2r2xb"
Mar 20 15:58:08 crc kubenswrapper[4730]: I0320 15:58:08.470786    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7d94g\" (UniqueName: \"kubernetes.io/projected/a5e88dae-c3fd-456c-92c6-3bc143b5a399-kube-api-access-7d94g\") pod \"dnsmasq-dns-74bcc47849-2r2xb\" (UID: \"a5e88dae-c3fd-456c-92c6-3bc143b5a399\") " pod="openstack/dnsmasq-dns-74bcc47849-2r2xb"
Mar 20 15:58:08 crc kubenswrapper[4730]: I0320 15:58:08.592697    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74bcc47849-2r2xb"
Mar 20 15:58:08 crc kubenswrapper[4730]: I0320 15:58:08.832722    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"]
Mar 20 15:58:08 crc kubenswrapper[4730]: I0320 15:58:08.838569    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0"
Mar 20 15:58:08 crc kubenswrapper[4730]: I0320 15:58:08.841638    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie"
Mar 20 15:58:08 crc kubenswrapper[4730]: I0320 15:58:08.841977    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc"
Mar 20 15:58:08 crc kubenswrapper[4730]: I0320 15:58:08.843959    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-rwcvj"
Mar 20 15:58:08 crc kubenswrapper[4730]: I0320 15:58:08.844493    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user"
Mar 20 15:58:08 crc kubenswrapper[4730]: I0320 15:58:08.845617    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data"
Mar 20 15:58:08 crc kubenswrapper[4730]: I0320 15:58:08.846091    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf"
Mar 20 15:58:08 crc kubenswrapper[4730]: I0320 15:58:08.846308    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf"
Mar 20 15:58:08 crc kubenswrapper[4730]: I0320 15:58:08.853688    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"]
Mar 20 15:58:08 crc kubenswrapper[4730]: I0320 15:58:08.963083    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\") " pod="openstack/rabbitmq-server-0"
Mar 20 15:58:08 crc kubenswrapper[4730]: I0320 15:58:08.963151    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\") " pod="openstack/rabbitmq-server-0"
Mar 20 15:58:08 crc kubenswrapper[4730]: I0320 15:58:08.963265    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4jnf\" (UniqueName: \"kubernetes.io/projected/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-kube-api-access-c4jnf\") pod \"rabbitmq-server-0\" (UID: \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\") " pod="openstack/rabbitmq-server-0"
Mar 20 15:58:08 crc kubenswrapper[4730]: I0320 15:58:08.963491    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\") " pod="openstack/rabbitmq-server-0"
Mar 20 15:58:08 crc kubenswrapper[4730]: I0320 15:58:08.963563    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-config-data\") pod \"rabbitmq-server-0\" (UID: \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\") " pod="openstack/rabbitmq-server-0"
Mar 20 15:58:08 crc kubenswrapper[4730]: I0320 15:58:08.963602    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\") " pod="openstack/rabbitmq-server-0"
Mar 20 15:58:08 crc kubenswrapper[4730]: I0320 15:58:08.963632    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\") " pod="openstack/rabbitmq-server-0"
Mar 20 15:58:08 crc kubenswrapper[4730]: I0320 15:58:08.963752    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\") " pod="openstack/rabbitmq-server-0"
Mar 20 15:58:08 crc kubenswrapper[4730]: I0320 15:58:08.963814    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\") " pod="openstack/rabbitmq-server-0"
Mar 20 15:58:08 crc kubenswrapper[4730]: I0320 15:58:08.963851    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\") " pod="openstack/rabbitmq-server-0"
Mar 20 15:58:08 crc kubenswrapper[4730]: I0320 15:58:08.963898    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\") " pod="openstack/rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.065203    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4jnf\" (UniqueName: \"kubernetes.io/projected/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-kube-api-access-c4jnf\") pod \"rabbitmq-server-0\" (UID: \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\") " pod="openstack/rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.065281    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\") " pod="openstack/rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.065313    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-config-data\") pod \"rabbitmq-server-0\" (UID: \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\") " pod="openstack/rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.065344    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\") " pod="openstack/rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.065367    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\") " pod="openstack/rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.065401    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\") " pod="openstack/rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.065428    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\") " pod="openstack/rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.065448    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\") " pod="openstack/rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.065476    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\") " pod="openstack/rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.065510    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\") " pod="openstack/rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.065544    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\") " pod="openstack/rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.066389    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\") " pod="openstack/rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.066470    4730 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.067087    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\") " pod="openstack/rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.067367    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-config-data\") pod \"rabbitmq-server-0\" (UID: \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\") " pod="openstack/rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.067406    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\") " pod="openstack/rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.068530    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\") " pod="openstack/rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.071925    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\") " pod="openstack/rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.072226    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\") " pod="openstack/rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.072437    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\") " pod="openstack/rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.074888    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\") " pod="openstack/rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.087152    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4jnf\" (UniqueName: \"kubernetes.io/projected/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-kube-api-access-c4jnf\") pod \"rabbitmq-server-0\" (UID: \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\") " pod="openstack/rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.089484    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\") " pod="openstack/rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.119806    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/notifications-rabbitmq-server-0"]
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.121227    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/notifications-rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.126726    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"notifications-rabbitmq-config-data"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.127740    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-notifications-rabbitmq-svc"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.127906    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"notifications-rabbitmq-server-dockercfg-sl7kk"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.170327    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"notifications-rabbitmq-erlang-cookie"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.170655    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"notifications-rabbitmq-default-user"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.170759    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"notifications-rabbitmq-plugins-conf"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.170803    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"notifications-rabbitmq-server-conf"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.171069    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.179955    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/notifications-rabbitmq-server-0"]
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.273151    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/df9ca02d-e20f-4f55-ba14-92b91812afb6-plugins-conf\") pod \"notifications-rabbitmq-server-0\" (UID: \"df9ca02d-e20f-4f55-ba14-92b91812afb6\") " pod="openstack/notifications-rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.273199    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/df9ca02d-e20f-4f55-ba14-92b91812afb6-rabbitmq-confd\") pod \"notifications-rabbitmq-server-0\" (UID: \"df9ca02d-e20f-4f55-ba14-92b91812afb6\") " pod="openstack/notifications-rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.273229    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/df9ca02d-e20f-4f55-ba14-92b91812afb6-server-conf\") pod \"notifications-rabbitmq-server-0\" (UID: \"df9ca02d-e20f-4f55-ba14-92b91812afb6\") " pod="openstack/notifications-rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.273283    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/df9ca02d-e20f-4f55-ba14-92b91812afb6-pod-info\") pod \"notifications-rabbitmq-server-0\" (UID: \"df9ca02d-e20f-4f55-ba14-92b91812afb6\") " pod="openstack/notifications-rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.273320    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/df9ca02d-e20f-4f55-ba14-92b91812afb6-erlang-cookie-secret\") pod \"notifications-rabbitmq-server-0\" (UID: \"df9ca02d-e20f-4f55-ba14-92b91812afb6\") " pod="openstack/notifications-rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.273574    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/df9ca02d-e20f-4f55-ba14-92b91812afb6-config-data\") pod \"notifications-rabbitmq-server-0\" (UID: \"df9ca02d-e20f-4f55-ba14-92b91812afb6\") " pod="openstack/notifications-rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.273658    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rj8vn\" (UniqueName: \"kubernetes.io/projected/df9ca02d-e20f-4f55-ba14-92b91812afb6-kube-api-access-rj8vn\") pod \"notifications-rabbitmq-server-0\" (UID: \"df9ca02d-e20f-4f55-ba14-92b91812afb6\") " pod="openstack/notifications-rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.273821    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"notifications-rabbitmq-server-0\" (UID: \"df9ca02d-e20f-4f55-ba14-92b91812afb6\") " pod="openstack/notifications-rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.273913    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/df9ca02d-e20f-4f55-ba14-92b91812afb6-rabbitmq-plugins\") pod \"notifications-rabbitmq-server-0\" (UID: \"df9ca02d-e20f-4f55-ba14-92b91812afb6\") " pod="openstack/notifications-rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.273969    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/df9ca02d-e20f-4f55-ba14-92b91812afb6-rabbitmq-erlang-cookie\") pod \"notifications-rabbitmq-server-0\" (UID: \"df9ca02d-e20f-4f55-ba14-92b91812afb6\") " pod="openstack/notifications-rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.274053    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/df9ca02d-e20f-4f55-ba14-92b91812afb6-rabbitmq-tls\") pod \"notifications-rabbitmq-server-0\" (UID: \"df9ca02d-e20f-4f55-ba14-92b91812afb6\") " pod="openstack/notifications-rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.375576    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/df9ca02d-e20f-4f55-ba14-92b91812afb6-config-data\") pod \"notifications-rabbitmq-server-0\" (UID: \"df9ca02d-e20f-4f55-ba14-92b91812afb6\") " pod="openstack/notifications-rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.375650    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rj8vn\" (UniqueName: \"kubernetes.io/projected/df9ca02d-e20f-4f55-ba14-92b91812afb6-kube-api-access-rj8vn\") pod \"notifications-rabbitmq-server-0\" (UID: \"df9ca02d-e20f-4f55-ba14-92b91812afb6\") " pod="openstack/notifications-rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.376310    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"notifications-rabbitmq-server-0\" (UID: \"df9ca02d-e20f-4f55-ba14-92b91812afb6\") " pod="openstack/notifications-rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.376508    4730 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"notifications-rabbitmq-server-0\" (UID: \"df9ca02d-e20f-4f55-ba14-92b91812afb6\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/notifications-rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.377570    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/df9ca02d-e20f-4f55-ba14-92b91812afb6-rabbitmq-plugins\") pod \"notifications-rabbitmq-server-0\" (UID: \"df9ca02d-e20f-4f55-ba14-92b91812afb6\") " pod="openstack/notifications-rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.377661    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/df9ca02d-e20f-4f55-ba14-92b91812afb6-rabbitmq-erlang-cookie\") pod \"notifications-rabbitmq-server-0\" (UID: \"df9ca02d-e20f-4f55-ba14-92b91812afb6\") " pod="openstack/notifications-rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.377806    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/df9ca02d-e20f-4f55-ba14-92b91812afb6-rabbitmq-tls\") pod \"notifications-rabbitmq-server-0\" (UID: \"df9ca02d-e20f-4f55-ba14-92b91812afb6\") " pod="openstack/notifications-rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.377854    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/df9ca02d-e20f-4f55-ba14-92b91812afb6-plugins-conf\") pod \"notifications-rabbitmq-server-0\" (UID: \"df9ca02d-e20f-4f55-ba14-92b91812afb6\") " pod="openstack/notifications-rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.377901    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/df9ca02d-e20f-4f55-ba14-92b91812afb6-rabbitmq-confd\") pod \"notifications-rabbitmq-server-0\" (UID: \"df9ca02d-e20f-4f55-ba14-92b91812afb6\") " pod="openstack/notifications-rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.378071    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/df9ca02d-e20f-4f55-ba14-92b91812afb6-server-conf\") pod \"notifications-rabbitmq-server-0\" (UID: \"df9ca02d-e20f-4f55-ba14-92b91812afb6\") " pod="openstack/notifications-rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.378118    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/df9ca02d-e20f-4f55-ba14-92b91812afb6-rabbitmq-plugins\") pod \"notifications-rabbitmq-server-0\" (UID: \"df9ca02d-e20f-4f55-ba14-92b91812afb6\") " pod="openstack/notifications-rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.378135    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/df9ca02d-e20f-4f55-ba14-92b91812afb6-pod-info\") pod \"notifications-rabbitmq-server-0\" (UID: \"df9ca02d-e20f-4f55-ba14-92b91812afb6\") " pod="openstack/notifications-rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.379071    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/df9ca02d-e20f-4f55-ba14-92b91812afb6-rabbitmq-erlang-cookie\") pod \"notifications-rabbitmq-server-0\" (UID: \"df9ca02d-e20f-4f55-ba14-92b91812afb6\") " pod="openstack/notifications-rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.379810    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/df9ca02d-e20f-4f55-ba14-92b91812afb6-plugins-conf\") pod \"notifications-rabbitmq-server-0\" (UID: \"df9ca02d-e20f-4f55-ba14-92b91812afb6\") " pod="openstack/notifications-rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.379863    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/df9ca02d-e20f-4f55-ba14-92b91812afb6-config-data\") pod \"notifications-rabbitmq-server-0\" (UID: \"df9ca02d-e20f-4f55-ba14-92b91812afb6\") " pod="openstack/notifications-rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.380201    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/df9ca02d-e20f-4f55-ba14-92b91812afb6-server-conf\") pod \"notifications-rabbitmq-server-0\" (UID: \"df9ca02d-e20f-4f55-ba14-92b91812afb6\") " pod="openstack/notifications-rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.380536    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/df9ca02d-e20f-4f55-ba14-92b91812afb6-erlang-cookie-secret\") pod \"notifications-rabbitmq-server-0\" (UID: \"df9ca02d-e20f-4f55-ba14-92b91812afb6\") " pod="openstack/notifications-rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.382390    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/df9ca02d-e20f-4f55-ba14-92b91812afb6-rabbitmq-confd\") pod \"notifications-rabbitmq-server-0\" (UID: \"df9ca02d-e20f-4f55-ba14-92b91812afb6\") " pod="openstack/notifications-rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.383060    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/df9ca02d-e20f-4f55-ba14-92b91812afb6-rabbitmq-tls\") pod \"notifications-rabbitmq-server-0\" (UID: \"df9ca02d-e20f-4f55-ba14-92b91812afb6\") " pod="openstack/notifications-rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.390140    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/df9ca02d-e20f-4f55-ba14-92b91812afb6-pod-info\") pod \"notifications-rabbitmq-server-0\" (UID: \"df9ca02d-e20f-4f55-ba14-92b91812afb6\") " pod="openstack/notifications-rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.390790    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/df9ca02d-e20f-4f55-ba14-92b91812afb6-erlang-cookie-secret\") pod \"notifications-rabbitmq-server-0\" (UID: \"df9ca02d-e20f-4f55-ba14-92b91812afb6\") " pod="openstack/notifications-rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.392163    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rj8vn\" (UniqueName: \"kubernetes.io/projected/df9ca02d-e20f-4f55-ba14-92b91812afb6-kube-api-access-rj8vn\") pod \"notifications-rabbitmq-server-0\" (UID: \"df9ca02d-e20f-4f55-ba14-92b91812afb6\") " pod="openstack/notifications-rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.417559    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"notifications-rabbitmq-server-0\" (UID: \"df9ca02d-e20f-4f55-ba14-92b91812afb6\") " pod="openstack/notifications-rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.434004    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"]
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.438255    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.444501    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.444781    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.444920    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.445271    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.445493    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.445627    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-dlhcb"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.446344    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.469939    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"]
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.485671    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/notifications-rabbitmq-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.583584    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8043f69c-832c-4afa-a9b9-211507664805-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8043f69c-832c-4afa-a9b9-211507664805\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.583838    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8043f69c-832c-4afa-a9b9-211507664805-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8043f69c-832c-4afa-a9b9-211507664805\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.583921    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8043f69c-832c-4afa-a9b9-211507664805\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.584011    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8043f69c-832c-4afa-a9b9-211507664805-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8043f69c-832c-4afa-a9b9-211507664805\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.584118    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8043f69c-832c-4afa-a9b9-211507664805-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8043f69c-832c-4afa-a9b9-211507664805\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.584203    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8043f69c-832c-4afa-a9b9-211507664805-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8043f69c-832c-4afa-a9b9-211507664805\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.584298    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8043f69c-832c-4afa-a9b9-211507664805-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8043f69c-832c-4afa-a9b9-211507664805\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.584372    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8043f69c-832c-4afa-a9b9-211507664805-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8043f69c-832c-4afa-a9b9-211507664805\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.584483    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8043f69c-832c-4afa-a9b9-211507664805-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8043f69c-832c-4afa-a9b9-211507664805\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.584599    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8043f69c-832c-4afa-a9b9-211507664805-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8043f69c-832c-4afa-a9b9-211507664805\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.584688    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vth2k\" (UniqueName: \"kubernetes.io/projected/8043f69c-832c-4afa-a9b9-211507664805-kube-api-access-vth2k\") pod \"rabbitmq-cell1-server-0\" (UID: \"8043f69c-832c-4afa-a9b9-211507664805\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.686909    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vth2k\" (UniqueName: \"kubernetes.io/projected/8043f69c-832c-4afa-a9b9-211507664805-kube-api-access-vth2k\") pod \"rabbitmq-cell1-server-0\" (UID: \"8043f69c-832c-4afa-a9b9-211507664805\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.686967    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8043f69c-832c-4afa-a9b9-211507664805-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8043f69c-832c-4afa-a9b9-211507664805\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.686991    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8043f69c-832c-4afa-a9b9-211507664805-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8043f69c-832c-4afa-a9b9-211507664805\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.687042    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8043f69c-832c-4afa-a9b9-211507664805\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.687072    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8043f69c-832c-4afa-a9b9-211507664805-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8043f69c-832c-4afa-a9b9-211507664805\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.687089    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8043f69c-832c-4afa-a9b9-211507664805-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8043f69c-832c-4afa-a9b9-211507664805\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.687112    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8043f69c-832c-4afa-a9b9-211507664805-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8043f69c-832c-4afa-a9b9-211507664805\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.687135    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8043f69c-832c-4afa-a9b9-211507664805-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8043f69c-832c-4afa-a9b9-211507664805\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.687152    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8043f69c-832c-4afa-a9b9-211507664805-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8043f69c-832c-4afa-a9b9-211507664805\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.687189    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8043f69c-832c-4afa-a9b9-211507664805-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8043f69c-832c-4afa-a9b9-211507664805\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.687223    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8043f69c-832c-4afa-a9b9-211507664805-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8043f69c-832c-4afa-a9b9-211507664805\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.688172    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8043f69c-832c-4afa-a9b9-211507664805-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8043f69c-832c-4afa-a9b9-211507664805\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.688654    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8043f69c-832c-4afa-a9b9-211507664805-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8043f69c-832c-4afa-a9b9-211507664805\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.688919    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8043f69c-832c-4afa-a9b9-211507664805-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8043f69c-832c-4afa-a9b9-211507664805\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.689551    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8043f69c-832c-4afa-a9b9-211507664805-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8043f69c-832c-4afa-a9b9-211507664805\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.689662    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8043f69c-832c-4afa-a9b9-211507664805-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8043f69c-832c-4afa-a9b9-211507664805\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.690219    4730 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8043f69c-832c-4afa-a9b9-211507664805\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-cell1-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.694198    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8043f69c-832c-4afa-a9b9-211507664805-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8043f69c-832c-4afa-a9b9-211507664805\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.696405    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8043f69c-832c-4afa-a9b9-211507664805-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8043f69c-832c-4afa-a9b9-211507664805\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.705803    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8043f69c-832c-4afa-a9b9-211507664805-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8043f69c-832c-4afa-a9b9-211507664805\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.712945    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vth2k\" (UniqueName: \"kubernetes.io/projected/8043f69c-832c-4afa-a9b9-211507664805-kube-api-access-vth2k\") pod \"rabbitmq-cell1-server-0\" (UID: \"8043f69c-832c-4afa-a9b9-211507664805\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.724879    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8043f69c-832c-4afa-a9b9-211507664805-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8043f69c-832c-4afa-a9b9-211507664805\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.729457    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8043f69c-832c-4afa-a9b9-211507664805\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 15:58:09 crc kubenswrapper[4730]: I0320 15:58:09.782450    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0"
Mar 20 15:58:10 crc kubenswrapper[4730]: I0320 15:58:10.802782    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"]
Mar 20 15:58:10 crc kubenswrapper[4730]: I0320 15:58:10.806998    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0"
Mar 20 15:58:10 crc kubenswrapper[4730]: I0320 15:58:10.811440    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data"
Mar 20 15:58:10 crc kubenswrapper[4730]: I0320 15:58:10.812257    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-8d9gs"
Mar 20 15:58:10 crc kubenswrapper[4730]: I0320 15:58:10.812402    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc"
Mar 20 15:58:10 crc kubenswrapper[4730]: I0320 15:58:10.812742    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts"
Mar 20 15:58:10 crc kubenswrapper[4730]: I0320 15:58:10.818808    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle"
Mar 20 15:58:10 crc kubenswrapper[4730]: I0320 15:58:10.824134    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"]
Mar 20 15:58:10 crc kubenswrapper[4730]: I0320 15:58:10.910521    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9glhm\" (UniqueName: \"kubernetes.io/projected/6abf778f-200f-4d48-97b6-08a638b4efa2-kube-api-access-9glhm\") pod \"openstack-galera-0\" (UID: \"6abf778f-200f-4d48-97b6-08a638b4efa2\") " pod="openstack/openstack-galera-0"
Mar 20 15:58:10 crc kubenswrapper[4730]: I0320 15:58:10.910631    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6abf778f-200f-4d48-97b6-08a638b4efa2-operator-scripts\") pod \"openstack-galera-0\" (UID: \"6abf778f-200f-4d48-97b6-08a638b4efa2\") " pod="openstack/openstack-galera-0"
Mar 20 15:58:10 crc kubenswrapper[4730]: I0320 15:58:10.910680    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6abf778f-200f-4d48-97b6-08a638b4efa2-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"6abf778f-200f-4d48-97b6-08a638b4efa2\") " pod="openstack/openstack-galera-0"
Mar 20 15:58:10 crc kubenswrapper[4730]: I0320 15:58:10.910708    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6abf778f-200f-4d48-97b6-08a638b4efa2-config-data-generated\") pod \"openstack-galera-0\" (UID: \"6abf778f-200f-4d48-97b6-08a638b4efa2\") " pod="openstack/openstack-galera-0"
Mar 20 15:58:10 crc kubenswrapper[4730]: I0320 15:58:10.910761    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6abf778f-200f-4d48-97b6-08a638b4efa2-config-data-default\") pod \"openstack-galera-0\" (UID: \"6abf778f-200f-4d48-97b6-08a638b4efa2\") " pod="openstack/openstack-galera-0"
Mar 20 15:58:10 crc kubenswrapper[4730]: I0320 15:58:10.910786    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6abf778f-200f-4d48-97b6-08a638b4efa2-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"6abf778f-200f-4d48-97b6-08a638b4efa2\") " pod="openstack/openstack-galera-0"
Mar 20 15:58:10 crc kubenswrapper[4730]: I0320 15:58:10.910824    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6abf778f-200f-4d48-97b6-08a638b4efa2-kolla-config\") pod \"openstack-galera-0\" (UID: \"6abf778f-200f-4d48-97b6-08a638b4efa2\") " pod="openstack/openstack-galera-0"
Mar 20 15:58:10 crc kubenswrapper[4730]: I0320 15:58:10.910851    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"6abf778f-200f-4d48-97b6-08a638b4efa2\") " pod="openstack/openstack-galera-0"
Mar 20 15:58:11 crc kubenswrapper[4730]: I0320 15:58:11.012050    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6abf778f-200f-4d48-97b6-08a638b4efa2-operator-scripts\") pod \"openstack-galera-0\" (UID: \"6abf778f-200f-4d48-97b6-08a638b4efa2\") " pod="openstack/openstack-galera-0"
Mar 20 15:58:11 crc kubenswrapper[4730]: I0320 15:58:11.012097    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6abf778f-200f-4d48-97b6-08a638b4efa2-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"6abf778f-200f-4d48-97b6-08a638b4efa2\") " pod="openstack/openstack-galera-0"
Mar 20 15:58:11 crc kubenswrapper[4730]: I0320 15:58:11.012130    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6abf778f-200f-4d48-97b6-08a638b4efa2-config-data-generated\") pod \"openstack-galera-0\" (UID: \"6abf778f-200f-4d48-97b6-08a638b4efa2\") " pod="openstack/openstack-galera-0"
Mar 20 15:58:11 crc kubenswrapper[4730]: I0320 15:58:11.012161    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6abf778f-200f-4d48-97b6-08a638b4efa2-config-data-default\") pod \"openstack-galera-0\" (UID: \"6abf778f-200f-4d48-97b6-08a638b4efa2\") " pod="openstack/openstack-galera-0"
Mar 20 15:58:11 crc kubenswrapper[4730]: I0320 15:58:11.012181    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6abf778f-200f-4d48-97b6-08a638b4efa2-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"6abf778f-200f-4d48-97b6-08a638b4efa2\") " pod="openstack/openstack-galera-0"
Mar 20 15:58:11 crc kubenswrapper[4730]: I0320 15:58:11.012199    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6abf778f-200f-4d48-97b6-08a638b4efa2-kolla-config\") pod \"openstack-galera-0\" (UID: \"6abf778f-200f-4d48-97b6-08a638b4efa2\") " pod="openstack/openstack-galera-0"
Mar 20 15:58:11 crc kubenswrapper[4730]: I0320 15:58:11.012221    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"6abf778f-200f-4d48-97b6-08a638b4efa2\") " pod="openstack/openstack-galera-0"
Mar 20 15:58:11 crc kubenswrapper[4730]: I0320 15:58:11.012286    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9glhm\" (UniqueName: \"kubernetes.io/projected/6abf778f-200f-4d48-97b6-08a638b4efa2-kube-api-access-9glhm\") pod \"openstack-galera-0\" (UID: \"6abf778f-200f-4d48-97b6-08a638b4efa2\") " pod="openstack/openstack-galera-0"
Mar 20 15:58:11 crc kubenswrapper[4730]: I0320 15:58:11.012867    4730 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"6abf778f-200f-4d48-97b6-08a638b4efa2\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/openstack-galera-0"
Mar 20 15:58:11 crc kubenswrapper[4730]: I0320 15:58:11.013315    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6abf778f-200f-4d48-97b6-08a638b4efa2-kolla-config\") pod \"openstack-galera-0\" (UID: \"6abf778f-200f-4d48-97b6-08a638b4efa2\") " pod="openstack/openstack-galera-0"
Mar 20 15:58:11 crc kubenswrapper[4730]: I0320 15:58:11.013510    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6abf778f-200f-4d48-97b6-08a638b4efa2-config-data-default\") pod \"openstack-galera-0\" (UID: \"6abf778f-200f-4d48-97b6-08a638b4efa2\") " pod="openstack/openstack-galera-0"
Mar 20 15:58:11 crc kubenswrapper[4730]: I0320 15:58:11.013958    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6abf778f-200f-4d48-97b6-08a638b4efa2-operator-scripts\") pod \"openstack-galera-0\" (UID: \"6abf778f-200f-4d48-97b6-08a638b4efa2\") " pod="openstack/openstack-galera-0"
Mar 20 15:58:11 crc kubenswrapper[4730]: I0320 15:58:11.014014    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6abf778f-200f-4d48-97b6-08a638b4efa2-config-data-generated\") pod \"openstack-galera-0\" (UID: \"6abf778f-200f-4d48-97b6-08a638b4efa2\") " pod="openstack/openstack-galera-0"
Mar 20 15:58:11 crc kubenswrapper[4730]: I0320 15:58:11.018154    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6abf778f-200f-4d48-97b6-08a638b4efa2-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"6abf778f-200f-4d48-97b6-08a638b4efa2\") " pod="openstack/openstack-galera-0"
Mar 20 15:58:11 crc kubenswrapper[4730]: I0320 15:58:11.030963    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6abf778f-200f-4d48-97b6-08a638b4efa2-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"6abf778f-200f-4d48-97b6-08a638b4efa2\") " pod="openstack/openstack-galera-0"
Mar 20 15:58:11 crc kubenswrapper[4730]: I0320 15:58:11.035873    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9glhm\" (UniqueName: \"kubernetes.io/projected/6abf778f-200f-4d48-97b6-08a638b4efa2-kube-api-access-9glhm\") pod \"openstack-galera-0\" (UID: \"6abf778f-200f-4d48-97b6-08a638b4efa2\") " pod="openstack/openstack-galera-0"
Mar 20 15:58:11 crc kubenswrapper[4730]: I0320 15:58:11.040825    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"6abf778f-200f-4d48-97b6-08a638b4efa2\") " pod="openstack/openstack-galera-0"
Mar 20 15:58:11 crc kubenswrapper[4730]: I0320 15:58:11.129944    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.232598    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"]
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.234161    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.237547    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-ghtpc"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.238175    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.238205    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.238337    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.258718    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"]
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.332354    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/899bd9ae-9354-4e70-ad37-b438a5a33a24-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"899bd9ae-9354-4e70-ad37-b438a5a33a24\") " pod="openstack/openstack-cell1-galera-0"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.332439    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"899bd9ae-9354-4e70-ad37-b438a5a33a24\") " pod="openstack/openstack-cell1-galera-0"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.332474    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/899bd9ae-9354-4e70-ad37-b438a5a33a24-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"899bd9ae-9354-4e70-ad37-b438a5a33a24\") " pod="openstack/openstack-cell1-galera-0"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.332542    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/899bd9ae-9354-4e70-ad37-b438a5a33a24-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"899bd9ae-9354-4e70-ad37-b438a5a33a24\") " pod="openstack/openstack-cell1-galera-0"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.332571    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9x25\" (UniqueName: \"kubernetes.io/projected/899bd9ae-9354-4e70-ad37-b438a5a33a24-kube-api-access-f9x25\") pod \"openstack-cell1-galera-0\" (UID: \"899bd9ae-9354-4e70-ad37-b438a5a33a24\") " pod="openstack/openstack-cell1-galera-0"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.332598    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/899bd9ae-9354-4e70-ad37-b438a5a33a24-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"899bd9ae-9354-4e70-ad37-b438a5a33a24\") " pod="openstack/openstack-cell1-galera-0"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.332631    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/899bd9ae-9354-4e70-ad37-b438a5a33a24-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"899bd9ae-9354-4e70-ad37-b438a5a33a24\") " pod="openstack/openstack-cell1-galera-0"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.332670    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/899bd9ae-9354-4e70-ad37-b438a5a33a24-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"899bd9ae-9354-4e70-ad37-b438a5a33a24\") " pod="openstack/openstack-cell1-galera-0"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.349282    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"]
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.350439    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.352468    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.352667    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-8d8cg"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.352843    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.354205    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"]
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.433933    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/899bd9ae-9354-4e70-ad37-b438a5a33a24-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"899bd9ae-9354-4e70-ad37-b438a5a33a24\") " pod="openstack/openstack-cell1-galera-0"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.434017    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/899bd9ae-9354-4e70-ad37-b438a5a33a24-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"899bd9ae-9354-4e70-ad37-b438a5a33a24\") " pod="openstack/openstack-cell1-galera-0"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.434067    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84bbdebb-43de-41d6-82d4-71b0948c25f8-combined-ca-bundle\") pod \"memcached-0\" (UID: \"84bbdebb-43de-41d6-82d4-71b0948c25f8\") " pod="openstack/memcached-0"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.434093    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"899bd9ae-9354-4e70-ad37-b438a5a33a24\") " pod="openstack/openstack-cell1-galera-0"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.434122    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/899bd9ae-9354-4e70-ad37-b438a5a33a24-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"899bd9ae-9354-4e70-ad37-b438a5a33a24\") " pod="openstack/openstack-cell1-galera-0"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.434143    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vggm6\" (UniqueName: \"kubernetes.io/projected/84bbdebb-43de-41d6-82d4-71b0948c25f8-kube-api-access-vggm6\") pod \"memcached-0\" (UID: \"84bbdebb-43de-41d6-82d4-71b0948c25f8\") " pod="openstack/memcached-0"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.434177    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/84bbdebb-43de-41d6-82d4-71b0948c25f8-kolla-config\") pod \"memcached-0\" (UID: \"84bbdebb-43de-41d6-82d4-71b0948c25f8\") " pod="openstack/memcached-0"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.434201    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/899bd9ae-9354-4e70-ad37-b438a5a33a24-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"899bd9ae-9354-4e70-ad37-b438a5a33a24\") " pod="openstack/openstack-cell1-galera-0"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.434226    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9x25\" (UniqueName: \"kubernetes.io/projected/899bd9ae-9354-4e70-ad37-b438a5a33a24-kube-api-access-f9x25\") pod \"openstack-cell1-galera-0\" (UID: \"899bd9ae-9354-4e70-ad37-b438a5a33a24\") " pod="openstack/openstack-cell1-galera-0"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.434346    4730 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"899bd9ae-9354-4e70-ad37-b438a5a33a24\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/openstack-cell1-galera-0"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.435186    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/899bd9ae-9354-4e70-ad37-b438a5a33a24-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"899bd9ae-9354-4e70-ad37-b438a5a33a24\") " pod="openstack/openstack-cell1-galera-0"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.435304    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/899bd9ae-9354-4e70-ad37-b438a5a33a24-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"899bd9ae-9354-4e70-ad37-b438a5a33a24\") " pod="openstack/openstack-cell1-galera-0"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.435533    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/899bd9ae-9354-4e70-ad37-b438a5a33a24-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"899bd9ae-9354-4e70-ad37-b438a5a33a24\") " pod="openstack/openstack-cell1-galera-0"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.436576    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/899bd9ae-9354-4e70-ad37-b438a5a33a24-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"899bd9ae-9354-4e70-ad37-b438a5a33a24\") " pod="openstack/openstack-cell1-galera-0"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.436656    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/899bd9ae-9354-4e70-ad37-b438a5a33a24-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"899bd9ae-9354-4e70-ad37-b438a5a33a24\") " pod="openstack/openstack-cell1-galera-0"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.436988    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/899bd9ae-9354-4e70-ad37-b438a5a33a24-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"899bd9ae-9354-4e70-ad37-b438a5a33a24\") " pod="openstack/openstack-cell1-galera-0"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.437605    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/84bbdebb-43de-41d6-82d4-71b0948c25f8-memcached-tls-certs\") pod \"memcached-0\" (UID: \"84bbdebb-43de-41d6-82d4-71b0948c25f8\") " pod="openstack/memcached-0"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.437654    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/84bbdebb-43de-41d6-82d4-71b0948c25f8-config-data\") pod \"memcached-0\" (UID: \"84bbdebb-43de-41d6-82d4-71b0948c25f8\") " pod="openstack/memcached-0"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.440292    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/899bd9ae-9354-4e70-ad37-b438a5a33a24-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"899bd9ae-9354-4e70-ad37-b438a5a33a24\") " pod="openstack/openstack-cell1-galera-0"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.450516    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/899bd9ae-9354-4e70-ad37-b438a5a33a24-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"899bd9ae-9354-4e70-ad37-b438a5a33a24\") " pod="openstack/openstack-cell1-galera-0"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.454153    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9x25\" (UniqueName: \"kubernetes.io/projected/899bd9ae-9354-4e70-ad37-b438a5a33a24-kube-api-access-f9x25\") pod \"openstack-cell1-galera-0\" (UID: \"899bd9ae-9354-4e70-ad37-b438a5a33a24\") " pod="openstack/openstack-cell1-galera-0"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.468904    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"899bd9ae-9354-4e70-ad37-b438a5a33a24\") " pod="openstack/openstack-cell1-galera-0"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.539008    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vggm6\" (UniqueName: \"kubernetes.io/projected/84bbdebb-43de-41d6-82d4-71b0948c25f8-kube-api-access-vggm6\") pod \"memcached-0\" (UID: \"84bbdebb-43de-41d6-82d4-71b0948c25f8\") " pod="openstack/memcached-0"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.539056    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/84bbdebb-43de-41d6-82d4-71b0948c25f8-kolla-config\") pod \"memcached-0\" (UID: \"84bbdebb-43de-41d6-82d4-71b0948c25f8\") " pod="openstack/memcached-0"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.539099    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/84bbdebb-43de-41d6-82d4-71b0948c25f8-memcached-tls-certs\") pod \"memcached-0\" (UID: \"84bbdebb-43de-41d6-82d4-71b0948c25f8\") " pod="openstack/memcached-0"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.539124    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/84bbdebb-43de-41d6-82d4-71b0948c25f8-config-data\") pod \"memcached-0\" (UID: \"84bbdebb-43de-41d6-82d4-71b0948c25f8\") " pod="openstack/memcached-0"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.539179    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84bbdebb-43de-41d6-82d4-71b0948c25f8-combined-ca-bundle\") pod \"memcached-0\" (UID: \"84bbdebb-43de-41d6-82d4-71b0948c25f8\") " pod="openstack/memcached-0"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.540073    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/84bbdebb-43de-41d6-82d4-71b0948c25f8-config-data\") pod \"memcached-0\" (UID: \"84bbdebb-43de-41d6-82d4-71b0948c25f8\") " pod="openstack/memcached-0"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.540888    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/84bbdebb-43de-41d6-82d4-71b0948c25f8-kolla-config\") pod \"memcached-0\" (UID: \"84bbdebb-43de-41d6-82d4-71b0948c25f8\") " pod="openstack/memcached-0"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.542326    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84bbdebb-43de-41d6-82d4-71b0948c25f8-combined-ca-bundle\") pod \"memcached-0\" (UID: \"84bbdebb-43de-41d6-82d4-71b0948c25f8\") " pod="openstack/memcached-0"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.542923    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/84bbdebb-43de-41d6-82d4-71b0948c25f8-memcached-tls-certs\") pod \"memcached-0\" (UID: \"84bbdebb-43de-41d6-82d4-71b0948c25f8\") " pod="openstack/memcached-0"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.566015    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vggm6\" (UniqueName: \"kubernetes.io/projected/84bbdebb-43de-41d6-82d4-71b0948c25f8-kube-api-access-vggm6\") pod \"memcached-0\" (UID: \"84bbdebb-43de-41d6-82d4-71b0948c25f8\") " pod="openstack/memcached-0"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.569083    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.666263    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.880450    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.880496    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.880531    4730 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.881557    4730 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5a28eadd1ac2eb334876364a020c16296d471cf45645c126a154825ac93c80d5"} pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted"
Mar 20 15:58:12 crc kubenswrapper[4730]: I0320 15:58:12.881612    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" containerID="cri-o://5a28eadd1ac2eb334876364a020c16296d471cf45645c126a154825ac93c80d5" gracePeriod=600
Mar 20 15:58:13 crc kubenswrapper[4730]: I0320 15:58:13.733237    4730 generic.go:334] "Generic (PLEG): container finished" podID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerID="5a28eadd1ac2eb334876364a020c16296d471cf45645c126a154825ac93c80d5" exitCode=0
Mar 20 15:58:13 crc kubenswrapper[4730]: I0320 15:58:13.733282    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" event={"ID":"7fcd3db3-55f1-4c23-8fa9-78844495cea3","Type":"ContainerDied","Data":"5a28eadd1ac2eb334876364a020c16296d471cf45645c126a154825ac93c80d5"}
Mar 20 15:58:13 crc kubenswrapper[4730]: I0320 15:58:13.733685    4730 scope.go:117] "RemoveContainer" containerID="4969adb306e949f48cbf48ac9e1452830c3458afd1750aa781060e2cc0952393"
Mar 20 15:58:14 crc kubenswrapper[4730]: I0320 15:58:14.891657    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"]
Mar 20 15:58:14 crc kubenswrapper[4730]: I0320 15:58:14.893158    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0"
Mar 20 15:58:14 crc kubenswrapper[4730]: I0320 15:58:14.896986    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-wwqfb"
Mar 20 15:58:14 crc kubenswrapper[4730]: I0320 15:58:14.914002    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"]
Mar 20 15:58:14 crc kubenswrapper[4730]: I0320 15:58:14.994755    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6bgz\" (UniqueName: \"kubernetes.io/projected/4938ac0e-1226-4f20-8f23-763b62b863c4-kube-api-access-d6bgz\") pod \"kube-state-metrics-0\" (UID: \"4938ac0e-1226-4f20-8f23-763b62b863c4\") " pod="openstack/kube-state-metrics-0"
Mar 20 15:58:15 crc kubenswrapper[4730]: I0320 15:58:15.095849    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6bgz\" (UniqueName: \"kubernetes.io/projected/4938ac0e-1226-4f20-8f23-763b62b863c4-kube-api-access-d6bgz\") pod \"kube-state-metrics-0\" (UID: \"4938ac0e-1226-4f20-8f23-763b62b863c4\") " pod="openstack/kube-state-metrics-0"
Mar 20 15:58:15 crc kubenswrapper[4730]: I0320 15:58:15.124428    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6bgz\" (UniqueName: \"kubernetes.io/projected/4938ac0e-1226-4f20-8f23-763b62b863c4-kube-api-access-d6bgz\") pod \"kube-state-metrics-0\" (UID: \"4938ac0e-1226-4f20-8f23-763b62b863c4\") " pod="openstack/kube-state-metrics-0"
Mar 20 15:58:15 crc kubenswrapper[4730]: I0320 15:58:15.212575    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0"
Mar 20 15:58:16 crc kubenswrapper[4730]: I0320 15:58:16.361040    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"]
Mar 20 15:58:16 crc kubenswrapper[4730]: I0320 15:58:16.363565    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0"
Mar 20 15:58:16 crc kubenswrapper[4730]: I0320 15:58:16.365610    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-2q5k6"
Mar 20 15:58:16 crc kubenswrapper[4730]: I0320 15:58:16.365876    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config"
Mar 20 15:58:16 crc kubenswrapper[4730]: I0320 15:58:16.366008    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage"
Mar 20 15:58:16 crc kubenswrapper[4730]: I0320 15:58:16.366208    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file"
Mar 20 15:58:16 crc kubenswrapper[4730]: I0320 15:58:16.365890    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1"
Mar 20 15:58:16 crc kubenswrapper[4730]: I0320 15:58:16.365931    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2"
Mar 20 15:58:16 crc kubenswrapper[4730]: I0320 15:58:16.372579    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0"
Mar 20 15:58:16 crc kubenswrapper[4730]: I0320 15:58:16.373926    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0"
Mar 20 15:58:16 crc kubenswrapper[4730]: I0320 15:58:16.387044    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"]
Mar 20 15:58:16 crc kubenswrapper[4730]: I0320 15:58:16.514695    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fdd3845a-3723-438f-aa58-606451baed6c-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"fdd3845a-3723-438f-aa58-606451baed6c\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:58:16 crc kubenswrapper[4730]: I0320 15:58:16.514739    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fdd3845a-3723-438f-aa58-606451baed6c-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"fdd3845a-3723-438f-aa58-606451baed6c\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:58:16 crc kubenswrapper[4730]: I0320 15:58:16.514924    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/fdd3845a-3723-438f-aa58-606451baed6c-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"fdd3845a-3723-438f-aa58-606451baed6c\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:58:16 crc kubenswrapper[4730]: I0320 15:58:16.514999    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-361883a8-e26d-4e93-ae2e-72ea2afef68d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-361883a8-e26d-4e93-ae2e-72ea2afef68d\") pod \"prometheus-metric-storage-0\" (UID: \"fdd3845a-3723-438f-aa58-606451baed6c\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:58:16 crc kubenswrapper[4730]: I0320 15:58:16.515131    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fdd3845a-3723-438f-aa58-606451baed6c-config\") pod \"prometheus-metric-storage-0\" (UID: \"fdd3845a-3723-438f-aa58-606451baed6c\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:58:16 crc kubenswrapper[4730]: I0320 15:58:16.515160    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/fdd3845a-3723-438f-aa58-606451baed6c-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"fdd3845a-3723-438f-aa58-606451baed6c\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:58:16 crc kubenswrapper[4730]: I0320 15:58:16.515334    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxrks\" (UniqueName: \"kubernetes.io/projected/fdd3845a-3723-438f-aa58-606451baed6c-kube-api-access-pxrks\") pod \"prometheus-metric-storage-0\" (UID: \"fdd3845a-3723-438f-aa58-606451baed6c\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:58:16 crc kubenswrapper[4730]: I0320 15:58:16.515496    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fdd3845a-3723-438f-aa58-606451baed6c-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"fdd3845a-3723-438f-aa58-606451baed6c\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:58:16 crc kubenswrapper[4730]: I0320 15:58:16.515612    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fdd3845a-3723-438f-aa58-606451baed6c-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"fdd3845a-3723-438f-aa58-606451baed6c\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:58:16 crc kubenswrapper[4730]: I0320 15:58:16.515696    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fdd3845a-3723-438f-aa58-606451baed6c-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"fdd3845a-3723-438f-aa58-606451baed6c\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:58:16 crc kubenswrapper[4730]: I0320 15:58:16.617293    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fdd3845a-3723-438f-aa58-606451baed6c-config\") pod \"prometheus-metric-storage-0\" (UID: \"fdd3845a-3723-438f-aa58-606451baed6c\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:58:16 crc kubenswrapper[4730]: I0320 15:58:16.617332    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/fdd3845a-3723-438f-aa58-606451baed6c-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"fdd3845a-3723-438f-aa58-606451baed6c\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:58:16 crc kubenswrapper[4730]: I0320 15:58:16.617364    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxrks\" (UniqueName: \"kubernetes.io/projected/fdd3845a-3723-438f-aa58-606451baed6c-kube-api-access-pxrks\") pod \"prometheus-metric-storage-0\" (UID: \"fdd3845a-3723-438f-aa58-606451baed6c\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:58:16 crc kubenswrapper[4730]: I0320 15:58:16.619013    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/fdd3845a-3723-438f-aa58-606451baed6c-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"fdd3845a-3723-438f-aa58-606451baed6c\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:58:16 crc kubenswrapper[4730]: I0320 15:58:16.622743    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fdd3845a-3723-438f-aa58-606451baed6c-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"fdd3845a-3723-438f-aa58-606451baed6c\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:58:16 crc kubenswrapper[4730]: I0320 15:58:16.623723    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/fdd3845a-3723-438f-aa58-606451baed6c-config\") pod \"prometheus-metric-storage-0\" (UID: \"fdd3845a-3723-438f-aa58-606451baed6c\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:58:16 crc kubenswrapper[4730]: I0320 15:58:16.625911    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fdd3845a-3723-438f-aa58-606451baed6c-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"fdd3845a-3723-438f-aa58-606451baed6c\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:58:16 crc kubenswrapper[4730]: I0320 15:58:16.630411    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fdd3845a-3723-438f-aa58-606451baed6c-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"fdd3845a-3723-438f-aa58-606451baed6c\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:58:16 crc kubenswrapper[4730]: I0320 15:58:16.630493    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fdd3845a-3723-438f-aa58-606451baed6c-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"fdd3845a-3723-438f-aa58-606451baed6c\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:58:16 crc kubenswrapper[4730]: I0320 15:58:16.630560    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fdd3845a-3723-438f-aa58-606451baed6c-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"fdd3845a-3723-438f-aa58-606451baed6c\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:58:16 crc kubenswrapper[4730]: I0320 15:58:16.630598    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fdd3845a-3723-438f-aa58-606451baed6c-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"fdd3845a-3723-438f-aa58-606451baed6c\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:58:16 crc kubenswrapper[4730]: I0320 15:58:16.630645    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/fdd3845a-3723-438f-aa58-606451baed6c-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"fdd3845a-3723-438f-aa58-606451baed6c\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:58:16 crc kubenswrapper[4730]: I0320 15:58:16.630692    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-361883a8-e26d-4e93-ae2e-72ea2afef68d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-361883a8-e26d-4e93-ae2e-72ea2afef68d\") pod \"prometheus-metric-storage-0\" (UID: \"fdd3845a-3723-438f-aa58-606451baed6c\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:58:16 crc kubenswrapper[4730]: I0320 15:58:16.631362    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/fdd3845a-3723-438f-aa58-606451baed6c-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"fdd3845a-3723-438f-aa58-606451baed6c\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:58:16 crc kubenswrapper[4730]: I0320 15:58:16.632231    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fdd3845a-3723-438f-aa58-606451baed6c-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"fdd3845a-3723-438f-aa58-606451baed6c\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:58:16 crc kubenswrapper[4730]: I0320 15:58:16.635212    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fdd3845a-3723-438f-aa58-606451baed6c-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"fdd3845a-3723-438f-aa58-606451baed6c\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:58:16 crc kubenswrapper[4730]: I0320 15:58:16.636072    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fdd3845a-3723-438f-aa58-606451baed6c-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"fdd3845a-3723-438f-aa58-606451baed6c\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:58:16 crc kubenswrapper[4730]: I0320 15:58:16.637664    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxrks\" (UniqueName: \"kubernetes.io/projected/fdd3845a-3723-438f-aa58-606451baed6c-kube-api-access-pxrks\") pod \"prometheus-metric-storage-0\" (UID: \"fdd3845a-3723-438f-aa58-606451baed6c\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:58:16 crc kubenswrapper[4730]: I0320 15:58:16.639617    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fdd3845a-3723-438f-aa58-606451baed6c-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"fdd3845a-3723-438f-aa58-606451baed6c\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:58:16 crc kubenswrapper[4730]: I0320 15:58:16.640404    4730 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice...
Mar 20 15:58:16 crc kubenswrapper[4730]: I0320 15:58:16.640431    4730 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-361883a8-e26d-4e93-ae2e-72ea2afef68d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-361883a8-e26d-4e93-ae2e-72ea2afef68d\") pod \"prometheus-metric-storage-0\" (UID: \"fdd3845a-3723-438f-aa58-606451baed6c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6c50c5c57c27fdb24da1fcbf3a7504c7bda45f4dc15a5678e0deb708aa433733/globalmount\"" pod="openstack/prometheus-metric-storage-0"
Mar 20 15:58:16 crc kubenswrapper[4730]: I0320 15:58:16.686637    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-361883a8-e26d-4e93-ae2e-72ea2afef68d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-361883a8-e26d-4e93-ae2e-72ea2afef68d\") pod \"prometheus-metric-storage-0\" (UID: \"fdd3845a-3723-438f-aa58-606451baed6c\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:58:16 crc kubenswrapper[4730]: I0320 15:58:16.723383    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.476716    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"]
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.478210    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.483569    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.483701    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.483751    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-klswc"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.483701    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.485043    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.501002    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"]
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.563122    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/caa1db28-afc0-4abc-aa80-84cccb3d8412-config\") pod \"ovsdbserver-nb-0\" (UID: \"caa1db28-afc0-4abc-aa80-84cccb3d8412\") " pod="openstack/ovsdbserver-nb-0"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.563187    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6t79w\" (UniqueName: \"kubernetes.io/projected/caa1db28-afc0-4abc-aa80-84cccb3d8412-kube-api-access-6t79w\") pod \"ovsdbserver-nb-0\" (UID: \"caa1db28-afc0-4abc-aa80-84cccb3d8412\") " pod="openstack/ovsdbserver-nb-0"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.563220    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"caa1db28-afc0-4abc-aa80-84cccb3d8412\") " pod="openstack/ovsdbserver-nb-0"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.563240    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/caa1db28-afc0-4abc-aa80-84cccb3d8412-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"caa1db28-afc0-4abc-aa80-84cccb3d8412\") " pod="openstack/ovsdbserver-nb-0"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.563352    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/caa1db28-afc0-4abc-aa80-84cccb3d8412-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"caa1db28-afc0-4abc-aa80-84cccb3d8412\") " pod="openstack/ovsdbserver-nb-0"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.563458    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caa1db28-afc0-4abc-aa80-84cccb3d8412-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"caa1db28-afc0-4abc-aa80-84cccb3d8412\") " pod="openstack/ovsdbserver-nb-0"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.564632    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/caa1db28-afc0-4abc-aa80-84cccb3d8412-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"caa1db28-afc0-4abc-aa80-84cccb3d8412\") " pod="openstack/ovsdbserver-nb-0"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.564710    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/caa1db28-afc0-4abc-aa80-84cccb3d8412-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"caa1db28-afc0-4abc-aa80-84cccb3d8412\") " pod="openstack/ovsdbserver-nb-0"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.666171    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/caa1db28-afc0-4abc-aa80-84cccb3d8412-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"caa1db28-afc0-4abc-aa80-84cccb3d8412\") " pod="openstack/ovsdbserver-nb-0"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.666282    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/caa1db28-afc0-4abc-aa80-84cccb3d8412-config\") pod \"ovsdbserver-nb-0\" (UID: \"caa1db28-afc0-4abc-aa80-84cccb3d8412\") " pod="openstack/ovsdbserver-nb-0"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.666353    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6t79w\" (UniqueName: \"kubernetes.io/projected/caa1db28-afc0-4abc-aa80-84cccb3d8412-kube-api-access-6t79w\") pod \"ovsdbserver-nb-0\" (UID: \"caa1db28-afc0-4abc-aa80-84cccb3d8412\") " pod="openstack/ovsdbserver-nb-0"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.666395    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"caa1db28-afc0-4abc-aa80-84cccb3d8412\") " pod="openstack/ovsdbserver-nb-0"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.666415    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/caa1db28-afc0-4abc-aa80-84cccb3d8412-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"caa1db28-afc0-4abc-aa80-84cccb3d8412\") " pod="openstack/ovsdbserver-nb-0"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.666516    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/caa1db28-afc0-4abc-aa80-84cccb3d8412-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"caa1db28-afc0-4abc-aa80-84cccb3d8412\") " pod="openstack/ovsdbserver-nb-0"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.666635    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caa1db28-afc0-4abc-aa80-84cccb3d8412-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"caa1db28-afc0-4abc-aa80-84cccb3d8412\") " pod="openstack/ovsdbserver-nb-0"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.666702    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/caa1db28-afc0-4abc-aa80-84cccb3d8412-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"caa1db28-afc0-4abc-aa80-84cccb3d8412\") " pod="openstack/ovsdbserver-nb-0"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.667181    4730 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"caa1db28-afc0-4abc-aa80-84cccb3d8412\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-nb-0"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.667509    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/caa1db28-afc0-4abc-aa80-84cccb3d8412-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"caa1db28-afc0-4abc-aa80-84cccb3d8412\") " pod="openstack/ovsdbserver-nb-0"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.668056    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/caa1db28-afc0-4abc-aa80-84cccb3d8412-config\") pod \"ovsdbserver-nb-0\" (UID: \"caa1db28-afc0-4abc-aa80-84cccb3d8412\") " pod="openstack/ovsdbserver-nb-0"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.668667    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/caa1db28-afc0-4abc-aa80-84cccb3d8412-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"caa1db28-afc0-4abc-aa80-84cccb3d8412\") " pod="openstack/ovsdbserver-nb-0"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.672103    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caa1db28-afc0-4abc-aa80-84cccb3d8412-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"caa1db28-afc0-4abc-aa80-84cccb3d8412\") " pod="openstack/ovsdbserver-nb-0"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.674831    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/caa1db28-afc0-4abc-aa80-84cccb3d8412-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"caa1db28-afc0-4abc-aa80-84cccb3d8412\") " pod="openstack/ovsdbserver-nb-0"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.679843    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/caa1db28-afc0-4abc-aa80-84cccb3d8412-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"caa1db28-afc0-4abc-aa80-84cccb3d8412\") " pod="openstack/ovsdbserver-nb-0"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.684155    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6t79w\" (UniqueName: \"kubernetes.io/projected/caa1db28-afc0-4abc-aa80-84cccb3d8412-kube-api-access-6t79w\") pod \"ovsdbserver-nb-0\" (UID: \"caa1db28-afc0-4abc-aa80-84cccb3d8412\") " pod="openstack/ovsdbserver-nb-0"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.694039    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"caa1db28-afc0-4abc-aa80-84cccb3d8412\") " pod="openstack/ovsdbserver-nb-0"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.724977    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-gtrnp"]
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.726986    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-gtrnp"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.729709    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-dbx4g"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.729792    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.730099    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.747611    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-cdd7f"]
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.749261    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-cdd7f"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.767237    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-gtrnp"]
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.784366    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-cdd7f"]
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.800378    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.870159    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/35efb2c2-6521-4f6f-a350-a4dc537ecaf8-scripts\") pod \"ovn-controller-ovs-cdd7f\" (UID: \"35efb2c2-6521-4f6f-a350-a4dc537ecaf8\") " pod="openstack/ovn-controller-ovs-cdd7f"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.870233    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/31651551-edb9-4793-a752-39fa60a85ee3-ovn-controller-tls-certs\") pod \"ovn-controller-gtrnp\" (UID: \"31651551-edb9-4793-a752-39fa60a85ee3\") " pod="openstack/ovn-controller-gtrnp"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.870291    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/35efb2c2-6521-4f6f-a350-a4dc537ecaf8-var-log\") pod \"ovn-controller-ovs-cdd7f\" (UID: \"35efb2c2-6521-4f6f-a350-a4dc537ecaf8\") " pod="openstack/ovn-controller-ovs-cdd7f"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.870316    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/31651551-edb9-4793-a752-39fa60a85ee3-var-log-ovn\") pod \"ovn-controller-gtrnp\" (UID: \"31651551-edb9-4793-a752-39fa60a85ee3\") " pod="openstack/ovn-controller-gtrnp"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.870346    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbsk7\" (UniqueName: \"kubernetes.io/projected/35efb2c2-6521-4f6f-a350-a4dc537ecaf8-kube-api-access-jbsk7\") pod \"ovn-controller-ovs-cdd7f\" (UID: \"35efb2c2-6521-4f6f-a350-a4dc537ecaf8\") " pod="openstack/ovn-controller-ovs-cdd7f"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.870576    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/35efb2c2-6521-4f6f-a350-a4dc537ecaf8-var-run\") pod \"ovn-controller-ovs-cdd7f\" (UID: \"35efb2c2-6521-4f6f-a350-a4dc537ecaf8\") " pod="openstack/ovn-controller-ovs-cdd7f"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.870632    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/35efb2c2-6521-4f6f-a350-a4dc537ecaf8-etc-ovs\") pod \"ovn-controller-ovs-cdd7f\" (UID: \"35efb2c2-6521-4f6f-a350-a4dc537ecaf8\") " pod="openstack/ovn-controller-ovs-cdd7f"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.870650    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31651551-edb9-4793-a752-39fa60a85ee3-combined-ca-bundle\") pod \"ovn-controller-gtrnp\" (UID: \"31651551-edb9-4793-a752-39fa60a85ee3\") " pod="openstack/ovn-controller-gtrnp"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.870689    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/35efb2c2-6521-4f6f-a350-a4dc537ecaf8-var-lib\") pod \"ovn-controller-ovs-cdd7f\" (UID: \"35efb2c2-6521-4f6f-a350-a4dc537ecaf8\") " pod="openstack/ovn-controller-ovs-cdd7f"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.870788    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/31651551-edb9-4793-a752-39fa60a85ee3-var-run\") pod \"ovn-controller-gtrnp\" (UID: \"31651551-edb9-4793-a752-39fa60a85ee3\") " pod="openstack/ovn-controller-gtrnp"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.870868    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/31651551-edb9-4793-a752-39fa60a85ee3-scripts\") pod \"ovn-controller-gtrnp\" (UID: \"31651551-edb9-4793-a752-39fa60a85ee3\") " pod="openstack/ovn-controller-gtrnp"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.871019    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfv4x\" (UniqueName: \"kubernetes.io/projected/31651551-edb9-4793-a752-39fa60a85ee3-kube-api-access-lfv4x\") pod \"ovn-controller-gtrnp\" (UID: \"31651551-edb9-4793-a752-39fa60a85ee3\") " pod="openstack/ovn-controller-gtrnp"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.871063    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/31651551-edb9-4793-a752-39fa60a85ee3-var-run-ovn\") pod \"ovn-controller-gtrnp\" (UID: \"31651551-edb9-4793-a752-39fa60a85ee3\") " pod="openstack/ovn-controller-gtrnp"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.972716    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfv4x\" (UniqueName: \"kubernetes.io/projected/31651551-edb9-4793-a752-39fa60a85ee3-kube-api-access-lfv4x\") pod \"ovn-controller-gtrnp\" (UID: \"31651551-edb9-4793-a752-39fa60a85ee3\") " pod="openstack/ovn-controller-gtrnp"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.972801    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/31651551-edb9-4793-a752-39fa60a85ee3-var-run-ovn\") pod \"ovn-controller-gtrnp\" (UID: \"31651551-edb9-4793-a752-39fa60a85ee3\") " pod="openstack/ovn-controller-gtrnp"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.972888    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/35efb2c2-6521-4f6f-a350-a4dc537ecaf8-scripts\") pod \"ovn-controller-ovs-cdd7f\" (UID: \"35efb2c2-6521-4f6f-a350-a4dc537ecaf8\") " pod="openstack/ovn-controller-ovs-cdd7f"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.972955    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/31651551-edb9-4793-a752-39fa60a85ee3-ovn-controller-tls-certs\") pod \"ovn-controller-gtrnp\" (UID: \"31651551-edb9-4793-a752-39fa60a85ee3\") " pod="openstack/ovn-controller-gtrnp"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.973031    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/35efb2c2-6521-4f6f-a350-a4dc537ecaf8-var-log\") pod \"ovn-controller-ovs-cdd7f\" (UID: \"35efb2c2-6521-4f6f-a350-a4dc537ecaf8\") " pod="openstack/ovn-controller-ovs-cdd7f"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.973060    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/31651551-edb9-4793-a752-39fa60a85ee3-var-log-ovn\") pod \"ovn-controller-gtrnp\" (UID: \"31651551-edb9-4793-a752-39fa60a85ee3\") " pod="openstack/ovn-controller-gtrnp"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.973134    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbsk7\" (UniqueName: \"kubernetes.io/projected/35efb2c2-6521-4f6f-a350-a4dc537ecaf8-kube-api-access-jbsk7\") pod \"ovn-controller-ovs-cdd7f\" (UID: \"35efb2c2-6521-4f6f-a350-a4dc537ecaf8\") " pod="openstack/ovn-controller-ovs-cdd7f"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.973233    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/35efb2c2-6521-4f6f-a350-a4dc537ecaf8-var-run\") pod \"ovn-controller-ovs-cdd7f\" (UID: \"35efb2c2-6521-4f6f-a350-a4dc537ecaf8\") " pod="openstack/ovn-controller-ovs-cdd7f"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.973311    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/35efb2c2-6521-4f6f-a350-a4dc537ecaf8-etc-ovs\") pod \"ovn-controller-ovs-cdd7f\" (UID: \"35efb2c2-6521-4f6f-a350-a4dc537ecaf8\") " pod="openstack/ovn-controller-ovs-cdd7f"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.973369    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31651551-edb9-4793-a752-39fa60a85ee3-combined-ca-bundle\") pod \"ovn-controller-gtrnp\" (UID: \"31651551-edb9-4793-a752-39fa60a85ee3\") " pod="openstack/ovn-controller-gtrnp"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.973409    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/35efb2c2-6521-4f6f-a350-a4dc537ecaf8-var-lib\") pod \"ovn-controller-ovs-cdd7f\" (UID: \"35efb2c2-6521-4f6f-a350-a4dc537ecaf8\") " pod="openstack/ovn-controller-ovs-cdd7f"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.973506    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/31651551-edb9-4793-a752-39fa60a85ee3-var-run\") pod \"ovn-controller-gtrnp\" (UID: \"31651551-edb9-4793-a752-39fa60a85ee3\") " pod="openstack/ovn-controller-gtrnp"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.973594    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/31651551-edb9-4793-a752-39fa60a85ee3-scripts\") pod \"ovn-controller-gtrnp\" (UID: \"31651551-edb9-4793-a752-39fa60a85ee3\") " pod="openstack/ovn-controller-gtrnp"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.974241    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/35efb2c2-6521-4f6f-a350-a4dc537ecaf8-var-log\") pod \"ovn-controller-ovs-cdd7f\" (UID: \"35efb2c2-6521-4f6f-a350-a4dc537ecaf8\") " pod="openstack/ovn-controller-ovs-cdd7f"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.974371    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/31651551-edb9-4793-a752-39fa60a85ee3-var-run\") pod \"ovn-controller-gtrnp\" (UID: \"31651551-edb9-4793-a752-39fa60a85ee3\") " pod="openstack/ovn-controller-gtrnp"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.974391    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/35efb2c2-6521-4f6f-a350-a4dc537ecaf8-var-run\") pod \"ovn-controller-ovs-cdd7f\" (UID: \"35efb2c2-6521-4f6f-a350-a4dc537ecaf8\") " pod="openstack/ovn-controller-ovs-cdd7f"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.974420    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/31651551-edb9-4793-a752-39fa60a85ee3-var-log-ovn\") pod \"ovn-controller-gtrnp\" (UID: \"31651551-edb9-4793-a752-39fa60a85ee3\") " pod="openstack/ovn-controller-gtrnp"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.974482    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/35efb2c2-6521-4f6f-a350-a4dc537ecaf8-var-lib\") pod \"ovn-controller-ovs-cdd7f\" (UID: \"35efb2c2-6521-4f6f-a350-a4dc537ecaf8\") " pod="openstack/ovn-controller-ovs-cdd7f"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.974484    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/35efb2c2-6521-4f6f-a350-a4dc537ecaf8-etc-ovs\") pod \"ovn-controller-ovs-cdd7f\" (UID: \"35efb2c2-6521-4f6f-a350-a4dc537ecaf8\") " pod="openstack/ovn-controller-ovs-cdd7f"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.974578    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/31651551-edb9-4793-a752-39fa60a85ee3-var-run-ovn\") pod \"ovn-controller-gtrnp\" (UID: \"31651551-edb9-4793-a752-39fa60a85ee3\") " pod="openstack/ovn-controller-gtrnp"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.977151    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31651551-edb9-4793-a752-39fa60a85ee3-combined-ca-bundle\") pod \"ovn-controller-gtrnp\" (UID: \"31651551-edb9-4793-a752-39fa60a85ee3\") " pod="openstack/ovn-controller-gtrnp"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.977572    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/35efb2c2-6521-4f6f-a350-a4dc537ecaf8-scripts\") pod \"ovn-controller-ovs-cdd7f\" (UID: \"35efb2c2-6521-4f6f-a350-a4dc537ecaf8\") " pod="openstack/ovn-controller-ovs-cdd7f"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.978194    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/31651551-edb9-4793-a752-39fa60a85ee3-ovn-controller-tls-certs\") pod \"ovn-controller-gtrnp\" (UID: \"31651551-edb9-4793-a752-39fa60a85ee3\") " pod="openstack/ovn-controller-gtrnp"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.979431    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/31651551-edb9-4793-a752-39fa60a85ee3-scripts\") pod \"ovn-controller-gtrnp\" (UID: \"31651551-edb9-4793-a752-39fa60a85ee3\") " pod="openstack/ovn-controller-gtrnp"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.989108    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbsk7\" (UniqueName: \"kubernetes.io/projected/35efb2c2-6521-4f6f-a350-a4dc537ecaf8-kube-api-access-jbsk7\") pod \"ovn-controller-ovs-cdd7f\" (UID: \"35efb2c2-6521-4f6f-a350-a4dc537ecaf8\") " pod="openstack/ovn-controller-ovs-cdd7f"
Mar 20 15:58:18 crc kubenswrapper[4730]: I0320 15:58:18.989860    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfv4x\" (UniqueName: \"kubernetes.io/projected/31651551-edb9-4793-a752-39fa60a85ee3-kube-api-access-lfv4x\") pod \"ovn-controller-gtrnp\" (UID: \"31651551-edb9-4793-a752-39fa60a85ee3\") " pod="openstack/ovn-controller-gtrnp"
Mar 20 15:58:19 crc kubenswrapper[4730]: I0320 15:58:19.099660    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-gtrnp"
Mar 20 15:58:19 crc kubenswrapper[4730]: I0320 15:58:19.113096    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-cdd7f"
Mar 20 15:58:21 crc kubenswrapper[4730]: E0320 15:58:21.466754    4730 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.147:5001/podified-master-centos10/openstack-neutron-server:watcher_latest"
Mar 20 15:58:21 crc kubenswrapper[4730]: E0320 15:58:21.467063    4730 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.147:5001/podified-master-centos10/openstack-neutron-server:watcher_latest"
Mar 20 15:58:21 crc kubenswrapper[4730]: E0320 15:58:21.467185    4730 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.102.83.147:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hxh4d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-bf56b5889-hwkdn_openstack(ae05e358-cb9f-4772-a644-8ec5131415eb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError"
Mar 20 15:58:21 crc kubenswrapper[4730]: E0320 15:58:21.469064    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-bf56b5889-hwkdn" podUID="ae05e358-cb9f-4772-a644-8ec5131415eb"
Mar 20 15:58:21 crc kubenswrapper[4730]: E0320 15:58:21.489858    4730 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.147:5001/podified-master-centos10/openstack-neutron-server:watcher_latest"
Mar 20 15:58:21 crc kubenswrapper[4730]: E0320 15:58:21.489916    4730 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.147:5001/podified-master-centos10/openstack-neutron-server:watcher_latest"
Mar 20 15:58:21 crc kubenswrapper[4730]: E0320 15:58:21.490043    4730 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.102.83.147:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rqsxv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-784b55c5d9-5wk8n_openstack(dcd9cd10-c13e-446b-9dad-8b30f04de37e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError"
Mar 20 15:58:21 crc kubenswrapper[4730]: E0320 15:58:21.491206    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-784b55c5d9-5wk8n" podUID="dcd9cd10-c13e-446b-9dad-8b30f04de37e"
Mar 20 15:58:21 crc kubenswrapper[4730]: I0320 15:58:21.875784    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"]
Mar 20 15:58:21 crc kubenswrapper[4730]: I0320 15:58:21.877882    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0"
Mar 20 15:58:21 crc kubenswrapper[4730]: I0320 15:58:21.881059    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-dm82v"
Mar 20 15:58:21 crc kubenswrapper[4730]: I0320 15:58:21.881652    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts"
Mar 20 15:58:21 crc kubenswrapper[4730]: I0320 15:58:21.881759    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config"
Mar 20 15:58:21 crc kubenswrapper[4730]: I0320 15:58:21.881848    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs"
Mar 20 15:58:21 crc kubenswrapper[4730]: I0320 15:58:21.882808    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"]
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.026059    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ba8c36f-1882-4bb3-bcb5-b3518ce35553-config\") pod \"ovsdbserver-sb-0\" (UID: \"1ba8c36f-1882-4bb3-bcb5-b3518ce35553\") " pod="openstack/ovsdbserver-sb-0"
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.026115    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ba8c36f-1882-4bb3-bcb5-b3518ce35553-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1ba8c36f-1882-4bb3-bcb5-b3518ce35553\") " pod="openstack/ovsdbserver-sb-0"
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.026214    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1ba8c36f-1882-4bb3-bcb5-b3518ce35553-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"1ba8c36f-1882-4bb3-bcb5-b3518ce35553\") " pod="openstack/ovsdbserver-sb-0"
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.026717    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ba8c36f-1882-4bb3-bcb5-b3518ce35553-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"1ba8c36f-1882-4bb3-bcb5-b3518ce35553\") " pod="openstack/ovsdbserver-sb-0"
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.026770    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjq4h\" (UniqueName: \"kubernetes.io/projected/1ba8c36f-1882-4bb3-bcb5-b3518ce35553-kube-api-access-rjq4h\") pod \"ovsdbserver-sb-0\" (UID: \"1ba8c36f-1882-4bb3-bcb5-b3518ce35553\") " pod="openstack/ovsdbserver-sb-0"
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.026802    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ba8c36f-1882-4bb3-bcb5-b3518ce35553-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1ba8c36f-1882-4bb3-bcb5-b3518ce35553\") " pod="openstack/ovsdbserver-sb-0"
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.026833    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ba8c36f-1882-4bb3-bcb5-b3518ce35553-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"1ba8c36f-1882-4bb3-bcb5-b3518ce35553\") " pod="openstack/ovsdbserver-sb-0"
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.026891    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"1ba8c36f-1882-4bb3-bcb5-b3518ce35553\") " pod="openstack/ovsdbserver-sb-0"
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.128301    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ba8c36f-1882-4bb3-bcb5-b3518ce35553-config\") pod \"ovsdbserver-sb-0\" (UID: \"1ba8c36f-1882-4bb3-bcb5-b3518ce35553\") " pod="openstack/ovsdbserver-sb-0"
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.128374    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ba8c36f-1882-4bb3-bcb5-b3518ce35553-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1ba8c36f-1882-4bb3-bcb5-b3518ce35553\") " pod="openstack/ovsdbserver-sb-0"
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.128423    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1ba8c36f-1882-4bb3-bcb5-b3518ce35553-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"1ba8c36f-1882-4bb3-bcb5-b3518ce35553\") " pod="openstack/ovsdbserver-sb-0"
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.128445    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ba8c36f-1882-4bb3-bcb5-b3518ce35553-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"1ba8c36f-1882-4bb3-bcb5-b3518ce35553\") " pod="openstack/ovsdbserver-sb-0"
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.128494    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjq4h\" (UniqueName: \"kubernetes.io/projected/1ba8c36f-1882-4bb3-bcb5-b3518ce35553-kube-api-access-rjq4h\") pod \"ovsdbserver-sb-0\" (UID: \"1ba8c36f-1882-4bb3-bcb5-b3518ce35553\") " pod="openstack/ovsdbserver-sb-0"
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.128532    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ba8c36f-1882-4bb3-bcb5-b3518ce35553-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1ba8c36f-1882-4bb3-bcb5-b3518ce35553\") " pod="openstack/ovsdbserver-sb-0"
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.128569    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ba8c36f-1882-4bb3-bcb5-b3518ce35553-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"1ba8c36f-1882-4bb3-bcb5-b3518ce35553\") " pod="openstack/ovsdbserver-sb-0"
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.128600    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"1ba8c36f-1882-4bb3-bcb5-b3518ce35553\") " pod="openstack/ovsdbserver-sb-0"
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.128943    4730 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"1ba8c36f-1882-4bb3-bcb5-b3518ce35553\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/ovsdbserver-sb-0"
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.131363    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ba8c36f-1882-4bb3-bcb5-b3518ce35553-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"1ba8c36f-1882-4bb3-bcb5-b3518ce35553\") " pod="openstack/ovsdbserver-sb-0"
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.132341    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ba8c36f-1882-4bb3-bcb5-b3518ce35553-config\") pod \"ovsdbserver-sb-0\" (UID: \"1ba8c36f-1882-4bb3-bcb5-b3518ce35553\") " pod="openstack/ovsdbserver-sb-0"
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.133267    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1ba8c36f-1882-4bb3-bcb5-b3518ce35553-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"1ba8c36f-1882-4bb3-bcb5-b3518ce35553\") " pod="openstack/ovsdbserver-sb-0"
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.152963    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ba8c36f-1882-4bb3-bcb5-b3518ce35553-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"1ba8c36f-1882-4bb3-bcb5-b3518ce35553\") " pod="openstack/ovsdbserver-sb-0"
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.153559    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ba8c36f-1882-4bb3-bcb5-b3518ce35553-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1ba8c36f-1882-4bb3-bcb5-b3518ce35553\") " pod="openstack/ovsdbserver-sb-0"
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.154092    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ba8c36f-1882-4bb3-bcb5-b3518ce35553-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1ba8c36f-1882-4bb3-bcb5-b3518ce35553\") " pod="openstack/ovsdbserver-sb-0"
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.157196    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjq4h\" (UniqueName: \"kubernetes.io/projected/1ba8c36f-1882-4bb3-bcb5-b3518ce35553-kube-api-access-rjq4h\") pod \"ovsdbserver-sb-0\" (UID: \"1ba8c36f-1882-4bb3-bcb5-b3518ce35553\") " pod="openstack/ovsdbserver-sb-0"
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.163428    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"1ba8c36f-1882-4bb3-bcb5-b3518ce35553\") " pod="openstack/ovsdbserver-sb-0"
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.190518    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74bcc47849-2r2xb"]
Mar 20 15:58:22 crc kubenswrapper[4730]: W0320 15:58:22.197088    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6abf778f_200f_4d48_97b6_08a638b4efa2.slice/crio-2fc583ef7f9e5c815bdce66332204108577039d4d2c809ecc6f6da695232596d WatchSource:0}: Error finding container 2fc583ef7f9e5c815bdce66332204108577039d4d2c809ecc6f6da695232596d: Status 404 returned error can't find the container with id 2fc583ef7f9e5c815bdce66332204108577039d4d2c809ecc6f6da695232596d
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.198406    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"]
Mar 20 15:58:22 crc kubenswrapper[4730]: W0320 15:58:22.202561    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8043f69c_832c_4afa_a9b9_211507664805.slice/crio-8412f9f53f9d92b76fea1c48228bf1b9d922d18161acc37ea8504ee75f4ce219 WatchSource:0}: Error finding container 8412f9f53f9d92b76fea1c48228bf1b9d922d18161acc37ea8504ee75f4ce219: Status 404 returned error can't find the container with id 8412f9f53f9d92b76fea1c48228bf1b9d922d18161acc37ea8504ee75f4ce219
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.205127    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0"
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.210096    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"]
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.367690    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bf56b5889-hwkdn"
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.373657    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-784b55c5d9-5wk8n"
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.435052    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae05e358-cb9f-4772-a644-8ec5131415eb-dns-svc\") pod \"ae05e358-cb9f-4772-a644-8ec5131415eb\" (UID: \"ae05e358-cb9f-4772-a644-8ec5131415eb\") "
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.435448    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxh4d\" (UniqueName: \"kubernetes.io/projected/ae05e358-cb9f-4772-a644-8ec5131415eb-kube-api-access-hxh4d\") pod \"ae05e358-cb9f-4772-a644-8ec5131415eb\" (UID: \"ae05e358-cb9f-4772-a644-8ec5131415eb\") "
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.435508    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae05e358-cb9f-4772-a644-8ec5131415eb-config\") pod \"ae05e358-cb9f-4772-a644-8ec5131415eb\" (UID: \"ae05e358-cb9f-4772-a644-8ec5131415eb\") "
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.435655    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae05e358-cb9f-4772-a644-8ec5131415eb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ae05e358-cb9f-4772-a644-8ec5131415eb" (UID: "ae05e358-cb9f-4772-a644-8ec5131415eb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.435866    4730 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae05e358-cb9f-4772-a644-8ec5131415eb-dns-svc\") on node \"crc\" DevicePath \"\""
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.436179    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae05e358-cb9f-4772-a644-8ec5131415eb-config" (OuterVolumeSpecName: "config") pod "ae05e358-cb9f-4772-a644-8ec5131415eb" (UID: "ae05e358-cb9f-4772-a644-8ec5131415eb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.441630    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae05e358-cb9f-4772-a644-8ec5131415eb-kube-api-access-hxh4d" (OuterVolumeSpecName: "kube-api-access-hxh4d") pod "ae05e358-cb9f-4772-a644-8ec5131415eb" (UID: "ae05e358-cb9f-4772-a644-8ec5131415eb"). InnerVolumeSpecName "kube-api-access-hxh4d". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.537367    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqsxv\" (UniqueName: \"kubernetes.io/projected/dcd9cd10-c13e-446b-9dad-8b30f04de37e-kube-api-access-rqsxv\") pod \"dcd9cd10-c13e-446b-9dad-8b30f04de37e\" (UID: \"dcd9cd10-c13e-446b-9dad-8b30f04de37e\") "
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.537498    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcd9cd10-c13e-446b-9dad-8b30f04de37e-config\") pod \"dcd9cd10-c13e-446b-9dad-8b30f04de37e\" (UID: \"dcd9cd10-c13e-446b-9dad-8b30f04de37e\") "
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.537920    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxh4d\" (UniqueName: \"kubernetes.io/projected/ae05e358-cb9f-4772-a644-8ec5131415eb-kube-api-access-hxh4d\") on node \"crc\" DevicePath \"\""
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.537933    4730 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae05e358-cb9f-4772-a644-8ec5131415eb-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.541395    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcd9cd10-c13e-446b-9dad-8b30f04de37e-config" (OuterVolumeSpecName: "config") pod "dcd9cd10-c13e-446b-9dad-8b30f04de37e" (UID: "dcd9cd10-c13e-446b-9dad-8b30f04de37e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.546545    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcd9cd10-c13e-446b-9dad-8b30f04de37e-kube-api-access-rqsxv" (OuterVolumeSpecName: "kube-api-access-rqsxv") pod "dcd9cd10-c13e-446b-9dad-8b30f04de37e" (UID: "dcd9cd10-c13e-446b-9dad-8b30f04de37e"). InnerVolumeSpecName "kube-api-access-rqsxv". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.563194    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68d64f5f8f-ncpth"]
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.651483    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqsxv\" (UniqueName: \"kubernetes.io/projected/dcd9cd10-c13e-446b-9dad-8b30f04de37e-kube-api-access-rqsxv\") on node \"crc\" DevicePath \"\""
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.651516    4730 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcd9cd10-c13e-446b-9dad-8b30f04de37e-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.726505    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/notifications-rabbitmq-server-0"]
Mar 20 15:58:22 crc kubenswrapper[4730]: W0320 15:58:22.744850    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf9ca02d_e20f_4f55_ba14_92b91812afb6.slice/crio-e318018f5b6c6fc0655140eb9d79b013847e5addc578d01fc6f2fbfd40ddada2 WatchSource:0}: Error finding container e318018f5b6c6fc0655140eb9d79b013847e5addc578d01fc6f2fbfd40ddada2: Status 404 returned error can't find the container with id e318018f5b6c6fc0655140eb9d79b013847e5addc578d01fc6f2fbfd40ddada2
Mar 20 15:58:22 crc kubenswrapper[4730]: W0320 15:58:22.749728    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27dbbb52_2bd1_4e24_b621_128e7c880a2b.slice/crio-a15e1598ce7b71e13d66b76d6a152b790714647ed13bd032c186f428dbc12d19 WatchSource:0}: Error finding container a15e1598ce7b71e13d66b76d6a152b790714647ed13bd032c186f428dbc12d19: Status 404 returned error can't find the container with id a15e1598ce7b71e13d66b76d6a152b790714647ed13bd032c186f428dbc12d19
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.751341    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7847d45595-fnchx"]
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.766884    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"]
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.779821    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"]
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.868158    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/notifications-rabbitmq-server-0" event={"ID":"df9ca02d-e20f-4f55-ba14-92b91812afb6","Type":"ContainerStarted","Data":"e318018f5b6c6fc0655140eb9d79b013847e5addc578d01fc6f2fbfd40ddada2"}
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.869543    4730 generic.go:334] "Generic (PLEG): container finished" podID="a5e88dae-c3fd-456c-92c6-3bc143b5a399" containerID="9eec9013a24b2740a1eda4c33bad2a0fe15b131b43355b27b55b42be661249e4" exitCode=0
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.869584    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74bcc47849-2r2xb" event={"ID":"a5e88dae-c3fd-456c-92c6-3bc143b5a399","Type":"ContainerDied","Data":"9eec9013a24b2740a1eda4c33bad2a0fe15b131b43355b27b55b42be661249e4"}
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.869599    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74bcc47849-2r2xb" event={"ID":"a5e88dae-c3fd-456c-92c6-3bc143b5a399","Type":"ContainerStarted","Data":"179fd0dd8f56e799ace1a1c345f43d1901fddb9fcf8b79a809a5d214fa5edf4a"}
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.872978    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-784b55c5d9-5wk8n" event={"ID":"dcd9cd10-c13e-446b-9dad-8b30f04de37e","Type":"ContainerDied","Data":"cc74e55810f8f5b64f9744e711295d16d00bc26b41f45c20f60b7913df187ae8"}
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.873022    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-784b55c5d9-5wk8n"
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.874008    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3","Type":"ContainerStarted","Data":"ddda948591892d2a138c1c22bc7cf5e93ad382615cc9ff618b810cf784bacaf9"}
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.875013    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"84bbdebb-43de-41d6-82d4-71b0948c25f8","Type":"ContainerStarted","Data":"54a8e4515bfc47a9d24eedccdb8072950e20a0d4b713e2481491bcc7d18538fc"}
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.875854    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8043f69c-832c-4afa-a9b9-211507664805","Type":"ContainerStarted","Data":"8412f9f53f9d92b76fea1c48228bf1b9d922d18161acc37ea8504ee75f4ce219"}
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.876934    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6abf778f-200f-4d48-97b6-08a638b4efa2","Type":"ContainerStarted","Data":"2fc583ef7f9e5c815bdce66332204108577039d4d2c809ecc6f6da695232596d"}
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.880021    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" event={"ID":"7fcd3db3-55f1-4c23-8fa9-78844495cea3","Type":"ContainerStarted","Data":"2aab75ddb2e10e731a7d582f69fae06a40e7e5a6270ff47496bdac5fb9c6ebfd"}
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.881981    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bf56b5889-hwkdn" event={"ID":"ae05e358-cb9f-4772-a644-8ec5131415eb","Type":"ContainerDied","Data":"0f15db04a290faf8bce634d61170874353004570e3569dc452521a6a7ac01ec4"}
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.882083    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bf56b5889-hwkdn"
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.895158    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7847d45595-fnchx" event={"ID":"27dbbb52-2bd1-4e24-b621-128e7c880a2b","Type":"ContainerStarted","Data":"a15e1598ce7b71e13d66b76d6a152b790714647ed13bd032c186f428dbc12d19"}
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.896821    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68d64f5f8f-ncpth" event={"ID":"7a5c1062-c366-4407-a395-cc3ad80ed296","Type":"ContainerStarted","Data":"180b2223e3765d491bb6fc4c8ff950162b5ee560c2cb4a3e29faf9b39fae29c7"}
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.896858    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68d64f5f8f-ncpth" event={"ID":"7a5c1062-c366-4407-a395-cc3ad80ed296","Type":"ContainerStarted","Data":"8379bbe1dd70d1272baf32264af24e932d0f78e8da002ecb096ba75a065d1af3"}
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.976178    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-784b55c5d9-5wk8n"]
Mar 20 15:58:22 crc kubenswrapper[4730]: I0320 15:58:22.984956    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-784b55c5d9-5wk8n"]
Mar 20 15:58:23 crc kubenswrapper[4730]: I0320 15:58:23.003109    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bf56b5889-hwkdn"]
Mar 20 15:58:23 crc kubenswrapper[4730]: I0320 15:58:23.029124    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bf56b5889-hwkdn"]
Mar 20 15:58:23 crc kubenswrapper[4730]: I0320 15:58:23.031067    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"]
Mar 20 15:58:23 crc kubenswrapper[4730]: I0320 15:58:23.037040    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"]
Mar 20 15:58:23 crc kubenswrapper[4730]: I0320 15:58:23.046989    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-gtrnp"]
Mar 20 15:58:23 crc kubenswrapper[4730]: I0320 15:58:23.058670    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"]
Mar 20 15:58:23 crc kubenswrapper[4730]: I0320 15:58:23.150327    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-cdd7f"]
Mar 20 15:58:23 crc kubenswrapper[4730]: W0320 15:58:23.188868    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35efb2c2_6521_4f6f_a350_a4dc537ecaf8.slice/crio-fd022285e838760cd95fe62d2e673a3f95b956772711184e1cf9b46740dcff5b WatchSource:0}: Error finding container fd022285e838760cd95fe62d2e673a3f95b956772711184e1cf9b46740dcff5b: Status 404 returned error can't find the container with id fd022285e838760cd95fe62d2e673a3f95b956772711184e1cf9b46740dcff5b
Mar 20 15:58:23 crc kubenswrapper[4730]: I0320 15:58:23.245635    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"]
Mar 20 15:58:23 crc kubenswrapper[4730]: I0320 15:58:23.300197    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68d64f5f8f-ncpth"
Mar 20 15:58:23 crc kubenswrapper[4730]: I0320 15:58:23.464187    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrdjk\" (UniqueName: \"kubernetes.io/projected/7a5c1062-c366-4407-a395-cc3ad80ed296-kube-api-access-mrdjk\") pod \"7a5c1062-c366-4407-a395-cc3ad80ed296\" (UID: \"7a5c1062-c366-4407-a395-cc3ad80ed296\") "
Mar 20 15:58:23 crc kubenswrapper[4730]: I0320 15:58:23.464311    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a5c1062-c366-4407-a395-cc3ad80ed296-dns-svc\") pod \"7a5c1062-c366-4407-a395-cc3ad80ed296\" (UID: \"7a5c1062-c366-4407-a395-cc3ad80ed296\") "
Mar 20 15:58:23 crc kubenswrapper[4730]: I0320 15:58:23.464366    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a5c1062-c366-4407-a395-cc3ad80ed296-config\") pod \"7a5c1062-c366-4407-a395-cc3ad80ed296\" (UID: \"7a5c1062-c366-4407-a395-cc3ad80ed296\") "
Mar 20 15:58:23 crc kubenswrapper[4730]: I0320 15:58:23.470810    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a5c1062-c366-4407-a395-cc3ad80ed296-kube-api-access-mrdjk" (OuterVolumeSpecName: "kube-api-access-mrdjk") pod "7a5c1062-c366-4407-a395-cc3ad80ed296" (UID: "7a5c1062-c366-4407-a395-cc3ad80ed296"). InnerVolumeSpecName "kube-api-access-mrdjk". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:58:23 crc kubenswrapper[4730]: I0320 15:58:23.486080    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a5c1062-c366-4407-a395-cc3ad80ed296-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7a5c1062-c366-4407-a395-cc3ad80ed296" (UID: "7a5c1062-c366-4407-a395-cc3ad80ed296"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:58:23 crc kubenswrapper[4730]: I0320 15:58:23.495319    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a5c1062-c366-4407-a395-cc3ad80ed296-config" (OuterVolumeSpecName: "config") pod "7a5c1062-c366-4407-a395-cc3ad80ed296" (UID: "7a5c1062-c366-4407-a395-cc3ad80ed296"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:58:23 crc kubenswrapper[4730]: I0320 15:58:23.548343    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae05e358-cb9f-4772-a644-8ec5131415eb" path="/var/lib/kubelet/pods/ae05e358-cb9f-4772-a644-8ec5131415eb/volumes"
Mar 20 15:58:23 crc kubenswrapper[4730]: I0320 15:58:23.548799    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcd9cd10-c13e-446b-9dad-8b30f04de37e" path="/var/lib/kubelet/pods/dcd9cd10-c13e-446b-9dad-8b30f04de37e/volumes"
Mar 20 15:58:23 crc kubenswrapper[4730]: I0320 15:58:23.567572    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrdjk\" (UniqueName: \"kubernetes.io/projected/7a5c1062-c366-4407-a395-cc3ad80ed296-kube-api-access-mrdjk\") on node \"crc\" DevicePath \"\""
Mar 20 15:58:23 crc kubenswrapper[4730]: I0320 15:58:23.567600    4730 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a5c1062-c366-4407-a395-cc3ad80ed296-dns-svc\") on node \"crc\" DevicePath \"\""
Mar 20 15:58:23 crc kubenswrapper[4730]: I0320 15:58:23.567609    4730 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a5c1062-c366-4407-a395-cc3ad80ed296-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:58:23 crc kubenswrapper[4730]: I0320 15:58:23.908964    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-gtrnp" event={"ID":"31651551-edb9-4793-a752-39fa60a85ee3","Type":"ContainerStarted","Data":"1d157ee12a3f603de7bc92a403912ba9effcb82755fea58f5d2d2033bc139fb0"}
Mar 20 15:58:23 crc kubenswrapper[4730]: I0320 15:58:23.910435    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"899bd9ae-9354-4e70-ad37-b438a5a33a24","Type":"ContainerStarted","Data":"88944d155607449a1db05424d7fa831eefbbe52c1129e2a15b4531a9f20134db"}
Mar 20 15:58:23 crc kubenswrapper[4730]: I0320 15:58:23.912962    4730 generic.go:334] "Generic (PLEG): container finished" podID="27dbbb52-2bd1-4e24-b621-128e7c880a2b" containerID="809194a2c64396f3bfaa32147cdb80cee474a4897b6ae0caac1fba9c383996f4" exitCode=0
Mar 20 15:58:23 crc kubenswrapper[4730]: I0320 15:58:23.913173    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7847d45595-fnchx" event={"ID":"27dbbb52-2bd1-4e24-b621-128e7c880a2b","Type":"ContainerDied","Data":"809194a2c64396f3bfaa32147cdb80cee474a4897b6ae0caac1fba9c383996f4"}
Mar 20 15:58:23 crc kubenswrapper[4730]: I0320 15:58:23.916102    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4938ac0e-1226-4f20-8f23-763b62b863c4","Type":"ContainerStarted","Data":"7d7a0da71b781876c622a4a6bd339b425ce64e7ebb40d9b345bbc6e39de452eb"}
Mar 20 15:58:23 crc kubenswrapper[4730]: I0320 15:58:23.919616    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74bcc47849-2r2xb" event={"ID":"a5e88dae-c3fd-456c-92c6-3bc143b5a399","Type":"ContainerStarted","Data":"15b7107144e36c5c008efc65f73b4c3e5a1b51a4b8a473402aba592b6c7329e1"}
Mar 20 15:58:23 crc kubenswrapper[4730]: I0320 15:58:23.920001    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74bcc47849-2r2xb"
Mar 20 15:58:23 crc kubenswrapper[4730]: I0320 15:58:23.923188    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1ba8c36f-1882-4bb3-bcb5-b3518ce35553","Type":"ContainerStarted","Data":"806def77f051b27a1eb53a83708824c91f1e36524e01a36d9fa7f6e9f8168f4e"}
Mar 20 15:58:23 crc kubenswrapper[4730]: I0320 15:58:23.924905    4730 generic.go:334] "Generic (PLEG): container finished" podID="7a5c1062-c366-4407-a395-cc3ad80ed296" containerID="180b2223e3765d491bb6fc4c8ff950162b5ee560c2cb4a3e29faf9b39fae29c7" exitCode=0
Mar 20 15:58:23 crc kubenswrapper[4730]: I0320 15:58:23.925051    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68d64f5f8f-ncpth" event={"ID":"7a5c1062-c366-4407-a395-cc3ad80ed296","Type":"ContainerDied","Data":"180b2223e3765d491bb6fc4c8ff950162b5ee560c2cb4a3e29faf9b39fae29c7"}
Mar 20 15:58:23 crc kubenswrapper[4730]: I0320 15:58:23.925086    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68d64f5f8f-ncpth" event={"ID":"7a5c1062-c366-4407-a395-cc3ad80ed296","Type":"ContainerDied","Data":"8379bbe1dd70d1272baf32264af24e932d0f78e8da002ecb096ba75a065d1af3"}
Mar 20 15:58:23 crc kubenswrapper[4730]: I0320 15:58:23.925151    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68d64f5f8f-ncpth"
Mar 20 15:58:23 crc kubenswrapper[4730]: I0320 15:58:23.925165    4730 scope.go:117] "RemoveContainer" containerID="180b2223e3765d491bb6fc4c8ff950162b5ee560c2cb4a3e29faf9b39fae29c7"
Mar 20 15:58:23 crc kubenswrapper[4730]: I0320 15:58:23.926981    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cdd7f" event={"ID":"35efb2c2-6521-4f6f-a350-a4dc537ecaf8","Type":"ContainerStarted","Data":"fd022285e838760cd95fe62d2e673a3f95b956772711184e1cf9b46740dcff5b"}
Mar 20 15:58:23 crc kubenswrapper[4730]: I0320 15:58:23.937086    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fdd3845a-3723-438f-aa58-606451baed6c","Type":"ContainerStarted","Data":"1e180be81f67edf4314b8f53e9215c19609d113f31f1dc11bb8e1b622d9ae961"}
Mar 20 15:58:23 crc kubenswrapper[4730]: I0320 15:58:23.963718    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74bcc47849-2r2xb" podStartSLOduration=15.844895156 podStartE2EDuration="15.963696031s" podCreationTimestamp="2026-03-20 15:58:08 +0000 UTC" firstStartedPulling="2026-03-20 15:58:22.194025212 +0000 UTC m=+1161.407396581" lastFinishedPulling="2026-03-20 15:58:22.312826087 +0000 UTC m=+1161.526197456" observedRunningTime="2026-03-20 15:58:23.957601628 +0000 UTC m=+1163.170972997" watchObservedRunningTime="2026-03-20 15:58:23.963696031 +0000 UTC m=+1163.177067400"
Mar 20 15:58:24 crc kubenswrapper[4730]: I0320 15:58:24.032620    4730 scope.go:117] "RemoveContainer" containerID="180b2223e3765d491bb6fc4c8ff950162b5ee560c2cb4a3e29faf9b39fae29c7"
Mar 20 15:58:24 crc kubenswrapper[4730]: E0320 15:58:24.037060    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"180b2223e3765d491bb6fc4c8ff950162b5ee560c2cb4a3e29faf9b39fae29c7\": container with ID starting with 180b2223e3765d491bb6fc4c8ff950162b5ee560c2cb4a3e29faf9b39fae29c7 not found: ID does not exist" containerID="180b2223e3765d491bb6fc4c8ff950162b5ee560c2cb4a3e29faf9b39fae29c7"
Mar 20 15:58:24 crc kubenswrapper[4730]: I0320 15:58:24.037100    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"180b2223e3765d491bb6fc4c8ff950162b5ee560c2cb4a3e29faf9b39fae29c7"} err="failed to get container status \"180b2223e3765d491bb6fc4c8ff950162b5ee560c2cb4a3e29faf9b39fae29c7\": rpc error: code = NotFound desc = could not find container \"180b2223e3765d491bb6fc4c8ff950162b5ee560c2cb4a3e29faf9b39fae29c7\": container with ID starting with 180b2223e3765d491bb6fc4c8ff950162b5ee560c2cb4a3e29faf9b39fae29c7 not found: ID does not exist"
Mar 20 15:58:24 crc kubenswrapper[4730]: I0320 15:58:24.042383    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68d64f5f8f-ncpth"]
Mar 20 15:58:24 crc kubenswrapper[4730]: I0320 15:58:24.049563    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-68d64f5f8f-ncpth"]
Mar 20 15:58:24 crc kubenswrapper[4730]: E0320 15:58:24.140374    4730 log.go:32] "CreateContainer in sandbox from runtime service failed" err=<
Mar 20 15:58:24 crc kubenswrapper[4730]:         rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/27dbbb52-2bd1-4e24-b621-128e7c880a2b/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory
Mar 20 15:58:24 crc kubenswrapper[4730]:  > podSandboxID="a15e1598ce7b71e13d66b76d6a152b790714647ed13bd032c186f428dbc12d19"
Mar 20 15:58:24 crc kubenswrapper[4730]: E0320 15:58:24.140645    4730 kuberuntime_manager.go:1274] "Unhandled Error" err=<
Mar 20 15:58:24 crc kubenswrapper[4730]:         container &Container{Name:dnsmasq-dns,Image:38.102.83.147:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n658h5c5h88h68dhb6h57dhd4h697hb8h8fh74hb7h54fh54dh548h7h55dhb8h9fh55dh688h5bbh5d5h675h669hb7h67hbbhffh668h5c7hc5q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dgkz6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-7847d45595-fnchx_openstack(27dbbb52-2bd1-4e24-b621-128e7c880a2b): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/27dbbb52-2bd1-4e24-b621-128e7c880a2b/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory
Mar 20 15:58:24 crc kubenswrapper[4730]:  > logger="UnhandledError"
Mar 20 15:58:24 crc kubenswrapper[4730]: E0320 15:58:24.141903    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/27dbbb52-2bd1-4e24-b621-128e7c880a2b/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-7847d45595-fnchx" podUID="27dbbb52-2bd1-4e24-b621-128e7c880a2b"
Mar 20 15:58:24 crc kubenswrapper[4730]: I0320 15:58:24.189606    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"]
Mar 20 15:58:24 crc kubenswrapper[4730]: W0320 15:58:24.194279    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcaa1db28_afc0_4abc_aa80_84cccb3d8412.slice/crio-6feb9c0bdd6c4d26ab57540b5fa6559ab43da600a84557c93e7717682e410ad7 WatchSource:0}: Error finding container 6feb9c0bdd6c4d26ab57540b5fa6559ab43da600a84557c93e7717682e410ad7: Status 404 returned error can't find the container with id 6feb9c0bdd6c4d26ab57540b5fa6559ab43da600a84557c93e7717682e410ad7
Mar 20 15:58:24 crc kubenswrapper[4730]: I0320 15:58:24.945270    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"caa1db28-afc0-4abc-aa80-84cccb3d8412","Type":"ContainerStarted","Data":"6feb9c0bdd6c4d26ab57540b5fa6559ab43da600a84557c93e7717682e410ad7"}
Mar 20 15:58:25 crc kubenswrapper[4730]: I0320 15:58:25.545046    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a5c1062-c366-4407-a395-cc3ad80ed296" path="/var/lib/kubelet/pods/7a5c1062-c366-4407-a395-cc3ad80ed296/volumes"
Mar 20 15:58:25 crc kubenswrapper[4730]: I0320 15:58:25.953058    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7847d45595-fnchx" event={"ID":"27dbbb52-2bd1-4e24-b621-128e7c880a2b","Type":"ContainerStarted","Data":"10de99a85cf21dfd026eabd639047a86231ef84f5643c16ab2ab663cf70834f5"}
Mar 20 15:58:25 crc kubenswrapper[4730]: I0320 15:58:25.953585    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7847d45595-fnchx"
Mar 20 15:58:25 crc kubenswrapper[4730]: I0320 15:58:25.972448    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7847d45595-fnchx" podStartSLOduration=18.972428983 podStartE2EDuration="18.972428983s" podCreationTimestamp="2026-03-20 15:58:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:58:25.968412789 +0000 UTC m=+1165.181784168" watchObservedRunningTime="2026-03-20 15:58:25.972428983 +0000 UTC m=+1165.185800352"
Mar 20 15:58:28 crc kubenswrapper[4730]: I0320 15:58:28.594291    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74bcc47849-2r2xb"
Mar 20 15:58:28 crc kubenswrapper[4730]: I0320 15:58:28.655640    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7847d45595-fnchx"]
Mar 20 15:58:28 crc kubenswrapper[4730]: I0320 15:58:28.656185    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7847d45595-fnchx" podUID="27dbbb52-2bd1-4e24-b621-128e7c880a2b" containerName="dnsmasq-dns" containerID="cri-o://10de99a85cf21dfd026eabd639047a86231ef84f5643c16ab2ab663cf70834f5" gracePeriod=10
Mar 20 15:58:28 crc kubenswrapper[4730]: I0320 15:58:28.973569    4730 generic.go:334] "Generic (PLEG): container finished" podID="27dbbb52-2bd1-4e24-b621-128e7c880a2b" containerID="10de99a85cf21dfd026eabd639047a86231ef84f5643c16ab2ab663cf70834f5" exitCode=0
Mar 20 15:58:28 crc kubenswrapper[4730]: I0320 15:58:28.973639    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7847d45595-fnchx" event={"ID":"27dbbb52-2bd1-4e24-b621-128e7c880a2b","Type":"ContainerDied","Data":"10de99a85cf21dfd026eabd639047a86231ef84f5643c16ab2ab663cf70834f5"}
Mar 20 15:58:33 crc kubenswrapper[4730]: I0320 15:58:33.338718    4730 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7847d45595-fnchx" podUID="27dbbb52-2bd1-4e24-b621-128e7c880a2b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.107:5353: connect: connection refused"
Mar 20 15:58:34 crc kubenswrapper[4730]: E0320 15:58:34.986185    4730 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.147:5001/podified-master-centos10/openstack-memcached:watcher_latest"
Mar 20 15:58:34 crc kubenswrapper[4730]: E0320 15:58:34.986245    4730 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.147:5001/podified-master-centos10/openstack-memcached:watcher_latest"
Mar 20 15:58:34 crc kubenswrapper[4730]: E0320 15:58:34.986439    4730 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:38.102.83.147:5001/podified-master-centos10/openstack-memcached:watcher_latest,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:n687h645h66ch5bch5f7hf6h54fh689h67bhcchd5h688h5d9h5c4h5fh6bhd9h545hb8hd4h545hcfh5f4h649hd5h644h57dh78h5fch695h684h684q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/memcached.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/memcached.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vggm6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_openstack(84bbdebb-43de-41d6-82d4-71b0948c25f8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError"
Mar 20 15:58:34 crc kubenswrapper[4730]: E0320 15:58:34.987682    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/memcached-0" podUID="84bbdebb-43de-41d6-82d4-71b0948c25f8"
Mar 20 15:58:35 crc kubenswrapper[4730]: E0320 15:58:35.030685    4730 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.147:5001/podified-master-centos10/openstack-ovn-base:watcher_latest"
Mar 20 15:58:35 crc kubenswrapper[4730]: E0320 15:58:35.030798    4730 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.147:5001/podified-master-centos10/openstack-ovn-base:watcher_latest"
Mar 20 15:58:35 crc kubenswrapper[4730]: E0320 15:58:35.031025    4730 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:ovsdb-server-init,Image:38.102.83.147:5001/podified-master-centos10/openstack-ovn-base:watcher_latest,Command:[/usr/local/bin/container-scripts/init-ovsdb-server.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n546h568h5dch64fh59dh64h594h59fh85h578hb7h599h66fhd5h5d4h5fh57fhbh5bch68bh5b8h5ffhcch9fh548h594h578h65h594h558hf4h5c8q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-ovs,ReadOnly:false,MountPath:/etc/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-run,ReadOnly:false,MountPath:/var/run/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-log,ReadOnly:false,MountPath:/var/log/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-lib,ReadOnly:false,MountPath:/var/lib/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jbsk7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN SYS_ADMIN SYS_NICE],Drop:[],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-controller-ovs-cdd7f_openstack(35efb2c2-6521-4f6f-a350-a4dc537ecaf8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError"
Mar 20 15:58:35 crc kubenswrapper[4730]: E0320 15:58:35.032595    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdb-server-init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovn-controller-ovs-cdd7f" podUID="35efb2c2-6521-4f6f-a350-a4dc537ecaf8"
Mar 20 15:58:35 crc kubenswrapper[4730]: E0320 15:58:35.065289    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.147:5001/podified-master-centos10/openstack-memcached:watcher_latest\\\"\"" pod="openstack/memcached-0" podUID="84bbdebb-43de-41d6-82d4-71b0948c25f8"
Mar 20 15:58:36 crc kubenswrapper[4730]: I0320 15:58:36.033992    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7847d45595-fnchx"
Mar 20 15:58:36 crc kubenswrapper[4730]: I0320 15:58:36.077303    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7847d45595-fnchx"
Mar 20 15:58:36 crc kubenswrapper[4730]: I0320 15:58:36.077438    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7847d45595-fnchx" event={"ID":"27dbbb52-2bd1-4e24-b621-128e7c880a2b","Type":"ContainerDied","Data":"a15e1598ce7b71e13d66b76d6a152b790714647ed13bd032c186f428dbc12d19"}
Mar 20 15:58:36 crc kubenswrapper[4730]: I0320 15:58:36.077477    4730 scope.go:117] "RemoveContainer" containerID="10de99a85cf21dfd026eabd639047a86231ef84f5643c16ab2ab663cf70834f5"
Mar 20 15:58:36 crc kubenswrapper[4730]: E0320 15:58:36.078692    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdb-server-init\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.147:5001/podified-master-centos10/openstack-ovn-base:watcher_latest\\\"\"" pod="openstack/ovn-controller-ovs-cdd7f" podUID="35efb2c2-6521-4f6f-a350-a4dc537ecaf8"
Mar 20 15:58:36 crc kubenswrapper[4730]: I0320 15:58:36.178772    4730 scope.go:117] "RemoveContainer" containerID="809194a2c64396f3bfaa32147cdb80cee474a4897b6ae0caac1fba9c383996f4"
Mar 20 15:58:36 crc kubenswrapper[4730]: I0320 15:58:36.215391    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27dbbb52-2bd1-4e24-b621-128e7c880a2b-config\") pod \"27dbbb52-2bd1-4e24-b621-128e7c880a2b\" (UID: \"27dbbb52-2bd1-4e24-b621-128e7c880a2b\") "
Mar 20 15:58:36 crc kubenswrapper[4730]: I0320 15:58:36.215554    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27dbbb52-2bd1-4e24-b621-128e7c880a2b-dns-svc\") pod \"27dbbb52-2bd1-4e24-b621-128e7c880a2b\" (UID: \"27dbbb52-2bd1-4e24-b621-128e7c880a2b\") "
Mar 20 15:58:36 crc kubenswrapper[4730]: I0320 15:58:36.215612    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgkz6\" (UniqueName: \"kubernetes.io/projected/27dbbb52-2bd1-4e24-b621-128e7c880a2b-kube-api-access-dgkz6\") pod \"27dbbb52-2bd1-4e24-b621-128e7c880a2b\" (UID: \"27dbbb52-2bd1-4e24-b621-128e7c880a2b\") "
Mar 20 15:58:36 crc kubenswrapper[4730]: I0320 15:58:36.219331    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27dbbb52-2bd1-4e24-b621-128e7c880a2b-kube-api-access-dgkz6" (OuterVolumeSpecName: "kube-api-access-dgkz6") pod "27dbbb52-2bd1-4e24-b621-128e7c880a2b" (UID: "27dbbb52-2bd1-4e24-b621-128e7c880a2b"). InnerVolumeSpecName "kube-api-access-dgkz6". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:58:36 crc kubenswrapper[4730]: I0320 15:58:36.253502    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27dbbb52-2bd1-4e24-b621-128e7c880a2b-config" (OuterVolumeSpecName: "config") pod "27dbbb52-2bd1-4e24-b621-128e7c880a2b" (UID: "27dbbb52-2bd1-4e24-b621-128e7c880a2b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:58:36 crc kubenswrapper[4730]: I0320 15:58:36.254075    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27dbbb52-2bd1-4e24-b621-128e7c880a2b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "27dbbb52-2bd1-4e24-b621-128e7c880a2b" (UID: "27dbbb52-2bd1-4e24-b621-128e7c880a2b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:58:36 crc kubenswrapper[4730]: I0320 15:58:36.318430    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgkz6\" (UniqueName: \"kubernetes.io/projected/27dbbb52-2bd1-4e24-b621-128e7c880a2b-kube-api-access-dgkz6\") on node \"crc\" DevicePath \"\""
Mar 20 15:58:36 crc kubenswrapper[4730]: I0320 15:58:36.318459    4730 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27dbbb52-2bd1-4e24-b621-128e7c880a2b-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:58:36 crc kubenswrapper[4730]: I0320 15:58:36.318468    4730 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27dbbb52-2bd1-4e24-b621-128e7c880a2b-dns-svc\") on node \"crc\" DevicePath \"\""
Mar 20 15:58:36 crc kubenswrapper[4730]: I0320 15:58:36.413857    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7847d45595-fnchx"]
Mar 20 15:58:36 crc kubenswrapper[4730]: I0320 15:58:36.421622    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7847d45595-fnchx"]
Mar 20 15:58:37 crc kubenswrapper[4730]: I0320 15:58:37.561727    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27dbbb52-2bd1-4e24-b621-128e7c880a2b" path="/var/lib/kubelet/pods/27dbbb52-2bd1-4e24-b621-128e7c880a2b/volumes"
Mar 20 15:58:38 crc kubenswrapper[4730]: I0320 15:58:38.097238    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6abf778f-200f-4d48-97b6-08a638b4efa2","Type":"ContainerStarted","Data":"dc0fdc1377be29d3b722fe6d6bb77fb846f03d0f1df47f3a8d806d50aa0f28ac"}
Mar 20 15:58:38 crc kubenswrapper[4730]: I0320 15:58:38.101536    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-gtrnp" event={"ID":"31651551-edb9-4793-a752-39fa60a85ee3","Type":"ContainerStarted","Data":"673a614f86cc9d53b20490455a209d43255459235454710313658c86d4b4fa1d"}
Mar 20 15:58:38 crc kubenswrapper[4730]: I0320 15:58:38.101896    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-gtrnp"
Mar 20 15:58:38 crc kubenswrapper[4730]: I0320 15:58:38.104413    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"899bd9ae-9354-4e70-ad37-b438a5a33a24","Type":"ContainerStarted","Data":"e083a94b7b0bad04baccc30cbbd4595d9bfb295fd7de2e8fc839c48d6d9ed2c7"}
Mar 20 15:58:38 crc kubenswrapper[4730]: I0320 15:58:38.181161    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-gtrnp" podStartSLOduration=7.423269801 podStartE2EDuration="20.18058131s" podCreationTimestamp="2026-03-20 15:58:18 +0000 UTC" firstStartedPulling="2026-03-20 15:58:23.161587562 +0000 UTC m=+1162.374958931" lastFinishedPulling="2026-03-20 15:58:35.918899071 +0000 UTC m=+1175.132270440" observedRunningTime="2026-03-20 15:58:38.168072894 +0000 UTC m=+1177.381444273" watchObservedRunningTime="2026-03-20 15:58:38.18058131 +0000 UTC m=+1177.393952679"
Mar 20 15:58:38 crc kubenswrapper[4730]: I0320 15:58:38.205399    4730 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider
Mar 20 15:58:39 crc kubenswrapper[4730]: I0320 15:58:39.115017    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1ba8c36f-1882-4bb3-bcb5-b3518ce35553","Type":"ContainerStarted","Data":"d603b5b2ea36b221ffaacb3a93618a8977298365b92770e11ecdfad53d65ccc9"}
Mar 20 15:58:39 crc kubenswrapper[4730]: I0320 15:58:39.116861    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4938ac0e-1226-4f20-8f23-763b62b863c4","Type":"ContainerStarted","Data":"49b97730a0d8925f4b80deb7db0aeea30943ec9b5135c0345bbaf48837573e5f"}
Mar 20 15:58:39 crc kubenswrapper[4730]: I0320 15:58:39.117889    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0"
Mar 20 15:58:39 crc kubenswrapper[4730]: I0320 15:58:39.120630    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3","Type":"ContainerStarted","Data":"13985a1e2e3d58d396be0af6437cdcdb0bbdea54308502442707c077b36e9713"}
Mar 20 15:58:39 crc kubenswrapper[4730]: I0320 15:58:39.134095    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/notifications-rabbitmq-server-0" event={"ID":"df9ca02d-e20f-4f55-ba14-92b91812afb6","Type":"ContainerStarted","Data":"3d7e7a0cabaf1b38c1891734e06c9106a1e8c0af0454fdefaf773422d1dcf747"}
Mar 20 15:58:39 crc kubenswrapper[4730]: I0320 15:58:39.137588    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"caa1db28-afc0-4abc-aa80-84cccb3d8412","Type":"ContainerStarted","Data":"eacc6920f5467c8cf478da7a75cba21c23e90e10fceb1cccd444a405fb378d06"}
Mar 20 15:58:39 crc kubenswrapper[4730]: I0320 15:58:39.146805    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=10.32157212 podStartE2EDuration="25.146747851s" podCreationTimestamp="2026-03-20 15:58:14 +0000 UTC" firstStartedPulling="2026-03-20 15:58:23.035177151 +0000 UTC m=+1162.248548520" lastFinishedPulling="2026-03-20 15:58:37.860352882 +0000 UTC m=+1177.073724251" observedRunningTime="2026-03-20 15:58:39.139218617 +0000 UTC m=+1178.352589986" watchObservedRunningTime="2026-03-20 15:58:39.146747851 +0000 UTC m=+1178.360119220"
Mar 20 15:58:40 crc kubenswrapper[4730]: I0320 15:58:40.146739    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8043f69c-832c-4afa-a9b9-211507664805","Type":"ContainerStarted","Data":"5873082b81a5b9253ac47bf2bf3866502e40b3ccab836a111c3bd8134e015ee5"}
Mar 20 15:58:41 crc kubenswrapper[4730]: I0320 15:58:41.161618    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fdd3845a-3723-438f-aa58-606451baed6c","Type":"ContainerStarted","Data":"e4820a88fffd97776afd3e8f20ce1473d0c4e99acb7cc9e56c9e53eaef07563a"}
Mar 20 15:58:41 crc kubenswrapper[4730]: I0320 15:58:41.865386    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-ktnvd"]
Mar 20 15:58:41 crc kubenswrapper[4730]: E0320 15:58:41.865899    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a5c1062-c366-4407-a395-cc3ad80ed296" containerName="init"
Mar 20 15:58:41 crc kubenswrapper[4730]: I0320 15:58:41.865946    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a5c1062-c366-4407-a395-cc3ad80ed296" containerName="init"
Mar 20 15:58:41 crc kubenswrapper[4730]: E0320 15:58:41.866018    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27dbbb52-2bd1-4e24-b621-128e7c880a2b" containerName="dnsmasq-dns"
Mar 20 15:58:41 crc kubenswrapper[4730]: I0320 15:58:41.866026    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="27dbbb52-2bd1-4e24-b621-128e7c880a2b" containerName="dnsmasq-dns"
Mar 20 15:58:41 crc kubenswrapper[4730]: E0320 15:58:41.866057    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27dbbb52-2bd1-4e24-b621-128e7c880a2b" containerName="init"
Mar 20 15:58:41 crc kubenswrapper[4730]: I0320 15:58:41.866066    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="27dbbb52-2bd1-4e24-b621-128e7c880a2b" containerName="init"
Mar 20 15:58:41 crc kubenswrapper[4730]: I0320 15:58:41.868859    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="27dbbb52-2bd1-4e24-b621-128e7c880a2b" containerName="dnsmasq-dns"
Mar 20 15:58:41 crc kubenswrapper[4730]: I0320 15:58:41.868918    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a5c1062-c366-4407-a395-cc3ad80ed296" containerName="init"
Mar 20 15:58:41 crc kubenswrapper[4730]: I0320 15:58:41.872782    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-ktnvd"
Mar 20 15:58:41 crc kubenswrapper[4730]: I0320 15:58:41.875526    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config"
Mar 20 15:58:41 crc kubenswrapper[4730]: I0320 15:58:41.890087    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-ktnvd"]
Mar 20 15:58:41 crc kubenswrapper[4730]: I0320 15:58:41.949913    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c615e3c6-d705-46e8-a1e7-c1c86df055f5-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-ktnvd\" (UID: \"c615e3c6-d705-46e8-a1e7-c1c86df055f5\") " pod="openstack/ovn-controller-metrics-ktnvd"
Mar 20 15:58:41 crc kubenswrapper[4730]: I0320 15:58:41.949994    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/c615e3c6-d705-46e8-a1e7-c1c86df055f5-ovn-rundir\") pod \"ovn-controller-metrics-ktnvd\" (UID: \"c615e3c6-d705-46e8-a1e7-c1c86df055f5\") " pod="openstack/ovn-controller-metrics-ktnvd"
Mar 20 15:58:41 crc kubenswrapper[4730]: I0320 15:58:41.950016    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c615e3c6-d705-46e8-a1e7-c1c86df055f5-combined-ca-bundle\") pod \"ovn-controller-metrics-ktnvd\" (UID: \"c615e3c6-d705-46e8-a1e7-c1c86df055f5\") " pod="openstack/ovn-controller-metrics-ktnvd"
Mar 20 15:58:41 crc kubenswrapper[4730]: I0320 15:58:41.950123    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts9bc\" (UniqueName: \"kubernetes.io/projected/c615e3c6-d705-46e8-a1e7-c1c86df055f5-kube-api-access-ts9bc\") pod \"ovn-controller-metrics-ktnvd\" (UID: \"c615e3c6-d705-46e8-a1e7-c1c86df055f5\") " pod="openstack/ovn-controller-metrics-ktnvd"
Mar 20 15:58:41 crc kubenswrapper[4730]: I0320 15:58:41.950365    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c615e3c6-d705-46e8-a1e7-c1c86df055f5-config\") pod \"ovn-controller-metrics-ktnvd\" (UID: \"c615e3c6-d705-46e8-a1e7-c1c86df055f5\") " pod="openstack/ovn-controller-metrics-ktnvd"
Mar 20 15:58:41 crc kubenswrapper[4730]: I0320 15:58:41.950540    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/c615e3c6-d705-46e8-a1e7-c1c86df055f5-ovs-rundir\") pod \"ovn-controller-metrics-ktnvd\" (UID: \"c615e3c6-d705-46e8-a1e7-c1c86df055f5\") " pod="openstack/ovn-controller-metrics-ktnvd"
Mar 20 15:58:42 crc kubenswrapper[4730]: I0320 15:58:42.014293    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-88c65768c-zz9lq"]
Mar 20 15:58:42 crc kubenswrapper[4730]: I0320 15:58:42.015703    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-88c65768c-zz9lq"
Mar 20 15:58:42 crc kubenswrapper[4730]: I0320 15:58:42.017472    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb"
Mar 20 15:58:42 crc kubenswrapper[4730]: I0320 15:58:42.029476    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-88c65768c-zz9lq"]
Mar 20 15:58:42 crc kubenswrapper[4730]: I0320 15:58:42.052492    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/c615e3c6-d705-46e8-a1e7-c1c86df055f5-ovs-rundir\") pod \"ovn-controller-metrics-ktnvd\" (UID: \"c615e3c6-d705-46e8-a1e7-c1c86df055f5\") " pod="openstack/ovn-controller-metrics-ktnvd"
Mar 20 15:58:42 crc kubenswrapper[4730]: I0320 15:58:42.052761    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/c615e3c6-d705-46e8-a1e7-c1c86df055f5-ovs-rundir\") pod \"ovn-controller-metrics-ktnvd\" (UID: \"c615e3c6-d705-46e8-a1e7-c1c86df055f5\") " pod="openstack/ovn-controller-metrics-ktnvd"
Mar 20 15:58:42 crc kubenswrapper[4730]: I0320 15:58:42.052838    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c615e3c6-d705-46e8-a1e7-c1c86df055f5-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-ktnvd\" (UID: \"c615e3c6-d705-46e8-a1e7-c1c86df055f5\") " pod="openstack/ovn-controller-metrics-ktnvd"
Mar 20 15:58:42 crc kubenswrapper[4730]: I0320 15:58:42.053600    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/c615e3c6-d705-46e8-a1e7-c1c86df055f5-ovn-rundir\") pod \"ovn-controller-metrics-ktnvd\" (UID: \"c615e3c6-d705-46e8-a1e7-c1c86df055f5\") " pod="openstack/ovn-controller-metrics-ktnvd"
Mar 20 15:58:42 crc kubenswrapper[4730]: I0320 15:58:42.053628    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c615e3c6-d705-46e8-a1e7-c1c86df055f5-combined-ca-bundle\") pod \"ovn-controller-metrics-ktnvd\" (UID: \"c615e3c6-d705-46e8-a1e7-c1c86df055f5\") " pod="openstack/ovn-controller-metrics-ktnvd"
Mar 20 15:58:42 crc kubenswrapper[4730]: I0320 15:58:42.053648    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ts9bc\" (UniqueName: \"kubernetes.io/projected/c615e3c6-d705-46e8-a1e7-c1c86df055f5-kube-api-access-ts9bc\") pod \"ovn-controller-metrics-ktnvd\" (UID: \"c615e3c6-d705-46e8-a1e7-c1c86df055f5\") " pod="openstack/ovn-controller-metrics-ktnvd"
Mar 20 15:58:42 crc kubenswrapper[4730]: I0320 15:58:42.053696    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c615e3c6-d705-46e8-a1e7-c1c86df055f5-config\") pod \"ovn-controller-metrics-ktnvd\" (UID: \"c615e3c6-d705-46e8-a1e7-c1c86df055f5\") " pod="openstack/ovn-controller-metrics-ktnvd"
Mar 20 15:58:42 crc kubenswrapper[4730]: I0320 15:58:42.053774    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/c615e3c6-d705-46e8-a1e7-c1c86df055f5-ovn-rundir\") pod \"ovn-controller-metrics-ktnvd\" (UID: \"c615e3c6-d705-46e8-a1e7-c1c86df055f5\") " pod="openstack/ovn-controller-metrics-ktnvd"
Mar 20 15:58:42 crc kubenswrapper[4730]: I0320 15:58:42.054238    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c615e3c6-d705-46e8-a1e7-c1c86df055f5-config\") pod \"ovn-controller-metrics-ktnvd\" (UID: \"c615e3c6-d705-46e8-a1e7-c1c86df055f5\") " pod="openstack/ovn-controller-metrics-ktnvd"
Mar 20 15:58:42 crc kubenswrapper[4730]: I0320 15:58:42.068687    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c615e3c6-d705-46e8-a1e7-c1c86df055f5-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-ktnvd\" (UID: \"c615e3c6-d705-46e8-a1e7-c1c86df055f5\") " pod="openstack/ovn-controller-metrics-ktnvd"
Mar 20 15:58:42 crc kubenswrapper[4730]: I0320 15:58:42.069739    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c615e3c6-d705-46e8-a1e7-c1c86df055f5-combined-ca-bundle\") pod \"ovn-controller-metrics-ktnvd\" (UID: \"c615e3c6-d705-46e8-a1e7-c1c86df055f5\") " pod="openstack/ovn-controller-metrics-ktnvd"
Mar 20 15:58:42 crc kubenswrapper[4730]: I0320 15:58:42.085630    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ts9bc\" (UniqueName: \"kubernetes.io/projected/c615e3c6-d705-46e8-a1e7-c1c86df055f5-kube-api-access-ts9bc\") pod \"ovn-controller-metrics-ktnvd\" (UID: \"c615e3c6-d705-46e8-a1e7-c1c86df055f5\") " pod="openstack/ovn-controller-metrics-ktnvd"
Mar 20 15:58:42 crc kubenswrapper[4730]: I0320 15:58:42.155517    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae8a0d34-0de8-464b-b533-a3eb56c320f6-ovsdbserver-nb\") pod \"dnsmasq-dns-88c65768c-zz9lq\" (UID: \"ae8a0d34-0de8-464b-b533-a3eb56c320f6\") " pod="openstack/dnsmasq-dns-88c65768c-zz9lq"
Mar 20 15:58:42 crc kubenswrapper[4730]: I0320 15:58:42.155707    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae8a0d34-0de8-464b-b533-a3eb56c320f6-config\") pod \"dnsmasq-dns-88c65768c-zz9lq\" (UID: \"ae8a0d34-0de8-464b-b533-a3eb56c320f6\") " pod="openstack/dnsmasq-dns-88c65768c-zz9lq"
Mar 20 15:58:42 crc kubenswrapper[4730]: I0320 15:58:42.155771    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87jfq\" (UniqueName: \"kubernetes.io/projected/ae8a0d34-0de8-464b-b533-a3eb56c320f6-kube-api-access-87jfq\") pod \"dnsmasq-dns-88c65768c-zz9lq\" (UID: \"ae8a0d34-0de8-464b-b533-a3eb56c320f6\") " pod="openstack/dnsmasq-dns-88c65768c-zz9lq"
Mar 20 15:58:42 crc kubenswrapper[4730]: I0320 15:58:42.156011    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae8a0d34-0de8-464b-b533-a3eb56c320f6-dns-svc\") pod \"dnsmasq-dns-88c65768c-zz9lq\" (UID: \"ae8a0d34-0de8-464b-b533-a3eb56c320f6\") " pod="openstack/dnsmasq-dns-88c65768c-zz9lq"
Mar 20 15:58:42 crc kubenswrapper[4730]: I0320 15:58:42.201508    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-ktnvd"
Mar 20 15:58:42 crc kubenswrapper[4730]: I0320 15:58:42.208850    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-88c65768c-zz9lq"]
Mar 20 15:58:42 crc kubenswrapper[4730]: E0320 15:58:42.209363    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-87jfq ovsdbserver-nb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-88c65768c-zz9lq" podUID="ae8a0d34-0de8-464b-b533-a3eb56c320f6"
Mar 20 15:58:42 crc kubenswrapper[4730]: I0320 15:58:42.258114    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae8a0d34-0de8-464b-b533-a3eb56c320f6-config\") pod \"dnsmasq-dns-88c65768c-zz9lq\" (UID: \"ae8a0d34-0de8-464b-b533-a3eb56c320f6\") " pod="openstack/dnsmasq-dns-88c65768c-zz9lq"
Mar 20 15:58:42 crc kubenswrapper[4730]: I0320 15:58:42.258166    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87jfq\" (UniqueName: \"kubernetes.io/projected/ae8a0d34-0de8-464b-b533-a3eb56c320f6-kube-api-access-87jfq\") pod \"dnsmasq-dns-88c65768c-zz9lq\" (UID: \"ae8a0d34-0de8-464b-b533-a3eb56c320f6\") " pod="openstack/dnsmasq-dns-88c65768c-zz9lq"
Mar 20 15:58:42 crc kubenswrapper[4730]: I0320 15:58:42.258234    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae8a0d34-0de8-464b-b533-a3eb56c320f6-dns-svc\") pod \"dnsmasq-dns-88c65768c-zz9lq\" (UID: \"ae8a0d34-0de8-464b-b533-a3eb56c320f6\") " pod="openstack/dnsmasq-dns-88c65768c-zz9lq"
Mar 20 15:58:42 crc kubenswrapper[4730]: I0320 15:58:42.258375    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae8a0d34-0de8-464b-b533-a3eb56c320f6-ovsdbserver-nb\") pod \"dnsmasq-dns-88c65768c-zz9lq\" (UID: \"ae8a0d34-0de8-464b-b533-a3eb56c320f6\") " pod="openstack/dnsmasq-dns-88c65768c-zz9lq"
Mar 20 15:58:42 crc kubenswrapper[4730]: I0320 15:58:42.259875    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae8a0d34-0de8-464b-b533-a3eb56c320f6-dns-svc\") pod \"dnsmasq-dns-88c65768c-zz9lq\" (UID: \"ae8a0d34-0de8-464b-b533-a3eb56c320f6\") " pod="openstack/dnsmasq-dns-88c65768c-zz9lq"
Mar 20 15:58:42 crc kubenswrapper[4730]: I0320 15:58:42.259945    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae8a0d34-0de8-464b-b533-a3eb56c320f6-config\") pod \"dnsmasq-dns-88c65768c-zz9lq\" (UID: \"ae8a0d34-0de8-464b-b533-a3eb56c320f6\") " pod="openstack/dnsmasq-dns-88c65768c-zz9lq"
Mar 20 15:58:42 crc kubenswrapper[4730]: I0320 15:58:42.261861    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae8a0d34-0de8-464b-b533-a3eb56c320f6-ovsdbserver-nb\") pod \"dnsmasq-dns-88c65768c-zz9lq\" (UID: \"ae8a0d34-0de8-464b-b533-a3eb56c320f6\") " pod="openstack/dnsmasq-dns-88c65768c-zz9lq"
Mar 20 15:58:42 crc kubenswrapper[4730]: I0320 15:58:42.268744    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-648686d659-c5gtt"]
Mar 20 15:58:42 crc kubenswrapper[4730]: I0320 15:58:42.270103    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-648686d659-c5gtt"
Mar 20 15:58:42 crc kubenswrapper[4730]: I0320 15:58:42.271772    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb"
Mar 20 15:58:42 crc kubenswrapper[4730]: I0320 15:58:42.284113    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-648686d659-c5gtt"]
Mar 20 15:58:42 crc kubenswrapper[4730]: I0320 15:58:42.303720    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87jfq\" (UniqueName: \"kubernetes.io/projected/ae8a0d34-0de8-464b-b533-a3eb56c320f6-kube-api-access-87jfq\") pod \"dnsmasq-dns-88c65768c-zz9lq\" (UID: \"ae8a0d34-0de8-464b-b533-a3eb56c320f6\") " pod="openstack/dnsmasq-dns-88c65768c-zz9lq"
Mar 20 15:58:42 crc kubenswrapper[4730]: I0320 15:58:42.359681    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68e4e5c3-825d-477e-a403-0cae45a86806-ovsdbserver-sb\") pod \"dnsmasq-dns-648686d659-c5gtt\" (UID: \"68e4e5c3-825d-477e-a403-0cae45a86806\") " pod="openstack/dnsmasq-dns-648686d659-c5gtt"
Mar 20 15:58:42 crc kubenswrapper[4730]: I0320 15:58:42.359975    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68e4e5c3-825d-477e-a403-0cae45a86806-ovsdbserver-nb\") pod \"dnsmasq-dns-648686d659-c5gtt\" (UID: \"68e4e5c3-825d-477e-a403-0cae45a86806\") " pod="openstack/dnsmasq-dns-648686d659-c5gtt"
Mar 20 15:58:42 crc kubenswrapper[4730]: I0320 15:58:42.360036    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shdlr\" (UniqueName: \"kubernetes.io/projected/68e4e5c3-825d-477e-a403-0cae45a86806-kube-api-access-shdlr\") pod \"dnsmasq-dns-648686d659-c5gtt\" (UID: \"68e4e5c3-825d-477e-a403-0cae45a86806\") " pod="openstack/dnsmasq-dns-648686d659-c5gtt"
Mar 20 15:58:42 crc kubenswrapper[4730]: I0320 15:58:42.360080    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68e4e5c3-825d-477e-a403-0cae45a86806-config\") pod \"dnsmasq-dns-648686d659-c5gtt\" (UID: \"68e4e5c3-825d-477e-a403-0cae45a86806\") " pod="openstack/dnsmasq-dns-648686d659-c5gtt"
Mar 20 15:58:42 crc kubenswrapper[4730]: I0320 15:58:42.360101    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68e4e5c3-825d-477e-a403-0cae45a86806-dns-svc\") pod \"dnsmasq-dns-648686d659-c5gtt\" (UID: \"68e4e5c3-825d-477e-a403-0cae45a86806\") " pod="openstack/dnsmasq-dns-648686d659-c5gtt"
Mar 20 15:58:42 crc kubenswrapper[4730]: I0320 15:58:42.463536    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68e4e5c3-825d-477e-a403-0cae45a86806-ovsdbserver-nb\") pod \"dnsmasq-dns-648686d659-c5gtt\" (UID: \"68e4e5c3-825d-477e-a403-0cae45a86806\") " pod="openstack/dnsmasq-dns-648686d659-c5gtt"
Mar 20 15:58:42 crc kubenswrapper[4730]: I0320 15:58:42.463645    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shdlr\" (UniqueName: \"kubernetes.io/projected/68e4e5c3-825d-477e-a403-0cae45a86806-kube-api-access-shdlr\") pod \"dnsmasq-dns-648686d659-c5gtt\" (UID: \"68e4e5c3-825d-477e-a403-0cae45a86806\") " pod="openstack/dnsmasq-dns-648686d659-c5gtt"
Mar 20 15:58:42 crc kubenswrapper[4730]: I0320 15:58:42.463720    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68e4e5c3-825d-477e-a403-0cae45a86806-config\") pod \"dnsmasq-dns-648686d659-c5gtt\" (UID: \"68e4e5c3-825d-477e-a403-0cae45a86806\") " pod="openstack/dnsmasq-dns-648686d659-c5gtt"
Mar 20 15:58:42 crc kubenswrapper[4730]: I0320 15:58:42.463746    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68e4e5c3-825d-477e-a403-0cae45a86806-dns-svc\") pod \"dnsmasq-dns-648686d659-c5gtt\" (UID: \"68e4e5c3-825d-477e-a403-0cae45a86806\") " pod="openstack/dnsmasq-dns-648686d659-c5gtt"
Mar 20 15:58:42 crc kubenswrapper[4730]: I0320 15:58:42.463813    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68e4e5c3-825d-477e-a403-0cae45a86806-ovsdbserver-sb\") pod \"dnsmasq-dns-648686d659-c5gtt\" (UID: \"68e4e5c3-825d-477e-a403-0cae45a86806\") " pod="openstack/dnsmasq-dns-648686d659-c5gtt"
Mar 20 15:58:42 crc kubenswrapper[4730]: I0320 15:58:42.464831    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68e4e5c3-825d-477e-a403-0cae45a86806-ovsdbserver-sb\") pod \"dnsmasq-dns-648686d659-c5gtt\" (UID: \"68e4e5c3-825d-477e-a403-0cae45a86806\") " pod="openstack/dnsmasq-dns-648686d659-c5gtt"
Mar 20 15:58:42 crc kubenswrapper[4730]: I0320 15:58:42.465464    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68e4e5c3-825d-477e-a403-0cae45a86806-ovsdbserver-nb\") pod \"dnsmasq-dns-648686d659-c5gtt\" (UID: \"68e4e5c3-825d-477e-a403-0cae45a86806\") " pod="openstack/dnsmasq-dns-648686d659-c5gtt"
Mar 20 15:58:42 crc kubenswrapper[4730]: I0320 15:58:42.466594    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68e4e5c3-825d-477e-a403-0cae45a86806-config\") pod \"dnsmasq-dns-648686d659-c5gtt\" (UID: \"68e4e5c3-825d-477e-a403-0cae45a86806\") " pod="openstack/dnsmasq-dns-648686d659-c5gtt"
Mar 20 15:58:42 crc kubenswrapper[4730]: I0320 15:58:42.467375    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68e4e5c3-825d-477e-a403-0cae45a86806-dns-svc\") pod \"dnsmasq-dns-648686d659-c5gtt\" (UID: \"68e4e5c3-825d-477e-a403-0cae45a86806\") " pod="openstack/dnsmasq-dns-648686d659-c5gtt"
Mar 20 15:58:42 crc kubenswrapper[4730]: I0320 15:58:42.482613    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shdlr\" (UniqueName: \"kubernetes.io/projected/68e4e5c3-825d-477e-a403-0cae45a86806-kube-api-access-shdlr\") pod \"dnsmasq-dns-648686d659-c5gtt\" (UID: \"68e4e5c3-825d-477e-a403-0cae45a86806\") " pod="openstack/dnsmasq-dns-648686d659-c5gtt"
Mar 20 15:58:42 crc kubenswrapper[4730]: I0320 15:58:42.695734    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-648686d659-c5gtt"
Mar 20 15:58:42 crc kubenswrapper[4730]: I0320 15:58:42.713436    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-ktnvd"]
Mar 20 15:58:42 crc kubenswrapper[4730]: W0320 15:58:42.749211    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc615e3c6_d705_46e8_a1e7_c1c86df055f5.slice/crio-0e876e10d4381f7f141a114030c7addb898e1a59d3f1686111f8b89ec552304b WatchSource:0}: Error finding container 0e876e10d4381f7f141a114030c7addb898e1a59d3f1686111f8b89ec552304b: Status 404 returned error can't find the container with id 0e876e10d4381f7f141a114030c7addb898e1a59d3f1686111f8b89ec552304b
Mar 20 15:58:43 crc kubenswrapper[4730]: I0320 15:58:43.155355    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-648686d659-c5gtt"]
Mar 20 15:58:43 crc kubenswrapper[4730]: I0320 15:58:43.173646    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-ktnvd" event={"ID":"c615e3c6-d705-46e8-a1e7-c1c86df055f5","Type":"ContainerStarted","Data":"547d6dc86725d6e5ac2298d66636d1679015d01ffce4088ea8ef37a09f0ab2df"}
Mar 20 15:58:43 crc kubenswrapper[4730]: I0320 15:58:43.173691    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-ktnvd" event={"ID":"c615e3c6-d705-46e8-a1e7-c1c86df055f5","Type":"ContainerStarted","Data":"0e876e10d4381f7f141a114030c7addb898e1a59d3f1686111f8b89ec552304b"}
Mar 20 15:58:43 crc kubenswrapper[4730]: I0320 15:58:43.177411    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"caa1db28-afc0-4abc-aa80-84cccb3d8412","Type":"ContainerStarted","Data":"6320c4e2aa45fff9f52c077f4d0c3c449c955705c38ad50993cfbe211a48dbde"}
Mar 20 15:58:43 crc kubenswrapper[4730]: I0320 15:58:43.181603    4730 generic.go:334] "Generic (PLEG): container finished" podID="899bd9ae-9354-4e70-ad37-b438a5a33a24" containerID="e083a94b7b0bad04baccc30cbbd4595d9bfb295fd7de2e8fc839c48d6d9ed2c7" exitCode=0
Mar 20 15:58:43 crc kubenswrapper[4730]: I0320 15:58:43.181832    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"899bd9ae-9354-4e70-ad37-b438a5a33a24","Type":"ContainerDied","Data":"e083a94b7b0bad04baccc30cbbd4595d9bfb295fd7de2e8fc839c48d6d9ed2c7"}
Mar 20 15:58:43 crc kubenswrapper[4730]: I0320 15:58:43.188340    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-88c65768c-zz9lq"
Mar 20 15:58:43 crc kubenswrapper[4730]: I0320 15:58:43.189267    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1ba8c36f-1882-4bb3-bcb5-b3518ce35553","Type":"ContainerStarted","Data":"a49057e55a1263fc2d8b9406b94c75356e303544000ffcd8d57b91b5449f641f"}
Mar 20 15:58:43 crc kubenswrapper[4730]: I0320 15:58:43.204650    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-ktnvd" podStartSLOduration=2.204631393 podStartE2EDuration="2.204631393s" podCreationTimestamp="2026-03-20 15:58:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:58:43.192117588 +0000 UTC m=+1182.405488957" watchObservedRunningTime="2026-03-20 15:58:43.204631393 +0000 UTC m=+1182.418002762"
Mar 20 15:58:43 crc kubenswrapper[4730]: I0320 15:58:43.207145    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0"
Mar 20 15:58:43 crc kubenswrapper[4730]: I0320 15:58:43.239271    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=8.113116779 podStartE2EDuration="26.239242167s" podCreationTimestamp="2026-03-20 15:58:17 +0000 UTC" firstStartedPulling="2026-03-20 15:58:24.197441332 +0000 UTC m=+1163.410812701" lastFinishedPulling="2026-03-20 15:58:42.32356672 +0000 UTC m=+1181.536938089" observedRunningTime="2026-03-20 15:58:43.229121599 +0000 UTC m=+1182.442492968" watchObservedRunningTime="2026-03-20 15:58:43.239242167 +0000 UTC m=+1182.452613526"
Mar 20 15:58:43 crc kubenswrapper[4730]: I0320 15:58:43.262450    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0"
Mar 20 15:58:43 crc kubenswrapper[4730]: I0320 15:58:43.300390    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-88c65768c-zz9lq"
Mar 20 15:58:43 crc kubenswrapper[4730]: I0320 15:58:43.315698    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.187637184 podStartE2EDuration="23.315682628s" podCreationTimestamp="2026-03-20 15:58:20 +0000 UTC" firstStartedPulling="2026-03-20 15:58:23.264532657 +0000 UTC m=+1162.477904026" lastFinishedPulling="2026-03-20 15:58:42.392578101 +0000 UTC m=+1181.605949470" observedRunningTime="2026-03-20 15:58:43.298419508 +0000 UTC m=+1182.511790877" watchObservedRunningTime="2026-03-20 15:58:43.315682628 +0000 UTC m=+1182.529053997"
Mar 20 15:58:43 crc kubenswrapper[4730]: I0320 15:58:43.482302    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae8a0d34-0de8-464b-b533-a3eb56c320f6-dns-svc\") pod \"ae8a0d34-0de8-464b-b533-a3eb56c320f6\" (UID: \"ae8a0d34-0de8-464b-b533-a3eb56c320f6\") "
Mar 20 15:58:43 crc kubenswrapper[4730]: I0320 15:58:43.482400    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae8a0d34-0de8-464b-b533-a3eb56c320f6-ovsdbserver-nb\") pod \"ae8a0d34-0de8-464b-b533-a3eb56c320f6\" (UID: \"ae8a0d34-0de8-464b-b533-a3eb56c320f6\") "
Mar 20 15:58:43 crc kubenswrapper[4730]: I0320 15:58:43.482445    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87jfq\" (UniqueName: \"kubernetes.io/projected/ae8a0d34-0de8-464b-b533-a3eb56c320f6-kube-api-access-87jfq\") pod \"ae8a0d34-0de8-464b-b533-a3eb56c320f6\" (UID: \"ae8a0d34-0de8-464b-b533-a3eb56c320f6\") "
Mar 20 15:58:43 crc kubenswrapper[4730]: I0320 15:58:43.482517    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae8a0d34-0de8-464b-b533-a3eb56c320f6-config\") pod \"ae8a0d34-0de8-464b-b533-a3eb56c320f6\" (UID: \"ae8a0d34-0de8-464b-b533-a3eb56c320f6\") "
Mar 20 15:58:43 crc kubenswrapper[4730]: I0320 15:58:43.482860    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae8a0d34-0de8-464b-b533-a3eb56c320f6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ae8a0d34-0de8-464b-b533-a3eb56c320f6" (UID: "ae8a0d34-0de8-464b-b533-a3eb56c320f6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:58:43 crc kubenswrapper[4730]: I0320 15:58:43.483015    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae8a0d34-0de8-464b-b533-a3eb56c320f6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ae8a0d34-0de8-464b-b533-a3eb56c320f6" (UID: "ae8a0d34-0de8-464b-b533-a3eb56c320f6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:58:43 crc kubenswrapper[4730]: I0320 15:58:43.483269    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae8a0d34-0de8-464b-b533-a3eb56c320f6-config" (OuterVolumeSpecName: "config") pod "ae8a0d34-0de8-464b-b533-a3eb56c320f6" (UID: "ae8a0d34-0de8-464b-b533-a3eb56c320f6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:58:43 crc kubenswrapper[4730]: I0320 15:58:43.487863    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae8a0d34-0de8-464b-b533-a3eb56c320f6-kube-api-access-87jfq" (OuterVolumeSpecName: "kube-api-access-87jfq") pod "ae8a0d34-0de8-464b-b533-a3eb56c320f6" (UID: "ae8a0d34-0de8-464b-b533-a3eb56c320f6"). InnerVolumeSpecName "kube-api-access-87jfq". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:58:43 crc kubenswrapper[4730]: I0320 15:58:43.584094    4730 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae8a0d34-0de8-464b-b533-a3eb56c320f6-dns-svc\") on node \"crc\" DevicePath \"\""
Mar 20 15:58:43 crc kubenswrapper[4730]: I0320 15:58:43.584127    4730 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae8a0d34-0de8-464b-b533-a3eb56c320f6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\""
Mar 20 15:58:43 crc kubenswrapper[4730]: I0320 15:58:43.584139    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87jfq\" (UniqueName: \"kubernetes.io/projected/ae8a0d34-0de8-464b-b533-a3eb56c320f6-kube-api-access-87jfq\") on node \"crc\" DevicePath \"\""
Mar 20 15:58:43 crc kubenswrapper[4730]: I0320 15:58:43.584148    4730 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae8a0d34-0de8-464b-b533-a3eb56c320f6-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:58:43 crc kubenswrapper[4730]: I0320 15:58:43.801355    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0"
Mar 20 15:58:44 crc kubenswrapper[4730]: I0320 15:58:44.198208    4730 generic.go:334] "Generic (PLEG): container finished" podID="68e4e5c3-825d-477e-a403-0cae45a86806" containerID="c68e3d34495e1743eae1f1377e46b36b943735fc7af94f6c4a34f2182614de74" exitCode=0
Mar 20 15:58:44 crc kubenswrapper[4730]: I0320 15:58:44.198345    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-648686d659-c5gtt" event={"ID":"68e4e5c3-825d-477e-a403-0cae45a86806","Type":"ContainerDied","Data":"c68e3d34495e1743eae1f1377e46b36b943735fc7af94f6c4a34f2182614de74"}
Mar 20 15:58:44 crc kubenswrapper[4730]: I0320 15:58:44.198374    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-648686d659-c5gtt" event={"ID":"68e4e5c3-825d-477e-a403-0cae45a86806","Type":"ContainerStarted","Data":"4818c9171456a6ff54f39a98823d5ad13ee602182702fbabd7963c0de43a3bf8"}
Mar 20 15:58:44 crc kubenswrapper[4730]: I0320 15:58:44.200119    4730 generic.go:334] "Generic (PLEG): container finished" podID="6abf778f-200f-4d48-97b6-08a638b4efa2" containerID="dc0fdc1377be29d3b722fe6d6bb77fb846f03d0f1df47f3a8d806d50aa0f28ac" exitCode=0
Mar 20 15:58:44 crc kubenswrapper[4730]: I0320 15:58:44.200165    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6abf778f-200f-4d48-97b6-08a638b4efa2","Type":"ContainerDied","Data":"dc0fdc1377be29d3b722fe6d6bb77fb846f03d0f1df47f3a8d806d50aa0f28ac"}
Mar 20 15:58:44 crc kubenswrapper[4730]: I0320 15:58:44.202387    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"899bd9ae-9354-4e70-ad37-b438a5a33a24","Type":"ContainerStarted","Data":"e3a213186b82a3bbc61ecd10d2f4aed4539fa2830e21ca4c0bb702d6f7bc640b"}
Mar 20 15:58:44 crc kubenswrapper[4730]: I0320 15:58:44.202491    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-88c65768c-zz9lq"
Mar 20 15:58:44 crc kubenswrapper[4730]: I0320 15:58:44.204293    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0"
Mar 20 15:58:44 crc kubenswrapper[4730]: I0320 15:58:44.274335    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-88c65768c-zz9lq"]
Mar 20 15:58:44 crc kubenswrapper[4730]: I0320 15:58:44.288166    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-88c65768c-zz9lq"]
Mar 20 15:58:44 crc kubenswrapper[4730]: I0320 15:58:44.288244    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0"
Mar 20 15:58:44 crc kubenswrapper[4730]: I0320 15:58:44.317323    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=20.416300375 podStartE2EDuration="33.317303697s" podCreationTimestamp="2026-03-20 15:58:11 +0000 UTC" firstStartedPulling="2026-03-20 15:58:23.116488861 +0000 UTC m=+1162.329860230" lastFinishedPulling="2026-03-20 15:58:36.017492173 +0000 UTC m=+1175.230863552" observedRunningTime="2026-03-20 15:58:44.312754857 +0000 UTC m=+1183.526126236" watchObservedRunningTime="2026-03-20 15:58:44.317303697 +0000 UTC m=+1183.530675066"
Mar 20 15:58:45 crc kubenswrapper[4730]: I0320 15:58:45.211273    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6abf778f-200f-4d48-97b6-08a638b4efa2","Type":"ContainerStarted","Data":"1dca84c6132b08065c2467e557f8c71c754b08f037a4e15bc1f4b58a30cfcf02"}
Mar 20 15:58:45 crc kubenswrapper[4730]: I0320 15:58:45.213806    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-648686d659-c5gtt" event={"ID":"68e4e5c3-825d-477e-a403-0cae45a86806","Type":"ContainerStarted","Data":"fe73524182281c96c8ff70a33be500f38666a259e3cec724e30e6c153a1251ae"}
Mar 20 15:58:45 crc kubenswrapper[4730]: I0320 15:58:45.214106    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-648686d659-c5gtt"
Mar 20 15:58:45 crc kubenswrapper[4730]: I0320 15:58:45.220933    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0"
Mar 20 15:58:45 crc kubenswrapper[4730]: I0320 15:58:45.251800    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=21.862043826 podStartE2EDuration="36.251779926s" podCreationTimestamp="2026-03-20 15:58:09 +0000 UTC" firstStartedPulling="2026-03-20 15:58:22.199041764 +0000 UTC m=+1161.412413133" lastFinishedPulling="2026-03-20 15:58:36.588777864 +0000 UTC m=+1175.802149233" observedRunningTime="2026-03-20 15:58:45.238074978 +0000 UTC m=+1184.451446347" watchObservedRunningTime="2026-03-20 15:58:45.251779926 +0000 UTC m=+1184.465151295"
Mar 20 15:58:45 crc kubenswrapper[4730]: I0320 15:58:45.542867    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae8a0d34-0de8-464b-b533-a3eb56c320f6" path="/var/lib/kubelet/pods/ae8a0d34-0de8-464b-b533-a3eb56c320f6/volumes"
Mar 20 15:58:45 crc kubenswrapper[4730]: I0320 15:58:45.800972    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0"
Mar 20 15:58:45 crc kubenswrapper[4730]: I0320 15:58:45.840886    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0"
Mar 20 15:58:45 crc kubenswrapper[4730]: I0320 15:58:45.865594    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-648686d659-c5gtt" podStartSLOduration=3.865573805 podStartE2EDuration="3.865573805s" podCreationTimestamp="2026-03-20 15:58:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:58:45.303870716 +0000 UTC m=+1184.517242085" watchObservedRunningTime="2026-03-20 15:58:45.865573805 +0000 UTC m=+1185.078945174"
Mar 20 15:58:46 crc kubenswrapper[4730]: I0320 15:58:46.222002    4730 generic.go:334] "Generic (PLEG): container finished" podID="fdd3845a-3723-438f-aa58-606451baed6c" containerID="e4820a88fffd97776afd3e8f20ce1473d0c4e99acb7cc9e56c9e53eaef07563a" exitCode=0
Mar 20 15:58:46 crc kubenswrapper[4730]: I0320 15:58:46.222079    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fdd3845a-3723-438f-aa58-606451baed6c","Type":"ContainerDied","Data":"e4820a88fffd97776afd3e8f20ce1473d0c4e99acb7cc9e56c9e53eaef07563a"}
Mar 20 15:58:46 crc kubenswrapper[4730]: I0320 15:58:46.269810    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0"
Mar 20 15:58:46 crc kubenswrapper[4730]: I0320 15:58:46.525933    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"]
Mar 20 15:58:46 crc kubenswrapper[4730]: I0320 15:58:46.527384    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0"
Mar 20 15:58:46 crc kubenswrapper[4730]: I0320 15:58:46.529389    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config"
Mar 20 15:58:46 crc kubenswrapper[4730]: I0320 15:58:46.529785    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts"
Mar 20 15:58:46 crc kubenswrapper[4730]: I0320 15:58:46.530084    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs"
Mar 20 15:58:46 crc kubenswrapper[4730]: I0320 15:58:46.530448    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-pqq9k"
Mar 20 15:58:46 crc kubenswrapper[4730]: I0320 15:58:46.541785    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"]
Mar 20 15:58:46 crc kubenswrapper[4730]: I0320 15:58:46.632814    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cb9ef9a-6d98-43c1-8e74-7f24ba39357d-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"4cb9ef9a-6d98-43c1-8e74-7f24ba39357d\") " pod="openstack/ovn-northd-0"
Mar 20 15:58:46 crc kubenswrapper[4730]: I0320 15:58:46.632861    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cb9ef9a-6d98-43c1-8e74-7f24ba39357d-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"4cb9ef9a-6d98-43c1-8e74-7f24ba39357d\") " pod="openstack/ovn-northd-0"
Mar 20 15:58:46 crc kubenswrapper[4730]: I0320 15:58:46.632909    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4cb9ef9a-6d98-43c1-8e74-7f24ba39357d-scripts\") pod \"ovn-northd-0\" (UID: \"4cb9ef9a-6d98-43c1-8e74-7f24ba39357d\") " pod="openstack/ovn-northd-0"
Mar 20 15:58:46 crc kubenswrapper[4730]: I0320 15:58:46.632932    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cb9ef9a-6d98-43c1-8e74-7f24ba39357d-config\") pod \"ovn-northd-0\" (UID: \"4cb9ef9a-6d98-43c1-8e74-7f24ba39357d\") " pod="openstack/ovn-northd-0"
Mar 20 15:58:46 crc kubenswrapper[4730]: I0320 15:58:46.632971    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cb9ef9a-6d98-43c1-8e74-7f24ba39357d-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"4cb9ef9a-6d98-43c1-8e74-7f24ba39357d\") " pod="openstack/ovn-northd-0"
Mar 20 15:58:46 crc kubenswrapper[4730]: I0320 15:58:46.633038    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4v6j\" (UniqueName: \"kubernetes.io/projected/4cb9ef9a-6d98-43c1-8e74-7f24ba39357d-kube-api-access-p4v6j\") pod \"ovn-northd-0\" (UID: \"4cb9ef9a-6d98-43c1-8e74-7f24ba39357d\") " pod="openstack/ovn-northd-0"
Mar 20 15:58:46 crc kubenswrapper[4730]: I0320 15:58:46.633053    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4cb9ef9a-6d98-43c1-8e74-7f24ba39357d-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"4cb9ef9a-6d98-43c1-8e74-7f24ba39357d\") " pod="openstack/ovn-northd-0"
Mar 20 15:58:46 crc kubenswrapper[4730]: I0320 15:58:46.734268    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cb9ef9a-6d98-43c1-8e74-7f24ba39357d-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"4cb9ef9a-6d98-43c1-8e74-7f24ba39357d\") " pod="openstack/ovn-northd-0"
Mar 20 15:58:46 crc kubenswrapper[4730]: I0320 15:58:46.734531    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cb9ef9a-6d98-43c1-8e74-7f24ba39357d-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"4cb9ef9a-6d98-43c1-8e74-7f24ba39357d\") " pod="openstack/ovn-northd-0"
Mar 20 15:58:46 crc kubenswrapper[4730]: I0320 15:58:46.734566    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4cb9ef9a-6d98-43c1-8e74-7f24ba39357d-scripts\") pod \"ovn-northd-0\" (UID: \"4cb9ef9a-6d98-43c1-8e74-7f24ba39357d\") " pod="openstack/ovn-northd-0"
Mar 20 15:58:46 crc kubenswrapper[4730]: I0320 15:58:46.734583    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cb9ef9a-6d98-43c1-8e74-7f24ba39357d-config\") pod \"ovn-northd-0\" (UID: \"4cb9ef9a-6d98-43c1-8e74-7f24ba39357d\") " pod="openstack/ovn-northd-0"
Mar 20 15:58:46 crc kubenswrapper[4730]: I0320 15:58:46.734603    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cb9ef9a-6d98-43c1-8e74-7f24ba39357d-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"4cb9ef9a-6d98-43c1-8e74-7f24ba39357d\") " pod="openstack/ovn-northd-0"
Mar 20 15:58:46 crc kubenswrapper[4730]: I0320 15:58:46.734637    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4v6j\" (UniqueName: \"kubernetes.io/projected/4cb9ef9a-6d98-43c1-8e74-7f24ba39357d-kube-api-access-p4v6j\") pod \"ovn-northd-0\" (UID: \"4cb9ef9a-6d98-43c1-8e74-7f24ba39357d\") " pod="openstack/ovn-northd-0"
Mar 20 15:58:46 crc kubenswrapper[4730]: I0320 15:58:46.734655    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4cb9ef9a-6d98-43c1-8e74-7f24ba39357d-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"4cb9ef9a-6d98-43c1-8e74-7f24ba39357d\") " pod="openstack/ovn-northd-0"
Mar 20 15:58:46 crc kubenswrapper[4730]: I0320 15:58:46.735031    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4cb9ef9a-6d98-43c1-8e74-7f24ba39357d-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"4cb9ef9a-6d98-43c1-8e74-7f24ba39357d\") " pod="openstack/ovn-northd-0"
Mar 20 15:58:46 crc kubenswrapper[4730]: I0320 15:58:46.735607    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cb9ef9a-6d98-43c1-8e74-7f24ba39357d-config\") pod \"ovn-northd-0\" (UID: \"4cb9ef9a-6d98-43c1-8e74-7f24ba39357d\") " pod="openstack/ovn-northd-0"
Mar 20 15:58:46 crc kubenswrapper[4730]: I0320 15:58:46.735723    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4cb9ef9a-6d98-43c1-8e74-7f24ba39357d-scripts\") pod \"ovn-northd-0\" (UID: \"4cb9ef9a-6d98-43c1-8e74-7f24ba39357d\") " pod="openstack/ovn-northd-0"
Mar 20 15:58:46 crc kubenswrapper[4730]: I0320 15:58:46.738988    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cb9ef9a-6d98-43c1-8e74-7f24ba39357d-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"4cb9ef9a-6d98-43c1-8e74-7f24ba39357d\") " pod="openstack/ovn-northd-0"
Mar 20 15:58:46 crc kubenswrapper[4730]: I0320 15:58:46.739164    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cb9ef9a-6d98-43c1-8e74-7f24ba39357d-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"4cb9ef9a-6d98-43c1-8e74-7f24ba39357d\") " pod="openstack/ovn-northd-0"
Mar 20 15:58:46 crc kubenswrapper[4730]: I0320 15:58:46.739410    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cb9ef9a-6d98-43c1-8e74-7f24ba39357d-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"4cb9ef9a-6d98-43c1-8e74-7f24ba39357d\") " pod="openstack/ovn-northd-0"
Mar 20 15:58:46 crc kubenswrapper[4730]: I0320 15:58:46.757906    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4v6j\" (UniqueName: \"kubernetes.io/projected/4cb9ef9a-6d98-43c1-8e74-7f24ba39357d-kube-api-access-p4v6j\") pod \"ovn-northd-0\" (UID: \"4cb9ef9a-6d98-43c1-8e74-7f24ba39357d\") " pod="openstack/ovn-northd-0"
Mar 20 15:58:46 crc kubenswrapper[4730]: I0320 15:58:46.855580    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0"
Mar 20 15:58:47 crc kubenswrapper[4730]: I0320 15:58:47.232017    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"84bbdebb-43de-41d6-82d4-71b0948c25f8","Type":"ContainerStarted","Data":"1f77c44861846d42519203e36df51ca7effaf39c8d087cdf7c5bea5217b97755"}
Mar 20 15:58:47 crc kubenswrapper[4730]: I0320 15:58:47.232421    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0"
Mar 20 15:58:47 crc kubenswrapper[4730]: I0320 15:58:47.251047    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=11.433395575 podStartE2EDuration="35.251031119s" podCreationTimestamp="2026-03-20 15:58:12 +0000 UTC" firstStartedPulling="2026-03-20 15:58:22.784471978 +0000 UTC m=+1161.997843347" lastFinishedPulling="2026-03-20 15:58:46.602107522 +0000 UTC m=+1185.815478891" observedRunningTime="2026-03-20 15:58:47.249855675 +0000 UTC m=+1186.463227044" watchObservedRunningTime="2026-03-20 15:58:47.251031119 +0000 UTC m=+1186.464402478"
Mar 20 15:58:47 crc kubenswrapper[4730]: I0320 15:58:47.375395    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"]
Mar 20 15:58:48 crc kubenswrapper[4730]: E0320 15:58:48.216610    4730 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.162:46540->38.102.83.162:41077: write tcp 38.102.83.162:46540->38.102.83.162:41077: write: broken pipe
Mar 20 15:58:48 crc kubenswrapper[4730]: I0320 15:58:48.241877    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"4cb9ef9a-6d98-43c1-8e74-7f24ba39357d","Type":"ContainerStarted","Data":"3b3e82e8c77b75fcffe50c4ddfbc5b744895a2b18d1f5d0d563a3717927f7732"}
Mar 20 15:58:48 crc kubenswrapper[4730]: I0320 15:58:48.241915    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"4cb9ef9a-6d98-43c1-8e74-7f24ba39357d","Type":"ContainerStarted","Data":"69dd39cf6401b28114536c13981581ad290f1afdc3ad0ec83535c5b6d2123db8"}
Mar 20 15:58:49 crc kubenswrapper[4730]: I0320 15:58:49.249762    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"4cb9ef9a-6d98-43c1-8e74-7f24ba39357d","Type":"ContainerStarted","Data":"964e8576b3554110ae20790b5688f31bd138746068e3fe025ffb791e18a75b8e"}
Mar 20 15:58:49 crc kubenswrapper[4730]: I0320 15:58:49.250848    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0"
Mar 20 15:58:49 crc kubenswrapper[4730]: I0320 15:58:49.269960    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.67340561 podStartE2EDuration="3.269942769s" podCreationTimestamp="2026-03-20 15:58:46 +0000 UTC" firstStartedPulling="2026-03-20 15:58:47.38025871 +0000 UTC m=+1186.593630079" lastFinishedPulling="2026-03-20 15:58:47.976795869 +0000 UTC m=+1187.190167238" observedRunningTime="2026-03-20 15:58:49.269843117 +0000 UTC m=+1188.483214486" watchObservedRunningTime="2026-03-20 15:58:49.269942769 +0000 UTC m=+1188.483314138"
Mar 20 15:58:51 crc kubenswrapper[4730]: I0320 15:58:51.131073    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0"
Mar 20 15:58:51 crc kubenswrapper[4730]: I0320 15:58:51.131460    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0"
Mar 20 15:58:51 crc kubenswrapper[4730]: I0320 15:58:51.240155    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0"
Mar 20 15:58:51 crc kubenswrapper[4730]: I0320 15:58:51.360203    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0"
Mar 20 15:58:52 crc kubenswrapper[4730]: I0320 15:58:52.569517    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0"
Mar 20 15:58:52 crc kubenswrapper[4730]: I0320 15:58:52.569858    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0"
Mar 20 15:58:52 crc kubenswrapper[4730]: I0320 15:58:52.668551    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0"
Mar 20 15:58:52 crc kubenswrapper[4730]: I0320 15:58:52.701564    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-648686d659-c5gtt"
Mar 20 15:58:52 crc kubenswrapper[4730]: I0320 15:58:52.707973    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0"
Mar 20 15:58:52 crc kubenswrapper[4730]: I0320 15:58:52.777456    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74bcc47849-2r2xb"]
Mar 20 15:58:52 crc kubenswrapper[4730]: I0320 15:58:52.777761    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74bcc47849-2r2xb" podUID="a5e88dae-c3fd-456c-92c6-3bc143b5a399" containerName="dnsmasq-dns" containerID="cri-o://15b7107144e36c5c008efc65f73b4c3e5a1b51a4b8a473402aba592b6c7329e1" gracePeriod=10
Mar 20 15:58:52 crc kubenswrapper[4730]: I0320 15:58:52.923320    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-e285-account-create-update-6wk66"]
Mar 20 15:58:52 crc kubenswrapper[4730]: I0320 15:58:52.924544    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e285-account-create-update-6wk66"
Mar 20 15:58:52 crc kubenswrapper[4730]: I0320 15:58:52.926196    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret"
Mar 20 15:58:52 crc kubenswrapper[4730]: I0320 15:58:52.941430    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-e285-account-create-update-6wk66"]
Mar 20 15:58:52 crc kubenswrapper[4730]: I0320 15:58:52.987692    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-bjqvh"]
Mar 20 15:58:52 crc kubenswrapper[4730]: I0320 15:58:52.988895    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-bjqvh"
Mar 20 15:58:53 crc kubenswrapper[4730]: I0320 15:58:53.006648    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-bjqvh"]
Mar 20 15:58:53 crc kubenswrapper[4730]: I0320 15:58:53.050011    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16da1663-821b-4e05-95f6-df67e9fac962-operator-scripts\") pod \"glance-e285-account-create-update-6wk66\" (UID: \"16da1663-821b-4e05-95f6-df67e9fac962\") " pod="openstack/glance-e285-account-create-update-6wk66"
Mar 20 15:58:53 crc kubenswrapper[4730]: I0320 15:58:53.050110    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfcdj\" (UniqueName: \"kubernetes.io/projected/16da1663-821b-4e05-95f6-df67e9fac962-kube-api-access-xfcdj\") pod \"glance-e285-account-create-update-6wk66\" (UID: \"16da1663-821b-4e05-95f6-df67e9fac962\") " pod="openstack/glance-e285-account-create-update-6wk66"
Mar 20 15:58:53 crc kubenswrapper[4730]: I0320 15:58:53.151290    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17c0870c-17e5-4bd4-91b1-a8df134a4fbd-operator-scripts\") pod \"glance-db-create-bjqvh\" (UID: \"17c0870c-17e5-4bd4-91b1-a8df134a4fbd\") " pod="openstack/glance-db-create-bjqvh"
Mar 20 15:58:53 crc kubenswrapper[4730]: I0320 15:58:53.151360    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16da1663-821b-4e05-95f6-df67e9fac962-operator-scripts\") pod \"glance-e285-account-create-update-6wk66\" (UID: \"16da1663-821b-4e05-95f6-df67e9fac962\") " pod="openstack/glance-e285-account-create-update-6wk66"
Mar 20 15:58:53 crc kubenswrapper[4730]: I0320 15:58:53.151398    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfcdj\" (UniqueName: \"kubernetes.io/projected/16da1663-821b-4e05-95f6-df67e9fac962-kube-api-access-xfcdj\") pod \"glance-e285-account-create-update-6wk66\" (UID: \"16da1663-821b-4e05-95f6-df67e9fac962\") " pod="openstack/glance-e285-account-create-update-6wk66"
Mar 20 15:58:53 crc kubenswrapper[4730]: I0320 15:58:53.151477    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj7ws\" (UniqueName: \"kubernetes.io/projected/17c0870c-17e5-4bd4-91b1-a8df134a4fbd-kube-api-access-hj7ws\") pod \"glance-db-create-bjqvh\" (UID: \"17c0870c-17e5-4bd4-91b1-a8df134a4fbd\") " pod="openstack/glance-db-create-bjqvh"
Mar 20 15:58:53 crc kubenswrapper[4730]: I0320 15:58:53.152447    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16da1663-821b-4e05-95f6-df67e9fac962-operator-scripts\") pod \"glance-e285-account-create-update-6wk66\" (UID: \"16da1663-821b-4e05-95f6-df67e9fac962\") " pod="openstack/glance-e285-account-create-update-6wk66"
Mar 20 15:58:53 crc kubenswrapper[4730]: I0320 15:58:53.176153    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfcdj\" (UniqueName: \"kubernetes.io/projected/16da1663-821b-4e05-95f6-df67e9fac962-kube-api-access-xfcdj\") pod \"glance-e285-account-create-update-6wk66\" (UID: \"16da1663-821b-4e05-95f6-df67e9fac962\") " pod="openstack/glance-e285-account-create-update-6wk66"
Mar 20 15:58:53 crc kubenswrapper[4730]: I0320 15:58:53.244685    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e285-account-create-update-6wk66"
Mar 20 15:58:53 crc kubenswrapper[4730]: I0320 15:58:53.253045    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17c0870c-17e5-4bd4-91b1-a8df134a4fbd-operator-scripts\") pod \"glance-db-create-bjqvh\" (UID: \"17c0870c-17e5-4bd4-91b1-a8df134a4fbd\") " pod="openstack/glance-db-create-bjqvh"
Mar 20 15:58:53 crc kubenswrapper[4730]: I0320 15:58:53.253175    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hj7ws\" (UniqueName: \"kubernetes.io/projected/17c0870c-17e5-4bd4-91b1-a8df134a4fbd-kube-api-access-hj7ws\") pod \"glance-db-create-bjqvh\" (UID: \"17c0870c-17e5-4bd4-91b1-a8df134a4fbd\") " pod="openstack/glance-db-create-bjqvh"
Mar 20 15:58:53 crc kubenswrapper[4730]: I0320 15:58:53.253684    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17c0870c-17e5-4bd4-91b1-a8df134a4fbd-operator-scripts\") pod \"glance-db-create-bjqvh\" (UID: \"17c0870c-17e5-4bd4-91b1-a8df134a4fbd\") " pod="openstack/glance-db-create-bjqvh"
Mar 20 15:58:53 crc kubenswrapper[4730]: I0320 15:58:53.269995    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj7ws\" (UniqueName: \"kubernetes.io/projected/17c0870c-17e5-4bd4-91b1-a8df134a4fbd-kube-api-access-hj7ws\") pod \"glance-db-create-bjqvh\" (UID: \"17c0870c-17e5-4bd4-91b1-a8df134a4fbd\") " pod="openstack/glance-db-create-bjqvh"
Mar 20 15:58:53 crc kubenswrapper[4730]: I0320 15:58:53.285594    4730 generic.go:334] "Generic (PLEG): container finished" podID="a5e88dae-c3fd-456c-92c6-3bc143b5a399" containerID="15b7107144e36c5c008efc65f73b4c3e5a1b51a4b8a473402aba592b6c7329e1" exitCode=0
Mar 20 15:58:53 crc kubenswrapper[4730]: I0320 15:58:53.286517    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74bcc47849-2r2xb" event={"ID":"a5e88dae-c3fd-456c-92c6-3bc143b5a399","Type":"ContainerDied","Data":"15b7107144e36c5c008efc65f73b4c3e5a1b51a4b8a473402aba592b6c7329e1"}
Mar 20 15:58:53 crc kubenswrapper[4730]: I0320 15:58:53.310561    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-bjqvh"
Mar 20 15:58:53 crc kubenswrapper[4730]: I0320 15:58:53.437932    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0"
Mar 20 15:58:53 crc kubenswrapper[4730]: I0320 15:58:53.595555    4730 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74bcc47849-2r2xb" podUID="a5e88dae-c3fd-456c-92c6-3bc143b5a399" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.108:5353: connect: connection refused"
Mar 20 15:58:53 crc kubenswrapper[4730]: I0320 15:58:53.688889    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-e285-account-create-update-6wk66"]
Mar 20 15:58:53 crc kubenswrapper[4730]: W0320 15:58:53.700239    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16da1663_821b_4e05_95f6_df67e9fac962.slice/crio-8583b53c13d80cd0982656c0b75867dafc594f16bd85e802824264387cf98f7a WatchSource:0}: Error finding container 8583b53c13d80cd0982656c0b75867dafc594f16bd85e802824264387cf98f7a: Status 404 returned error can't find the container with id 8583b53c13d80cd0982656c0b75867dafc594f16bd85e802824264387cf98f7a
Mar 20 15:58:53 crc kubenswrapper[4730]: I0320 15:58:53.805870    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-bjqvh"]
Mar 20 15:58:53 crc kubenswrapper[4730]: W0320 15:58:53.813835    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17c0870c_17e5_4bd4_91b1_a8df134a4fbd.slice/crio-758d7b72caef7e5c055044d07ddf94f810e0aec3dcd8ba6e54797d6869ccee1f WatchSource:0}: Error finding container 758d7b72caef7e5c055044d07ddf94f810e0aec3dcd8ba6e54797d6869ccee1f: Status 404 returned error can't find the container with id 758d7b72caef7e5c055044d07ddf94f810e0aec3dcd8ba6e54797d6869ccee1f
Mar 20 15:58:53 crc kubenswrapper[4730]: I0320 15:58:53.989883    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-fvknw"]
Mar 20 15:58:53 crc kubenswrapper[4730]: I0320 15:58:53.991185    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-fvknw"
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.004555    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-fvknw"]
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.016559    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74bcc47849-2r2xb"
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.162588    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db41-account-create-update-x7l2w"]
Mar 20 15:58:54 crc kubenswrapper[4730]: E0320 15:58:54.162916    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5e88dae-c3fd-456c-92c6-3bc143b5a399" containerName="dnsmasq-dns"
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.162935    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5e88dae-c3fd-456c-92c6-3bc143b5a399" containerName="dnsmasq-dns"
Mar 20 15:58:54 crc kubenswrapper[4730]: E0320 15:58:54.162969    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5e88dae-c3fd-456c-92c6-3bc143b5a399" containerName="init"
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.162976    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5e88dae-c3fd-456c-92c6-3bc143b5a399" containerName="init"
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.163117    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5e88dae-c3fd-456c-92c6-3bc143b5a399" containerName="dnsmasq-dns"
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.163674    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db41-account-create-update-x7l2w"
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.166060    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret"
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.183239    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5e88dae-c3fd-456c-92c6-3bc143b5a399-config\") pod \"a5e88dae-c3fd-456c-92c6-3bc143b5a399\" (UID: \"a5e88dae-c3fd-456c-92c6-3bc143b5a399\") "
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.183782    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db41-account-create-update-x7l2w"]
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.183926    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a5e88dae-c3fd-456c-92c6-3bc143b5a399-dns-svc\") pod \"a5e88dae-c3fd-456c-92c6-3bc143b5a399\" (UID: \"a5e88dae-c3fd-456c-92c6-3bc143b5a399\") "
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.184276    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7d94g\" (UniqueName: \"kubernetes.io/projected/a5e88dae-c3fd-456c-92c6-3bc143b5a399-kube-api-access-7d94g\") pod \"a5e88dae-c3fd-456c-92c6-3bc143b5a399\" (UID: \"a5e88dae-c3fd-456c-92c6-3bc143b5a399\") "
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.184596    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfwtq\" (UniqueName: \"kubernetes.io/projected/ef40906b-a3dc-45b8-8bde-dd06eaaef85c-kube-api-access-vfwtq\") pod \"keystone-db-create-fvknw\" (UID: \"ef40906b-a3dc-45b8-8bde-dd06eaaef85c\") " pod="openstack/keystone-db-create-fvknw"
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.184710    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef40906b-a3dc-45b8-8bde-dd06eaaef85c-operator-scripts\") pod \"keystone-db-create-fvknw\" (UID: \"ef40906b-a3dc-45b8-8bde-dd06eaaef85c\") " pod="openstack/keystone-db-create-fvknw"
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.217432    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5e88dae-c3fd-456c-92c6-3bc143b5a399-kube-api-access-7d94g" (OuterVolumeSpecName: "kube-api-access-7d94g") pod "a5e88dae-c3fd-456c-92c6-3bc143b5a399" (UID: "a5e88dae-c3fd-456c-92c6-3bc143b5a399"). InnerVolumeSpecName "kube-api-access-7d94g". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.255915    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-x4h5x"]
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.257621    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-x4h5x"
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.264836    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5e88dae-c3fd-456c-92c6-3bc143b5a399-config" (OuterVolumeSpecName: "config") pod "a5e88dae-c3fd-456c-92c6-3bc143b5a399" (UID: "a5e88dae-c3fd-456c-92c6-3bc143b5a399"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.274340    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-x4h5x"]
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.285956    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c40f368e-f905-465b-9af0-b0ecb753de79-operator-scripts\") pod \"keystone-db41-account-create-update-x7l2w\" (UID: \"c40f368e-f905-465b-9af0-b0ecb753de79\") " pod="openstack/keystone-db41-account-create-update-x7l2w"
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.286215    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22s8r\" (UniqueName: \"kubernetes.io/projected/c40f368e-f905-465b-9af0-b0ecb753de79-kube-api-access-22s8r\") pod \"keystone-db41-account-create-update-x7l2w\" (UID: \"c40f368e-f905-465b-9af0-b0ecb753de79\") " pod="openstack/keystone-db41-account-create-update-x7l2w"
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.286284    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfwtq\" (UniqueName: \"kubernetes.io/projected/ef40906b-a3dc-45b8-8bde-dd06eaaef85c-kube-api-access-vfwtq\") pod \"keystone-db-create-fvknw\" (UID: \"ef40906b-a3dc-45b8-8bde-dd06eaaef85c\") " pod="openstack/keystone-db-create-fvknw"
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.286353    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef40906b-a3dc-45b8-8bde-dd06eaaef85c-operator-scripts\") pod \"keystone-db-create-fvknw\" (UID: \"ef40906b-a3dc-45b8-8bde-dd06eaaef85c\") " pod="openstack/keystone-db-create-fvknw"
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.286652    4730 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5e88dae-c3fd-456c-92c6-3bc143b5a399-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.286669    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7d94g\" (UniqueName: \"kubernetes.io/projected/a5e88dae-c3fd-456c-92c6-3bc143b5a399-kube-api-access-7d94g\") on node \"crc\" DevicePath \"\""
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.287219    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef40906b-a3dc-45b8-8bde-dd06eaaef85c-operator-scripts\") pod \"keystone-db-create-fvknw\" (UID: \"ef40906b-a3dc-45b8-8bde-dd06eaaef85c\") " pod="openstack/keystone-db-create-fvknw"
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.288211    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5e88dae-c3fd-456c-92c6-3bc143b5a399-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a5e88dae-c3fd-456c-92c6-3bc143b5a399" (UID: "a5e88dae-c3fd-456c-92c6-3bc143b5a399"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.302767    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfwtq\" (UniqueName: \"kubernetes.io/projected/ef40906b-a3dc-45b8-8bde-dd06eaaef85c-kube-api-access-vfwtq\") pod \"keystone-db-create-fvknw\" (UID: \"ef40906b-a3dc-45b8-8bde-dd06eaaef85c\") " pod="openstack/keystone-db-create-fvknw"
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.308474    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-bjqvh" event={"ID":"17c0870c-17e5-4bd4-91b1-a8df134a4fbd","Type":"ContainerStarted","Data":"30a5fc8a5ea71396f4de5cb5ef85143858b4f15e175e2aea0d88617f137cddad"}
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.308532    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-bjqvh" event={"ID":"17c0870c-17e5-4bd4-91b1-a8df134a4fbd","Type":"ContainerStarted","Data":"758d7b72caef7e5c055044d07ddf94f810e0aec3dcd8ba6e54797d6869ccee1f"}
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.310988    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-fvknw"
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.315281    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74bcc47849-2r2xb"
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.315275    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74bcc47849-2r2xb" event={"ID":"a5e88dae-c3fd-456c-92c6-3bc143b5a399","Type":"ContainerDied","Data":"179fd0dd8f56e799ace1a1c345f43d1901fddb9fcf8b79a809a5d214fa5edf4a"}
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.315430    4730 scope.go:117] "RemoveContainer" containerID="15b7107144e36c5c008efc65f73b4c3e5a1b51a4b8a473402aba592b6c7329e1"
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.318727    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-e285-account-create-update-6wk66" event={"ID":"16da1663-821b-4e05-95f6-df67e9fac962","Type":"ContainerStarted","Data":"67c549d0aa6a1c0db14f97c3aff414699de48b304aa4c5c416c420aae8bc31a7"}
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.318782    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-e285-account-create-update-6wk66" event={"ID":"16da1663-821b-4e05-95f6-df67e9fac962","Type":"ContainerStarted","Data":"8583b53c13d80cd0982656c0b75867dafc594f16bd85e802824264387cf98f7a"}
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.330534    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-bjqvh" podStartSLOduration=2.33050688 podStartE2EDuration="2.33050688s" podCreationTimestamp="2026-03-20 15:58:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:58:54.32417628 +0000 UTC m=+1193.537547649" watchObservedRunningTime="2026-03-20 15:58:54.33050688 +0000 UTC m=+1193.543878249"
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.358674    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-e285-account-create-update-6wk66" podStartSLOduration=2.358650989 podStartE2EDuration="2.358650989s" podCreationTimestamp="2026-03-20 15:58:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:58:54.347703558 +0000 UTC m=+1193.561074927" watchObservedRunningTime="2026-03-20 15:58:54.358650989 +0000 UTC m=+1193.572022358"
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.386375    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-c5a1-account-create-update-hfdmn"]
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.388062    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c5a1-account-create-update-hfdmn"
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.393961    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c40f368e-f905-465b-9af0-b0ecb753de79-operator-scripts\") pod \"keystone-db41-account-create-update-x7l2w\" (UID: \"c40f368e-f905-465b-9af0-b0ecb753de79\") " pod="openstack/keystone-db41-account-create-update-x7l2w"
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.394374    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22s8r\" (UniqueName: \"kubernetes.io/projected/c40f368e-f905-465b-9af0-b0ecb753de79-kube-api-access-22s8r\") pod \"keystone-db41-account-create-update-x7l2w\" (UID: \"c40f368e-f905-465b-9af0-b0ecb753de79\") " pod="openstack/keystone-db41-account-create-update-x7l2w"
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.394464    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3198c781-92f7-40f1-9b6e-ed5310febe0b-operator-scripts\") pod \"placement-db-create-x4h5x\" (UID: \"3198c781-92f7-40f1-9b6e-ed5310febe0b\") " pod="openstack/placement-db-create-x4h5x"
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.394566    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-698g8\" (UniqueName: \"kubernetes.io/projected/3198c781-92f7-40f1-9b6e-ed5310febe0b-kube-api-access-698g8\") pod \"placement-db-create-x4h5x\" (UID: \"3198c781-92f7-40f1-9b6e-ed5310febe0b\") " pod="openstack/placement-db-create-x4h5x"
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.394782    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret"
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.394852    4730 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a5e88dae-c3fd-456c-92c6-3bc143b5a399-dns-svc\") on node \"crc\" DevicePath \"\""
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.395882    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74bcc47849-2r2xb"]
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.397002    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c40f368e-f905-465b-9af0-b0ecb753de79-operator-scripts\") pod \"keystone-db41-account-create-update-x7l2w\" (UID: \"c40f368e-f905-465b-9af0-b0ecb753de79\") " pod="openstack/keystone-db41-account-create-update-x7l2w"
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.399991    4730 scope.go:117] "RemoveContainer" containerID="9eec9013a24b2740a1eda4c33bad2a0fe15b131b43355b27b55b42be661249e4"
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.414563    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-c5a1-account-create-update-hfdmn"]
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.415532    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22s8r\" (UniqueName: \"kubernetes.io/projected/c40f368e-f905-465b-9af0-b0ecb753de79-kube-api-access-22s8r\") pod \"keystone-db41-account-create-update-x7l2w\" (UID: \"c40f368e-f905-465b-9af0-b0ecb753de79\") " pod="openstack/keystone-db41-account-create-update-x7l2w"
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.427514    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74bcc47849-2r2xb"]
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.496200    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a532566c-ab86-4984-9212-1e48605d192b-operator-scripts\") pod \"placement-c5a1-account-create-update-hfdmn\" (UID: \"a532566c-ab86-4984-9212-1e48605d192b\") " pod="openstack/placement-c5a1-account-create-update-hfdmn"
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.496290    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3198c781-92f7-40f1-9b6e-ed5310febe0b-operator-scripts\") pod \"placement-db-create-x4h5x\" (UID: \"3198c781-92f7-40f1-9b6e-ed5310febe0b\") " pod="openstack/placement-db-create-x4h5x"
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.496364    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-698g8\" (UniqueName: \"kubernetes.io/projected/3198c781-92f7-40f1-9b6e-ed5310febe0b-kube-api-access-698g8\") pod \"placement-db-create-x4h5x\" (UID: \"3198c781-92f7-40f1-9b6e-ed5310febe0b\") " pod="openstack/placement-db-create-x4h5x"
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.496429    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9qkh\" (UniqueName: \"kubernetes.io/projected/a532566c-ab86-4984-9212-1e48605d192b-kube-api-access-w9qkh\") pod \"placement-c5a1-account-create-update-hfdmn\" (UID: \"a532566c-ab86-4984-9212-1e48605d192b\") " pod="openstack/placement-c5a1-account-create-update-hfdmn"
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.498044    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3198c781-92f7-40f1-9b6e-ed5310febe0b-operator-scripts\") pod \"placement-db-create-x4h5x\" (UID: \"3198c781-92f7-40f1-9b6e-ed5310febe0b\") " pod="openstack/placement-db-create-x4h5x"
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.532922    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-698g8\" (UniqueName: \"kubernetes.io/projected/3198c781-92f7-40f1-9b6e-ed5310febe0b-kube-api-access-698g8\") pod \"placement-db-create-x4h5x\" (UID: \"3198c781-92f7-40f1-9b6e-ed5310febe0b\") " pod="openstack/placement-db-create-x4h5x"
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.597599    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9qkh\" (UniqueName: \"kubernetes.io/projected/a532566c-ab86-4984-9212-1e48605d192b-kube-api-access-w9qkh\") pod \"placement-c5a1-account-create-update-hfdmn\" (UID: \"a532566c-ab86-4984-9212-1e48605d192b\") " pod="openstack/placement-c5a1-account-create-update-hfdmn"
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.597722    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a532566c-ab86-4984-9212-1e48605d192b-operator-scripts\") pod \"placement-c5a1-account-create-update-hfdmn\" (UID: \"a532566c-ab86-4984-9212-1e48605d192b\") " pod="openstack/placement-c5a1-account-create-update-hfdmn"
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.598616    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a532566c-ab86-4984-9212-1e48605d192b-operator-scripts\") pod \"placement-c5a1-account-create-update-hfdmn\" (UID: \"a532566c-ab86-4984-9212-1e48605d192b\") " pod="openstack/placement-c5a1-account-create-update-hfdmn"
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.604991    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db41-account-create-update-x7l2w"
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.611024    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-x4h5x"
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.617378    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9qkh\" (UniqueName: \"kubernetes.io/projected/a532566c-ab86-4984-9212-1e48605d192b-kube-api-access-w9qkh\") pod \"placement-c5a1-account-create-update-hfdmn\" (UID: \"a532566c-ab86-4984-9212-1e48605d192b\") " pod="openstack/placement-c5a1-account-create-update-hfdmn"
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.745780    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c5a1-account-create-update-hfdmn"
Mar 20 15:58:54 crc kubenswrapper[4730]: I0320 15:58:54.891781    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-fvknw"]
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.113070    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-x4h5x"]
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.122060    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db41-account-create-update-x7l2w"]
Mar 20 15:58:55 crc kubenswrapper[4730]: W0320 15:58:55.129367    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc40f368e_f905_465b_9af0_b0ecb753de79.slice/crio-085c52d92c419cf64b41d1ae79845132baa487a0824496f899efa452067532d7 WatchSource:0}: Error finding container 085c52d92c419cf64b41d1ae79845132baa487a0824496f899efa452067532d7: Status 404 returned error can't find the container with id 085c52d92c419cf64b41d1ae79845132baa487a0824496f899efa452067532d7
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.284184    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-c5a1-account-create-update-hfdmn"]
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.321104    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6868c44cb9-4zxm5"]
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.322481    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6868c44cb9-4zxm5"
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.374419    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6868c44cb9-4zxm5"]
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.392459    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-create-n9vdf"]
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.393495    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-n9vdf"
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.409272    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db41-account-create-update-x7l2w" event={"ID":"c40f368e-f905-465b-9af0-b0ecb753de79","Type":"ContainerStarted","Data":"085c52d92c419cf64b41d1ae79845132baa487a0824496f899efa452067532d7"}
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.411987    4730 generic.go:334] "Generic (PLEG): container finished" podID="16da1663-821b-4e05-95f6-df67e9fac962" containerID="67c549d0aa6a1c0db14f97c3aff414699de48b304aa4c5c416c420aae8bc31a7" exitCode=0
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.412033    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-e285-account-create-update-6wk66" event={"ID":"16da1663-821b-4e05-95f6-df67e9fac962","Type":"ContainerDied","Data":"67c549d0aa6a1c0db14f97c3aff414699de48b304aa4c5c416c420aae8bc31a7"}
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.413405    4730 generic.go:334] "Generic (PLEG): container finished" podID="35efb2c2-6521-4f6f-a350-a4dc537ecaf8" containerID="775e72f867e9b2d92cfca2b734aa3db0f978ba6095c698e181792a7a0058d4cd" exitCode=0
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.413440    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cdd7f" event={"ID":"35efb2c2-6521-4f6f-a350-a4dc537ecaf8","Type":"ContainerDied","Data":"775e72f867e9b2d92cfca2b734aa3db0f978ba6095c698e181792a7a0058d4cd"}
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.417040    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-x4h5x" event={"ID":"3198c781-92f7-40f1-9b6e-ed5310febe0b","Type":"ContainerStarted","Data":"848e9fa2112624c8fb8c6bec3d6893fc6a8dd666fb6f1795ec93e5b1d6b5bea5"}
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.422688    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3806a27b-4a0f-439b-8660-d9ccd4bb0618-config\") pod \"dnsmasq-dns-6868c44cb9-4zxm5\" (UID: \"3806a27b-4a0f-439b-8660-d9ccd4bb0618\") " pod="openstack/dnsmasq-dns-6868c44cb9-4zxm5"
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.422847    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt2hc\" (UniqueName: \"kubernetes.io/projected/3806a27b-4a0f-439b-8660-d9ccd4bb0618-kube-api-access-lt2hc\") pod \"dnsmasq-dns-6868c44cb9-4zxm5\" (UID: \"3806a27b-4a0f-439b-8660-d9ccd4bb0618\") " pod="openstack/dnsmasq-dns-6868c44cb9-4zxm5"
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.422902    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3806a27b-4a0f-439b-8660-d9ccd4bb0618-ovsdbserver-sb\") pod \"dnsmasq-dns-6868c44cb9-4zxm5\" (UID: \"3806a27b-4a0f-439b-8660-d9ccd4bb0618\") " pod="openstack/dnsmasq-dns-6868c44cb9-4zxm5"
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.422991    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3806a27b-4a0f-439b-8660-d9ccd4bb0618-ovsdbserver-nb\") pod \"dnsmasq-dns-6868c44cb9-4zxm5\" (UID: \"3806a27b-4a0f-439b-8660-d9ccd4bb0618\") " pod="openstack/dnsmasq-dns-6868c44cb9-4zxm5"
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.423034    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3806a27b-4a0f-439b-8660-d9ccd4bb0618-dns-svc\") pod \"dnsmasq-dns-6868c44cb9-4zxm5\" (UID: \"3806a27b-4a0f-439b-8660-d9ccd4bb0618\") " pod="openstack/dnsmasq-dns-6868c44cb9-4zxm5"
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.426311    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fdd3845a-3723-438f-aa58-606451baed6c","Type":"ContainerStarted","Data":"ccc4d976b4160ab2263002a763830d5f5f68919c64d310c4f41b79be9631a6ea"}
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.437351    4730 generic.go:334] "Generic (PLEG): container finished" podID="17c0870c-17e5-4bd4-91b1-a8df134a4fbd" containerID="30a5fc8a5ea71396f4de5cb5ef85143858b4f15e175e2aea0d88617f137cddad" exitCode=0
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.437661    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-bjqvh" event={"ID":"17c0870c-17e5-4bd4-91b1-a8df134a4fbd","Type":"ContainerDied","Data":"30a5fc8a5ea71396f4de5cb5ef85143858b4f15e175e2aea0d88617f137cddad"}
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.457974    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-fvknw" event={"ID":"ef40906b-a3dc-45b8-8bde-dd06eaaef85c","Type":"ContainerStarted","Data":"58cffe0b249055c3d576f3ea017f6ee1185d299a1b49ba134b1ea8fcf81d53bd"}
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.458021    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-fvknw" event={"ID":"ef40906b-a3dc-45b8-8bde-dd06eaaef85c","Type":"ContainerStarted","Data":"53640fa059a665b1d15935fd34ffaf73050cf1ca9fa0e3068b812699863c68d2"}
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.472876    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-n9vdf"]
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.525923    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt2hc\" (UniqueName: \"kubernetes.io/projected/3806a27b-4a0f-439b-8660-d9ccd4bb0618-kube-api-access-lt2hc\") pod \"dnsmasq-dns-6868c44cb9-4zxm5\" (UID: \"3806a27b-4a0f-439b-8660-d9ccd4bb0618\") " pod="openstack/dnsmasq-dns-6868c44cb9-4zxm5"
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.525990    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3806a27b-4a0f-439b-8660-d9ccd4bb0618-ovsdbserver-sb\") pod \"dnsmasq-dns-6868c44cb9-4zxm5\" (UID: \"3806a27b-4a0f-439b-8660-d9ccd4bb0618\") " pod="openstack/dnsmasq-dns-6868c44cb9-4zxm5"
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.526038    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5hhb\" (UniqueName: \"kubernetes.io/projected/c7b436cd-ff29-4a9f-9e58-4c8760b1e012-kube-api-access-v5hhb\") pod \"watcher-db-create-n9vdf\" (UID: \"c7b436cd-ff29-4a9f-9e58-4c8760b1e012\") " pod="openstack/watcher-db-create-n9vdf"
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.526085    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3806a27b-4a0f-439b-8660-d9ccd4bb0618-ovsdbserver-nb\") pod \"dnsmasq-dns-6868c44cb9-4zxm5\" (UID: \"3806a27b-4a0f-439b-8660-d9ccd4bb0618\") " pod="openstack/dnsmasq-dns-6868c44cb9-4zxm5"
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.526111    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7b436cd-ff29-4a9f-9e58-4c8760b1e012-operator-scripts\") pod \"watcher-db-create-n9vdf\" (UID: \"c7b436cd-ff29-4a9f-9e58-4c8760b1e012\") " pod="openstack/watcher-db-create-n9vdf"
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.526136    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3806a27b-4a0f-439b-8660-d9ccd4bb0618-dns-svc\") pod \"dnsmasq-dns-6868c44cb9-4zxm5\" (UID: \"3806a27b-4a0f-439b-8660-d9ccd4bb0618\") " pod="openstack/dnsmasq-dns-6868c44cb9-4zxm5"
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.526185    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3806a27b-4a0f-439b-8660-d9ccd4bb0618-config\") pod \"dnsmasq-dns-6868c44cb9-4zxm5\" (UID: \"3806a27b-4a0f-439b-8660-d9ccd4bb0618\") " pod="openstack/dnsmasq-dns-6868c44cb9-4zxm5"
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.527069    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3806a27b-4a0f-439b-8660-d9ccd4bb0618-config\") pod \"dnsmasq-dns-6868c44cb9-4zxm5\" (UID: \"3806a27b-4a0f-439b-8660-d9ccd4bb0618\") " pod="openstack/dnsmasq-dns-6868c44cb9-4zxm5"
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.528432    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3806a27b-4a0f-439b-8660-d9ccd4bb0618-ovsdbserver-sb\") pod \"dnsmasq-dns-6868c44cb9-4zxm5\" (UID: \"3806a27b-4a0f-439b-8660-d9ccd4bb0618\") " pod="openstack/dnsmasq-dns-6868c44cb9-4zxm5"
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.528931    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3806a27b-4a0f-439b-8660-d9ccd4bb0618-ovsdbserver-nb\") pod \"dnsmasq-dns-6868c44cb9-4zxm5\" (UID: \"3806a27b-4a0f-439b-8660-d9ccd4bb0618\") " pod="openstack/dnsmasq-dns-6868c44cb9-4zxm5"
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.528967    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3806a27b-4a0f-439b-8660-d9ccd4bb0618-dns-svc\") pod \"dnsmasq-dns-6868c44cb9-4zxm5\" (UID: \"3806a27b-4a0f-439b-8660-d9ccd4bb0618\") " pod="openstack/dnsmasq-dns-6868c44cb9-4zxm5"
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.558639    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5e88dae-c3fd-456c-92c6-3bc143b5a399" path="/var/lib/kubelet/pods/a5e88dae-c3fd-456c-92c6-3bc143b5a399/volumes"
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.571399    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt2hc\" (UniqueName: \"kubernetes.io/projected/3806a27b-4a0f-439b-8660-d9ccd4bb0618-kube-api-access-lt2hc\") pod \"dnsmasq-dns-6868c44cb9-4zxm5\" (UID: \"3806a27b-4a0f-439b-8660-d9ccd4bb0618\") " pod="openstack/dnsmasq-dns-6868c44cb9-4zxm5"
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.587534    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-87f2-account-create-update-lblc4"]
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.589002    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-87f2-account-create-update-lblc4"
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.592993    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-db-secret"
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.635999    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7b436cd-ff29-4a9f-9e58-4c8760b1e012-operator-scripts\") pod \"watcher-db-create-n9vdf\" (UID: \"c7b436cd-ff29-4a9f-9e58-4c8760b1e012\") " pod="openstack/watcher-db-create-n9vdf"
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.636164    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5hhb\" (UniqueName: \"kubernetes.io/projected/c7b436cd-ff29-4a9f-9e58-4c8760b1e012-kube-api-access-v5hhb\") pod \"watcher-db-create-n9vdf\" (UID: \"c7b436cd-ff29-4a9f-9e58-4c8760b1e012\") " pod="openstack/watcher-db-create-n9vdf"
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.638000    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7b436cd-ff29-4a9f-9e58-4c8760b1e012-operator-scripts\") pod \"watcher-db-create-n9vdf\" (UID: \"c7b436cd-ff29-4a9f-9e58-4c8760b1e012\") " pod="openstack/watcher-db-create-n9vdf"
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.641352    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-87f2-account-create-update-lblc4"]
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.689997    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5hhb\" (UniqueName: \"kubernetes.io/projected/c7b436cd-ff29-4a9f-9e58-4c8760b1e012-kube-api-access-v5hhb\") pod \"watcher-db-create-n9vdf\" (UID: \"c7b436cd-ff29-4a9f-9e58-4c8760b1e012\") " pod="openstack/watcher-db-create-n9vdf"
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.708653    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6868c44cb9-4zxm5"
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.734532    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-fvknw" podStartSLOduration=2.7345098009999997 podStartE2EDuration="2.734509801s" podCreationTimestamp="2026-03-20 15:58:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:58:55.581433491 +0000 UTC m=+1194.794804860" watchObservedRunningTime="2026-03-20 15:58:55.734509801 +0000 UTC m=+1194.947881170"
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.745662    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhx74\" (UniqueName: \"kubernetes.io/projected/a132fe19-9294-49c6-9b1e-fe3eed7f4bae-kube-api-access-qhx74\") pod \"watcher-87f2-account-create-update-lblc4\" (UID: \"a132fe19-9294-49c6-9b1e-fe3eed7f4bae\") " pod="openstack/watcher-87f2-account-create-update-lblc4"
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.745708    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a132fe19-9294-49c6-9b1e-fe3eed7f4bae-operator-scripts\") pod \"watcher-87f2-account-create-update-lblc4\" (UID: \"a132fe19-9294-49c6-9b1e-fe3eed7f4bae\") " pod="openstack/watcher-87f2-account-create-update-lblc4"
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.752616    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-n9vdf"
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.849219    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhx74\" (UniqueName: \"kubernetes.io/projected/a132fe19-9294-49c6-9b1e-fe3eed7f4bae-kube-api-access-qhx74\") pod \"watcher-87f2-account-create-update-lblc4\" (UID: \"a132fe19-9294-49c6-9b1e-fe3eed7f4bae\") " pod="openstack/watcher-87f2-account-create-update-lblc4"
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.849267    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a132fe19-9294-49c6-9b1e-fe3eed7f4bae-operator-scripts\") pod \"watcher-87f2-account-create-update-lblc4\" (UID: \"a132fe19-9294-49c6-9b1e-fe3eed7f4bae\") " pod="openstack/watcher-87f2-account-create-update-lblc4"
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.850023    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a132fe19-9294-49c6-9b1e-fe3eed7f4bae-operator-scripts\") pod \"watcher-87f2-account-create-update-lblc4\" (UID: \"a132fe19-9294-49c6-9b1e-fe3eed7f4bae\") " pod="openstack/watcher-87f2-account-create-update-lblc4"
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.878341    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhx74\" (UniqueName: \"kubernetes.io/projected/a132fe19-9294-49c6-9b1e-fe3eed7f4bae-kube-api-access-qhx74\") pod \"watcher-87f2-account-create-update-lblc4\" (UID: \"a132fe19-9294-49c6-9b1e-fe3eed7f4bae\") " pod="openstack/watcher-87f2-account-create-update-lblc4"
Mar 20 15:58:55 crc kubenswrapper[4730]: I0320 15:58:55.980483    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-87f2-account-create-update-lblc4"
Mar 20 15:58:56 crc kubenswrapper[4730]: E0320 15:58:56.122672    4730 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef40906b_a3dc_45b8_8bde_dd06eaaef85c.slice/crio-conmon-58cffe0b249055c3d576f3ea017f6ee1185d299a1b49ba134b1ea8fcf81d53bd.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef40906b_a3dc_45b8_8bde_dd06eaaef85c.slice/crio-58cffe0b249055c3d576f3ea017f6ee1185d299a1b49ba134b1ea8fcf81d53bd.scope\": RecentStats: unable to find data in memory cache]"
Mar 20 15:58:56 crc kubenswrapper[4730]: I0320 15:58:56.288982    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6868c44cb9-4zxm5"]
Mar 20 15:58:56 crc kubenswrapper[4730]: W0320 15:58:56.303691    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3806a27b_4a0f_439b_8660_d9ccd4bb0618.slice/crio-1d80c1a200a2b02176c3920aea1d4ac3bcd064cd7f985eb732053050a5d3edc7 WatchSource:0}: Error finding container 1d80c1a200a2b02176c3920aea1d4ac3bcd064cd7f985eb732053050a5d3edc7: Status 404 returned error can't find the container with id 1d80c1a200a2b02176c3920aea1d4ac3bcd064cd7f985eb732053050a5d3edc7
Mar 20 15:58:56 crc kubenswrapper[4730]: I0320 15:58:56.391989    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-n9vdf"]
Mar 20 15:58:56 crc kubenswrapper[4730]: W0320 15:58:56.401108    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7b436cd_ff29_4a9f_9e58_4c8760b1e012.slice/crio-46623510ee8353ece4203ec113c421119597c40cda4c774ba80d9347a1c73413 WatchSource:0}: Error finding container 46623510ee8353ece4203ec113c421119597c40cda4c774ba80d9347a1c73413: Status 404 returned error can't find the container with id 46623510ee8353ece4203ec113c421119597c40cda4c774ba80d9347a1c73413
Mar 20 15:58:56 crc kubenswrapper[4730]: I0320 15:58:56.470029    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6868c44cb9-4zxm5" event={"ID":"3806a27b-4a0f-439b-8660-d9ccd4bb0618","Type":"ContainerStarted","Data":"1d80c1a200a2b02176c3920aea1d4ac3bcd064cd7f985eb732053050a5d3edc7"}
Mar 20 15:58:56 crc kubenswrapper[4730]: I0320 15:58:56.473542    4730 generic.go:334] "Generic (PLEG): container finished" podID="a532566c-ab86-4984-9212-1e48605d192b" containerID="a8ebba1aa3aefe2f2a84695ac23d42f8c9788cfc46f63bc2e9dead733c7274ec" exitCode=0
Mar 20 15:58:56 crc kubenswrapper[4730]: I0320 15:58:56.473641    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c5a1-account-create-update-hfdmn" event={"ID":"a532566c-ab86-4984-9212-1e48605d192b","Type":"ContainerDied","Data":"a8ebba1aa3aefe2f2a84695ac23d42f8c9788cfc46f63bc2e9dead733c7274ec"}
Mar 20 15:58:56 crc kubenswrapper[4730]: I0320 15:58:56.473719    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c5a1-account-create-update-hfdmn" event={"ID":"a532566c-ab86-4984-9212-1e48605d192b","Type":"ContainerStarted","Data":"292856fca4df62efa63cfc164ccbb4f784eb6e19218a746589e6b7b1c3a0dd78"}
Mar 20 15:58:56 crc kubenswrapper[4730]: I0320 15:58:56.476184    4730 generic.go:334] "Generic (PLEG): container finished" podID="3198c781-92f7-40f1-9b6e-ed5310febe0b" containerID="1742d4d5f625e20265689269ef8d4a8b9f9546ddd3978d31dffe002e4353d662" exitCode=0
Mar 20 15:58:56 crc kubenswrapper[4730]: I0320 15:58:56.476302    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-x4h5x" event={"ID":"3198c781-92f7-40f1-9b6e-ed5310febe0b","Type":"ContainerDied","Data":"1742d4d5f625e20265689269ef8d4a8b9f9546ddd3978d31dffe002e4353d662"}
Mar 20 15:58:56 crc kubenswrapper[4730]: I0320 15:58:56.482346    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cdd7f" event={"ID":"35efb2c2-6521-4f6f-a350-a4dc537ecaf8","Type":"ContainerStarted","Data":"738343ef117f1c9c96aec25170eaeff222767a5338bf169b41385a96e5114518"}
Mar 20 15:58:56 crc kubenswrapper[4730]: I0320 15:58:56.490713    4730 generic.go:334] "Generic (PLEG): container finished" podID="ef40906b-a3dc-45b8-8bde-dd06eaaef85c" containerID="58cffe0b249055c3d576f3ea017f6ee1185d299a1b49ba134b1ea8fcf81d53bd" exitCode=0
Mar 20 15:58:56 crc kubenswrapper[4730]: I0320 15:58:56.490749    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-fvknw" event={"ID":"ef40906b-a3dc-45b8-8bde-dd06eaaef85c","Type":"ContainerDied","Data":"58cffe0b249055c3d576f3ea017f6ee1185d299a1b49ba134b1ea8fcf81d53bd"}
Mar 20 15:58:56 crc kubenswrapper[4730]: I0320 15:58:56.493332    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-n9vdf" event={"ID":"c7b436cd-ff29-4a9f-9e58-4c8760b1e012","Type":"ContainerStarted","Data":"46623510ee8353ece4203ec113c421119597c40cda4c774ba80d9347a1c73413"}
Mar 20 15:58:56 crc kubenswrapper[4730]: I0320 15:58:56.496397    4730 generic.go:334] "Generic (PLEG): container finished" podID="c40f368e-f905-465b-9af0-b0ecb753de79" containerID="b503fc415bca6f276d3faa0fabe6ea4e17e93d2815b320d70f62ffa635dc90fc" exitCode=0
Mar 20 15:58:56 crc kubenswrapper[4730]: I0320 15:58:56.496496    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db41-account-create-update-x7l2w" event={"ID":"c40f368e-f905-465b-9af0-b0ecb753de79","Type":"ContainerDied","Data":"b503fc415bca6f276d3faa0fabe6ea4e17e93d2815b320d70f62ffa635dc90fc"}
Mar 20 15:58:56 crc kubenswrapper[4730]: I0320 15:58:56.586261    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-87f2-account-create-update-lblc4"]
Mar 20 15:58:56 crc kubenswrapper[4730]: I0320 15:58:56.717732    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"]
Mar 20 15:58:56 crc kubenswrapper[4730]: I0320 15:58:56.745853    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0"
Mar 20 15:58:56 crc kubenswrapper[4730]: I0320 15:58:56.751045    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf"
Mar 20 15:58:56 crc kubenswrapper[4730]: I0320 15:58:56.751214    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data"
Mar 20 15:58:56 crc kubenswrapper[4730]: I0320 15:58:56.751343    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files"
Mar 20 15:58:56 crc kubenswrapper[4730]: I0320 15:58:56.751462    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-6gzpt"
Mar 20 15:58:56 crc kubenswrapper[4730]: I0320 15:58:56.758925    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"]
Mar 20 15:58:56 crc kubenswrapper[4730]: I0320 15:58:56.782855    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/2c9def6e-27a0-4543-8d3c-07b3e4005b33-lock\") pod \"swift-storage-0\" (UID: \"2c9def6e-27a0-4543-8d3c-07b3e4005b33\") " pod="openstack/swift-storage-0"
Mar 20 15:58:56 crc kubenswrapper[4730]: I0320 15:58:56.782991    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c9def6e-27a0-4543-8d3c-07b3e4005b33-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"2c9def6e-27a0-4543-8d3c-07b3e4005b33\") " pod="openstack/swift-storage-0"
Mar 20 15:58:56 crc kubenswrapper[4730]: I0320 15:58:56.783021    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktk6f\" (UniqueName: \"kubernetes.io/projected/2c9def6e-27a0-4543-8d3c-07b3e4005b33-kube-api-access-ktk6f\") pod \"swift-storage-0\" (UID: \"2c9def6e-27a0-4543-8d3c-07b3e4005b33\") " pod="openstack/swift-storage-0"
Mar 20 15:58:56 crc kubenswrapper[4730]: I0320 15:58:56.783054    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"2c9def6e-27a0-4543-8d3c-07b3e4005b33\") " pod="openstack/swift-storage-0"
Mar 20 15:58:56 crc kubenswrapper[4730]: I0320 15:58:56.783097    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/2c9def6e-27a0-4543-8d3c-07b3e4005b33-cache\") pod \"swift-storage-0\" (UID: \"2c9def6e-27a0-4543-8d3c-07b3e4005b33\") " pod="openstack/swift-storage-0"
Mar 20 15:58:56 crc kubenswrapper[4730]: I0320 15:58:56.783789    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2c9def6e-27a0-4543-8d3c-07b3e4005b33-etc-swift\") pod \"swift-storage-0\" (UID: \"2c9def6e-27a0-4543-8d3c-07b3e4005b33\") " pod="openstack/swift-storage-0"
Mar 20 15:58:56 crc kubenswrapper[4730]: I0320 15:58:56.888223    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/2c9def6e-27a0-4543-8d3c-07b3e4005b33-cache\") pod \"swift-storage-0\" (UID: \"2c9def6e-27a0-4543-8d3c-07b3e4005b33\") " pod="openstack/swift-storage-0"
Mar 20 15:58:56 crc kubenswrapper[4730]: I0320 15:58:56.888364    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2c9def6e-27a0-4543-8d3c-07b3e4005b33-etc-swift\") pod \"swift-storage-0\" (UID: \"2c9def6e-27a0-4543-8d3c-07b3e4005b33\") " pod="openstack/swift-storage-0"
Mar 20 15:58:56 crc kubenswrapper[4730]: I0320 15:58:56.888424    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/2c9def6e-27a0-4543-8d3c-07b3e4005b33-lock\") pod \"swift-storage-0\" (UID: \"2c9def6e-27a0-4543-8d3c-07b3e4005b33\") " pod="openstack/swift-storage-0"
Mar 20 15:58:56 crc kubenswrapper[4730]: I0320 15:58:56.888527    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c9def6e-27a0-4543-8d3c-07b3e4005b33-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"2c9def6e-27a0-4543-8d3c-07b3e4005b33\") " pod="openstack/swift-storage-0"
Mar 20 15:58:56 crc kubenswrapper[4730]: I0320 15:58:56.888552    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktk6f\" (UniqueName: \"kubernetes.io/projected/2c9def6e-27a0-4543-8d3c-07b3e4005b33-kube-api-access-ktk6f\") pod \"swift-storage-0\" (UID: \"2c9def6e-27a0-4543-8d3c-07b3e4005b33\") " pod="openstack/swift-storage-0"
Mar 20 15:58:56 crc kubenswrapper[4730]: I0320 15:58:56.888596    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"2c9def6e-27a0-4543-8d3c-07b3e4005b33\") " pod="openstack/swift-storage-0"
Mar 20 15:58:56 crc kubenswrapper[4730]: I0320 15:58:56.889126    4730 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"2c9def6e-27a0-4543-8d3c-07b3e4005b33\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/swift-storage-0"
Mar 20 15:58:56 crc kubenswrapper[4730]: I0320 15:58:56.893733    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/2c9def6e-27a0-4543-8d3c-07b3e4005b33-lock\") pod \"swift-storage-0\" (UID: \"2c9def6e-27a0-4543-8d3c-07b3e4005b33\") " pod="openstack/swift-storage-0"
Mar 20 15:58:56 crc kubenswrapper[4730]: I0320 15:58:56.889955    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/2c9def6e-27a0-4543-8d3c-07b3e4005b33-cache\") pod \"swift-storage-0\" (UID: \"2c9def6e-27a0-4543-8d3c-07b3e4005b33\") " pod="openstack/swift-storage-0"
Mar 20 15:58:56 crc kubenswrapper[4730]: E0320 15:58:56.893879    4730 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found
Mar 20 15:58:56 crc kubenswrapper[4730]: E0320 15:58:56.893893    4730 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found
Mar 20 15:58:56 crc kubenswrapper[4730]: E0320 15:58:56.894000    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2c9def6e-27a0-4543-8d3c-07b3e4005b33-etc-swift podName:2c9def6e-27a0-4543-8d3c-07b3e4005b33 nodeName:}" failed. No retries permitted until 2026-03-20 15:58:57.393966412 +0000 UTC m=+1196.607337781 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2c9def6e-27a0-4543-8d3c-07b3e4005b33-etc-swift") pod "swift-storage-0" (UID: "2c9def6e-27a0-4543-8d3c-07b3e4005b33") : configmap "swift-ring-files" not found
Mar 20 15:58:56 crc kubenswrapper[4730]: I0320 15:58:56.924709    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktk6f\" (UniqueName: \"kubernetes.io/projected/2c9def6e-27a0-4543-8d3c-07b3e4005b33-kube-api-access-ktk6f\") pod \"swift-storage-0\" (UID: \"2c9def6e-27a0-4543-8d3c-07b3e4005b33\") " pod="openstack/swift-storage-0"
Mar 20 15:58:56 crc kubenswrapper[4730]: I0320 15:58:56.925039    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c9def6e-27a0-4543-8d3c-07b3e4005b33-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"2c9def6e-27a0-4543-8d3c-07b3e4005b33\") " pod="openstack/swift-storage-0"
Mar 20 15:58:56 crc kubenswrapper[4730]: I0320 15:58:56.949701    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"2c9def6e-27a0-4543-8d3c-07b3e4005b33\") " pod="openstack/swift-storage-0"
Mar 20 15:58:56 crc kubenswrapper[4730]: I0320 15:58:56.976499    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e285-account-create-update-6wk66"
Mar 20 15:58:56 crc kubenswrapper[4730]: I0320 15:58:56.979500    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-bjqvh"
Mar 20 15:58:56 crc kubenswrapper[4730]: I0320 15:58:56.990646    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfcdj\" (UniqueName: \"kubernetes.io/projected/16da1663-821b-4e05-95f6-df67e9fac962-kube-api-access-xfcdj\") pod \"16da1663-821b-4e05-95f6-df67e9fac962\" (UID: \"16da1663-821b-4e05-95f6-df67e9fac962\") "
Mar 20 15:58:56 crc kubenswrapper[4730]: I0320 15:58:56.990828    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16da1663-821b-4e05-95f6-df67e9fac962-operator-scripts\") pod \"16da1663-821b-4e05-95f6-df67e9fac962\" (UID: \"16da1663-821b-4e05-95f6-df67e9fac962\") "
Mar 20 15:58:56 crc kubenswrapper[4730]: I0320 15:58:56.993598    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16da1663-821b-4e05-95f6-df67e9fac962-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "16da1663-821b-4e05-95f6-df67e9fac962" (UID: "16da1663-821b-4e05-95f6-df67e9fac962"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:58:57 crc kubenswrapper[4730]: I0320 15:58:57.091640    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17c0870c-17e5-4bd4-91b1-a8df134a4fbd-operator-scripts\") pod \"17c0870c-17e5-4bd4-91b1-a8df134a4fbd\" (UID: \"17c0870c-17e5-4bd4-91b1-a8df134a4fbd\") "
Mar 20 15:58:57 crc kubenswrapper[4730]: I0320 15:58:57.091760    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hj7ws\" (UniqueName: \"kubernetes.io/projected/17c0870c-17e5-4bd4-91b1-a8df134a4fbd-kube-api-access-hj7ws\") pod \"17c0870c-17e5-4bd4-91b1-a8df134a4fbd\" (UID: \"17c0870c-17e5-4bd4-91b1-a8df134a4fbd\") "
Mar 20 15:58:57 crc kubenswrapper[4730]: I0320 15:58:57.092043    4730 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16da1663-821b-4e05-95f6-df67e9fac962-operator-scripts\") on node \"crc\" DevicePath \"\""
Mar 20 15:58:57 crc kubenswrapper[4730]: I0320 15:58:57.092431    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17c0870c-17e5-4bd4-91b1-a8df134a4fbd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "17c0870c-17e5-4bd4-91b1-a8df134a4fbd" (UID: "17c0870c-17e5-4bd4-91b1-a8df134a4fbd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:58:57 crc kubenswrapper[4730]: I0320 15:58:57.121569    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16da1663-821b-4e05-95f6-df67e9fac962-kube-api-access-xfcdj" (OuterVolumeSpecName: "kube-api-access-xfcdj") pod "16da1663-821b-4e05-95f6-df67e9fac962" (UID: "16da1663-821b-4e05-95f6-df67e9fac962"). InnerVolumeSpecName "kube-api-access-xfcdj". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:58:57 crc kubenswrapper[4730]: I0320 15:58:57.123450    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17c0870c-17e5-4bd4-91b1-a8df134a4fbd-kube-api-access-hj7ws" (OuterVolumeSpecName: "kube-api-access-hj7ws") pod "17c0870c-17e5-4bd4-91b1-a8df134a4fbd" (UID: "17c0870c-17e5-4bd4-91b1-a8df134a4fbd"). InnerVolumeSpecName "kube-api-access-hj7ws". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:58:57 crc kubenswrapper[4730]: I0320 15:58:57.196087    4730 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17c0870c-17e5-4bd4-91b1-a8df134a4fbd-operator-scripts\") on node \"crc\" DevicePath \"\""
Mar 20 15:58:57 crc kubenswrapper[4730]: I0320 15:58:57.196442    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfcdj\" (UniqueName: \"kubernetes.io/projected/16da1663-821b-4e05-95f6-df67e9fac962-kube-api-access-xfcdj\") on node \"crc\" DevicePath \"\""
Mar 20 15:58:57 crc kubenswrapper[4730]: I0320 15:58:57.196465    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hj7ws\" (UniqueName: \"kubernetes.io/projected/17c0870c-17e5-4bd4-91b1-a8df134a4fbd-kube-api-access-hj7ws\") on node \"crc\" DevicePath \"\""
Mar 20 15:58:57 crc kubenswrapper[4730]: I0320 15:58:57.400412    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2c9def6e-27a0-4543-8d3c-07b3e4005b33-etc-swift\") pod \"swift-storage-0\" (UID: \"2c9def6e-27a0-4543-8d3c-07b3e4005b33\") " pod="openstack/swift-storage-0"
Mar 20 15:58:57 crc kubenswrapper[4730]: E0320 15:58:57.400715    4730 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found
Mar 20 15:58:57 crc kubenswrapper[4730]: E0320 15:58:57.400733    4730 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found
Mar 20 15:58:57 crc kubenswrapper[4730]: E0320 15:58:57.400781    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2c9def6e-27a0-4543-8d3c-07b3e4005b33-etc-swift podName:2c9def6e-27a0-4543-8d3c-07b3e4005b33 nodeName:}" failed. No retries permitted until 2026-03-20 15:58:58.400765301 +0000 UTC m=+1197.614136670 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2c9def6e-27a0-4543-8d3c-07b3e4005b33-etc-swift") pod "swift-storage-0" (UID: "2c9def6e-27a0-4543-8d3c-07b3e4005b33") : configmap "swift-ring-files" not found
Mar 20 15:58:57 crc kubenswrapper[4730]: I0320 15:58:57.508238    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-bjqvh" event={"ID":"17c0870c-17e5-4bd4-91b1-a8df134a4fbd","Type":"ContainerDied","Data":"758d7b72caef7e5c055044d07ddf94f810e0aec3dcd8ba6e54797d6869ccee1f"}
Mar 20 15:58:57 crc kubenswrapper[4730]: I0320 15:58:57.508298    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="758d7b72caef7e5c055044d07ddf94f810e0aec3dcd8ba6e54797d6869ccee1f"
Mar 20 15:58:57 crc kubenswrapper[4730]: I0320 15:58:57.508365    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-bjqvh"
Mar 20 15:58:57 crc kubenswrapper[4730]: I0320 15:58:57.511766    4730 generic.go:334] "Generic (PLEG): container finished" podID="a132fe19-9294-49c6-9b1e-fe3eed7f4bae" containerID="ae717d458b43c41d279b4f17419574a7ba6d139ccd8581e792b75559eb5cba0c" exitCode=0
Mar 20 15:58:57 crc kubenswrapper[4730]: I0320 15:58:57.511829    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-87f2-account-create-update-lblc4" event={"ID":"a132fe19-9294-49c6-9b1e-fe3eed7f4bae","Type":"ContainerDied","Data":"ae717d458b43c41d279b4f17419574a7ba6d139ccd8581e792b75559eb5cba0c"}
Mar 20 15:58:57 crc kubenswrapper[4730]: I0320 15:58:57.511857    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-87f2-account-create-update-lblc4" event={"ID":"a132fe19-9294-49c6-9b1e-fe3eed7f4bae","Type":"ContainerStarted","Data":"10d5a54946b4bf571d21b82b12cd1c4555f94ac55a2e76b8dacd79b2d2c7f077"}
Mar 20 15:58:57 crc kubenswrapper[4730]: I0320 15:58:57.513908    4730 generic.go:334] "Generic (PLEG): container finished" podID="c7b436cd-ff29-4a9f-9e58-4c8760b1e012" containerID="2857b9eca0093dd961d059808f0936df2d938583bfd861b80e003896a914c165" exitCode=0
Mar 20 15:58:57 crc kubenswrapper[4730]: I0320 15:58:57.513950    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-n9vdf" event={"ID":"c7b436cd-ff29-4a9f-9e58-4c8760b1e012","Type":"ContainerDied","Data":"2857b9eca0093dd961d059808f0936df2d938583bfd861b80e003896a914c165"}
Mar 20 15:58:57 crc kubenswrapper[4730]: I0320 15:58:57.520519    4730 generic.go:334] "Generic (PLEG): container finished" podID="3806a27b-4a0f-439b-8660-d9ccd4bb0618" containerID="065e2b37d1a5fb4fd864e13cde37d04579827706198a7e6a7ccea3b914128eb1" exitCode=0
Mar 20 15:58:57 crc kubenswrapper[4730]: I0320 15:58:57.520591    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6868c44cb9-4zxm5" event={"ID":"3806a27b-4a0f-439b-8660-d9ccd4bb0618","Type":"ContainerDied","Data":"065e2b37d1a5fb4fd864e13cde37d04579827706198a7e6a7ccea3b914128eb1"}
Mar 20 15:58:57 crc kubenswrapper[4730]: I0320 15:58:57.524219    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e285-account-create-update-6wk66"
Mar 20 15:58:57 crc kubenswrapper[4730]: I0320 15:58:57.524261    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-e285-account-create-update-6wk66" event={"ID":"16da1663-821b-4e05-95f6-df67e9fac962","Type":"ContainerDied","Data":"8583b53c13d80cd0982656c0b75867dafc594f16bd85e802824264387cf98f7a"}
Mar 20 15:58:57 crc kubenswrapper[4730]: I0320 15:58:57.524322    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8583b53c13d80cd0982656c0b75867dafc594f16bd85e802824264387cf98f7a"
Mar 20 15:58:57 crc kubenswrapper[4730]: I0320 15:58:57.530083    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cdd7f" event={"ID":"35efb2c2-6521-4f6f-a350-a4dc537ecaf8","Type":"ContainerStarted","Data":"d215dd2e91a84b3112500d4694f1541097820fe0660149159734b31a6dacf3b5"}
Mar 20 15:58:57 crc kubenswrapper[4730]: I0320 15:58:57.530653    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-cdd7f"
Mar 20 15:58:57 crc kubenswrapper[4730]: I0320 15:58:57.530857    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-cdd7f"
Mar 20 15:58:57 crc kubenswrapper[4730]: I0320 15:58:57.564418    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fdd3845a-3723-438f-aa58-606451baed6c","Type":"ContainerStarted","Data":"2e07902c18ce4590f197ddee6088f8273a8da0ef0fa191f28dc4116786b4c25f"}
Mar 20 15:58:57 crc kubenswrapper[4730]: I0320 15:58:57.641327    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-cdd7f" podStartSLOduration=8.583908846 podStartE2EDuration="39.641306535s" podCreationTimestamp="2026-03-20 15:58:18 +0000 UTC" firstStartedPulling="2026-03-20 15:58:23.196024711 +0000 UTC m=+1162.409396080" lastFinishedPulling="2026-03-20 15:58:54.2534224 +0000 UTC m=+1193.466793769" observedRunningTime="2026-03-20 15:58:57.626086573 +0000 UTC m=+1196.839457942" watchObservedRunningTime="2026-03-20 15:58:57.641306535 +0000 UTC m=+1196.854677905"
Mar 20 15:58:57 crc kubenswrapper[4730]: I0320 15:58:57.987853    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-x4h5x"
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.127866    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-698g8\" (UniqueName: \"kubernetes.io/projected/3198c781-92f7-40f1-9b6e-ed5310febe0b-kube-api-access-698g8\") pod \"3198c781-92f7-40f1-9b6e-ed5310febe0b\" (UID: \"3198c781-92f7-40f1-9b6e-ed5310febe0b\") "
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.128063    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3198c781-92f7-40f1-9b6e-ed5310febe0b-operator-scripts\") pod \"3198c781-92f7-40f1-9b6e-ed5310febe0b\" (UID: \"3198c781-92f7-40f1-9b6e-ed5310febe0b\") "
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.129188    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3198c781-92f7-40f1-9b6e-ed5310febe0b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3198c781-92f7-40f1-9b6e-ed5310febe0b" (UID: "3198c781-92f7-40f1-9b6e-ed5310febe0b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.141414    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3198c781-92f7-40f1-9b6e-ed5310febe0b-kube-api-access-698g8" (OuterVolumeSpecName: "kube-api-access-698g8") pod "3198c781-92f7-40f1-9b6e-ed5310febe0b" (UID: "3198c781-92f7-40f1-9b6e-ed5310febe0b"). InnerVolumeSpecName "kube-api-access-698g8". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.204660    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-mkvv4"]
Mar 20 15:58:58 crc kubenswrapper[4730]: E0320 15:58:58.204957    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3198c781-92f7-40f1-9b6e-ed5310febe0b" containerName="mariadb-database-create"
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.204970    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="3198c781-92f7-40f1-9b6e-ed5310febe0b" containerName="mariadb-database-create"
Mar 20 15:58:58 crc kubenswrapper[4730]: E0320 15:58:58.204986    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17c0870c-17e5-4bd4-91b1-a8df134a4fbd" containerName="mariadb-database-create"
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.204992    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="17c0870c-17e5-4bd4-91b1-a8df134a4fbd" containerName="mariadb-database-create"
Mar 20 15:58:58 crc kubenswrapper[4730]: E0320 15:58:58.205005    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16da1663-821b-4e05-95f6-df67e9fac962" containerName="mariadb-account-create-update"
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.205011    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="16da1663-821b-4e05-95f6-df67e9fac962" containerName="mariadb-account-create-update"
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.205181    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="3198c781-92f7-40f1-9b6e-ed5310febe0b" containerName="mariadb-database-create"
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.205194    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="17c0870c-17e5-4bd4-91b1-a8df134a4fbd" containerName="mariadb-database-create"
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.205209    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="16da1663-821b-4e05-95f6-df67e9fac962" containerName="mariadb-account-create-update"
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.205763    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-mkvv4"
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.212524    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-b8g88"
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.212529    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data"
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.220982    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-mkvv4"]
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.230558    4730 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3198c781-92f7-40f1-9b6e-ed5310febe0b-operator-scripts\") on node \"crc\" DevicePath \"\""
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.230599    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-698g8\" (UniqueName: \"kubernetes.io/projected/3198c781-92f7-40f1-9b6e-ed5310febe0b-kube-api-access-698g8\") on node \"crc\" DevicePath \"\""
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.283773    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-fvknw"
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.307789    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c5a1-account-create-update-hfdmn"
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.313709    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db41-account-create-update-x7l2w"
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.332924    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef40906b-a3dc-45b8-8bde-dd06eaaef85c-operator-scripts\") pod \"ef40906b-a3dc-45b8-8bde-dd06eaaef85c\" (UID: \"ef40906b-a3dc-45b8-8bde-dd06eaaef85c\") "
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.333087    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfwtq\" (UniqueName: \"kubernetes.io/projected/ef40906b-a3dc-45b8-8bde-dd06eaaef85c-kube-api-access-vfwtq\") pod \"ef40906b-a3dc-45b8-8bde-dd06eaaef85c\" (UID: \"ef40906b-a3dc-45b8-8bde-dd06eaaef85c\") "
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.333329    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/37dd8777-c196-4db2-af7a-5560a939e02c-db-sync-config-data\") pod \"glance-db-sync-mkvv4\" (UID: \"37dd8777-c196-4db2-af7a-5560a939e02c\") " pod="openstack/glance-db-sync-mkvv4"
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.333370    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37dd8777-c196-4db2-af7a-5560a939e02c-config-data\") pod \"glance-db-sync-mkvv4\" (UID: \"37dd8777-c196-4db2-af7a-5560a939e02c\") " pod="openstack/glance-db-sync-mkvv4"
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.333427    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwpz2\" (UniqueName: \"kubernetes.io/projected/37dd8777-c196-4db2-af7a-5560a939e02c-kube-api-access-wwpz2\") pod \"glance-db-sync-mkvv4\" (UID: \"37dd8777-c196-4db2-af7a-5560a939e02c\") " pod="openstack/glance-db-sync-mkvv4"
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.333465    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37dd8777-c196-4db2-af7a-5560a939e02c-combined-ca-bundle\") pod \"glance-db-sync-mkvv4\" (UID: \"37dd8777-c196-4db2-af7a-5560a939e02c\") " pod="openstack/glance-db-sync-mkvv4"
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.334533    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef40906b-a3dc-45b8-8bde-dd06eaaef85c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ef40906b-a3dc-45b8-8bde-dd06eaaef85c" (UID: "ef40906b-a3dc-45b8-8bde-dd06eaaef85c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.337070    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef40906b-a3dc-45b8-8bde-dd06eaaef85c-kube-api-access-vfwtq" (OuterVolumeSpecName: "kube-api-access-vfwtq") pod "ef40906b-a3dc-45b8-8bde-dd06eaaef85c" (UID: "ef40906b-a3dc-45b8-8bde-dd06eaaef85c"). InnerVolumeSpecName "kube-api-access-vfwtq". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.434966    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9qkh\" (UniqueName: \"kubernetes.io/projected/a532566c-ab86-4984-9212-1e48605d192b-kube-api-access-w9qkh\") pod \"a532566c-ab86-4984-9212-1e48605d192b\" (UID: \"a532566c-ab86-4984-9212-1e48605d192b\") "
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.435007    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c40f368e-f905-465b-9af0-b0ecb753de79-operator-scripts\") pod \"c40f368e-f905-465b-9af0-b0ecb753de79\" (UID: \"c40f368e-f905-465b-9af0-b0ecb753de79\") "
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.435030    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22s8r\" (UniqueName: \"kubernetes.io/projected/c40f368e-f905-465b-9af0-b0ecb753de79-kube-api-access-22s8r\") pod \"c40f368e-f905-465b-9af0-b0ecb753de79\" (UID: \"c40f368e-f905-465b-9af0-b0ecb753de79\") "
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.435569    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c40f368e-f905-465b-9af0-b0ecb753de79-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c40f368e-f905-465b-9af0-b0ecb753de79" (UID: "c40f368e-f905-465b-9af0-b0ecb753de79"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.436132    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a532566c-ab86-4984-9212-1e48605d192b-operator-scripts\") pod \"a532566c-ab86-4984-9212-1e48605d192b\" (UID: \"a532566c-ab86-4984-9212-1e48605d192b\") "
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.436569    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a532566c-ab86-4984-9212-1e48605d192b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a532566c-ab86-4984-9212-1e48605d192b" (UID: "a532566c-ab86-4984-9212-1e48605d192b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.436734    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/37dd8777-c196-4db2-af7a-5560a939e02c-db-sync-config-data\") pod \"glance-db-sync-mkvv4\" (UID: \"37dd8777-c196-4db2-af7a-5560a939e02c\") " pod="openstack/glance-db-sync-mkvv4"
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.437160    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37dd8777-c196-4db2-af7a-5560a939e02c-config-data\") pod \"glance-db-sync-mkvv4\" (UID: \"37dd8777-c196-4db2-af7a-5560a939e02c\") " pod="openstack/glance-db-sync-mkvv4"
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.437208    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2c9def6e-27a0-4543-8d3c-07b3e4005b33-etc-swift\") pod \"swift-storage-0\" (UID: \"2c9def6e-27a0-4543-8d3c-07b3e4005b33\") " pod="openstack/swift-storage-0"
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.437235    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwpz2\" (UniqueName: \"kubernetes.io/projected/37dd8777-c196-4db2-af7a-5560a939e02c-kube-api-access-wwpz2\") pod \"glance-db-sync-mkvv4\" (UID: \"37dd8777-c196-4db2-af7a-5560a939e02c\") " pod="openstack/glance-db-sync-mkvv4"
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.437285    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37dd8777-c196-4db2-af7a-5560a939e02c-combined-ca-bundle\") pod \"glance-db-sync-mkvv4\" (UID: \"37dd8777-c196-4db2-af7a-5560a939e02c\") " pod="openstack/glance-db-sync-mkvv4"
Mar 20 15:58:58 crc kubenswrapper[4730]: E0320 15:58:58.437374    4730 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found
Mar 20 15:58:58 crc kubenswrapper[4730]: E0320 15:58:58.437404    4730 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found
Mar 20 15:58:58 crc kubenswrapper[4730]: E0320 15:58:58.437479    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2c9def6e-27a0-4543-8d3c-07b3e4005b33-etc-swift podName:2c9def6e-27a0-4543-8d3c-07b3e4005b33 nodeName:}" failed. No retries permitted until 2026-03-20 15:59:00.437463456 +0000 UTC m=+1199.650834825 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2c9def6e-27a0-4543-8d3c-07b3e4005b33-etc-swift") pod "swift-storage-0" (UID: "2c9def6e-27a0-4543-8d3c-07b3e4005b33") : configmap "swift-ring-files" not found
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.437392    4730 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c40f368e-f905-465b-9af0-b0ecb753de79-operator-scripts\") on node \"crc\" DevicePath \"\""
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.437613    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfwtq\" (UniqueName: \"kubernetes.io/projected/ef40906b-a3dc-45b8-8bde-dd06eaaef85c-kube-api-access-vfwtq\") on node \"crc\" DevicePath \"\""
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.437627    4730 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef40906b-a3dc-45b8-8bde-dd06eaaef85c-operator-scripts\") on node \"crc\" DevicePath \"\""
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.437642    4730 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a532566c-ab86-4984-9212-1e48605d192b-operator-scripts\") on node \"crc\" DevicePath \"\""
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.438516    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a532566c-ab86-4984-9212-1e48605d192b-kube-api-access-w9qkh" (OuterVolumeSpecName: "kube-api-access-w9qkh") pod "a532566c-ab86-4984-9212-1e48605d192b" (UID: "a532566c-ab86-4984-9212-1e48605d192b"). InnerVolumeSpecName "kube-api-access-w9qkh". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.441018    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37dd8777-c196-4db2-af7a-5560a939e02c-config-data\") pod \"glance-db-sync-mkvv4\" (UID: \"37dd8777-c196-4db2-af7a-5560a939e02c\") " pod="openstack/glance-db-sync-mkvv4"
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.441112    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c40f368e-f905-465b-9af0-b0ecb753de79-kube-api-access-22s8r" (OuterVolumeSpecName: "kube-api-access-22s8r") pod "c40f368e-f905-465b-9af0-b0ecb753de79" (UID: "c40f368e-f905-465b-9af0-b0ecb753de79"). InnerVolumeSpecName "kube-api-access-22s8r". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.441494    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37dd8777-c196-4db2-af7a-5560a939e02c-combined-ca-bundle\") pod \"glance-db-sync-mkvv4\" (UID: \"37dd8777-c196-4db2-af7a-5560a939e02c\") " pod="openstack/glance-db-sync-mkvv4"
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.442983    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/37dd8777-c196-4db2-af7a-5560a939e02c-db-sync-config-data\") pod \"glance-db-sync-mkvv4\" (UID: \"37dd8777-c196-4db2-af7a-5560a939e02c\") " pod="openstack/glance-db-sync-mkvv4"
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.453435    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwpz2\" (UniqueName: \"kubernetes.io/projected/37dd8777-c196-4db2-af7a-5560a939e02c-kube-api-access-wwpz2\") pod \"glance-db-sync-mkvv4\" (UID: \"37dd8777-c196-4db2-af7a-5560a939e02c\") " pod="openstack/glance-db-sync-mkvv4"
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.539695    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9qkh\" (UniqueName: \"kubernetes.io/projected/a532566c-ab86-4984-9212-1e48605d192b-kube-api-access-w9qkh\") on node \"crc\" DevicePath \"\""
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.539752    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22s8r\" (UniqueName: \"kubernetes.io/projected/c40f368e-f905-465b-9af0-b0ecb753de79-kube-api-access-22s8r\") on node \"crc\" DevicePath \"\""
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.564387    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-x4h5x" event={"ID":"3198c781-92f7-40f1-9b6e-ed5310febe0b","Type":"ContainerDied","Data":"848e9fa2112624c8fb8c6bec3d6893fc6a8dd666fb6f1795ec93e5b1d6b5bea5"}
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.564431    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="848e9fa2112624c8fb8c6bec3d6893fc6a8dd666fb6f1795ec93e5b1d6b5bea5"
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.564436    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-x4h5x"
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.565960    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-fvknw" event={"ID":"ef40906b-a3dc-45b8-8bde-dd06eaaef85c","Type":"ContainerDied","Data":"53640fa059a665b1d15935fd34ffaf73050cf1ca9fa0e3068b812699863c68d2"}
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.566000    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53640fa059a665b1d15935fd34ffaf73050cf1ca9fa0e3068b812699863c68d2"
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.566049    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-fvknw"
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.568722    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db41-account-create-update-x7l2w"
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.568726    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db41-account-create-update-x7l2w" event={"ID":"c40f368e-f905-465b-9af0-b0ecb753de79","Type":"ContainerDied","Data":"085c52d92c419cf64b41d1ae79845132baa487a0824496f899efa452067532d7"}
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.568845    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="085c52d92c419cf64b41d1ae79845132baa487a0824496f899efa452067532d7"
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.571138    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6868c44cb9-4zxm5" event={"ID":"3806a27b-4a0f-439b-8660-d9ccd4bb0618","Type":"ContainerStarted","Data":"1e16ce44f59bba62455cbdf7f59fe0150efea77ce27a7027dd9fa2ba8ec0f7d5"}
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.571879    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6868c44cb9-4zxm5"
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.573433    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c5a1-account-create-update-hfdmn"
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.578722    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c5a1-account-create-update-hfdmn" event={"ID":"a532566c-ab86-4984-9212-1e48605d192b","Type":"ContainerDied","Data":"292856fca4df62efa63cfc164ccbb4f784eb6e19218a746589e6b7b1c3a0dd78"}
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.578779    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="292856fca4df62efa63cfc164ccbb4f784eb6e19218a746589e6b7b1c3a0dd78"
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.604194    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-mkvv4"
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.612624    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6868c44cb9-4zxm5" podStartSLOduration=3.612602102 podStartE2EDuration="3.612602102s" podCreationTimestamp="2026-03-20 15:58:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:58:58.60690336 +0000 UTC m=+1197.820274729" watchObservedRunningTime="2026-03-20 15:58:58.612602102 +0000 UTC m=+1197.825973471"
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.881932    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-n9vdf"
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.950351    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7b436cd-ff29-4a9f-9e58-4c8760b1e012-operator-scripts\") pod \"c7b436cd-ff29-4a9f-9e58-4c8760b1e012\" (UID: \"c7b436cd-ff29-4a9f-9e58-4c8760b1e012\") "
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.950529    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5hhb\" (UniqueName: \"kubernetes.io/projected/c7b436cd-ff29-4a9f-9e58-4c8760b1e012-kube-api-access-v5hhb\") pod \"c7b436cd-ff29-4a9f-9e58-4c8760b1e012\" (UID: \"c7b436cd-ff29-4a9f-9e58-4c8760b1e012\") "
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.951316    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7b436cd-ff29-4a9f-9e58-4c8760b1e012-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c7b436cd-ff29-4a9f-9e58-4c8760b1e012" (UID: "c7b436cd-ff29-4a9f-9e58-4c8760b1e012"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:58:58 crc kubenswrapper[4730]: I0320 15:58:58.966061    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7b436cd-ff29-4a9f-9e58-4c8760b1e012-kube-api-access-v5hhb" (OuterVolumeSpecName: "kube-api-access-v5hhb") pod "c7b436cd-ff29-4a9f-9e58-4c8760b1e012" (UID: "c7b436cd-ff29-4a9f-9e58-4c8760b1e012"). InnerVolumeSpecName "kube-api-access-v5hhb". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:58:59 crc kubenswrapper[4730]: I0320 15:58:59.031117    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-87f2-account-create-update-lblc4"
Mar 20 15:58:59 crc kubenswrapper[4730]: I0320 15:58:59.052862    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5hhb\" (UniqueName: \"kubernetes.io/projected/c7b436cd-ff29-4a9f-9e58-4c8760b1e012-kube-api-access-v5hhb\") on node \"crc\" DevicePath \"\""
Mar 20 15:58:59 crc kubenswrapper[4730]: I0320 15:58:59.052907    4730 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7b436cd-ff29-4a9f-9e58-4c8760b1e012-operator-scripts\") on node \"crc\" DevicePath \"\""
Mar 20 15:58:59 crc kubenswrapper[4730]: I0320 15:58:59.153693    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a132fe19-9294-49c6-9b1e-fe3eed7f4bae-operator-scripts\") pod \"a132fe19-9294-49c6-9b1e-fe3eed7f4bae\" (UID: \"a132fe19-9294-49c6-9b1e-fe3eed7f4bae\") "
Mar 20 15:58:59 crc kubenswrapper[4730]: I0320 15:58:59.153817    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhx74\" (UniqueName: \"kubernetes.io/projected/a132fe19-9294-49c6-9b1e-fe3eed7f4bae-kube-api-access-qhx74\") pod \"a132fe19-9294-49c6-9b1e-fe3eed7f4bae\" (UID: \"a132fe19-9294-49c6-9b1e-fe3eed7f4bae\") "
Mar 20 15:58:59 crc kubenswrapper[4730]: I0320 15:58:59.157644    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a132fe19-9294-49c6-9b1e-fe3eed7f4bae-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a132fe19-9294-49c6-9b1e-fe3eed7f4bae" (UID: "a132fe19-9294-49c6-9b1e-fe3eed7f4bae"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:58:59 crc kubenswrapper[4730]: I0320 15:58:59.158314    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a132fe19-9294-49c6-9b1e-fe3eed7f4bae-kube-api-access-qhx74" (OuterVolumeSpecName: "kube-api-access-qhx74") pod "a132fe19-9294-49c6-9b1e-fe3eed7f4bae" (UID: "a132fe19-9294-49c6-9b1e-fe3eed7f4bae"). InnerVolumeSpecName "kube-api-access-qhx74". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:58:59 crc kubenswrapper[4730]: I0320 15:58:59.255877    4730 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a132fe19-9294-49c6-9b1e-fe3eed7f4bae-operator-scripts\") on node \"crc\" DevicePath \"\""
Mar 20 15:58:59 crc kubenswrapper[4730]: I0320 15:58:59.255912    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhx74\" (UniqueName: \"kubernetes.io/projected/a132fe19-9294-49c6-9b1e-fe3eed7f4bae-kube-api-access-qhx74\") on node \"crc\" DevicePath \"\""
Mar 20 15:58:59 crc kubenswrapper[4730]: I0320 15:58:59.408799    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-mkvv4"]
Mar 20 15:58:59 crc kubenswrapper[4730]: W0320 15:58:59.421994    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37dd8777_c196_4db2_af7a_5560a939e02c.slice/crio-83ca9233a49380ba616e8a33edba2e21a0e34d304c17191d6b978242a55feaa7 WatchSource:0}: Error finding container 83ca9233a49380ba616e8a33edba2e21a0e34d304c17191d6b978242a55feaa7: Status 404 returned error can't find the container with id 83ca9233a49380ba616e8a33edba2e21a0e34d304c17191d6b978242a55feaa7
Mar 20 15:58:59 crc kubenswrapper[4730]: I0320 15:58:59.588308    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-87f2-account-create-update-lblc4"
Mar 20 15:58:59 crc kubenswrapper[4730]: I0320 15:58:59.588309    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-87f2-account-create-update-lblc4" event={"ID":"a132fe19-9294-49c6-9b1e-fe3eed7f4bae","Type":"ContainerDied","Data":"10d5a54946b4bf571d21b82b12cd1c4555f94ac55a2e76b8dacd79b2d2c7f077"}
Mar 20 15:58:59 crc kubenswrapper[4730]: I0320 15:58:59.588447    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10d5a54946b4bf571d21b82b12cd1c4555f94ac55a2e76b8dacd79b2d2c7f077"
Mar 20 15:58:59 crc kubenswrapper[4730]: I0320 15:58:59.590444    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-n9vdf" event={"ID":"c7b436cd-ff29-4a9f-9e58-4c8760b1e012","Type":"ContainerDied","Data":"46623510ee8353ece4203ec113c421119597c40cda4c774ba80d9347a1c73413"}
Mar 20 15:58:59 crc kubenswrapper[4730]: I0320 15:58:59.590471    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46623510ee8353ece4203ec113c421119597c40cda4c774ba80d9347a1c73413"
Mar 20 15:58:59 crc kubenswrapper[4730]: I0320 15:58:59.590528    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-n9vdf"
Mar 20 15:58:59 crc kubenswrapper[4730]: I0320 15:58:59.593516    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-mkvv4" event={"ID":"37dd8777-c196-4db2-af7a-5560a939e02c","Type":"ContainerStarted","Data":"83ca9233a49380ba616e8a33edba2e21a0e34d304c17191d6b978242a55feaa7"}
Mar 20 15:58:59 crc kubenswrapper[4730]: I0320 15:58:59.761386    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-c8wgl"]
Mar 20 15:58:59 crc kubenswrapper[4730]: E0320 15:58:59.761743    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a532566c-ab86-4984-9212-1e48605d192b" containerName="mariadb-account-create-update"
Mar 20 15:58:59 crc kubenswrapper[4730]: I0320 15:58:59.761761    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="a532566c-ab86-4984-9212-1e48605d192b" containerName="mariadb-account-create-update"
Mar 20 15:58:59 crc kubenswrapper[4730]: E0320 15:58:59.761780    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef40906b-a3dc-45b8-8bde-dd06eaaef85c" containerName="mariadb-database-create"
Mar 20 15:58:59 crc kubenswrapper[4730]: I0320 15:58:59.761786    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef40906b-a3dc-45b8-8bde-dd06eaaef85c" containerName="mariadb-database-create"
Mar 20 15:58:59 crc kubenswrapper[4730]: E0320 15:58:59.761797    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7b436cd-ff29-4a9f-9e58-4c8760b1e012" containerName="mariadb-database-create"
Mar 20 15:58:59 crc kubenswrapper[4730]: I0320 15:58:59.761805    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7b436cd-ff29-4a9f-9e58-4c8760b1e012" containerName="mariadb-database-create"
Mar 20 15:58:59 crc kubenswrapper[4730]: E0320 15:58:59.761819    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c40f368e-f905-465b-9af0-b0ecb753de79" containerName="mariadb-account-create-update"
Mar 20 15:58:59 crc kubenswrapper[4730]: I0320 15:58:59.761825    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="c40f368e-f905-465b-9af0-b0ecb753de79" containerName="mariadb-account-create-update"
Mar 20 15:58:59 crc kubenswrapper[4730]: E0320 15:58:59.761846    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a132fe19-9294-49c6-9b1e-fe3eed7f4bae" containerName="mariadb-account-create-update"
Mar 20 15:58:59 crc kubenswrapper[4730]: I0320 15:58:59.761852    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="a132fe19-9294-49c6-9b1e-fe3eed7f4bae" containerName="mariadb-account-create-update"
Mar 20 15:58:59 crc kubenswrapper[4730]: I0320 15:58:59.762000    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="a132fe19-9294-49c6-9b1e-fe3eed7f4bae" containerName="mariadb-account-create-update"
Mar 20 15:58:59 crc kubenswrapper[4730]: I0320 15:58:59.762010    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef40906b-a3dc-45b8-8bde-dd06eaaef85c" containerName="mariadb-database-create"
Mar 20 15:58:59 crc kubenswrapper[4730]: I0320 15:58:59.762025    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="a532566c-ab86-4984-9212-1e48605d192b" containerName="mariadb-account-create-update"
Mar 20 15:58:59 crc kubenswrapper[4730]: I0320 15:58:59.762037    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7b436cd-ff29-4a9f-9e58-4c8760b1e012" containerName="mariadb-database-create"
Mar 20 15:58:59 crc kubenswrapper[4730]: I0320 15:58:59.762046    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="c40f368e-f905-465b-9af0-b0ecb753de79" containerName="mariadb-account-create-update"
Mar 20 15:58:59 crc kubenswrapper[4730]: I0320 15:58:59.762562    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-c8wgl"
Mar 20 15:58:59 crc kubenswrapper[4730]: I0320 15:58:59.775226    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-c8wgl"]
Mar 20 15:58:59 crc kubenswrapper[4730]: I0320 15:58:59.786267    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret"
Mar 20 15:58:59 crc kubenswrapper[4730]: I0320 15:58:59.864863    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a37760f5-4ee5-4b95-9364-e81582b732c7-operator-scripts\") pod \"root-account-create-update-c8wgl\" (UID: \"a37760f5-4ee5-4b95-9364-e81582b732c7\") " pod="openstack/root-account-create-update-c8wgl"
Mar 20 15:58:59 crc kubenswrapper[4730]: I0320 15:58:59.864953    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjz56\" (UniqueName: \"kubernetes.io/projected/a37760f5-4ee5-4b95-9364-e81582b732c7-kube-api-access-tjz56\") pod \"root-account-create-update-c8wgl\" (UID: \"a37760f5-4ee5-4b95-9364-e81582b732c7\") " pod="openstack/root-account-create-update-c8wgl"
Mar 20 15:58:59 crc kubenswrapper[4730]: I0320 15:58:59.966132    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a37760f5-4ee5-4b95-9364-e81582b732c7-operator-scripts\") pod \"root-account-create-update-c8wgl\" (UID: \"a37760f5-4ee5-4b95-9364-e81582b732c7\") " pod="openstack/root-account-create-update-c8wgl"
Mar 20 15:58:59 crc kubenswrapper[4730]: I0320 15:58:59.966238    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjz56\" (UniqueName: \"kubernetes.io/projected/a37760f5-4ee5-4b95-9364-e81582b732c7-kube-api-access-tjz56\") pod \"root-account-create-update-c8wgl\" (UID: \"a37760f5-4ee5-4b95-9364-e81582b732c7\") " pod="openstack/root-account-create-update-c8wgl"
Mar 20 15:58:59 crc kubenswrapper[4730]: I0320 15:58:59.967177    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a37760f5-4ee5-4b95-9364-e81582b732c7-operator-scripts\") pod \"root-account-create-update-c8wgl\" (UID: \"a37760f5-4ee5-4b95-9364-e81582b732c7\") " pod="openstack/root-account-create-update-c8wgl"
Mar 20 15:58:59 crc kubenswrapper[4730]: I0320 15:58:59.987186    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjz56\" (UniqueName: \"kubernetes.io/projected/a37760f5-4ee5-4b95-9364-e81582b732c7-kube-api-access-tjz56\") pod \"root-account-create-update-c8wgl\" (UID: \"a37760f5-4ee5-4b95-9364-e81582b732c7\") " pod="openstack/root-account-create-update-c8wgl"
Mar 20 15:59:00 crc kubenswrapper[4730]: I0320 15:59:00.142748    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-c8wgl"
Mar 20 15:59:00 crc kubenswrapper[4730]: I0320 15:59:00.414160    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-7d8lv"]
Mar 20 15:59:00 crc kubenswrapper[4730]: I0320 15:59:00.415341    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-7d8lv"
Mar 20 15:59:00 crc kubenswrapper[4730]: I0320 15:59:00.417782    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data"
Mar 20 15:59:00 crc kubenswrapper[4730]: I0320 15:59:00.418062    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data"
Mar 20 15:59:00 crc kubenswrapper[4730]: I0320 15:59:00.418482    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts"
Mar 20 15:59:00 crc kubenswrapper[4730]: I0320 15:59:00.427822    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-7d8lv"]
Mar 20 15:59:00 crc kubenswrapper[4730]: I0320 15:59:00.473935    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/167282ce-29fc-44db-9b0b-baf2c956f433-dispersionconf\") pod \"swift-ring-rebalance-7d8lv\" (UID: \"167282ce-29fc-44db-9b0b-baf2c956f433\") " pod="openstack/swift-ring-rebalance-7d8lv"
Mar 20 15:59:00 crc kubenswrapper[4730]: I0320 15:59:00.473972    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/167282ce-29fc-44db-9b0b-baf2c956f433-scripts\") pod \"swift-ring-rebalance-7d8lv\" (UID: \"167282ce-29fc-44db-9b0b-baf2c956f433\") " pod="openstack/swift-ring-rebalance-7d8lv"
Mar 20 15:59:00 crc kubenswrapper[4730]: I0320 15:59:00.474029    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2c9def6e-27a0-4543-8d3c-07b3e4005b33-etc-swift\") pod \"swift-storage-0\" (UID: \"2c9def6e-27a0-4543-8d3c-07b3e4005b33\") " pod="openstack/swift-storage-0"
Mar 20 15:59:00 crc kubenswrapper[4730]: I0320 15:59:00.474063    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzrnr\" (UniqueName: \"kubernetes.io/projected/167282ce-29fc-44db-9b0b-baf2c956f433-kube-api-access-dzrnr\") pod \"swift-ring-rebalance-7d8lv\" (UID: \"167282ce-29fc-44db-9b0b-baf2c956f433\") " pod="openstack/swift-ring-rebalance-7d8lv"
Mar 20 15:59:00 crc kubenswrapper[4730]: I0320 15:59:00.474123    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/167282ce-29fc-44db-9b0b-baf2c956f433-etc-swift\") pod \"swift-ring-rebalance-7d8lv\" (UID: \"167282ce-29fc-44db-9b0b-baf2c956f433\") " pod="openstack/swift-ring-rebalance-7d8lv"
Mar 20 15:59:00 crc kubenswrapper[4730]: I0320 15:59:00.474141    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/167282ce-29fc-44db-9b0b-baf2c956f433-ring-data-devices\") pod \"swift-ring-rebalance-7d8lv\" (UID: \"167282ce-29fc-44db-9b0b-baf2c956f433\") " pod="openstack/swift-ring-rebalance-7d8lv"
Mar 20 15:59:00 crc kubenswrapper[4730]: I0320 15:59:00.474182    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/167282ce-29fc-44db-9b0b-baf2c956f433-swiftconf\") pod \"swift-ring-rebalance-7d8lv\" (UID: \"167282ce-29fc-44db-9b0b-baf2c956f433\") " pod="openstack/swift-ring-rebalance-7d8lv"
Mar 20 15:59:00 crc kubenswrapper[4730]: I0320 15:59:00.474206    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/167282ce-29fc-44db-9b0b-baf2c956f433-combined-ca-bundle\") pod \"swift-ring-rebalance-7d8lv\" (UID: \"167282ce-29fc-44db-9b0b-baf2c956f433\") " pod="openstack/swift-ring-rebalance-7d8lv"
Mar 20 15:59:00 crc kubenswrapper[4730]: E0320 15:59:00.474421    4730 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found
Mar 20 15:59:00 crc kubenswrapper[4730]: E0320 15:59:00.474451    4730 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found
Mar 20 15:59:00 crc kubenswrapper[4730]: E0320 15:59:00.474492    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2c9def6e-27a0-4543-8d3c-07b3e4005b33-etc-swift podName:2c9def6e-27a0-4543-8d3c-07b3e4005b33 nodeName:}" failed. No retries permitted until 2026-03-20 15:59:04.474479281 +0000 UTC m=+1203.687850650 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2c9def6e-27a0-4543-8d3c-07b3e4005b33-etc-swift") pod "swift-storage-0" (UID: "2c9def6e-27a0-4543-8d3c-07b3e4005b33") : configmap "swift-ring-files" not found
Mar 20 15:59:00 crc kubenswrapper[4730]: I0320 15:59:00.575308    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/167282ce-29fc-44db-9b0b-baf2c956f433-etc-swift\") pod \"swift-ring-rebalance-7d8lv\" (UID: \"167282ce-29fc-44db-9b0b-baf2c956f433\") " pod="openstack/swift-ring-rebalance-7d8lv"
Mar 20 15:59:00 crc kubenswrapper[4730]: I0320 15:59:00.575360    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/167282ce-29fc-44db-9b0b-baf2c956f433-ring-data-devices\") pod \"swift-ring-rebalance-7d8lv\" (UID: \"167282ce-29fc-44db-9b0b-baf2c956f433\") " pod="openstack/swift-ring-rebalance-7d8lv"
Mar 20 15:59:00 crc kubenswrapper[4730]: I0320 15:59:00.575400    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/167282ce-29fc-44db-9b0b-baf2c956f433-swiftconf\") pod \"swift-ring-rebalance-7d8lv\" (UID: \"167282ce-29fc-44db-9b0b-baf2c956f433\") " pod="openstack/swift-ring-rebalance-7d8lv"
Mar 20 15:59:00 crc kubenswrapper[4730]: I0320 15:59:00.575452    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/167282ce-29fc-44db-9b0b-baf2c956f433-combined-ca-bundle\") pod \"swift-ring-rebalance-7d8lv\" (UID: \"167282ce-29fc-44db-9b0b-baf2c956f433\") " pod="openstack/swift-ring-rebalance-7d8lv"
Mar 20 15:59:00 crc kubenswrapper[4730]: I0320 15:59:00.575521    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/167282ce-29fc-44db-9b0b-baf2c956f433-dispersionconf\") pod \"swift-ring-rebalance-7d8lv\" (UID: \"167282ce-29fc-44db-9b0b-baf2c956f433\") " pod="openstack/swift-ring-rebalance-7d8lv"
Mar 20 15:59:00 crc kubenswrapper[4730]: I0320 15:59:00.575547    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/167282ce-29fc-44db-9b0b-baf2c956f433-scripts\") pod \"swift-ring-rebalance-7d8lv\" (UID: \"167282ce-29fc-44db-9b0b-baf2c956f433\") " pod="openstack/swift-ring-rebalance-7d8lv"
Mar 20 15:59:00 crc kubenswrapper[4730]: I0320 15:59:00.575627    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzrnr\" (UniqueName: \"kubernetes.io/projected/167282ce-29fc-44db-9b0b-baf2c956f433-kube-api-access-dzrnr\") pod \"swift-ring-rebalance-7d8lv\" (UID: \"167282ce-29fc-44db-9b0b-baf2c956f433\") " pod="openstack/swift-ring-rebalance-7d8lv"
Mar 20 15:59:00 crc kubenswrapper[4730]: I0320 15:59:00.577103    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/167282ce-29fc-44db-9b0b-baf2c956f433-ring-data-devices\") pod \"swift-ring-rebalance-7d8lv\" (UID: \"167282ce-29fc-44db-9b0b-baf2c956f433\") " pod="openstack/swift-ring-rebalance-7d8lv"
Mar 20 15:59:00 crc kubenswrapper[4730]: I0320 15:59:00.579271    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/167282ce-29fc-44db-9b0b-baf2c956f433-etc-swift\") pod \"swift-ring-rebalance-7d8lv\" (UID: \"167282ce-29fc-44db-9b0b-baf2c956f433\") " pod="openstack/swift-ring-rebalance-7d8lv"
Mar 20 15:59:00 crc kubenswrapper[4730]: I0320 15:59:00.583352    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/167282ce-29fc-44db-9b0b-baf2c956f433-swiftconf\") pod \"swift-ring-rebalance-7d8lv\" (UID: \"167282ce-29fc-44db-9b0b-baf2c956f433\") " pod="openstack/swift-ring-rebalance-7d8lv"
Mar 20 15:59:00 crc kubenswrapper[4730]: I0320 15:59:00.586236    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/167282ce-29fc-44db-9b0b-baf2c956f433-scripts\") pod \"swift-ring-rebalance-7d8lv\" (UID: \"167282ce-29fc-44db-9b0b-baf2c956f433\") " pod="openstack/swift-ring-rebalance-7d8lv"
Mar 20 15:59:00 crc kubenswrapper[4730]: I0320 15:59:00.588593    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/167282ce-29fc-44db-9b0b-baf2c956f433-dispersionconf\") pod \"swift-ring-rebalance-7d8lv\" (UID: \"167282ce-29fc-44db-9b0b-baf2c956f433\") " pod="openstack/swift-ring-rebalance-7d8lv"
Mar 20 15:59:00 crc kubenswrapper[4730]: I0320 15:59:00.588788    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/167282ce-29fc-44db-9b0b-baf2c956f433-combined-ca-bundle\") pod \"swift-ring-rebalance-7d8lv\" (UID: \"167282ce-29fc-44db-9b0b-baf2c956f433\") " pod="openstack/swift-ring-rebalance-7d8lv"
Mar 20 15:59:00 crc kubenswrapper[4730]: I0320 15:59:00.593142    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzrnr\" (UniqueName: \"kubernetes.io/projected/167282ce-29fc-44db-9b0b-baf2c956f433-kube-api-access-dzrnr\") pod \"swift-ring-rebalance-7d8lv\" (UID: \"167282ce-29fc-44db-9b0b-baf2c956f433\") " pod="openstack/swift-ring-rebalance-7d8lv"
Mar 20 15:59:00 crc kubenswrapper[4730]: I0320 15:59:00.675197    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-c8wgl"]
Mar 20 15:59:00 crc kubenswrapper[4730]: I0320 15:59:00.753288    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-7d8lv"
Mar 20 15:59:02 crc kubenswrapper[4730]: I0320 15:59:02.310731    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret"
Mar 20 15:59:02 crc kubenswrapper[4730]: I0320 15:59:02.626361    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fdd3845a-3723-438f-aa58-606451baed6c","Type":"ContainerStarted","Data":"83d469de2778c173f20bbd31bfd4fc16492ac416000fbb48eb639e2a00a91feb"}
Mar 20 15:59:02 crc kubenswrapper[4730]: I0320 15:59:02.627764    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-c8wgl" event={"ID":"a37760f5-4ee5-4b95-9364-e81582b732c7","Type":"ContainerStarted","Data":"7f523e2f068601b64366f703836d659d597d40e8112309dd07122d42b1769869"}
Mar 20 15:59:02 crc kubenswrapper[4730]: I0320 15:59:02.627784    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-c8wgl" event={"ID":"a37760f5-4ee5-4b95-9364-e81582b732c7","Type":"ContainerStarted","Data":"7f0455b4d6e218aa0ce0046f8dd059ae557ff987aece6c438eedd673d8f6f282"}
Mar 20 15:59:02 crc kubenswrapper[4730]: I0320 15:59:02.650799    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=8.444930863 podStartE2EDuration="47.650783394s" podCreationTimestamp="2026-03-20 15:58:15 +0000 UTC" firstStartedPulling="2026-03-20 15:58:23.112610971 +0000 UTC m=+1162.325982340" lastFinishedPulling="2026-03-20 15:59:02.318463502 +0000 UTC m=+1201.531834871" observedRunningTime="2026-03-20 15:59:02.648635603 +0000 UTC m=+1201.862006992" watchObservedRunningTime="2026-03-20 15:59:02.650783394 +0000 UTC m=+1201.864154763"
Mar 20 15:59:02 crc kubenswrapper[4730]: I0320 15:59:02.666438    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-c8wgl" podStartSLOduration=3.666418889 podStartE2EDuration="3.666418889s" podCreationTimestamp="2026-03-20 15:58:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:59:02.662925059 +0000 UTC m=+1201.876296438" watchObservedRunningTime="2026-03-20 15:59:02.666418889 +0000 UTC m=+1201.879790258"
Mar 20 15:59:02 crc kubenswrapper[4730]: I0320 15:59:02.800831    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-7d8lv"]
Mar 20 15:59:02 crc kubenswrapper[4730]: W0320 15:59:02.811557    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod167282ce_29fc_44db_9b0b_baf2c956f433.slice/crio-c2bed297b15aeb66e5504095b19b123685071c689e915cfbc2281dd9c7ff81a2 WatchSource:0}: Error finding container c2bed297b15aeb66e5504095b19b123685071c689e915cfbc2281dd9c7ff81a2: Status 404 returned error can't find the container with id c2bed297b15aeb66e5504095b19b123685071c689e915cfbc2281dd9c7ff81a2
Mar 20 15:59:03 crc kubenswrapper[4730]: I0320 15:59:03.637081    4730 generic.go:334] "Generic (PLEG): container finished" podID="a37760f5-4ee5-4b95-9364-e81582b732c7" containerID="7f523e2f068601b64366f703836d659d597d40e8112309dd07122d42b1769869" exitCode=0
Mar 20 15:59:03 crc kubenswrapper[4730]: I0320 15:59:03.637159    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-c8wgl" event={"ID":"a37760f5-4ee5-4b95-9364-e81582b732c7","Type":"ContainerDied","Data":"7f523e2f068601b64366f703836d659d597d40e8112309dd07122d42b1769869"}
Mar 20 15:59:03 crc kubenswrapper[4730]: I0320 15:59:03.642462    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-7d8lv" event={"ID":"167282ce-29fc-44db-9b0b-baf2c956f433","Type":"ContainerStarted","Data":"c2bed297b15aeb66e5504095b19b123685071c689e915cfbc2281dd9c7ff81a2"}
Mar 20 15:59:04 crc kubenswrapper[4730]: I0320 15:59:04.486511    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2c9def6e-27a0-4543-8d3c-07b3e4005b33-etc-swift\") pod \"swift-storage-0\" (UID: \"2c9def6e-27a0-4543-8d3c-07b3e4005b33\") " pod="openstack/swift-storage-0"
Mar 20 15:59:04 crc kubenswrapper[4730]: E0320 15:59:04.486737    4730 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found
Mar 20 15:59:04 crc kubenswrapper[4730]: E0320 15:59:04.486779    4730 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found
Mar 20 15:59:04 crc kubenswrapper[4730]: E0320 15:59:04.486843    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2c9def6e-27a0-4543-8d3c-07b3e4005b33-etc-swift podName:2c9def6e-27a0-4543-8d3c-07b3e4005b33 nodeName:}" failed. No retries permitted until 2026-03-20 15:59:12.486825499 +0000 UTC m=+1211.700196868 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2c9def6e-27a0-4543-8d3c-07b3e4005b33-etc-swift") pod "swift-storage-0" (UID: "2c9def6e-27a0-4543-8d3c-07b3e4005b33") : configmap "swift-ring-files" not found
Mar 20 15:59:05 crc kubenswrapper[4730]: I0320 15:59:05.717466    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6868c44cb9-4zxm5"
Mar 20 15:59:05 crc kubenswrapper[4730]: I0320 15:59:05.761914    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-c8wgl"
Mar 20 15:59:05 crc kubenswrapper[4730]: I0320 15:59:05.788435    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-648686d659-c5gtt"]
Mar 20 15:59:05 crc kubenswrapper[4730]: I0320 15:59:05.788651    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-648686d659-c5gtt" podUID="68e4e5c3-825d-477e-a403-0cae45a86806" containerName="dnsmasq-dns" containerID="cri-o://fe73524182281c96c8ff70a33be500f38666a259e3cec724e30e6c153a1251ae" gracePeriod=10
Mar 20 15:59:05 crc kubenswrapper[4730]: I0320 15:59:05.815716    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjz56\" (UniqueName: \"kubernetes.io/projected/a37760f5-4ee5-4b95-9364-e81582b732c7-kube-api-access-tjz56\") pod \"a37760f5-4ee5-4b95-9364-e81582b732c7\" (UID: \"a37760f5-4ee5-4b95-9364-e81582b732c7\") "
Mar 20 15:59:05 crc kubenswrapper[4730]: I0320 15:59:05.815946    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a37760f5-4ee5-4b95-9364-e81582b732c7-operator-scripts\") pod \"a37760f5-4ee5-4b95-9364-e81582b732c7\" (UID: \"a37760f5-4ee5-4b95-9364-e81582b732c7\") "
Mar 20 15:59:05 crc kubenswrapper[4730]: I0320 15:59:05.817120    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a37760f5-4ee5-4b95-9364-e81582b732c7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a37760f5-4ee5-4b95-9364-e81582b732c7" (UID: "a37760f5-4ee5-4b95-9364-e81582b732c7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:59:05 crc kubenswrapper[4730]: I0320 15:59:05.843446    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a37760f5-4ee5-4b95-9364-e81582b732c7-kube-api-access-tjz56" (OuterVolumeSpecName: "kube-api-access-tjz56") pod "a37760f5-4ee5-4b95-9364-e81582b732c7" (UID: "a37760f5-4ee5-4b95-9364-e81582b732c7"). InnerVolumeSpecName "kube-api-access-tjz56". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:59:05 crc kubenswrapper[4730]: I0320 15:59:05.918105    4730 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a37760f5-4ee5-4b95-9364-e81582b732c7-operator-scripts\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:05 crc kubenswrapper[4730]: I0320 15:59:05.918142    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjz56\" (UniqueName: \"kubernetes.io/projected/a37760f5-4ee5-4b95-9364-e81582b732c7-kube-api-access-tjz56\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:06 crc kubenswrapper[4730]: I0320 15:59:06.679388    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-c8wgl"
Mar 20 15:59:06 crc kubenswrapper[4730]: I0320 15:59:06.681466    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-c8wgl" event={"ID":"a37760f5-4ee5-4b95-9364-e81582b732c7","Type":"ContainerDied","Data":"7f0455b4d6e218aa0ce0046f8dd059ae557ff987aece6c438eedd673d8f6f282"}
Mar 20 15:59:06 crc kubenswrapper[4730]: I0320 15:59:06.681523    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f0455b4d6e218aa0ce0046f8dd059ae557ff987aece6c438eedd673d8f6f282"
Mar 20 15:59:06 crc kubenswrapper[4730]: I0320 15:59:06.683077    4730 generic.go:334] "Generic (PLEG): container finished" podID="68e4e5c3-825d-477e-a403-0cae45a86806" containerID="fe73524182281c96c8ff70a33be500f38666a259e3cec724e30e6c153a1251ae" exitCode=0
Mar 20 15:59:06 crc kubenswrapper[4730]: I0320 15:59:06.683171    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-648686d659-c5gtt" event={"ID":"68e4e5c3-825d-477e-a403-0cae45a86806","Type":"ContainerDied","Data":"fe73524182281c96c8ff70a33be500f38666a259e3cec724e30e6c153a1251ae"}
Mar 20 15:59:06 crc kubenswrapper[4730]: I0320 15:59:06.724488    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0"
Mar 20 15:59:06 crc kubenswrapper[4730]: I0320 15:59:06.924046    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0"
Mar 20 15:59:07 crc kubenswrapper[4730]: I0320 15:59:07.696872    4730 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-648686d659-c5gtt" podUID="68e4e5c3-825d-477e-a403-0cae45a86806" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.123:5353: connect: connection refused"
Mar 20 15:59:09 crc kubenswrapper[4730]: I0320 15:59:09.145146    4730 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-gtrnp" podUID="31651551-edb9-4793-a752-39fa60a85ee3" containerName="ovn-controller" probeResult="failure" output=<
Mar 20 15:59:09 crc kubenswrapper[4730]:         ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status
Mar 20 15:59:09 crc kubenswrapper[4730]:  >
Mar 20 15:59:10 crc kubenswrapper[4730]: I0320 15:59:10.740014    4730 generic.go:334] "Generic (PLEG): container finished" podID="df9ca02d-e20f-4f55-ba14-92b91812afb6" containerID="3d7e7a0cabaf1b38c1891734e06c9106a1e8c0af0454fdefaf773422d1dcf747" exitCode=0
Mar 20 15:59:10 crc kubenswrapper[4730]: I0320 15:59:10.740325    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/notifications-rabbitmq-server-0" event={"ID":"df9ca02d-e20f-4f55-ba14-92b91812afb6","Type":"ContainerDied","Data":"3d7e7a0cabaf1b38c1891734e06c9106a1e8c0af0454fdefaf773422d1dcf747"}
Mar 20 15:59:10 crc kubenswrapper[4730]: I0320 15:59:10.743149    4730 generic.go:334] "Generic (PLEG): container finished" podID="dfd9111c-a9f4-4874-91fc-c0ef68ae09a3" containerID="13985a1e2e3d58d396be0af6437cdcdb0bbdea54308502442707c077b36e9713" exitCode=0
Mar 20 15:59:10 crc kubenswrapper[4730]: I0320 15:59:10.743176    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3","Type":"ContainerDied","Data":"13985a1e2e3d58d396be0af6437cdcdb0bbdea54308502442707c077b36e9713"}
Mar 20 15:59:11 crc kubenswrapper[4730]: I0320 15:59:11.242615    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-c8wgl"]
Mar 20 15:59:11 crc kubenswrapper[4730]: I0320 15:59:11.248996    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-c8wgl"]
Mar 20 15:59:11 crc kubenswrapper[4730]: I0320 15:59:11.557595    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a37760f5-4ee5-4b95-9364-e81582b732c7" path="/var/lib/kubelet/pods/a37760f5-4ee5-4b95-9364-e81582b732c7/volumes"
Mar 20 15:59:11 crc kubenswrapper[4730]: I0320 15:59:11.753014    4730 generic.go:334] "Generic (PLEG): container finished" podID="8043f69c-832c-4afa-a9b9-211507664805" containerID="5873082b81a5b9253ac47bf2bf3866502e40b3ccab836a111c3bd8134e015ee5" exitCode=0
Mar 20 15:59:11 crc kubenswrapper[4730]: I0320 15:59:11.753226    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8043f69c-832c-4afa-a9b9-211507664805","Type":"ContainerDied","Data":"5873082b81a5b9253ac47bf2bf3866502e40b3ccab836a111c3bd8134e015ee5"}
Mar 20 15:59:12 crc kubenswrapper[4730]: I0320 15:59:12.575677    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2c9def6e-27a0-4543-8d3c-07b3e4005b33-etc-swift\") pod \"swift-storage-0\" (UID: \"2c9def6e-27a0-4543-8d3c-07b3e4005b33\") " pod="openstack/swift-storage-0"
Mar 20 15:59:12 crc kubenswrapper[4730]: E0320 15:59:12.575872    4730 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found
Mar 20 15:59:12 crc kubenswrapper[4730]: E0320 15:59:12.575899    4730 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found
Mar 20 15:59:12 crc kubenswrapper[4730]: E0320 15:59:12.575958    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2c9def6e-27a0-4543-8d3c-07b3e4005b33-etc-swift podName:2c9def6e-27a0-4543-8d3c-07b3e4005b33 nodeName:}" failed. No retries permitted until 2026-03-20 15:59:28.575940097 +0000 UTC m=+1227.789311466 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2c9def6e-27a0-4543-8d3c-07b3e4005b33-etc-swift") pod "swift-storage-0" (UID: "2c9def6e-27a0-4543-8d3c-07b3e4005b33") : configmap "swift-ring-files" not found
Mar 20 15:59:12 crc kubenswrapper[4730]: I0320 15:59:12.696688    4730 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-648686d659-c5gtt" podUID="68e4e5c3-825d-477e-a403-0cae45a86806" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.123:5353: connect: connection refused"
Mar 20 15:59:13 crc kubenswrapper[4730]: I0320 15:59:13.781095    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-648686d659-c5gtt"
Mar 20 15:59:13 crc kubenswrapper[4730]: I0320 15:59:13.781215    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-648686d659-c5gtt" event={"ID":"68e4e5c3-825d-477e-a403-0cae45a86806","Type":"ContainerDied","Data":"4818c9171456a6ff54f39a98823d5ad13ee602182702fbabd7963c0de43a3bf8"}
Mar 20 15:59:13 crc kubenswrapper[4730]: I0320 15:59:13.781779    4730 scope.go:117] "RemoveContainer" containerID="fe73524182281c96c8ff70a33be500f38666a259e3cec724e30e6c153a1251ae"
Mar 20 15:59:13 crc kubenswrapper[4730]: I0320 15:59:13.811445    4730 scope.go:117] "RemoveContainer" containerID="c68e3d34495e1743eae1f1377e46b36b943735fc7af94f6c4a34f2182614de74"
Mar 20 15:59:13 crc kubenswrapper[4730]: I0320 15:59:13.900108    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68e4e5c3-825d-477e-a403-0cae45a86806-dns-svc\") pod \"68e4e5c3-825d-477e-a403-0cae45a86806\" (UID: \"68e4e5c3-825d-477e-a403-0cae45a86806\") "
Mar 20 15:59:13 crc kubenswrapper[4730]: I0320 15:59:13.900153    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shdlr\" (UniqueName: \"kubernetes.io/projected/68e4e5c3-825d-477e-a403-0cae45a86806-kube-api-access-shdlr\") pod \"68e4e5c3-825d-477e-a403-0cae45a86806\" (UID: \"68e4e5c3-825d-477e-a403-0cae45a86806\") "
Mar 20 15:59:13 crc kubenswrapper[4730]: I0320 15:59:13.900269    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68e4e5c3-825d-477e-a403-0cae45a86806-ovsdbserver-nb\") pod \"68e4e5c3-825d-477e-a403-0cae45a86806\" (UID: \"68e4e5c3-825d-477e-a403-0cae45a86806\") "
Mar 20 15:59:13 crc kubenswrapper[4730]: I0320 15:59:13.900315    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68e4e5c3-825d-477e-a403-0cae45a86806-ovsdbserver-sb\") pod \"68e4e5c3-825d-477e-a403-0cae45a86806\" (UID: \"68e4e5c3-825d-477e-a403-0cae45a86806\") "
Mar 20 15:59:13 crc kubenswrapper[4730]: I0320 15:59:13.900370    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68e4e5c3-825d-477e-a403-0cae45a86806-config\") pod \"68e4e5c3-825d-477e-a403-0cae45a86806\" (UID: \"68e4e5c3-825d-477e-a403-0cae45a86806\") "
Mar 20 15:59:13 crc kubenswrapper[4730]: I0320 15:59:13.904580    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68e4e5c3-825d-477e-a403-0cae45a86806-kube-api-access-shdlr" (OuterVolumeSpecName: "kube-api-access-shdlr") pod "68e4e5c3-825d-477e-a403-0cae45a86806" (UID: "68e4e5c3-825d-477e-a403-0cae45a86806"). InnerVolumeSpecName "kube-api-access-shdlr". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:59:13 crc kubenswrapper[4730]: I0320 15:59:13.937293    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68e4e5c3-825d-477e-a403-0cae45a86806-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "68e4e5c3-825d-477e-a403-0cae45a86806" (UID: "68e4e5c3-825d-477e-a403-0cae45a86806"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:59:13 crc kubenswrapper[4730]: I0320 15:59:13.942631    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68e4e5c3-825d-477e-a403-0cae45a86806-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "68e4e5c3-825d-477e-a403-0cae45a86806" (UID: "68e4e5c3-825d-477e-a403-0cae45a86806"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:59:13 crc kubenswrapper[4730]: I0320 15:59:13.946074    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68e4e5c3-825d-477e-a403-0cae45a86806-config" (OuterVolumeSpecName: "config") pod "68e4e5c3-825d-477e-a403-0cae45a86806" (UID: "68e4e5c3-825d-477e-a403-0cae45a86806"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:59:13 crc kubenswrapper[4730]: I0320 15:59:13.982591    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68e4e5c3-825d-477e-a403-0cae45a86806-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "68e4e5c3-825d-477e-a403-0cae45a86806" (UID: "68e4e5c3-825d-477e-a403-0cae45a86806"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:59:14 crc kubenswrapper[4730]: I0320 15:59:14.002563    4730 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68e4e5c3-825d-477e-a403-0cae45a86806-dns-svc\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:14 crc kubenswrapper[4730]: I0320 15:59:14.002599    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shdlr\" (UniqueName: \"kubernetes.io/projected/68e4e5c3-825d-477e-a403-0cae45a86806-kube-api-access-shdlr\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:14 crc kubenswrapper[4730]: I0320 15:59:14.002612    4730 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68e4e5c3-825d-477e-a403-0cae45a86806-ovsdbserver-nb\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:14 crc kubenswrapper[4730]: I0320 15:59:14.002620    4730 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68e4e5c3-825d-477e-a403-0cae45a86806-ovsdbserver-sb\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:14 crc kubenswrapper[4730]: I0320 15:59:14.002628    4730 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68e4e5c3-825d-477e-a403-0cae45a86806-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:14 crc kubenswrapper[4730]: I0320 15:59:14.148064    4730 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-gtrnp" podUID="31651551-edb9-4793-a752-39fa60a85ee3" containerName="ovn-controller" probeResult="failure" output=<
Mar 20 15:59:14 crc kubenswrapper[4730]:         ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status
Mar 20 15:59:14 crc kubenswrapper[4730]:  >
Mar 20 15:59:14 crc kubenswrapper[4730]: I0320 15:59:14.790856    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-7d8lv" event={"ID":"167282ce-29fc-44db-9b0b-baf2c956f433","Type":"ContainerStarted","Data":"768be9518c03c37026c048245f752f8e9492e3f207a6cde3432392aa9859edc5"}
Mar 20 15:59:14 crc kubenswrapper[4730]: I0320 15:59:14.792743    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-mkvv4" event={"ID":"37dd8777-c196-4db2-af7a-5560a939e02c","Type":"ContainerStarted","Data":"45a09c4f4bffe31b4f9cf83737f4a3331b9ba65b3e4bbf1a00d15070f2dd1fbb"}
Mar 20 15:59:14 crc kubenswrapper[4730]: I0320 15:59:14.794507    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-648686d659-c5gtt"
Mar 20 15:59:14 crc kubenswrapper[4730]: I0320 15:59:14.797546    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8043f69c-832c-4afa-a9b9-211507664805","Type":"ContainerStarted","Data":"f4ff5614730f4bee870729b9dbea193a82cb2fbf2b64a4650757a12a3469fc3b"}
Mar 20 15:59:14 crc kubenswrapper[4730]: I0320 15:59:14.797970    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0"
Mar 20 15:59:14 crc kubenswrapper[4730]: I0320 15:59:14.800151    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3","Type":"ContainerStarted","Data":"3a783d296547ab247634b62ed131b57fa9392453e5aadc95036d56c15ea1686f"}
Mar 20 15:59:14 crc kubenswrapper[4730]: I0320 15:59:14.800690    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0"
Mar 20 15:59:14 crc kubenswrapper[4730]: I0320 15:59:14.802418    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/notifications-rabbitmq-server-0" event={"ID":"df9ca02d-e20f-4f55-ba14-92b91812afb6","Type":"ContainerStarted","Data":"ee087c818bbb63c9bd400095c1dd7f6ec4709f200b2191330d19a5435b2266e6"}
Mar 20 15:59:14 crc kubenswrapper[4730]: I0320 15:59:14.802725    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/notifications-rabbitmq-server-0"
Mar 20 15:59:14 crc kubenswrapper[4730]: I0320 15:59:14.829926    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-7d8lv" podStartSLOduration=4.132109491 podStartE2EDuration="14.829906486s" podCreationTimestamp="2026-03-20 15:59:00 +0000 UTC" firstStartedPulling="2026-03-20 15:59:02.813914899 +0000 UTC m=+1202.027286268" lastFinishedPulling="2026-03-20 15:59:13.511711824 +0000 UTC m=+1212.725083263" observedRunningTime="2026-03-20 15:59:14.82336375 +0000 UTC m=+1214.036735119" watchObservedRunningTime="2026-03-20 15:59:14.829906486 +0000 UTC m=+1214.043277855"
Mar 20 15:59:14 crc kubenswrapper[4730]: I0320 15:59:14.850019    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-mkvv4" podStartSLOduration=2.682739028 podStartE2EDuration="16.850002247s" podCreationTimestamp="2026-03-20 15:58:58 +0000 UTC" firstStartedPulling="2026-03-20 15:58:59.425012204 +0000 UTC m=+1198.638383583" lastFinishedPulling="2026-03-20 15:59:13.592275433 +0000 UTC m=+1212.805646802" observedRunningTime="2026-03-20 15:59:14.843541683 +0000 UTC m=+1214.056913052" watchObservedRunningTime="2026-03-20 15:59:14.850002247 +0000 UTC m=+1214.063373606"
Mar 20 15:59:14 crc kubenswrapper[4730]: I0320 15:59:14.870263    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/notifications-rabbitmq-server-0" podStartSLOduration=53.485550587 podStartE2EDuration="1m6.870220991s" podCreationTimestamp="2026-03-20 15:58:08 +0000 UTC" firstStartedPulling="2026-03-20 15:58:22.761890696 +0000 UTC m=+1161.975262065" lastFinishedPulling="2026-03-20 15:58:36.14656111 +0000 UTC m=+1175.359932469" observedRunningTime="2026-03-20 15:59:14.866436514 +0000 UTC m=+1214.079807923" watchObservedRunningTime="2026-03-20 15:59:14.870220991 +0000 UTC m=+1214.083592360"
Mar 20 15:59:14 crc kubenswrapper[4730]: I0320 15:59:14.900173    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=52.950118835 podStartE2EDuration="1m6.900154332s" podCreationTimestamp="2026-03-20 15:58:08 +0000 UTC" firstStartedPulling="2026-03-20 15:58:22.21965882 +0000 UTC m=+1161.433030189" lastFinishedPulling="2026-03-20 15:58:36.169694317 +0000 UTC m=+1175.383065686" observedRunningTime="2026-03-20 15:59:14.891577388 +0000 UTC m=+1214.104948757" watchObservedRunningTime="2026-03-20 15:59:14.900154332 +0000 UTC m=+1214.113525701"
Mar 20 15:59:14 crc kubenswrapper[4730]: I0320 15:59:14.911404    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-648686d659-c5gtt"]
Mar 20 15:59:14 crc kubenswrapper[4730]: I0320 15:59:14.917916    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-648686d659-c5gtt"]
Mar 20 15:59:14 crc kubenswrapper[4730]: I0320 15:59:14.931846    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=55.033145057 podStartE2EDuration="1m7.931827052s" podCreationTimestamp="2026-03-20 15:58:07 +0000 UTC" firstStartedPulling="2026-03-20 15:58:22.7831313 +0000 UTC m=+1161.996502669" lastFinishedPulling="2026-03-20 15:58:35.681813295 +0000 UTC m=+1174.895184664" observedRunningTime="2026-03-20 15:59:14.924132093 +0000 UTC m=+1214.137503482" watchObservedRunningTime="2026-03-20 15:59:14.931827052 +0000 UTC m=+1214.145198421"
Mar 20 15:59:15 crc kubenswrapper[4730]: I0320 15:59:15.547965    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68e4e5c3-825d-477e-a403-0cae45a86806" path="/var/lib/kubelet/pods/68e4e5c3-825d-477e-a403-0cae45a86806/volumes"
Mar 20 15:59:16 crc kubenswrapper[4730]: I0320 15:59:16.238978    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-d92d8"]
Mar 20 15:59:16 crc kubenswrapper[4730]: E0320 15:59:16.239341    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68e4e5c3-825d-477e-a403-0cae45a86806" containerName="init"
Mar 20 15:59:16 crc kubenswrapper[4730]: I0320 15:59:16.239353    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="68e4e5c3-825d-477e-a403-0cae45a86806" containerName="init"
Mar 20 15:59:16 crc kubenswrapper[4730]: E0320 15:59:16.239364    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a37760f5-4ee5-4b95-9364-e81582b732c7" containerName="mariadb-account-create-update"
Mar 20 15:59:16 crc kubenswrapper[4730]: I0320 15:59:16.239369    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="a37760f5-4ee5-4b95-9364-e81582b732c7" containerName="mariadb-account-create-update"
Mar 20 15:59:16 crc kubenswrapper[4730]: E0320 15:59:16.239381    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68e4e5c3-825d-477e-a403-0cae45a86806" containerName="dnsmasq-dns"
Mar 20 15:59:16 crc kubenswrapper[4730]: I0320 15:59:16.239387    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="68e4e5c3-825d-477e-a403-0cae45a86806" containerName="dnsmasq-dns"
Mar 20 15:59:16 crc kubenswrapper[4730]: I0320 15:59:16.239532    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="68e4e5c3-825d-477e-a403-0cae45a86806" containerName="dnsmasq-dns"
Mar 20 15:59:16 crc kubenswrapper[4730]: I0320 15:59:16.239541    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="a37760f5-4ee5-4b95-9364-e81582b732c7" containerName="mariadb-account-create-update"
Mar 20 15:59:16 crc kubenswrapper[4730]: I0320 15:59:16.240072    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-d92d8"
Mar 20 15:59:16 crc kubenswrapper[4730]: I0320 15:59:16.247018    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret"
Mar 20 15:59:16 crc kubenswrapper[4730]: I0320 15:59:16.254761    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-d92d8"]
Mar 20 15:59:16 crc kubenswrapper[4730]: I0320 15:59:16.354351    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb89m\" (UniqueName: \"kubernetes.io/projected/ed167127-4e44-4877-bf9b-dbb6a23a8b3f-kube-api-access-pb89m\") pod \"root-account-create-update-d92d8\" (UID: \"ed167127-4e44-4877-bf9b-dbb6a23a8b3f\") " pod="openstack/root-account-create-update-d92d8"
Mar 20 15:59:16 crc kubenswrapper[4730]: I0320 15:59:16.354817    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed167127-4e44-4877-bf9b-dbb6a23a8b3f-operator-scripts\") pod \"root-account-create-update-d92d8\" (UID: \"ed167127-4e44-4877-bf9b-dbb6a23a8b3f\") " pod="openstack/root-account-create-update-d92d8"
Mar 20 15:59:16 crc kubenswrapper[4730]: I0320 15:59:16.456183    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed167127-4e44-4877-bf9b-dbb6a23a8b3f-operator-scripts\") pod \"root-account-create-update-d92d8\" (UID: \"ed167127-4e44-4877-bf9b-dbb6a23a8b3f\") " pod="openstack/root-account-create-update-d92d8"
Mar 20 15:59:16 crc kubenswrapper[4730]: I0320 15:59:16.456522    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pb89m\" (UniqueName: \"kubernetes.io/projected/ed167127-4e44-4877-bf9b-dbb6a23a8b3f-kube-api-access-pb89m\") pod \"root-account-create-update-d92d8\" (UID: \"ed167127-4e44-4877-bf9b-dbb6a23a8b3f\") " pod="openstack/root-account-create-update-d92d8"
Mar 20 15:59:16 crc kubenswrapper[4730]: I0320 15:59:16.456879    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed167127-4e44-4877-bf9b-dbb6a23a8b3f-operator-scripts\") pod \"root-account-create-update-d92d8\" (UID: \"ed167127-4e44-4877-bf9b-dbb6a23a8b3f\") " pod="openstack/root-account-create-update-d92d8"
Mar 20 15:59:16 crc kubenswrapper[4730]: I0320 15:59:16.483406    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb89m\" (UniqueName: \"kubernetes.io/projected/ed167127-4e44-4877-bf9b-dbb6a23a8b3f-kube-api-access-pb89m\") pod \"root-account-create-update-d92d8\" (UID: \"ed167127-4e44-4877-bf9b-dbb6a23a8b3f\") " pod="openstack/root-account-create-update-d92d8"
Mar 20 15:59:16 crc kubenswrapper[4730]: I0320 15:59:16.566287    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-d92d8"
Mar 20 15:59:16 crc kubenswrapper[4730]: I0320 15:59:16.724615    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0"
Mar 20 15:59:16 crc kubenswrapper[4730]: I0320 15:59:16.755613    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0"
Mar 20 15:59:16 crc kubenswrapper[4730]: I0320 15:59:16.827518    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0"
Mar 20 15:59:17 crc kubenswrapper[4730]: I0320 15:59:17.015197    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-d92d8"]
Mar 20 15:59:17 crc kubenswrapper[4730]: I0320 15:59:17.835277    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-d92d8" event={"ID":"ed167127-4e44-4877-bf9b-dbb6a23a8b3f","Type":"ContainerStarted","Data":"6f4ac67e084527a1cb38bd3c525c24e61f9c884d533f00e2d72554c431fbb247"}
Mar 20 15:59:17 crc kubenswrapper[4730]: I0320 15:59:17.835564    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-d92d8" event={"ID":"ed167127-4e44-4877-bf9b-dbb6a23a8b3f","Type":"ContainerStarted","Data":"bdc93b2eae763d308256566d1dc3d671bf388b3fb090367832183f2f953d3839"}
Mar 20 15:59:17 crc kubenswrapper[4730]: I0320 15:59:17.855010    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-d92d8" podStartSLOduration=1.854992335 podStartE2EDuration="1.854992335s" podCreationTimestamp="2026-03-20 15:59:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:59:17.847223534 +0000 UTC m=+1217.060594923" watchObservedRunningTime="2026-03-20 15:59:17.854992335 +0000 UTC m=+1217.068363714"
Mar 20 15:59:18 crc kubenswrapper[4730]: I0320 15:59:18.843334    4730 generic.go:334] "Generic (PLEG): container finished" podID="ed167127-4e44-4877-bf9b-dbb6a23a8b3f" containerID="6f4ac67e084527a1cb38bd3c525c24e61f9c884d533f00e2d72554c431fbb247" exitCode=0
Mar 20 15:59:18 crc kubenswrapper[4730]: I0320 15:59:18.843515    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-d92d8" event={"ID":"ed167127-4e44-4877-bf9b-dbb6a23a8b3f","Type":"ContainerDied","Data":"6f4ac67e084527a1cb38bd3c525c24e61f9c884d533f00e2d72554c431fbb247"}
Mar 20 15:59:19 crc kubenswrapper[4730]: I0320 15:59:19.141802    4730 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-gtrnp" podUID="31651551-edb9-4793-a752-39fa60a85ee3" containerName="ovn-controller" probeResult="failure" output=<
Mar 20 15:59:19 crc kubenswrapper[4730]:         ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status
Mar 20 15:59:19 crc kubenswrapper[4730]:  >
Mar 20 15:59:19 crc kubenswrapper[4730]: I0320 15:59:19.367339    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"]
Mar 20 15:59:19 crc kubenswrapper[4730]: I0320 15:59:19.367616    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="fdd3845a-3723-438f-aa58-606451baed6c" containerName="prometheus" containerID="cri-o://ccc4d976b4160ab2263002a763830d5f5f68919c64d310c4f41b79be9631a6ea" gracePeriod=600
Mar 20 15:59:19 crc kubenswrapper[4730]: I0320 15:59:19.367679    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="fdd3845a-3723-438f-aa58-606451baed6c" containerName="thanos-sidecar" containerID="cri-o://83d469de2778c173f20bbd31bfd4fc16492ac416000fbb48eb639e2a00a91feb" gracePeriod=600
Mar 20 15:59:19 crc kubenswrapper[4730]: I0320 15:59:19.367771    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="fdd3845a-3723-438f-aa58-606451baed6c" containerName="config-reloader" containerID="cri-o://2e07902c18ce4590f197ddee6088f8273a8da0ef0fa191f28dc4116786b4c25f" gracePeriod=600
Mar 20 15:59:19 crc kubenswrapper[4730]: I0320 15:59:19.869052    4730 generic.go:334] "Generic (PLEG): container finished" podID="fdd3845a-3723-438f-aa58-606451baed6c" containerID="83d469de2778c173f20bbd31bfd4fc16492ac416000fbb48eb639e2a00a91feb" exitCode=0
Mar 20 15:59:19 crc kubenswrapper[4730]: I0320 15:59:19.869084    4730 generic.go:334] "Generic (PLEG): container finished" podID="fdd3845a-3723-438f-aa58-606451baed6c" containerID="2e07902c18ce4590f197ddee6088f8273a8da0ef0fa191f28dc4116786b4c25f" exitCode=0
Mar 20 15:59:19 crc kubenswrapper[4730]: I0320 15:59:19.869096    4730 generic.go:334] "Generic (PLEG): container finished" podID="fdd3845a-3723-438f-aa58-606451baed6c" containerID="ccc4d976b4160ab2263002a763830d5f5f68919c64d310c4f41b79be9631a6ea" exitCode=0
Mar 20 15:59:19 crc kubenswrapper[4730]: I0320 15:59:19.869179    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fdd3845a-3723-438f-aa58-606451baed6c","Type":"ContainerDied","Data":"83d469de2778c173f20bbd31bfd4fc16492ac416000fbb48eb639e2a00a91feb"}
Mar 20 15:59:19 crc kubenswrapper[4730]: I0320 15:59:19.869213    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fdd3845a-3723-438f-aa58-606451baed6c","Type":"ContainerDied","Data":"2e07902c18ce4590f197ddee6088f8273a8da0ef0fa191f28dc4116786b4c25f"}
Mar 20 15:59:19 crc kubenswrapper[4730]: I0320 15:59:19.869230    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fdd3845a-3723-438f-aa58-606451baed6c","Type":"ContainerDied","Data":"ccc4d976b4160ab2263002a763830d5f5f68919c64d310c4f41b79be9631a6ea"}
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.064548    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0"
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.123611    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fdd3845a-3723-438f-aa58-606451baed6c-web-config\") pod \"fdd3845a-3723-438f-aa58-606451baed6c\" (UID: \"fdd3845a-3723-438f-aa58-606451baed6c\") "
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.123675    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fdd3845a-3723-438f-aa58-606451baed6c-config\") pod \"fdd3845a-3723-438f-aa58-606451baed6c\" (UID: \"fdd3845a-3723-438f-aa58-606451baed6c\") "
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.123707    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fdd3845a-3723-438f-aa58-606451baed6c-thanos-prometheus-http-client-file\") pod \"fdd3845a-3723-438f-aa58-606451baed6c\" (UID: \"fdd3845a-3723-438f-aa58-606451baed6c\") "
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.123754    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/fdd3845a-3723-438f-aa58-606451baed6c-prometheus-metric-storage-rulefiles-1\") pod \"fdd3845a-3723-438f-aa58-606451baed6c\" (UID: \"fdd3845a-3723-438f-aa58-606451baed6c\") "
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.123823    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fdd3845a-3723-438f-aa58-606451baed6c-tls-assets\") pod \"fdd3845a-3723-438f-aa58-606451baed6c\" (UID: \"fdd3845a-3723-438f-aa58-606451baed6c\") "
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.123908    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxrks\" (UniqueName: \"kubernetes.io/projected/fdd3845a-3723-438f-aa58-606451baed6c-kube-api-access-pxrks\") pod \"fdd3845a-3723-438f-aa58-606451baed6c\" (UID: \"fdd3845a-3723-438f-aa58-606451baed6c\") "
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.124049    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-361883a8-e26d-4e93-ae2e-72ea2afef68d\") pod \"fdd3845a-3723-438f-aa58-606451baed6c\" (UID: \"fdd3845a-3723-438f-aa58-606451baed6c\") "
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.125085    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/fdd3845a-3723-438f-aa58-606451baed6c-prometheus-metric-storage-rulefiles-2\") pod \"fdd3845a-3723-438f-aa58-606451baed6c\" (UID: \"fdd3845a-3723-438f-aa58-606451baed6c\") "
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.125185    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fdd3845a-3723-438f-aa58-606451baed6c-config-out\") pod \"fdd3845a-3723-438f-aa58-606451baed6c\" (UID: \"fdd3845a-3723-438f-aa58-606451baed6c\") "
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.125263    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fdd3845a-3723-438f-aa58-606451baed6c-prometheus-metric-storage-rulefiles-0\") pod \"fdd3845a-3723-438f-aa58-606451baed6c\" (UID: \"fdd3845a-3723-438f-aa58-606451baed6c\") "
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.133378    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdd3845a-3723-438f-aa58-606451baed6c-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "fdd3845a-3723-438f-aa58-606451baed6c" (UID: "fdd3845a-3723-438f-aa58-606451baed6c"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.133754    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdd3845a-3723-438f-aa58-606451baed6c-kube-api-access-pxrks" (OuterVolumeSpecName: "kube-api-access-pxrks") pod "fdd3845a-3723-438f-aa58-606451baed6c" (UID: "fdd3845a-3723-438f-aa58-606451baed6c"). InnerVolumeSpecName "kube-api-access-pxrks". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.135862    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdd3845a-3723-438f-aa58-606451baed6c-config" (OuterVolumeSpecName: "config") pod "fdd3845a-3723-438f-aa58-606451baed6c" (UID: "fdd3845a-3723-438f-aa58-606451baed6c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.135959    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdd3845a-3723-438f-aa58-606451baed6c-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "fdd3845a-3723-438f-aa58-606451baed6c" (UID: "fdd3845a-3723-438f-aa58-606451baed6c"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.136065    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdd3845a-3723-438f-aa58-606451baed6c-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "fdd3845a-3723-438f-aa58-606451baed6c" (UID: "fdd3845a-3723-438f-aa58-606451baed6c"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.136467    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdd3845a-3723-438f-aa58-606451baed6c-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "fdd3845a-3723-438f-aa58-606451baed6c" (UID: "fdd3845a-3723-438f-aa58-606451baed6c"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.137577    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdd3845a-3723-438f-aa58-606451baed6c-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "fdd3845a-3723-438f-aa58-606451baed6c" (UID: "fdd3845a-3723-438f-aa58-606451baed6c"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.140800    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdd3845a-3723-438f-aa58-606451baed6c-config-out" (OuterVolumeSpecName: "config-out") pod "fdd3845a-3723-438f-aa58-606451baed6c" (UID: "fdd3845a-3723-438f-aa58-606451baed6c"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.159328    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdd3845a-3723-438f-aa58-606451baed6c-web-config" (OuterVolumeSpecName: "web-config") pod "fdd3845a-3723-438f-aa58-606451baed6c" (UID: "fdd3845a-3723-438f-aa58-606451baed6c"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.160078    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-361883a8-e26d-4e93-ae2e-72ea2afef68d" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "fdd3845a-3723-438f-aa58-606451baed6c" (UID: "fdd3845a-3723-438f-aa58-606451baed6c"). InnerVolumeSpecName "pvc-361883a8-e26d-4e93-ae2e-72ea2afef68d". PluginName "kubernetes.io/csi", VolumeGidValue ""
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.161004    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-d92d8"
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.235056    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed167127-4e44-4877-bf9b-dbb6a23a8b3f-operator-scripts\") pod \"ed167127-4e44-4877-bf9b-dbb6a23a8b3f\" (UID: \"ed167127-4e44-4877-bf9b-dbb6a23a8b3f\") "
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.235460    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pb89m\" (UniqueName: \"kubernetes.io/projected/ed167127-4e44-4877-bf9b-dbb6a23a8b3f-kube-api-access-pb89m\") pod \"ed167127-4e44-4877-bf9b-dbb6a23a8b3f\" (UID: \"ed167127-4e44-4877-bf9b-dbb6a23a8b3f\") "
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.235803    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed167127-4e44-4877-bf9b-dbb6a23a8b3f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ed167127-4e44-4877-bf9b-dbb6a23a8b3f" (UID: "ed167127-4e44-4877-bf9b-dbb6a23a8b3f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.235932    4730 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fdd3845a-3723-438f-aa58-606451baed6c-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.236006    4730 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fdd3845a-3723-438f-aa58-606451baed6c-web-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.236065    4730 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/fdd3845a-3723-438f-aa58-606451baed6c-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.236132    4730 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fdd3845a-3723-438f-aa58-606451baed6c-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.236195    4730 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/fdd3845a-3723-438f-aa58-606451baed6c-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.236267    4730 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fdd3845a-3723-438f-aa58-606451baed6c-tls-assets\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.236327    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxrks\" (UniqueName: \"kubernetes.io/projected/fdd3845a-3723-438f-aa58-606451baed6c-kube-api-access-pxrks\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.236433    4730 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-361883a8-e26d-4e93-ae2e-72ea2afef68d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-361883a8-e26d-4e93-ae2e-72ea2afef68d\") on node \"crc\" "
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.236501    4730 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/fdd3845a-3723-438f-aa58-606451baed6c-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.236569    4730 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fdd3845a-3723-438f-aa58-606451baed6c-config-out\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.239833    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed167127-4e44-4877-bf9b-dbb6a23a8b3f-kube-api-access-pb89m" (OuterVolumeSpecName: "kube-api-access-pb89m") pod "ed167127-4e44-4877-bf9b-dbb6a23a8b3f" (UID: "ed167127-4e44-4877-bf9b-dbb6a23a8b3f"). InnerVolumeSpecName "kube-api-access-pb89m". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.262930    4730 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice...
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.263091    4730 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-361883a8-e26d-4e93-ae2e-72ea2afef68d" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-361883a8-e26d-4e93-ae2e-72ea2afef68d") on node "crc"
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.338226    4730 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed167127-4e44-4877-bf9b-dbb6a23a8b3f-operator-scripts\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.338564    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pb89m\" (UniqueName: \"kubernetes.io/projected/ed167127-4e44-4877-bf9b-dbb6a23a8b3f-kube-api-access-pb89m\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.338577    4730 reconciler_common.go:293] "Volume detached for volume \"pvc-361883a8-e26d-4e93-ae2e-72ea2afef68d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-361883a8-e26d-4e93-ae2e-72ea2afef68d\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.894077    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fdd3845a-3723-438f-aa58-606451baed6c","Type":"ContainerDied","Data":"1e180be81f67edf4314b8f53e9215c19609d113f31f1dc11bb8e1b622d9ae961"}
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.894172    4730 scope.go:117] "RemoveContainer" containerID="83d469de2778c173f20bbd31bfd4fc16492ac416000fbb48eb639e2a00a91feb"
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.894812    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0"
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.895964    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-d92d8" event={"ID":"ed167127-4e44-4877-bf9b-dbb6a23a8b3f","Type":"ContainerDied","Data":"bdc93b2eae763d308256566d1dc3d671bf388b3fb090367832183f2f953d3839"}
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.896006    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bdc93b2eae763d308256566d1dc3d671bf388b3fb090367832183f2f953d3839"
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.896015    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-d92d8"
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.929779    4730 scope.go:117] "RemoveContainer" containerID="2e07902c18ce4590f197ddee6088f8273a8da0ef0fa191f28dc4116786b4c25f"
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.946537    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"]
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.952069    4730 scope.go:117] "RemoveContainer" containerID="ccc4d976b4160ab2263002a763830d5f5f68919c64d310c4f41b79be9631a6ea"
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.955981    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"]
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.973180    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"]
Mar 20 15:59:20 crc kubenswrapper[4730]: E0320 15:59:20.973525    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdd3845a-3723-438f-aa58-606451baed6c" containerName="prometheus"
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.973543    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdd3845a-3723-438f-aa58-606451baed6c" containerName="prometheus"
Mar 20 15:59:20 crc kubenswrapper[4730]: E0320 15:59:20.973559    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdd3845a-3723-438f-aa58-606451baed6c" containerName="init-config-reloader"
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.973565    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdd3845a-3723-438f-aa58-606451baed6c" containerName="init-config-reloader"
Mar 20 15:59:20 crc kubenswrapper[4730]: E0320 15:59:20.973577    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdd3845a-3723-438f-aa58-606451baed6c" containerName="thanos-sidecar"
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.973583    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdd3845a-3723-438f-aa58-606451baed6c" containerName="thanos-sidecar"
Mar 20 15:59:20 crc kubenswrapper[4730]: E0320 15:59:20.973596    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdd3845a-3723-438f-aa58-606451baed6c" containerName="config-reloader"
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.973602    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdd3845a-3723-438f-aa58-606451baed6c" containerName="config-reloader"
Mar 20 15:59:20 crc kubenswrapper[4730]: E0320 15:59:20.973612    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed167127-4e44-4877-bf9b-dbb6a23a8b3f" containerName="mariadb-account-create-update"
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.973617    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed167127-4e44-4877-bf9b-dbb6a23a8b3f" containerName="mariadb-account-create-update"
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.973787    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed167127-4e44-4877-bf9b-dbb6a23a8b3f" containerName="mariadb-account-create-update"
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.973802    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdd3845a-3723-438f-aa58-606451baed6c" containerName="config-reloader"
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.973811    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdd3845a-3723-438f-aa58-606451baed6c" containerName="thanos-sidecar"
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.973850    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdd3845a-3723-438f-aa58-606451baed6c" containerName="prometheus"
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.975268    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0"
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.978513    4730 scope.go:117] "RemoveContainer" containerID="e4820a88fffd97776afd3e8f20ce1473d0c4e99acb7cc9e56c9e53eaef07563a"
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.978715    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc"
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.978805    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage"
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.980159    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0"
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.980364    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2"
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.980529    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file"
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.980695    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config"
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.980887    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1"
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.980941    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-2q5k6"
Mar 20 15:59:20 crc kubenswrapper[4730]: I0320 15:59:20.990058    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0"
Mar 20 15:59:21 crc kubenswrapper[4730]: I0320 15:59:21.006202    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"]
Mar 20 15:59:21 crc kubenswrapper[4730]: I0320 15:59:21.049561    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/b9474555-d03c-4f34-8914-15b7654ec76e-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:59:21 crc kubenswrapper[4730]: I0320 15:59:21.049621    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b9474555-d03c-4f34-8914-15b7654ec76e-config\") pod \"prometheus-metric-storage-0\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:59:21 crc kubenswrapper[4730]: I0320 15:59:21.049669    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b9474555-d03c-4f34-8914-15b7654ec76e-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:59:21 crc kubenswrapper[4730]: I0320 15:59:21.049745    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/b9474555-d03c-4f34-8914-15b7654ec76e-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:59:21 crc kubenswrapper[4730]: I0320 15:59:21.049778    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b9474555-d03c-4f34-8914-15b7654ec76e-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:59:21 crc kubenswrapper[4730]: I0320 15:59:21.049811    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/b9474555-d03c-4f34-8914-15b7654ec76e-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:59:21 crc kubenswrapper[4730]: I0320 15:59:21.049860    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/b9474555-d03c-4f34-8914-15b7654ec76e-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:59:21 crc kubenswrapper[4730]: I0320 15:59:21.049886    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9474555-d03c-4f34-8914-15b7654ec76e-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:59:21 crc kubenswrapper[4730]: I0320 15:59:21.049928    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-361883a8-e26d-4e93-ae2e-72ea2afef68d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-361883a8-e26d-4e93-ae2e-72ea2afef68d\") pod \"prometheus-metric-storage-0\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:59:21 crc kubenswrapper[4730]: I0320 15:59:21.049987    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55vpw\" (UniqueName: \"kubernetes.io/projected/b9474555-d03c-4f34-8914-15b7654ec76e-kube-api-access-55vpw\") pod \"prometheus-metric-storage-0\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:59:21 crc kubenswrapper[4730]: I0320 15:59:21.050019    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b9474555-d03c-4f34-8914-15b7654ec76e-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:59:21 crc kubenswrapper[4730]: I0320 15:59:21.050046    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b9474555-d03c-4f34-8914-15b7654ec76e-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:59:21 crc kubenswrapper[4730]: I0320 15:59:21.050073    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b9474555-d03c-4f34-8914-15b7654ec76e-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:59:21 crc kubenswrapper[4730]: I0320 15:59:21.151516    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/b9474555-d03c-4f34-8914-15b7654ec76e-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:59:21 crc kubenswrapper[4730]: I0320 15:59:21.152218    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/b9474555-d03c-4f34-8914-15b7654ec76e-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:59:21 crc kubenswrapper[4730]: I0320 15:59:21.152332    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9474555-d03c-4f34-8914-15b7654ec76e-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:59:21 crc kubenswrapper[4730]: I0320 15:59:21.152941    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-361883a8-e26d-4e93-ae2e-72ea2afef68d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-361883a8-e26d-4e93-ae2e-72ea2afef68d\") pod \"prometheus-metric-storage-0\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:59:21 crc kubenswrapper[4730]: I0320 15:59:21.152994    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55vpw\" (UniqueName: \"kubernetes.io/projected/b9474555-d03c-4f34-8914-15b7654ec76e-kube-api-access-55vpw\") pod \"prometheus-metric-storage-0\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:59:21 crc kubenswrapper[4730]: I0320 15:59:21.153022    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b9474555-d03c-4f34-8914-15b7654ec76e-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:59:21 crc kubenswrapper[4730]: I0320 15:59:21.153043    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b9474555-d03c-4f34-8914-15b7654ec76e-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:59:21 crc kubenswrapper[4730]: I0320 15:59:21.153062    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b9474555-d03c-4f34-8914-15b7654ec76e-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:59:21 crc kubenswrapper[4730]: I0320 15:59:21.153085    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/b9474555-d03c-4f34-8914-15b7654ec76e-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:59:21 crc kubenswrapper[4730]: I0320 15:59:21.153113    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b9474555-d03c-4f34-8914-15b7654ec76e-config\") pod \"prometheus-metric-storage-0\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:59:21 crc kubenswrapper[4730]: I0320 15:59:21.153148    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b9474555-d03c-4f34-8914-15b7654ec76e-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:59:21 crc kubenswrapper[4730]: I0320 15:59:21.153188    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/b9474555-d03c-4f34-8914-15b7654ec76e-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:59:21 crc kubenswrapper[4730]: I0320 15:59:21.153207    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b9474555-d03c-4f34-8914-15b7654ec76e-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:59:21 crc kubenswrapper[4730]: I0320 15:59:21.153281    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/b9474555-d03c-4f34-8914-15b7654ec76e-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:59:21 crc kubenswrapper[4730]: I0320 15:59:21.153691    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b9474555-d03c-4f34-8914-15b7654ec76e-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:59:21 crc kubenswrapper[4730]: I0320 15:59:21.156908    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9474555-d03c-4f34-8914-15b7654ec76e-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:59:21 crc kubenswrapper[4730]: I0320 15:59:21.157031    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/b9474555-d03c-4f34-8914-15b7654ec76e-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:59:21 crc kubenswrapper[4730]: I0320 15:59:21.161279    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/b9474555-d03c-4f34-8914-15b7654ec76e-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:59:21 crc kubenswrapper[4730]: I0320 15:59:21.162121    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/b9474555-d03c-4f34-8914-15b7654ec76e-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:59:21 crc kubenswrapper[4730]: I0320 15:59:21.162237    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b9474555-d03c-4f34-8914-15b7654ec76e-config\") pod \"prometheus-metric-storage-0\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:59:21 crc kubenswrapper[4730]: I0320 15:59:21.162350    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b9474555-d03c-4f34-8914-15b7654ec76e-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:59:21 crc kubenswrapper[4730]: I0320 15:59:21.162545    4730 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice...
Mar 20 15:59:21 crc kubenswrapper[4730]: I0320 15:59:21.162571    4730 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-361883a8-e26d-4e93-ae2e-72ea2afef68d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-361883a8-e26d-4e93-ae2e-72ea2afef68d\") pod \"prometheus-metric-storage-0\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6c50c5c57c27fdb24da1fcbf3a7504c7bda45f4dc15a5678e0deb708aa433733/globalmount\"" pod="openstack/prometheus-metric-storage-0"
Mar 20 15:59:21 crc kubenswrapper[4730]: I0320 15:59:21.162777    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b9474555-d03c-4f34-8914-15b7654ec76e-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:59:21 crc kubenswrapper[4730]: I0320 15:59:21.172404    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55vpw\" (UniqueName: \"kubernetes.io/projected/b9474555-d03c-4f34-8914-15b7654ec76e-kube-api-access-55vpw\") pod \"prometheus-metric-storage-0\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:59:21 crc kubenswrapper[4730]: I0320 15:59:21.172862    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b9474555-d03c-4f34-8914-15b7654ec76e-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:59:21 crc kubenswrapper[4730]: I0320 15:59:21.173325    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b9474555-d03c-4f34-8914-15b7654ec76e-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:59:21 crc kubenswrapper[4730]: I0320 15:59:21.194183    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-361883a8-e26d-4e93-ae2e-72ea2afef68d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-361883a8-e26d-4e93-ae2e-72ea2afef68d\") pod \"prometheus-metric-storage-0\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 15:59:21 crc kubenswrapper[4730]: I0320 15:59:21.349989    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0"
Mar 20 15:59:21 crc kubenswrapper[4730]: I0320 15:59:21.543180    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdd3845a-3723-438f-aa58-606451baed6c" path="/var/lib/kubelet/pods/fdd3845a-3723-438f-aa58-606451baed6c/volumes"
Mar 20 15:59:22 crc kubenswrapper[4730]: I0320 15:59:21.811959    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"]
Mar 20 15:59:22 crc kubenswrapper[4730]: I0320 15:59:21.910771    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b9474555-d03c-4f34-8914-15b7654ec76e","Type":"ContainerStarted","Data":"8f7081ac79f5f8ab5d11083740ef5a60bf4e5c0ec09313ba816c9290b7a2077b"}
Mar 20 15:59:22 crc kubenswrapper[4730]: I0320 15:59:21.915656    4730 generic.go:334] "Generic (PLEG): container finished" podID="167282ce-29fc-44db-9b0b-baf2c956f433" containerID="768be9518c03c37026c048245f752f8e9492e3f207a6cde3432392aa9859edc5" exitCode=0
Mar 20 15:59:22 crc kubenswrapper[4730]: I0320 15:59:21.915694    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-7d8lv" event={"ID":"167282ce-29fc-44db-9b0b-baf2c956f433","Type":"ContainerDied","Data":"768be9518c03c37026c048245f752f8e9492e3f207a6cde3432392aa9859edc5"}
Mar 20 15:59:22 crc kubenswrapper[4730]: I0320 15:59:22.923988    4730 generic.go:334] "Generic (PLEG): container finished" podID="37dd8777-c196-4db2-af7a-5560a939e02c" containerID="45a09c4f4bffe31b4f9cf83737f4a3331b9ba65b3e4bbf1a00d15070f2dd1fbb" exitCode=0
Mar 20 15:59:22 crc kubenswrapper[4730]: I0320 15:59:22.924118    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-mkvv4" event={"ID":"37dd8777-c196-4db2-af7a-5560a939e02c","Type":"ContainerDied","Data":"45a09c4f4bffe31b4f9cf83737f4a3331b9ba65b3e4bbf1a00d15070f2dd1fbb"}
Mar 20 15:59:23 crc kubenswrapper[4730]: I0320 15:59:23.271995    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-7d8lv"
Mar 20 15:59:23 crc kubenswrapper[4730]: I0320 15:59:23.388530    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/167282ce-29fc-44db-9b0b-baf2c956f433-ring-data-devices\") pod \"167282ce-29fc-44db-9b0b-baf2c956f433\" (UID: \"167282ce-29fc-44db-9b0b-baf2c956f433\") "
Mar 20 15:59:23 crc kubenswrapper[4730]: I0320 15:59:23.388628    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/167282ce-29fc-44db-9b0b-baf2c956f433-scripts\") pod \"167282ce-29fc-44db-9b0b-baf2c956f433\" (UID: \"167282ce-29fc-44db-9b0b-baf2c956f433\") "
Mar 20 15:59:23 crc kubenswrapper[4730]: I0320 15:59:23.388659    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/167282ce-29fc-44db-9b0b-baf2c956f433-dispersionconf\") pod \"167282ce-29fc-44db-9b0b-baf2c956f433\" (UID: \"167282ce-29fc-44db-9b0b-baf2c956f433\") "
Mar 20 15:59:23 crc kubenswrapper[4730]: I0320 15:59:23.388723    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzrnr\" (UniqueName: \"kubernetes.io/projected/167282ce-29fc-44db-9b0b-baf2c956f433-kube-api-access-dzrnr\") pod \"167282ce-29fc-44db-9b0b-baf2c956f433\" (UID: \"167282ce-29fc-44db-9b0b-baf2c956f433\") "
Mar 20 15:59:23 crc kubenswrapper[4730]: I0320 15:59:23.388750    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/167282ce-29fc-44db-9b0b-baf2c956f433-etc-swift\") pod \"167282ce-29fc-44db-9b0b-baf2c956f433\" (UID: \"167282ce-29fc-44db-9b0b-baf2c956f433\") "
Mar 20 15:59:23 crc kubenswrapper[4730]: I0320 15:59:23.388776    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/167282ce-29fc-44db-9b0b-baf2c956f433-combined-ca-bundle\") pod \"167282ce-29fc-44db-9b0b-baf2c956f433\" (UID: \"167282ce-29fc-44db-9b0b-baf2c956f433\") "
Mar 20 15:59:23 crc kubenswrapper[4730]: I0320 15:59:23.388837    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/167282ce-29fc-44db-9b0b-baf2c956f433-swiftconf\") pod \"167282ce-29fc-44db-9b0b-baf2c956f433\" (UID: \"167282ce-29fc-44db-9b0b-baf2c956f433\") "
Mar 20 15:59:23 crc kubenswrapper[4730]: I0320 15:59:23.389736    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/167282ce-29fc-44db-9b0b-baf2c956f433-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "167282ce-29fc-44db-9b0b-baf2c956f433" (UID: "167282ce-29fc-44db-9b0b-baf2c956f433"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:59:23 crc kubenswrapper[4730]: I0320 15:59:23.389844    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/167282ce-29fc-44db-9b0b-baf2c956f433-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "167282ce-29fc-44db-9b0b-baf2c956f433" (UID: "167282ce-29fc-44db-9b0b-baf2c956f433"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 15:59:23 crc kubenswrapper[4730]: I0320 15:59:23.396011    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/167282ce-29fc-44db-9b0b-baf2c956f433-kube-api-access-dzrnr" (OuterVolumeSpecName: "kube-api-access-dzrnr") pod "167282ce-29fc-44db-9b0b-baf2c956f433" (UID: "167282ce-29fc-44db-9b0b-baf2c956f433"). InnerVolumeSpecName "kube-api-access-dzrnr". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:59:23 crc kubenswrapper[4730]: I0320 15:59:23.401184    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/167282ce-29fc-44db-9b0b-baf2c956f433-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "167282ce-29fc-44db-9b0b-baf2c956f433" (UID: "167282ce-29fc-44db-9b0b-baf2c956f433"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:59:23 crc kubenswrapper[4730]: I0320 15:59:23.490941    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzrnr\" (UniqueName: \"kubernetes.io/projected/167282ce-29fc-44db-9b0b-baf2c956f433-kube-api-access-dzrnr\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:23 crc kubenswrapper[4730]: I0320 15:59:23.490977    4730 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/167282ce-29fc-44db-9b0b-baf2c956f433-etc-swift\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:23 crc kubenswrapper[4730]: I0320 15:59:23.490986    4730 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/167282ce-29fc-44db-9b0b-baf2c956f433-ring-data-devices\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:23 crc kubenswrapper[4730]: I0320 15:59:23.490995    4730 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/167282ce-29fc-44db-9b0b-baf2c956f433-dispersionconf\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:23 crc kubenswrapper[4730]: I0320 15:59:23.517163    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/167282ce-29fc-44db-9b0b-baf2c956f433-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "167282ce-29fc-44db-9b0b-baf2c956f433" (UID: "167282ce-29fc-44db-9b0b-baf2c956f433"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:59:23 crc kubenswrapper[4730]: I0320 15:59:23.518352    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/167282ce-29fc-44db-9b0b-baf2c956f433-scripts" (OuterVolumeSpecName: "scripts") pod "167282ce-29fc-44db-9b0b-baf2c956f433" (UID: "167282ce-29fc-44db-9b0b-baf2c956f433"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:59:23 crc kubenswrapper[4730]: I0320 15:59:23.525769    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/167282ce-29fc-44db-9b0b-baf2c956f433-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "167282ce-29fc-44db-9b0b-baf2c956f433" (UID: "167282ce-29fc-44db-9b0b-baf2c956f433"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:59:23 crc kubenswrapper[4730]: I0320 15:59:23.593618    4730 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/167282ce-29fc-44db-9b0b-baf2c956f433-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:23 crc kubenswrapper[4730]: I0320 15:59:23.593662    4730 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/167282ce-29fc-44db-9b0b-baf2c956f433-swiftconf\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:23 crc kubenswrapper[4730]: I0320 15:59:23.593671    4730 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/167282ce-29fc-44db-9b0b-baf2c956f433-scripts\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:23 crc kubenswrapper[4730]: I0320 15:59:23.933892    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-7d8lv" event={"ID":"167282ce-29fc-44db-9b0b-baf2c956f433","Type":"ContainerDied","Data":"c2bed297b15aeb66e5504095b19b123685071c689e915cfbc2281dd9c7ff81a2"}
Mar 20 15:59:23 crc kubenswrapper[4730]: I0320 15:59:23.933967    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2bed297b15aeb66e5504095b19b123685071c689e915cfbc2281dd9c7ff81a2"
Mar 20 15:59:23 crc kubenswrapper[4730]: I0320 15:59:23.933906    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-7d8lv"
Mar 20 15:59:24 crc kubenswrapper[4730]: I0320 15:59:24.341579    4730 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-gtrnp" podUID="31651551-edb9-4793-a752-39fa60a85ee3" containerName="ovn-controller" probeResult="failure" output=<
Mar 20 15:59:24 crc kubenswrapper[4730]:         ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status
Mar 20 15:59:24 crc kubenswrapper[4730]:  >
Mar 20 15:59:24 crc kubenswrapper[4730]: I0320 15:59:24.720755    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-mkvv4"
Mar 20 15:59:24 crc kubenswrapper[4730]: I0320 15:59:24.810508    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37dd8777-c196-4db2-af7a-5560a939e02c-config-data\") pod \"37dd8777-c196-4db2-af7a-5560a939e02c\" (UID: \"37dd8777-c196-4db2-af7a-5560a939e02c\") "
Mar 20 15:59:24 crc kubenswrapper[4730]: I0320 15:59:24.810580    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/37dd8777-c196-4db2-af7a-5560a939e02c-db-sync-config-data\") pod \"37dd8777-c196-4db2-af7a-5560a939e02c\" (UID: \"37dd8777-c196-4db2-af7a-5560a939e02c\") "
Mar 20 15:59:24 crc kubenswrapper[4730]: I0320 15:59:24.810643    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37dd8777-c196-4db2-af7a-5560a939e02c-combined-ca-bundle\") pod \"37dd8777-c196-4db2-af7a-5560a939e02c\" (UID: \"37dd8777-c196-4db2-af7a-5560a939e02c\") "
Mar 20 15:59:24 crc kubenswrapper[4730]: I0320 15:59:24.810800    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwpz2\" (UniqueName: \"kubernetes.io/projected/37dd8777-c196-4db2-af7a-5560a939e02c-kube-api-access-wwpz2\") pod \"37dd8777-c196-4db2-af7a-5560a939e02c\" (UID: \"37dd8777-c196-4db2-af7a-5560a939e02c\") "
Mar 20 15:59:24 crc kubenswrapper[4730]: I0320 15:59:24.817617    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37dd8777-c196-4db2-af7a-5560a939e02c-kube-api-access-wwpz2" (OuterVolumeSpecName: "kube-api-access-wwpz2") pod "37dd8777-c196-4db2-af7a-5560a939e02c" (UID: "37dd8777-c196-4db2-af7a-5560a939e02c"). InnerVolumeSpecName "kube-api-access-wwpz2". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:59:24 crc kubenswrapper[4730]: I0320 15:59:24.827016    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37dd8777-c196-4db2-af7a-5560a939e02c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "37dd8777-c196-4db2-af7a-5560a939e02c" (UID: "37dd8777-c196-4db2-af7a-5560a939e02c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:59:24 crc kubenswrapper[4730]: I0320 15:59:24.840127    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37dd8777-c196-4db2-af7a-5560a939e02c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37dd8777-c196-4db2-af7a-5560a939e02c" (UID: "37dd8777-c196-4db2-af7a-5560a939e02c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:59:24 crc kubenswrapper[4730]: I0320 15:59:24.860820    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37dd8777-c196-4db2-af7a-5560a939e02c-config-data" (OuterVolumeSpecName: "config-data") pod "37dd8777-c196-4db2-af7a-5560a939e02c" (UID: "37dd8777-c196-4db2-af7a-5560a939e02c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:59:24 crc kubenswrapper[4730]: I0320 15:59:24.912644    4730 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37dd8777-c196-4db2-af7a-5560a939e02c-config-data\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:24 crc kubenswrapper[4730]: I0320 15:59:24.912681    4730 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/37dd8777-c196-4db2-af7a-5560a939e02c-db-sync-config-data\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:24 crc kubenswrapper[4730]: I0320 15:59:24.912694    4730 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37dd8777-c196-4db2-af7a-5560a939e02c-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:24 crc kubenswrapper[4730]: I0320 15:59:24.912708    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwpz2\" (UniqueName: \"kubernetes.io/projected/37dd8777-c196-4db2-af7a-5560a939e02c-kube-api-access-wwpz2\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:24 crc kubenswrapper[4730]: I0320 15:59:24.941715    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-mkvv4" event={"ID":"37dd8777-c196-4db2-af7a-5560a939e02c","Type":"ContainerDied","Data":"83ca9233a49380ba616e8a33edba2e21a0e34d304c17191d6b978242a55feaa7"}
Mar 20 15:59:24 crc kubenswrapper[4730]: I0320 15:59:24.941752    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83ca9233a49380ba616e8a33edba2e21a0e34d304c17191d6b978242a55feaa7"
Mar 20 15:59:24 crc kubenswrapper[4730]: I0320 15:59:24.941804    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-mkvv4"
Mar 20 15:59:24 crc kubenswrapper[4730]: I0320 15:59:24.951579    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b9474555-d03c-4f34-8914-15b7654ec76e","Type":"ContainerStarted","Data":"c6f36cf8613ae0c9fc8870f685a85cb84e12a16bcd52f187f659c8895c86bf85"}
Mar 20 15:59:25 crc kubenswrapper[4730]: I0320 15:59:25.319397    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-68f7d4448c-cvqk4"]
Mar 20 15:59:25 crc kubenswrapper[4730]: E0320 15:59:25.320001    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37dd8777-c196-4db2-af7a-5560a939e02c" containerName="glance-db-sync"
Mar 20 15:59:25 crc kubenswrapper[4730]: I0320 15:59:25.320023    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="37dd8777-c196-4db2-af7a-5560a939e02c" containerName="glance-db-sync"
Mar 20 15:59:25 crc kubenswrapper[4730]: E0320 15:59:25.320033    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="167282ce-29fc-44db-9b0b-baf2c956f433" containerName="swift-ring-rebalance"
Mar 20 15:59:25 crc kubenswrapper[4730]: I0320 15:59:25.320040    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="167282ce-29fc-44db-9b0b-baf2c956f433" containerName="swift-ring-rebalance"
Mar 20 15:59:25 crc kubenswrapper[4730]: I0320 15:59:25.320187    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="167282ce-29fc-44db-9b0b-baf2c956f433" containerName="swift-ring-rebalance"
Mar 20 15:59:25 crc kubenswrapper[4730]: I0320 15:59:25.320203    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="37dd8777-c196-4db2-af7a-5560a939e02c" containerName="glance-db-sync"
Mar 20 15:59:25 crc kubenswrapper[4730]: I0320 15:59:25.320979    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68f7d4448c-cvqk4"
Mar 20 15:59:25 crc kubenswrapper[4730]: I0320 15:59:25.343220    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68f7d4448c-cvqk4"]
Mar 20 15:59:25 crc kubenswrapper[4730]: I0320 15:59:25.424894    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ae0793d-af8a-4808-b632-9f8b22a4d0c0-config\") pod \"dnsmasq-dns-68f7d4448c-cvqk4\" (UID: \"3ae0793d-af8a-4808-b632-9f8b22a4d0c0\") " pod="openstack/dnsmasq-dns-68f7d4448c-cvqk4"
Mar 20 15:59:25 crc kubenswrapper[4730]: I0320 15:59:25.424973    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3ae0793d-af8a-4808-b632-9f8b22a4d0c0-dns-svc\") pod \"dnsmasq-dns-68f7d4448c-cvqk4\" (UID: \"3ae0793d-af8a-4808-b632-9f8b22a4d0c0\") " pod="openstack/dnsmasq-dns-68f7d4448c-cvqk4"
Mar 20 15:59:25 crc kubenswrapper[4730]: I0320 15:59:25.425011    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hslv\" (UniqueName: \"kubernetes.io/projected/3ae0793d-af8a-4808-b632-9f8b22a4d0c0-kube-api-access-9hslv\") pod \"dnsmasq-dns-68f7d4448c-cvqk4\" (UID: \"3ae0793d-af8a-4808-b632-9f8b22a4d0c0\") " pod="openstack/dnsmasq-dns-68f7d4448c-cvqk4"
Mar 20 15:59:25 crc kubenswrapper[4730]: I0320 15:59:25.425035    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3ae0793d-af8a-4808-b632-9f8b22a4d0c0-ovsdbserver-sb\") pod \"dnsmasq-dns-68f7d4448c-cvqk4\" (UID: \"3ae0793d-af8a-4808-b632-9f8b22a4d0c0\") " pod="openstack/dnsmasq-dns-68f7d4448c-cvqk4"
Mar 20 15:59:25 crc kubenswrapper[4730]: I0320 15:59:25.425146    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3ae0793d-af8a-4808-b632-9f8b22a4d0c0-ovsdbserver-nb\") pod \"dnsmasq-dns-68f7d4448c-cvqk4\" (UID: \"3ae0793d-af8a-4808-b632-9f8b22a4d0c0\") " pod="openstack/dnsmasq-dns-68f7d4448c-cvqk4"
Mar 20 15:59:25 crc kubenswrapper[4730]: I0320 15:59:25.526951    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hslv\" (UniqueName: \"kubernetes.io/projected/3ae0793d-af8a-4808-b632-9f8b22a4d0c0-kube-api-access-9hslv\") pod \"dnsmasq-dns-68f7d4448c-cvqk4\" (UID: \"3ae0793d-af8a-4808-b632-9f8b22a4d0c0\") " pod="openstack/dnsmasq-dns-68f7d4448c-cvqk4"
Mar 20 15:59:25 crc kubenswrapper[4730]: I0320 15:59:25.526991    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3ae0793d-af8a-4808-b632-9f8b22a4d0c0-ovsdbserver-sb\") pod \"dnsmasq-dns-68f7d4448c-cvqk4\" (UID: \"3ae0793d-af8a-4808-b632-9f8b22a4d0c0\") " pod="openstack/dnsmasq-dns-68f7d4448c-cvqk4"
Mar 20 15:59:25 crc kubenswrapper[4730]: I0320 15:59:25.527086    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3ae0793d-af8a-4808-b632-9f8b22a4d0c0-ovsdbserver-nb\") pod \"dnsmasq-dns-68f7d4448c-cvqk4\" (UID: \"3ae0793d-af8a-4808-b632-9f8b22a4d0c0\") " pod="openstack/dnsmasq-dns-68f7d4448c-cvqk4"
Mar 20 15:59:25 crc kubenswrapper[4730]: I0320 15:59:25.527139    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ae0793d-af8a-4808-b632-9f8b22a4d0c0-config\") pod \"dnsmasq-dns-68f7d4448c-cvqk4\" (UID: \"3ae0793d-af8a-4808-b632-9f8b22a4d0c0\") " pod="openstack/dnsmasq-dns-68f7d4448c-cvqk4"
Mar 20 15:59:25 crc kubenswrapper[4730]: I0320 15:59:25.527177    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3ae0793d-af8a-4808-b632-9f8b22a4d0c0-dns-svc\") pod \"dnsmasq-dns-68f7d4448c-cvqk4\" (UID: \"3ae0793d-af8a-4808-b632-9f8b22a4d0c0\") " pod="openstack/dnsmasq-dns-68f7d4448c-cvqk4"
Mar 20 15:59:25 crc kubenswrapper[4730]: I0320 15:59:25.528091    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3ae0793d-af8a-4808-b632-9f8b22a4d0c0-dns-svc\") pod \"dnsmasq-dns-68f7d4448c-cvqk4\" (UID: \"3ae0793d-af8a-4808-b632-9f8b22a4d0c0\") " pod="openstack/dnsmasq-dns-68f7d4448c-cvqk4"
Mar 20 15:59:25 crc kubenswrapper[4730]: I0320 15:59:25.528104    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3ae0793d-af8a-4808-b632-9f8b22a4d0c0-ovsdbserver-nb\") pod \"dnsmasq-dns-68f7d4448c-cvqk4\" (UID: \"3ae0793d-af8a-4808-b632-9f8b22a4d0c0\") " pod="openstack/dnsmasq-dns-68f7d4448c-cvqk4"
Mar 20 15:59:25 crc kubenswrapper[4730]: I0320 15:59:25.528688    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ae0793d-af8a-4808-b632-9f8b22a4d0c0-config\") pod \"dnsmasq-dns-68f7d4448c-cvqk4\" (UID: \"3ae0793d-af8a-4808-b632-9f8b22a4d0c0\") " pod="openstack/dnsmasq-dns-68f7d4448c-cvqk4"
Mar 20 15:59:25 crc kubenswrapper[4730]: I0320 15:59:25.528710    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3ae0793d-af8a-4808-b632-9f8b22a4d0c0-ovsdbserver-sb\") pod \"dnsmasq-dns-68f7d4448c-cvqk4\" (UID: \"3ae0793d-af8a-4808-b632-9f8b22a4d0c0\") " pod="openstack/dnsmasq-dns-68f7d4448c-cvqk4"
Mar 20 15:59:25 crc kubenswrapper[4730]: I0320 15:59:25.547073    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hslv\" (UniqueName: \"kubernetes.io/projected/3ae0793d-af8a-4808-b632-9f8b22a4d0c0-kube-api-access-9hslv\") pod \"dnsmasq-dns-68f7d4448c-cvqk4\" (UID: \"3ae0793d-af8a-4808-b632-9f8b22a4d0c0\") " pod="openstack/dnsmasq-dns-68f7d4448c-cvqk4"
Mar 20 15:59:25 crc kubenswrapper[4730]: I0320 15:59:25.641645    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68f7d4448c-cvqk4"
Mar 20 15:59:26 crc kubenswrapper[4730]: I0320 15:59:26.086935    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68f7d4448c-cvqk4"]
Mar 20 15:59:26 crc kubenswrapper[4730]: W0320 15:59:26.090420    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ae0793d_af8a_4808_b632_9f8b22a4d0c0.slice/crio-12c589b54c6d611757ef6559c9af4693c0c76b7c0e6f07d2b8ae5108a83f1489 WatchSource:0}: Error finding container 12c589b54c6d611757ef6559c9af4693c0c76b7c0e6f07d2b8ae5108a83f1489: Status 404 returned error can't find the container with id 12c589b54c6d611757ef6559c9af4693c0c76b7c0e6f07d2b8ae5108a83f1489
Mar 20 15:59:26 crc kubenswrapper[4730]: I0320 15:59:26.966999    4730 generic.go:334] "Generic (PLEG): container finished" podID="3ae0793d-af8a-4808-b632-9f8b22a4d0c0" containerID="ed9c122c62fe6334e758d54aed5e4b1a868adadd18c7ec06bf5ae96e604dfa76" exitCode=0
Mar 20 15:59:26 crc kubenswrapper[4730]: I0320 15:59:26.967143    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68f7d4448c-cvqk4" event={"ID":"3ae0793d-af8a-4808-b632-9f8b22a4d0c0","Type":"ContainerDied","Data":"ed9c122c62fe6334e758d54aed5e4b1a868adadd18c7ec06bf5ae96e604dfa76"}
Mar 20 15:59:26 crc kubenswrapper[4730]: I0320 15:59:26.967436    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68f7d4448c-cvqk4" event={"ID":"3ae0793d-af8a-4808-b632-9f8b22a4d0c0","Type":"ContainerStarted","Data":"12c589b54c6d611757ef6559c9af4693c0c76b7c0e6f07d2b8ae5108a83f1489"}
Mar 20 15:59:27 crc kubenswrapper[4730]: I0320 15:59:27.976790    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68f7d4448c-cvqk4" event={"ID":"3ae0793d-af8a-4808-b632-9f8b22a4d0c0","Type":"ContainerStarted","Data":"cca89ebad14e458072774d3863e92e5ae8961f7af2fa95160875c04d8dd584a6"}
Mar 20 15:59:27 crc kubenswrapper[4730]: I0320 15:59:27.977191    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-68f7d4448c-cvqk4"
Mar 20 15:59:27 crc kubenswrapper[4730]: I0320 15:59:27.999686    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-68f7d4448c-cvqk4" podStartSLOduration=2.999666884 podStartE2EDuration="2.999666884s" podCreationTimestamp="2026-03-20 15:59:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:59:27.991221494 +0000 UTC m=+1227.204592863" watchObservedRunningTime="2026-03-20 15:59:27.999666884 +0000 UTC m=+1227.213038253"
Mar 20 15:59:28 crc kubenswrapper[4730]: I0320 15:59:28.587895    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2c9def6e-27a0-4543-8d3c-07b3e4005b33-etc-swift\") pod \"swift-storage-0\" (UID: \"2c9def6e-27a0-4543-8d3c-07b3e4005b33\") " pod="openstack/swift-storage-0"
Mar 20 15:59:28 crc kubenswrapper[4730]: I0320 15:59:28.593522    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2c9def6e-27a0-4543-8d3c-07b3e4005b33-etc-swift\") pod \"swift-storage-0\" (UID: \"2c9def6e-27a0-4543-8d3c-07b3e4005b33\") " pod="openstack/swift-storage-0"
Mar 20 15:59:28 crc kubenswrapper[4730]: I0320 15:59:28.742163    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0"
Mar 20 15:59:29 crc kubenswrapper[4730]: I0320 15:59:29.132488    4730 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-gtrnp" podUID="31651551-edb9-4793-a752-39fa60a85ee3" containerName="ovn-controller" probeResult="failure" output=<
Mar 20 15:59:29 crc kubenswrapper[4730]:         ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status
Mar 20 15:59:29 crc kubenswrapper[4730]:  >
Mar 20 15:59:29 crc kubenswrapper[4730]: I0320 15:59:29.152223    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-cdd7f"
Mar 20 15:59:29 crc kubenswrapper[4730]: I0320 15:59:29.156419    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-cdd7f"
Mar 20 15:59:29 crc kubenswrapper[4730]: I0320 15:59:29.176266    4730 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="dfd9111c-a9f4-4874-91fc-c0ef68ae09a3" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.109:5671: connect: connection refused"
Mar 20 15:59:29 crc kubenswrapper[4730]: I0320 15:59:29.356008    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"]
Mar 20 15:59:29 crc kubenswrapper[4730]: I0320 15:59:29.382661    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-gtrnp-config-kwcsn"]
Mar 20 15:59:29 crc kubenswrapper[4730]: I0320 15:59:29.383825    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-gtrnp-config-kwcsn"
Mar 20 15:59:29 crc kubenswrapper[4730]: I0320 15:59:29.387723    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts"
Mar 20 15:59:29 crc kubenswrapper[4730]: I0320 15:59:29.412795    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4b45ada4-ae75-43d4-b589-0adca6844b9c-additional-scripts\") pod \"ovn-controller-gtrnp-config-kwcsn\" (UID: \"4b45ada4-ae75-43d4-b589-0adca6844b9c\") " pod="openstack/ovn-controller-gtrnp-config-kwcsn"
Mar 20 15:59:29 crc kubenswrapper[4730]: I0320 15:59:29.412839    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b45ada4-ae75-43d4-b589-0adca6844b9c-scripts\") pod \"ovn-controller-gtrnp-config-kwcsn\" (UID: \"4b45ada4-ae75-43d4-b589-0adca6844b9c\") " pod="openstack/ovn-controller-gtrnp-config-kwcsn"
Mar 20 15:59:29 crc kubenswrapper[4730]: I0320 15:59:29.412869    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82mh6\" (UniqueName: \"kubernetes.io/projected/4b45ada4-ae75-43d4-b589-0adca6844b9c-kube-api-access-82mh6\") pod \"ovn-controller-gtrnp-config-kwcsn\" (UID: \"4b45ada4-ae75-43d4-b589-0adca6844b9c\") " pod="openstack/ovn-controller-gtrnp-config-kwcsn"
Mar 20 15:59:29 crc kubenswrapper[4730]: I0320 15:59:29.412926    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4b45ada4-ae75-43d4-b589-0adca6844b9c-var-run\") pod \"ovn-controller-gtrnp-config-kwcsn\" (UID: \"4b45ada4-ae75-43d4-b589-0adca6844b9c\") " pod="openstack/ovn-controller-gtrnp-config-kwcsn"
Mar 20 15:59:29 crc kubenswrapper[4730]: I0320 15:59:29.412951    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4b45ada4-ae75-43d4-b589-0adca6844b9c-var-run-ovn\") pod \"ovn-controller-gtrnp-config-kwcsn\" (UID: \"4b45ada4-ae75-43d4-b589-0adca6844b9c\") " pod="openstack/ovn-controller-gtrnp-config-kwcsn"
Mar 20 15:59:29 crc kubenswrapper[4730]: I0320 15:59:29.412997    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4b45ada4-ae75-43d4-b589-0adca6844b9c-var-log-ovn\") pod \"ovn-controller-gtrnp-config-kwcsn\" (UID: \"4b45ada4-ae75-43d4-b589-0adca6844b9c\") " pod="openstack/ovn-controller-gtrnp-config-kwcsn"
Mar 20 15:59:29 crc kubenswrapper[4730]: I0320 15:59:29.457999    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-gtrnp-config-kwcsn"]
Mar 20 15:59:29 crc kubenswrapper[4730]: I0320 15:59:29.492423    4730 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/notifications-rabbitmq-server-0" podUID="df9ca02d-e20f-4f55-ba14-92b91812afb6" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.110:5671: connect: connection refused"
Mar 20 15:59:29 crc kubenswrapper[4730]: I0320 15:59:29.514237    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82mh6\" (UniqueName: \"kubernetes.io/projected/4b45ada4-ae75-43d4-b589-0adca6844b9c-kube-api-access-82mh6\") pod \"ovn-controller-gtrnp-config-kwcsn\" (UID: \"4b45ada4-ae75-43d4-b589-0adca6844b9c\") " pod="openstack/ovn-controller-gtrnp-config-kwcsn"
Mar 20 15:59:29 crc kubenswrapper[4730]: I0320 15:59:29.514504    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4b45ada4-ae75-43d4-b589-0adca6844b9c-var-run\") pod \"ovn-controller-gtrnp-config-kwcsn\" (UID: \"4b45ada4-ae75-43d4-b589-0adca6844b9c\") " pod="openstack/ovn-controller-gtrnp-config-kwcsn"
Mar 20 15:59:29 crc kubenswrapper[4730]: I0320 15:59:29.514585    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4b45ada4-ae75-43d4-b589-0adca6844b9c-var-run-ovn\") pod \"ovn-controller-gtrnp-config-kwcsn\" (UID: \"4b45ada4-ae75-43d4-b589-0adca6844b9c\") " pod="openstack/ovn-controller-gtrnp-config-kwcsn"
Mar 20 15:59:29 crc kubenswrapper[4730]: I0320 15:59:29.514670    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4b45ada4-ae75-43d4-b589-0adca6844b9c-var-log-ovn\") pod \"ovn-controller-gtrnp-config-kwcsn\" (UID: \"4b45ada4-ae75-43d4-b589-0adca6844b9c\") " pod="openstack/ovn-controller-gtrnp-config-kwcsn"
Mar 20 15:59:29 crc kubenswrapper[4730]: I0320 15:59:29.514771    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4b45ada4-ae75-43d4-b589-0adca6844b9c-additional-scripts\") pod \"ovn-controller-gtrnp-config-kwcsn\" (UID: \"4b45ada4-ae75-43d4-b589-0adca6844b9c\") " pod="openstack/ovn-controller-gtrnp-config-kwcsn"
Mar 20 15:59:29 crc kubenswrapper[4730]: I0320 15:59:29.514834    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b45ada4-ae75-43d4-b589-0adca6844b9c-scripts\") pod \"ovn-controller-gtrnp-config-kwcsn\" (UID: \"4b45ada4-ae75-43d4-b589-0adca6844b9c\") " pod="openstack/ovn-controller-gtrnp-config-kwcsn"
Mar 20 15:59:29 crc kubenswrapper[4730]: I0320 15:59:29.514984    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4b45ada4-ae75-43d4-b589-0adca6844b9c-var-run\") pod \"ovn-controller-gtrnp-config-kwcsn\" (UID: \"4b45ada4-ae75-43d4-b589-0adca6844b9c\") " pod="openstack/ovn-controller-gtrnp-config-kwcsn"
Mar 20 15:59:29 crc kubenswrapper[4730]: I0320 15:59:29.515097    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4b45ada4-ae75-43d4-b589-0adca6844b9c-var-log-ovn\") pod \"ovn-controller-gtrnp-config-kwcsn\" (UID: \"4b45ada4-ae75-43d4-b589-0adca6844b9c\") " pod="openstack/ovn-controller-gtrnp-config-kwcsn"
Mar 20 15:59:29 crc kubenswrapper[4730]: I0320 15:59:29.515136    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4b45ada4-ae75-43d4-b589-0adca6844b9c-var-run-ovn\") pod \"ovn-controller-gtrnp-config-kwcsn\" (UID: \"4b45ada4-ae75-43d4-b589-0adca6844b9c\") " pod="openstack/ovn-controller-gtrnp-config-kwcsn"
Mar 20 15:59:29 crc kubenswrapper[4730]: I0320 15:59:29.515788    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4b45ada4-ae75-43d4-b589-0adca6844b9c-additional-scripts\") pod \"ovn-controller-gtrnp-config-kwcsn\" (UID: \"4b45ada4-ae75-43d4-b589-0adca6844b9c\") " pod="openstack/ovn-controller-gtrnp-config-kwcsn"
Mar 20 15:59:29 crc kubenswrapper[4730]: I0320 15:59:29.516887    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b45ada4-ae75-43d4-b589-0adca6844b9c-scripts\") pod \"ovn-controller-gtrnp-config-kwcsn\" (UID: \"4b45ada4-ae75-43d4-b589-0adca6844b9c\") " pod="openstack/ovn-controller-gtrnp-config-kwcsn"
Mar 20 15:59:29 crc kubenswrapper[4730]: I0320 15:59:29.535998    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82mh6\" (UniqueName: \"kubernetes.io/projected/4b45ada4-ae75-43d4-b589-0adca6844b9c-kube-api-access-82mh6\") pod \"ovn-controller-gtrnp-config-kwcsn\" (UID: \"4b45ada4-ae75-43d4-b589-0adca6844b9c\") " pod="openstack/ovn-controller-gtrnp-config-kwcsn"
Mar 20 15:59:29 crc kubenswrapper[4730]: I0320 15:59:29.719960    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-gtrnp-config-kwcsn"
Mar 20 15:59:29 crc kubenswrapper[4730]: I0320 15:59:29.786242    4730 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="8043f69c-832c-4afa-a9b9-211507664805" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.111:5671: connect: connection refused"
Mar 20 15:59:29 crc kubenswrapper[4730]: I0320 15:59:29.991787    4730 generic.go:334] "Generic (PLEG): container finished" podID="b9474555-d03c-4f34-8914-15b7654ec76e" containerID="c6f36cf8613ae0c9fc8870f685a85cb84e12a16bcd52f187f659c8895c86bf85" exitCode=0
Mar 20 15:59:29 crc kubenswrapper[4730]: I0320 15:59:29.991869    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b9474555-d03c-4f34-8914-15b7654ec76e","Type":"ContainerDied","Data":"c6f36cf8613ae0c9fc8870f685a85cb84e12a16bcd52f187f659c8895c86bf85"}
Mar 20 15:59:29 crc kubenswrapper[4730]: I0320 15:59:29.995047    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2c9def6e-27a0-4543-8d3c-07b3e4005b33","Type":"ContainerStarted","Data":"e83fd0cac8ffe7d058e2c956c9899a6ff0ab5f5026ca0cc56a4a1252042611ea"}
Mar 20 15:59:30 crc kubenswrapper[4730]: I0320 15:59:30.612333    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-gtrnp-config-kwcsn"]
Mar 20 15:59:30 crc kubenswrapper[4730]: W0320 15:59:30.620945    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b45ada4_ae75_43d4_b589_0adca6844b9c.slice/crio-7de87b0142ae0b273ea772f996c36fe710c098df6bc0dc5943f169f3431a89b9 WatchSource:0}: Error finding container 7de87b0142ae0b273ea772f996c36fe710c098df6bc0dc5943f169f3431a89b9: Status 404 returned error can't find the container with id 7de87b0142ae0b273ea772f996c36fe710c098df6bc0dc5943f169f3431a89b9
Mar 20 15:59:31 crc kubenswrapper[4730]: I0320 15:59:31.700913    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b9474555-d03c-4f34-8914-15b7654ec76e","Type":"ContainerStarted","Data":"8be7e9c76c5955e48d7fb228ba7f64fc7c79c7a945b0f730c1eb3ac871d2f2ce"}
Mar 20 15:59:31 crc kubenswrapper[4730]: I0320 15:59:31.704760    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2c9def6e-27a0-4543-8d3c-07b3e4005b33","Type":"ContainerStarted","Data":"43d0f186b9d31a94452b701ddd22187b3a7732670ec08b107b1eb534ab0742cc"}
Mar 20 15:59:31 crc kubenswrapper[4730]: I0320 15:59:31.704810    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2c9def6e-27a0-4543-8d3c-07b3e4005b33","Type":"ContainerStarted","Data":"ef7ca60bdf842c5c81f4b4cf24250a34944d74feea21474cb6a6eb996f1aaa1c"}
Mar 20 15:59:31 crc kubenswrapper[4730]: I0320 15:59:31.704823    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2c9def6e-27a0-4543-8d3c-07b3e4005b33","Type":"ContainerStarted","Data":"615e9add6050372e67d95a0542a96cc9945c562044da60e92bc777af8b054cce"}
Mar 20 15:59:31 crc kubenswrapper[4730]: I0320 15:59:31.704834    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2c9def6e-27a0-4543-8d3c-07b3e4005b33","Type":"ContainerStarted","Data":"0e133301e01f9e38eca7cfa40a22fdcab5a40fda8cabacecf46480ca86818f89"}
Mar 20 15:59:31 crc kubenswrapper[4730]: I0320 15:59:31.706290    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-gtrnp-config-kwcsn" event={"ID":"4b45ada4-ae75-43d4-b589-0adca6844b9c","Type":"ContainerStarted","Data":"7de87b0142ae0b273ea772f996c36fe710c098df6bc0dc5943f169f3431a89b9"}
Mar 20 15:59:32 crc kubenswrapper[4730]: I0320 15:59:32.721870    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b9474555-d03c-4f34-8914-15b7654ec76e","Type":"ContainerStarted","Data":"bf089e92ae421a8920cefe87896cf3bb8f1ad22d0fc9bc224fb423d5346400e7"}
Mar 20 15:59:32 crc kubenswrapper[4730]: I0320 15:59:32.722455    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b9474555-d03c-4f34-8914-15b7654ec76e","Type":"ContainerStarted","Data":"df8727da1c49b4db84c764524b2f4d737c0c03808dcebb6fb23caea5272a7aec"}
Mar 20 15:59:32 crc kubenswrapper[4730]: I0320 15:59:32.727769    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2c9def6e-27a0-4543-8d3c-07b3e4005b33","Type":"ContainerStarted","Data":"e5cd07648ca70e7df2f81765c08dbeec89730f1bbd8fc11d3f13a6d3af2301c3"}
Mar 20 15:59:32 crc kubenswrapper[4730]: I0320 15:59:32.727812    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2c9def6e-27a0-4543-8d3c-07b3e4005b33","Type":"ContainerStarted","Data":"52f2e369d5c1e38c4f44cbe410c75d284679eba0a6e2928d0a34a5744ca05c14"}
Mar 20 15:59:32 crc kubenswrapper[4730]: I0320 15:59:32.730010    4730 generic.go:334] "Generic (PLEG): container finished" podID="4b45ada4-ae75-43d4-b589-0adca6844b9c" containerID="e7513ea86a1e88bc7e61a8263e52b8513c6fca0f458503f246a5451604a802da" exitCode=0
Mar 20 15:59:32 crc kubenswrapper[4730]: I0320 15:59:32.730066    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-gtrnp-config-kwcsn" event={"ID":"4b45ada4-ae75-43d4-b589-0adca6844b9c","Type":"ContainerDied","Data":"e7513ea86a1e88bc7e61a8263e52b8513c6fca0f458503f246a5451604a802da"}
Mar 20 15:59:32 crc kubenswrapper[4730]: I0320 15:59:32.762481    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=12.762464344 podStartE2EDuration="12.762464344s" podCreationTimestamp="2026-03-20 15:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:59:32.756094523 +0000 UTC m=+1231.969465912" watchObservedRunningTime="2026-03-20 15:59:32.762464344 +0000 UTC m=+1231.975835713"
Mar 20 15:59:33 crc kubenswrapper[4730]: I0320 15:59:33.740640    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2c9def6e-27a0-4543-8d3c-07b3e4005b33","Type":"ContainerStarted","Data":"d0d4d4543419128c3541b234a2fa4a59fecbe52f28ed83b1a77213a046cbd529"}
Mar 20 15:59:33 crc kubenswrapper[4730]: I0320 15:59:33.740936    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2c9def6e-27a0-4543-8d3c-07b3e4005b33","Type":"ContainerStarted","Data":"b29eb91143f55f74d98cef57d7abf643c81f5daa634b3f30490fa642bc7121e0"}
Mar 20 15:59:34 crc kubenswrapper[4730]: I0320 15:59:34.106453    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-gtrnp-config-kwcsn"
Mar 20 15:59:34 crc kubenswrapper[4730]: I0320 15:59:34.149600    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-gtrnp"
Mar 20 15:59:34 crc kubenswrapper[4730]: I0320 15:59:34.186050    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4b45ada4-ae75-43d4-b589-0adca6844b9c-var-run\") pod \"4b45ada4-ae75-43d4-b589-0adca6844b9c\" (UID: \"4b45ada4-ae75-43d4-b589-0adca6844b9c\") "
Mar 20 15:59:34 crc kubenswrapper[4730]: I0320 15:59:34.186178    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4b45ada4-ae75-43d4-b589-0adca6844b9c-additional-scripts\") pod \"4b45ada4-ae75-43d4-b589-0adca6844b9c\" (UID: \"4b45ada4-ae75-43d4-b589-0adca6844b9c\") "
Mar 20 15:59:34 crc kubenswrapper[4730]: I0320 15:59:34.186180    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4b45ada4-ae75-43d4-b589-0adca6844b9c-var-run" (OuterVolumeSpecName: "var-run") pod "4b45ada4-ae75-43d4-b589-0adca6844b9c" (UID: "4b45ada4-ae75-43d4-b589-0adca6844b9c"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue ""
Mar 20 15:59:34 crc kubenswrapper[4730]: I0320 15:59:34.186216    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82mh6\" (UniqueName: \"kubernetes.io/projected/4b45ada4-ae75-43d4-b589-0adca6844b9c-kube-api-access-82mh6\") pod \"4b45ada4-ae75-43d4-b589-0adca6844b9c\" (UID: \"4b45ada4-ae75-43d4-b589-0adca6844b9c\") "
Mar 20 15:59:34 crc kubenswrapper[4730]: I0320 15:59:34.186332    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4b45ada4-ae75-43d4-b589-0adca6844b9c-var-log-ovn\") pod \"4b45ada4-ae75-43d4-b589-0adca6844b9c\" (UID: \"4b45ada4-ae75-43d4-b589-0adca6844b9c\") "
Mar 20 15:59:34 crc kubenswrapper[4730]: I0320 15:59:34.186421    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4b45ada4-ae75-43d4-b589-0adca6844b9c-var-run-ovn\") pod \"4b45ada4-ae75-43d4-b589-0adca6844b9c\" (UID: \"4b45ada4-ae75-43d4-b589-0adca6844b9c\") "
Mar 20 15:59:34 crc kubenswrapper[4730]: I0320 15:59:34.186444    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b45ada4-ae75-43d4-b589-0adca6844b9c-scripts\") pod \"4b45ada4-ae75-43d4-b589-0adca6844b9c\" (UID: \"4b45ada4-ae75-43d4-b589-0adca6844b9c\") "
Mar 20 15:59:34 crc kubenswrapper[4730]: I0320 15:59:34.186490    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4b45ada4-ae75-43d4-b589-0adca6844b9c-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "4b45ada4-ae75-43d4-b589-0adca6844b9c" (UID: "4b45ada4-ae75-43d4-b589-0adca6844b9c"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue ""
Mar 20 15:59:34 crc kubenswrapper[4730]: I0320 15:59:34.186534    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4b45ada4-ae75-43d4-b589-0adca6844b9c-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "4b45ada4-ae75-43d4-b589-0adca6844b9c" (UID: "4b45ada4-ae75-43d4-b589-0adca6844b9c"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue ""
Mar 20 15:59:34 crc kubenswrapper[4730]: I0320 15:59:34.186740    4730 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4b45ada4-ae75-43d4-b589-0adca6844b9c-var-run-ovn\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:34 crc kubenswrapper[4730]: I0320 15:59:34.186756    4730 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4b45ada4-ae75-43d4-b589-0adca6844b9c-var-run\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:34 crc kubenswrapper[4730]: I0320 15:59:34.186766    4730 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4b45ada4-ae75-43d4-b589-0adca6844b9c-var-log-ovn\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:34 crc kubenswrapper[4730]: I0320 15:59:34.186996    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b45ada4-ae75-43d4-b589-0adca6844b9c-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "4b45ada4-ae75-43d4-b589-0adca6844b9c" (UID: "4b45ada4-ae75-43d4-b589-0adca6844b9c"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:59:34 crc kubenswrapper[4730]: I0320 15:59:34.187261    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b45ada4-ae75-43d4-b589-0adca6844b9c-scripts" (OuterVolumeSpecName: "scripts") pod "4b45ada4-ae75-43d4-b589-0adca6844b9c" (UID: "4b45ada4-ae75-43d4-b589-0adca6844b9c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:59:34 crc kubenswrapper[4730]: I0320 15:59:34.191492    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b45ada4-ae75-43d4-b589-0adca6844b9c-kube-api-access-82mh6" (OuterVolumeSpecName: "kube-api-access-82mh6") pod "4b45ada4-ae75-43d4-b589-0adca6844b9c" (UID: "4b45ada4-ae75-43d4-b589-0adca6844b9c"). InnerVolumeSpecName "kube-api-access-82mh6". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:59:34 crc kubenswrapper[4730]: I0320 15:59:34.288885    4730 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b45ada4-ae75-43d4-b589-0adca6844b9c-scripts\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:34 crc kubenswrapper[4730]: I0320 15:59:34.288923    4730 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4b45ada4-ae75-43d4-b589-0adca6844b9c-additional-scripts\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:34 crc kubenswrapper[4730]: I0320 15:59:34.288957    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82mh6\" (UniqueName: \"kubernetes.io/projected/4b45ada4-ae75-43d4-b589-0adca6844b9c-kube-api-access-82mh6\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:34 crc kubenswrapper[4730]: I0320 15:59:34.761409    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2c9def6e-27a0-4543-8d3c-07b3e4005b33","Type":"ContainerStarted","Data":"e3c73940d22384d548f188be48d121dfb04490ec2e9d0d3beddf5a5c59cb93dc"}
Mar 20 15:59:34 crc kubenswrapper[4730]: I0320 15:59:34.761814    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2c9def6e-27a0-4543-8d3c-07b3e4005b33","Type":"ContainerStarted","Data":"7d767175ee4bf8f5b9a3d59ca4ec9669048c7b2e864ff322c9c461b712e9559a"}
Mar 20 15:59:34 crc kubenswrapper[4730]: I0320 15:59:34.761829    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2c9def6e-27a0-4543-8d3c-07b3e4005b33","Type":"ContainerStarted","Data":"fcd654aff2b2879b9299de147a59bc5a0c3867d284c41e4555e76fc0c05a01a9"}
Mar 20 15:59:34 crc kubenswrapper[4730]: I0320 15:59:34.761842    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2c9def6e-27a0-4543-8d3c-07b3e4005b33","Type":"ContainerStarted","Data":"2e495f4dc74f748695dde7cba48269b5971295aa4d03e008732a13ba023705f5"}
Mar 20 15:59:34 crc kubenswrapper[4730]: I0320 15:59:34.761854    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2c9def6e-27a0-4543-8d3c-07b3e4005b33","Type":"ContainerStarted","Data":"b48c30fbc95e109dfd2a125c4e8c7ac80a6868d24e6d83ffc2ae5d7d375f0921"}
Mar 20 15:59:34 crc kubenswrapper[4730]: I0320 15:59:34.764726    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-gtrnp-config-kwcsn" event={"ID":"4b45ada4-ae75-43d4-b589-0adca6844b9c","Type":"ContainerDied","Data":"7de87b0142ae0b273ea772f996c36fe710c098df6bc0dc5943f169f3431a89b9"}
Mar 20 15:59:34 crc kubenswrapper[4730]: I0320 15:59:34.764772    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7de87b0142ae0b273ea772f996c36fe710c098df6bc0dc5943f169f3431a89b9"
Mar 20 15:59:34 crc kubenswrapper[4730]: I0320 15:59:34.764805    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-gtrnp-config-kwcsn"
Mar 20 15:59:35 crc kubenswrapper[4730]: I0320 15:59:35.229291    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-gtrnp-config-kwcsn"]
Mar 20 15:59:35 crc kubenswrapper[4730]: I0320 15:59:35.236949    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-gtrnp-config-kwcsn"]
Mar 20 15:59:35 crc kubenswrapper[4730]: I0320 15:59:35.542687    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b45ada4-ae75-43d4-b589-0adca6844b9c" path="/var/lib/kubelet/pods/4b45ada4-ae75-43d4-b589-0adca6844b9c/volumes"
Mar 20 15:59:35 crc kubenswrapper[4730]: I0320 15:59:35.644459    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-68f7d4448c-cvqk4"
Mar 20 15:59:35 crc kubenswrapper[4730]: I0320 15:59:35.719384    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6868c44cb9-4zxm5"]
Mar 20 15:59:35 crc kubenswrapper[4730]: I0320 15:59:35.719875    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6868c44cb9-4zxm5" podUID="3806a27b-4a0f-439b-8660-d9ccd4bb0618" containerName="dnsmasq-dns" containerID="cri-o://1e16ce44f59bba62455cbdf7f59fe0150efea77ce27a7027dd9fa2ba8ec0f7d5" gracePeriod=10
Mar 20 15:59:35 crc kubenswrapper[4730]: I0320 15:59:35.781185    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2c9def6e-27a0-4543-8d3c-07b3e4005b33","Type":"ContainerStarted","Data":"ff0f61ebd188c7c7fcc1a66f90e54fe081c59931d9d1d9c5e4b2e1d788b94e87"}
Mar 20 15:59:35 crc kubenswrapper[4730]: I0320 15:59:35.781240    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2c9def6e-27a0-4543-8d3c-07b3e4005b33","Type":"ContainerStarted","Data":"fd2c46769dd7001bd445b620fe9b18ea7ce61d1efb0a3f45063f2d9ca43eccbc"}
Mar 20 15:59:35 crc kubenswrapper[4730]: I0320 15:59:35.818290    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=36.572823783 podStartE2EDuration="40.818271314s" podCreationTimestamp="2026-03-20 15:58:55 +0000 UTC" firstStartedPulling="2026-03-20 15:59:29.357980786 +0000 UTC m=+1228.571352155" lastFinishedPulling="2026-03-20 15:59:33.603428317 +0000 UTC m=+1232.816799686" observedRunningTime="2026-03-20 15:59:35.813005816 +0000 UTC m=+1235.026377195" watchObservedRunningTime="2026-03-20 15:59:35.818271314 +0000 UTC m=+1235.031642683"
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.093219    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59fc649cc7-tct2h"]
Mar 20 15:59:36 crc kubenswrapper[4730]: E0320 15:59:36.093662    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b45ada4-ae75-43d4-b589-0adca6844b9c" containerName="ovn-config"
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.093685    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b45ada4-ae75-43d4-b589-0adca6844b9c" containerName="ovn-config"
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.093887    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b45ada4-ae75-43d4-b589-0adca6844b9c" containerName="ovn-config"
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.095303    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59fc649cc7-tct2h"
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.102549    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0"
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.105214    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59fc649cc7-tct2h"]
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.245200    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37d31419-eada-4b93-bc20-bac232ced058-dns-swift-storage-0\") pod \"dnsmasq-dns-59fc649cc7-tct2h\" (UID: \"37d31419-eada-4b93-bc20-bac232ced058\") " pod="openstack/dnsmasq-dns-59fc649cc7-tct2h"
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.245565    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37d31419-eada-4b93-bc20-bac232ced058-dns-svc\") pod \"dnsmasq-dns-59fc649cc7-tct2h\" (UID: \"37d31419-eada-4b93-bc20-bac232ced058\") " pod="openstack/dnsmasq-dns-59fc649cc7-tct2h"
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.245595    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x9cc\" (UniqueName: \"kubernetes.io/projected/37d31419-eada-4b93-bc20-bac232ced058-kube-api-access-6x9cc\") pod \"dnsmasq-dns-59fc649cc7-tct2h\" (UID: \"37d31419-eada-4b93-bc20-bac232ced058\") " pod="openstack/dnsmasq-dns-59fc649cc7-tct2h"
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.245632    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37d31419-eada-4b93-bc20-bac232ced058-ovsdbserver-nb\") pod \"dnsmasq-dns-59fc649cc7-tct2h\" (UID: \"37d31419-eada-4b93-bc20-bac232ced058\") " pod="openstack/dnsmasq-dns-59fc649cc7-tct2h"
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.245656    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37d31419-eada-4b93-bc20-bac232ced058-config\") pod \"dnsmasq-dns-59fc649cc7-tct2h\" (UID: \"37d31419-eada-4b93-bc20-bac232ced058\") " pod="openstack/dnsmasq-dns-59fc649cc7-tct2h"
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.245697    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37d31419-eada-4b93-bc20-bac232ced058-ovsdbserver-sb\") pod \"dnsmasq-dns-59fc649cc7-tct2h\" (UID: \"37d31419-eada-4b93-bc20-bac232ced058\") " pod="openstack/dnsmasq-dns-59fc649cc7-tct2h"
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.261331    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6868c44cb9-4zxm5"
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.347341    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37d31419-eada-4b93-bc20-bac232ced058-dns-svc\") pod \"dnsmasq-dns-59fc649cc7-tct2h\" (UID: \"37d31419-eada-4b93-bc20-bac232ced058\") " pod="openstack/dnsmasq-dns-59fc649cc7-tct2h"
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.347417    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x9cc\" (UniqueName: \"kubernetes.io/projected/37d31419-eada-4b93-bc20-bac232ced058-kube-api-access-6x9cc\") pod \"dnsmasq-dns-59fc649cc7-tct2h\" (UID: \"37d31419-eada-4b93-bc20-bac232ced058\") " pod="openstack/dnsmasq-dns-59fc649cc7-tct2h"
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.347471    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37d31419-eada-4b93-bc20-bac232ced058-ovsdbserver-nb\") pod \"dnsmasq-dns-59fc649cc7-tct2h\" (UID: \"37d31419-eada-4b93-bc20-bac232ced058\") " pod="openstack/dnsmasq-dns-59fc649cc7-tct2h"
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.347507    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37d31419-eada-4b93-bc20-bac232ced058-config\") pod \"dnsmasq-dns-59fc649cc7-tct2h\" (UID: \"37d31419-eada-4b93-bc20-bac232ced058\") " pod="openstack/dnsmasq-dns-59fc649cc7-tct2h"
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.347569    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37d31419-eada-4b93-bc20-bac232ced058-ovsdbserver-sb\") pod \"dnsmasq-dns-59fc649cc7-tct2h\" (UID: \"37d31419-eada-4b93-bc20-bac232ced058\") " pod="openstack/dnsmasq-dns-59fc649cc7-tct2h"
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.347623    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37d31419-eada-4b93-bc20-bac232ced058-dns-swift-storage-0\") pod \"dnsmasq-dns-59fc649cc7-tct2h\" (UID: \"37d31419-eada-4b93-bc20-bac232ced058\") " pod="openstack/dnsmasq-dns-59fc649cc7-tct2h"
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.348714    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37d31419-eada-4b93-bc20-bac232ced058-dns-svc\") pod \"dnsmasq-dns-59fc649cc7-tct2h\" (UID: \"37d31419-eada-4b93-bc20-bac232ced058\") " pod="openstack/dnsmasq-dns-59fc649cc7-tct2h"
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.348759    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37d31419-eada-4b93-bc20-bac232ced058-ovsdbserver-nb\") pod \"dnsmasq-dns-59fc649cc7-tct2h\" (UID: \"37d31419-eada-4b93-bc20-bac232ced058\") " pod="openstack/dnsmasq-dns-59fc649cc7-tct2h"
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.348940    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37d31419-eada-4b93-bc20-bac232ced058-dns-swift-storage-0\") pod \"dnsmasq-dns-59fc649cc7-tct2h\" (UID: \"37d31419-eada-4b93-bc20-bac232ced058\") " pod="openstack/dnsmasq-dns-59fc649cc7-tct2h"
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.349129    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37d31419-eada-4b93-bc20-bac232ced058-ovsdbserver-sb\") pod \"dnsmasq-dns-59fc649cc7-tct2h\" (UID: \"37d31419-eada-4b93-bc20-bac232ced058\") " pod="openstack/dnsmasq-dns-59fc649cc7-tct2h"
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.349672    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37d31419-eada-4b93-bc20-bac232ced058-config\") pod \"dnsmasq-dns-59fc649cc7-tct2h\" (UID: \"37d31419-eada-4b93-bc20-bac232ced058\") " pod="openstack/dnsmasq-dns-59fc649cc7-tct2h"
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.350875    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0"
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.353028    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0"
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.360560    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0"
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.372119    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x9cc\" (UniqueName: \"kubernetes.io/projected/37d31419-eada-4b93-bc20-bac232ced058-kube-api-access-6x9cc\") pod \"dnsmasq-dns-59fc649cc7-tct2h\" (UID: \"37d31419-eada-4b93-bc20-bac232ced058\") " pod="openstack/dnsmasq-dns-59fc649cc7-tct2h"
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.420648    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59fc649cc7-tct2h"
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.448484    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lt2hc\" (UniqueName: \"kubernetes.io/projected/3806a27b-4a0f-439b-8660-d9ccd4bb0618-kube-api-access-lt2hc\") pod \"3806a27b-4a0f-439b-8660-d9ccd4bb0618\" (UID: \"3806a27b-4a0f-439b-8660-d9ccd4bb0618\") "
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.448607    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3806a27b-4a0f-439b-8660-d9ccd4bb0618-config\") pod \"3806a27b-4a0f-439b-8660-d9ccd4bb0618\" (UID: \"3806a27b-4a0f-439b-8660-d9ccd4bb0618\") "
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.448653    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3806a27b-4a0f-439b-8660-d9ccd4bb0618-dns-svc\") pod \"3806a27b-4a0f-439b-8660-d9ccd4bb0618\" (UID: \"3806a27b-4a0f-439b-8660-d9ccd4bb0618\") "
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.448696    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3806a27b-4a0f-439b-8660-d9ccd4bb0618-ovsdbserver-sb\") pod \"3806a27b-4a0f-439b-8660-d9ccd4bb0618\" (UID: \"3806a27b-4a0f-439b-8660-d9ccd4bb0618\") "
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.448767    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3806a27b-4a0f-439b-8660-d9ccd4bb0618-ovsdbserver-nb\") pod \"3806a27b-4a0f-439b-8660-d9ccd4bb0618\" (UID: \"3806a27b-4a0f-439b-8660-d9ccd4bb0618\") "
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.452541    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3806a27b-4a0f-439b-8660-d9ccd4bb0618-kube-api-access-lt2hc" (OuterVolumeSpecName: "kube-api-access-lt2hc") pod "3806a27b-4a0f-439b-8660-d9ccd4bb0618" (UID: "3806a27b-4a0f-439b-8660-d9ccd4bb0618"). InnerVolumeSpecName "kube-api-access-lt2hc". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.507938    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3806a27b-4a0f-439b-8660-d9ccd4bb0618-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3806a27b-4a0f-439b-8660-d9ccd4bb0618" (UID: "3806a27b-4a0f-439b-8660-d9ccd4bb0618"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.522406    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3806a27b-4a0f-439b-8660-d9ccd4bb0618-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3806a27b-4a0f-439b-8660-d9ccd4bb0618" (UID: "3806a27b-4a0f-439b-8660-d9ccd4bb0618"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.522509    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3806a27b-4a0f-439b-8660-d9ccd4bb0618-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3806a27b-4a0f-439b-8660-d9ccd4bb0618" (UID: "3806a27b-4a0f-439b-8660-d9ccd4bb0618"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.529934    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3806a27b-4a0f-439b-8660-d9ccd4bb0618-config" (OuterVolumeSpecName: "config") pod "3806a27b-4a0f-439b-8660-d9ccd4bb0618" (UID: "3806a27b-4a0f-439b-8660-d9ccd4bb0618"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.550108    4730 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3806a27b-4a0f-439b-8660-d9ccd4bb0618-ovsdbserver-nb\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.550142    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lt2hc\" (UniqueName: \"kubernetes.io/projected/3806a27b-4a0f-439b-8660-d9ccd4bb0618-kube-api-access-lt2hc\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.550153    4730 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3806a27b-4a0f-439b-8660-d9ccd4bb0618-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.550161    4730 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3806a27b-4a0f-439b-8660-d9ccd4bb0618-dns-svc\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.550169    4730 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3806a27b-4a0f-439b-8660-d9ccd4bb0618-ovsdbserver-sb\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.612342    4730 scope.go:117] "RemoveContainer" containerID="3950e99a8167c1c32630e01067078d701c75fdf49d8f8666a31a81f7a02ba1d9"
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.793452    4730 generic.go:334] "Generic (PLEG): container finished" podID="3806a27b-4a0f-439b-8660-d9ccd4bb0618" containerID="1e16ce44f59bba62455cbdf7f59fe0150efea77ce27a7027dd9fa2ba8ec0f7d5" exitCode=0
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.793502    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6868c44cb9-4zxm5"
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.793556    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6868c44cb9-4zxm5" event={"ID":"3806a27b-4a0f-439b-8660-d9ccd4bb0618","Type":"ContainerDied","Data":"1e16ce44f59bba62455cbdf7f59fe0150efea77ce27a7027dd9fa2ba8ec0f7d5"}
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.793580    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6868c44cb9-4zxm5" event={"ID":"3806a27b-4a0f-439b-8660-d9ccd4bb0618","Type":"ContainerDied","Data":"1d80c1a200a2b02176c3920aea1d4ac3bcd064cd7f985eb732053050a5d3edc7"}
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.793599    4730 scope.go:117] "RemoveContainer" containerID="1e16ce44f59bba62455cbdf7f59fe0150efea77ce27a7027dd9fa2ba8ec0f7d5"
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.804128    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0"
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.827310    4730 scope.go:117] "RemoveContainer" containerID="065e2b37d1a5fb4fd864e13cde37d04579827706198a7e6a7ccea3b914128eb1"
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.877050    4730 scope.go:117] "RemoveContainer" containerID="1e16ce44f59bba62455cbdf7f59fe0150efea77ce27a7027dd9fa2ba8ec0f7d5"
Mar 20 15:59:36 crc kubenswrapper[4730]: E0320 15:59:36.877927    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e16ce44f59bba62455cbdf7f59fe0150efea77ce27a7027dd9fa2ba8ec0f7d5\": container with ID starting with 1e16ce44f59bba62455cbdf7f59fe0150efea77ce27a7027dd9fa2ba8ec0f7d5 not found: ID does not exist" containerID="1e16ce44f59bba62455cbdf7f59fe0150efea77ce27a7027dd9fa2ba8ec0f7d5"
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.877968    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e16ce44f59bba62455cbdf7f59fe0150efea77ce27a7027dd9fa2ba8ec0f7d5"} err="failed to get container status \"1e16ce44f59bba62455cbdf7f59fe0150efea77ce27a7027dd9fa2ba8ec0f7d5\": rpc error: code = NotFound desc = could not find container \"1e16ce44f59bba62455cbdf7f59fe0150efea77ce27a7027dd9fa2ba8ec0f7d5\": container with ID starting with 1e16ce44f59bba62455cbdf7f59fe0150efea77ce27a7027dd9fa2ba8ec0f7d5 not found: ID does not exist"
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.877995    4730 scope.go:117] "RemoveContainer" containerID="065e2b37d1a5fb4fd864e13cde37d04579827706198a7e6a7ccea3b914128eb1"
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.882524    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6868c44cb9-4zxm5"]
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.889339    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6868c44cb9-4zxm5"]
Mar 20 15:59:36 crc kubenswrapper[4730]: E0320 15:59:36.889559    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"065e2b37d1a5fb4fd864e13cde37d04579827706198a7e6a7ccea3b914128eb1\": container with ID starting with 065e2b37d1a5fb4fd864e13cde37d04579827706198a7e6a7ccea3b914128eb1 not found: ID does not exist" containerID="065e2b37d1a5fb4fd864e13cde37d04579827706198a7e6a7ccea3b914128eb1"
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.889591    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"065e2b37d1a5fb4fd864e13cde37d04579827706198a7e6a7ccea3b914128eb1"} err="failed to get container status \"065e2b37d1a5fb4fd864e13cde37d04579827706198a7e6a7ccea3b914128eb1\": rpc error: code = NotFound desc = could not find container \"065e2b37d1a5fb4fd864e13cde37d04579827706198a7e6a7ccea3b914128eb1\": container with ID starting with 065e2b37d1a5fb4fd864e13cde37d04579827706198a7e6a7ccea3b914128eb1 not found: ID does not exist"
Mar 20 15:59:36 crc kubenswrapper[4730]: W0320 15:59:36.893676    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37d31419_eada_4b93_bc20_bac232ced058.slice/crio-3bd944df856483bb32661f18779d6ead5d56c48a59d438569d6061308ace393b WatchSource:0}: Error finding container 3bd944df856483bb32661f18779d6ead5d56c48a59d438569d6061308ace393b: Status 404 returned error can't find the container with id 3bd944df856483bb32661f18779d6ead5d56c48a59d438569d6061308ace393b
Mar 20 15:59:36 crc kubenswrapper[4730]: I0320 15:59:36.897850    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59fc649cc7-tct2h"]
Mar 20 15:59:37 crc kubenswrapper[4730]: I0320 15:59:37.546109    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3806a27b-4a0f-439b-8660-d9ccd4bb0618" path="/var/lib/kubelet/pods/3806a27b-4a0f-439b-8660-d9ccd4bb0618/volumes"
Mar 20 15:59:37 crc kubenswrapper[4730]: I0320 15:59:37.803978    4730 generic.go:334] "Generic (PLEG): container finished" podID="37d31419-eada-4b93-bc20-bac232ced058" containerID="44eee6d0807ce3fdd8bb3db86c31803cf1f646803d40fa88d8701a25be8c2aaa" exitCode=0
Mar 20 15:59:37 crc kubenswrapper[4730]: I0320 15:59:37.804046    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59fc649cc7-tct2h" event={"ID":"37d31419-eada-4b93-bc20-bac232ced058","Type":"ContainerDied","Data":"44eee6d0807ce3fdd8bb3db86c31803cf1f646803d40fa88d8701a25be8c2aaa"}
Mar 20 15:59:37 crc kubenswrapper[4730]: I0320 15:59:37.804072    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59fc649cc7-tct2h" event={"ID":"37d31419-eada-4b93-bc20-bac232ced058","Type":"ContainerStarted","Data":"3bd944df856483bb32661f18779d6ead5d56c48a59d438569d6061308ace393b"}
Mar 20 15:59:38 crc kubenswrapper[4730]: I0320 15:59:38.818810    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59fc649cc7-tct2h" event={"ID":"37d31419-eada-4b93-bc20-bac232ced058","Type":"ContainerStarted","Data":"bb5b7167e8f80b19b6b3c7bf7988748aafee8cd2702715c1020def5b6b2b9fb6"}
Mar 20 15:59:38 crc kubenswrapper[4730]: I0320 15:59:38.859464    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59fc649cc7-tct2h" podStartSLOduration=2.859429356 podStartE2EDuration="2.859429356s" podCreationTimestamp="2026-03-20 15:59:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:59:38.848568063 +0000 UTC m=+1238.061939442" watchObservedRunningTime="2026-03-20 15:59:38.859429356 +0000 UTC m=+1238.072800735"
Mar 20 15:59:39 crc kubenswrapper[4730]: I0320 15:59:39.175446    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0"
Mar 20 15:59:39 crc kubenswrapper[4730]: I0320 15:59:39.488438    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/notifications-rabbitmq-server-0"
Mar 20 15:59:39 crc kubenswrapper[4730]: I0320 15:59:39.544429    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-87csx"]
Mar 20 15:59:39 crc kubenswrapper[4730]: E0320 15:59:39.545000    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3806a27b-4a0f-439b-8660-d9ccd4bb0618" containerName="init"
Mar 20 15:59:39 crc kubenswrapper[4730]: I0320 15:59:39.545074    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="3806a27b-4a0f-439b-8660-d9ccd4bb0618" containerName="init"
Mar 20 15:59:39 crc kubenswrapper[4730]: E0320 15:59:39.545154    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3806a27b-4a0f-439b-8660-d9ccd4bb0618" containerName="dnsmasq-dns"
Mar 20 15:59:39 crc kubenswrapper[4730]: I0320 15:59:39.545211    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="3806a27b-4a0f-439b-8660-d9ccd4bb0618" containerName="dnsmasq-dns"
Mar 20 15:59:39 crc kubenswrapper[4730]: I0320 15:59:39.545449    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="3806a27b-4a0f-439b-8660-d9ccd4bb0618" containerName="dnsmasq-dns"
Mar 20 15:59:39 crc kubenswrapper[4730]: I0320 15:59:39.546098    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-87csx"
Mar 20 15:59:39 crc kubenswrapper[4730]: I0320 15:59:39.603485    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-87csx"]
Mar 20 15:59:39 crc kubenswrapper[4730]: I0320 15:59:39.665051    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-9f59-account-create-update-vmg5j"]
Mar 20 15:59:39 crc kubenswrapper[4730]: I0320 15:59:39.667113    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9f59-account-create-update-vmg5j"
Mar 20 15:59:39 crc kubenswrapper[4730]: I0320 15:59:39.667617    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6118ed31-b8d7-4a7c-8769-69d996d26915-operator-scripts\") pod \"cinder-db-create-87csx\" (UID: \"6118ed31-b8d7-4a7c-8769-69d996d26915\") " pod="openstack/cinder-db-create-87csx"
Mar 20 15:59:39 crc kubenswrapper[4730]: I0320 15:59:39.667662    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc4nn\" (UniqueName: \"kubernetes.io/projected/6118ed31-b8d7-4a7c-8769-69d996d26915-kube-api-access-jc4nn\") pod \"cinder-db-create-87csx\" (UID: \"6118ed31-b8d7-4a7c-8769-69d996d26915\") " pod="openstack/cinder-db-create-87csx"
Mar 20 15:59:39 crc kubenswrapper[4730]: I0320 15:59:39.674552    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret"
Mar 20 15:59:39 crc kubenswrapper[4730]: I0320 15:59:39.679930    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-9f59-account-create-update-vmg5j"]
Mar 20 15:59:39 crc kubenswrapper[4730]: I0320 15:59:39.769350    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmbg2\" (UniqueName: \"kubernetes.io/projected/44a72513-75fb-4b7e-912b-d28fa63d050a-kube-api-access-cmbg2\") pod \"cinder-9f59-account-create-update-vmg5j\" (UID: \"44a72513-75fb-4b7e-912b-d28fa63d050a\") " pod="openstack/cinder-9f59-account-create-update-vmg5j"
Mar 20 15:59:39 crc kubenswrapper[4730]: I0320 15:59:39.769434    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6118ed31-b8d7-4a7c-8769-69d996d26915-operator-scripts\") pod \"cinder-db-create-87csx\" (UID: \"6118ed31-b8d7-4a7c-8769-69d996d26915\") " pod="openstack/cinder-db-create-87csx"
Mar 20 15:59:39 crc kubenswrapper[4730]: I0320 15:59:39.769454    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc4nn\" (UniqueName: \"kubernetes.io/projected/6118ed31-b8d7-4a7c-8769-69d996d26915-kube-api-access-jc4nn\") pod \"cinder-db-create-87csx\" (UID: \"6118ed31-b8d7-4a7c-8769-69d996d26915\") " pod="openstack/cinder-db-create-87csx"
Mar 20 15:59:39 crc kubenswrapper[4730]: I0320 15:59:39.769487    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44a72513-75fb-4b7e-912b-d28fa63d050a-operator-scripts\") pod \"cinder-9f59-account-create-update-vmg5j\" (UID: \"44a72513-75fb-4b7e-912b-d28fa63d050a\") " pod="openstack/cinder-9f59-account-create-update-vmg5j"
Mar 20 15:59:39 crc kubenswrapper[4730]: I0320 15:59:39.770165    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6118ed31-b8d7-4a7c-8769-69d996d26915-operator-scripts\") pod \"cinder-db-create-87csx\" (UID: \"6118ed31-b8d7-4a7c-8769-69d996d26915\") " pod="openstack/cinder-db-create-87csx"
Mar 20 15:59:39 crc kubenswrapper[4730]: I0320 15:59:39.784410    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0"
Mar 20 15:59:39 crc kubenswrapper[4730]: I0320 15:59:39.788728    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc4nn\" (UniqueName: \"kubernetes.io/projected/6118ed31-b8d7-4a7c-8769-69d996d26915-kube-api-access-jc4nn\") pod \"cinder-db-create-87csx\" (UID: \"6118ed31-b8d7-4a7c-8769-69d996d26915\") " pod="openstack/cinder-db-create-87csx"
Mar 20 15:59:39 crc kubenswrapper[4730]: I0320 15:59:39.851912    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59fc649cc7-tct2h"
Mar 20 15:59:39 crc kubenswrapper[4730]: I0320 15:59:39.872029    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-9q2kz"]
Mar 20 15:59:39 crc kubenswrapper[4730]: I0320 15:59:39.873276    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmbg2\" (UniqueName: \"kubernetes.io/projected/44a72513-75fb-4b7e-912b-d28fa63d050a-kube-api-access-cmbg2\") pod \"cinder-9f59-account-create-update-vmg5j\" (UID: \"44a72513-75fb-4b7e-912b-d28fa63d050a\") " pod="openstack/cinder-9f59-account-create-update-vmg5j"
Mar 20 15:59:39 crc kubenswrapper[4730]: I0320 15:59:39.873374    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44a72513-75fb-4b7e-912b-d28fa63d050a-operator-scripts\") pod \"cinder-9f59-account-create-update-vmg5j\" (UID: \"44a72513-75fb-4b7e-912b-d28fa63d050a\") " pod="openstack/cinder-9f59-account-create-update-vmg5j"
Mar 20 15:59:39 crc kubenswrapper[4730]: I0320 15:59:39.873392    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-9q2kz"
Mar 20 15:59:39 crc kubenswrapper[4730]: I0320 15:59:39.874141    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44a72513-75fb-4b7e-912b-d28fa63d050a-operator-scripts\") pod \"cinder-9f59-account-create-update-vmg5j\" (UID: \"44a72513-75fb-4b7e-912b-d28fa63d050a\") " pod="openstack/cinder-9f59-account-create-update-vmg5j"
Mar 20 15:59:39 crc kubenswrapper[4730]: I0320 15:59:39.884670    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-9q2kz"]
Mar 20 15:59:39 crc kubenswrapper[4730]: I0320 15:59:39.913426    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-3959-account-create-update-qxd89"]
Mar 20 15:59:39 crc kubenswrapper[4730]: I0320 15:59:39.914466    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3959-account-create-update-qxd89"
Mar 20 15:59:39 crc kubenswrapper[4730]: I0320 15:59:39.916511    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-87csx"
Mar 20 15:59:39 crc kubenswrapper[4730]: I0320 15:59:39.917976    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmbg2\" (UniqueName: \"kubernetes.io/projected/44a72513-75fb-4b7e-912b-d28fa63d050a-kube-api-access-cmbg2\") pod \"cinder-9f59-account-create-update-vmg5j\" (UID: \"44a72513-75fb-4b7e-912b-d28fa63d050a\") " pod="openstack/cinder-9f59-account-create-update-vmg5j"
Mar 20 15:59:39 crc kubenswrapper[4730]: I0320 15:59:39.919715    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret"
Mar 20 15:59:39 crc kubenswrapper[4730]: I0320 15:59:39.961364    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-3959-account-create-update-qxd89"]
Mar 20 15:59:39 crc kubenswrapper[4730]: I0320 15:59:39.976199    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad93c0a8-34d6-4fee-985c-7c7307f00c0c-operator-scripts\") pod \"barbican-db-create-9q2kz\" (UID: \"ad93c0a8-34d6-4fee-985c-7c7307f00c0c\") " pod="openstack/barbican-db-create-9q2kz"
Mar 20 15:59:39 crc kubenswrapper[4730]: I0320 15:59:39.976322    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnfnf\" (UniqueName: \"kubernetes.io/projected/ad93c0a8-34d6-4fee-985c-7c7307f00c0c-kube-api-access-gnfnf\") pod \"barbican-db-create-9q2kz\" (UID: \"ad93c0a8-34d6-4fee-985c-7c7307f00c0c\") " pod="openstack/barbican-db-create-9q2kz"
Mar 20 15:59:39 crc kubenswrapper[4730]: I0320 15:59:39.983537    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9f59-account-create-update-vmg5j"
Mar 20 15:59:40 crc kubenswrapper[4730]: I0320 15:59:40.041473    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-rb4pw"]
Mar 20 15:59:40 crc kubenswrapper[4730]: I0320 15:59:40.077095    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-rb4pw"]
Mar 20 15:59:40 crc kubenswrapper[4730]: I0320 15:59:40.077185    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-rb4pw"
Mar 20 15:59:40 crc kubenswrapper[4730]: I0320 15:59:40.077819    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad93c0a8-34d6-4fee-985c-7c7307f00c0c-operator-scripts\") pod \"barbican-db-create-9q2kz\" (UID: \"ad93c0a8-34d6-4fee-985c-7c7307f00c0c\") " pod="openstack/barbican-db-create-9q2kz"
Mar 20 15:59:40 crc kubenswrapper[4730]: I0320 15:59:40.078102    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2tsx\" (UniqueName: \"kubernetes.io/projected/e01c2575-5301-494a-bf47-9a6053de9c64-kube-api-access-z2tsx\") pod \"barbican-3959-account-create-update-qxd89\" (UID: \"e01c2575-5301-494a-bf47-9a6053de9c64\") " pod="openstack/barbican-3959-account-create-update-qxd89"
Mar 20 15:59:40 crc kubenswrapper[4730]: I0320 15:59:40.078131    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnfnf\" (UniqueName: \"kubernetes.io/projected/ad93c0a8-34d6-4fee-985c-7c7307f00c0c-kube-api-access-gnfnf\") pod \"barbican-db-create-9q2kz\" (UID: \"ad93c0a8-34d6-4fee-985c-7c7307f00c0c\") " pod="openstack/barbican-db-create-9q2kz"
Mar 20 15:59:40 crc kubenswrapper[4730]: I0320 15:59:40.078149    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e01c2575-5301-494a-bf47-9a6053de9c64-operator-scripts\") pod \"barbican-3959-account-create-update-qxd89\" (UID: \"e01c2575-5301-494a-bf47-9a6053de9c64\") " pod="openstack/barbican-3959-account-create-update-qxd89"
Mar 20 15:59:40 crc kubenswrapper[4730]: I0320 15:59:40.078576    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad93c0a8-34d6-4fee-985c-7c7307f00c0c-operator-scripts\") pod \"barbican-db-create-9q2kz\" (UID: \"ad93c0a8-34d6-4fee-985c-7c7307f00c0c\") " pod="openstack/barbican-db-create-9q2kz"
Mar 20 15:59:40 crc kubenswrapper[4730]: I0320 15:59:40.079764    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts"
Mar 20 15:59:40 crc kubenswrapper[4730]: I0320 15:59:40.079822    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data"
Mar 20 15:59:40 crc kubenswrapper[4730]: I0320 15:59:40.080008    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone"
Mar 20 15:59:40 crc kubenswrapper[4730]: I0320 15:59:40.087043    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-pjvk4"
Mar 20 15:59:40 crc kubenswrapper[4730]: I0320 15:59:40.119169    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnfnf\" (UniqueName: \"kubernetes.io/projected/ad93c0a8-34d6-4fee-985c-7c7307f00c0c-kube-api-access-gnfnf\") pod \"barbican-db-create-9q2kz\" (UID: \"ad93c0a8-34d6-4fee-985c-7c7307f00c0c\") " pod="openstack/barbican-db-create-9q2kz"
Mar 20 15:59:40 crc kubenswrapper[4730]: I0320 15:59:40.179757    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e01c2575-5301-494a-bf47-9a6053de9c64-operator-scripts\") pod \"barbican-3959-account-create-update-qxd89\" (UID: \"e01c2575-5301-494a-bf47-9a6053de9c64\") " pod="openstack/barbican-3959-account-create-update-qxd89"
Mar 20 15:59:40 crc kubenswrapper[4730]: I0320 15:59:40.179898    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92a7eed8-de7c-4816-8bd9-e922ace376ad-config-data\") pod \"keystone-db-sync-rb4pw\" (UID: \"92a7eed8-de7c-4816-8bd9-e922ace376ad\") " pod="openstack/keystone-db-sync-rb4pw"
Mar 20 15:59:40 crc kubenswrapper[4730]: I0320 15:59:40.179931    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4hmq\" (UniqueName: \"kubernetes.io/projected/92a7eed8-de7c-4816-8bd9-e922ace376ad-kube-api-access-h4hmq\") pod \"keystone-db-sync-rb4pw\" (UID: \"92a7eed8-de7c-4816-8bd9-e922ace376ad\") " pod="openstack/keystone-db-sync-rb4pw"
Mar 20 15:59:40 crc kubenswrapper[4730]: I0320 15:59:40.179980    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92a7eed8-de7c-4816-8bd9-e922ace376ad-combined-ca-bundle\") pod \"keystone-db-sync-rb4pw\" (UID: \"92a7eed8-de7c-4816-8bd9-e922ace376ad\") " pod="openstack/keystone-db-sync-rb4pw"
Mar 20 15:59:40 crc kubenswrapper[4730]: I0320 15:59:40.180002    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2tsx\" (UniqueName: \"kubernetes.io/projected/e01c2575-5301-494a-bf47-9a6053de9c64-kube-api-access-z2tsx\") pod \"barbican-3959-account-create-update-qxd89\" (UID: \"e01c2575-5301-494a-bf47-9a6053de9c64\") " pod="openstack/barbican-3959-account-create-update-qxd89"
Mar 20 15:59:40 crc kubenswrapper[4730]: I0320 15:59:40.182293    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e01c2575-5301-494a-bf47-9a6053de9c64-operator-scripts\") pod \"barbican-3959-account-create-update-qxd89\" (UID: \"e01c2575-5301-494a-bf47-9a6053de9c64\") " pod="openstack/barbican-3959-account-create-update-qxd89"
Mar 20 15:59:40 crc kubenswrapper[4730]: I0320 15:59:40.199587    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-9q2kz"
Mar 20 15:59:40 crc kubenswrapper[4730]: I0320 15:59:40.201793    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2tsx\" (UniqueName: \"kubernetes.io/projected/e01c2575-5301-494a-bf47-9a6053de9c64-kube-api-access-z2tsx\") pod \"barbican-3959-account-create-update-qxd89\" (UID: \"e01c2575-5301-494a-bf47-9a6053de9c64\") " pod="openstack/barbican-3959-account-create-update-qxd89"
Mar 20 15:59:40 crc kubenswrapper[4730]: I0320 15:59:40.282953    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92a7eed8-de7c-4816-8bd9-e922ace376ad-config-data\") pod \"keystone-db-sync-rb4pw\" (UID: \"92a7eed8-de7c-4816-8bd9-e922ace376ad\") " pod="openstack/keystone-db-sync-rb4pw"
Mar 20 15:59:40 crc kubenswrapper[4730]: I0320 15:59:40.282997    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4hmq\" (UniqueName: \"kubernetes.io/projected/92a7eed8-de7c-4816-8bd9-e922ace376ad-kube-api-access-h4hmq\") pod \"keystone-db-sync-rb4pw\" (UID: \"92a7eed8-de7c-4816-8bd9-e922ace376ad\") " pod="openstack/keystone-db-sync-rb4pw"
Mar 20 15:59:40 crc kubenswrapper[4730]: I0320 15:59:40.283046    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92a7eed8-de7c-4816-8bd9-e922ace376ad-combined-ca-bundle\") pod \"keystone-db-sync-rb4pw\" (UID: \"92a7eed8-de7c-4816-8bd9-e922ace376ad\") " pod="openstack/keystone-db-sync-rb4pw"
Mar 20 15:59:40 crc kubenswrapper[4730]: I0320 15:59:40.290108    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92a7eed8-de7c-4816-8bd9-e922ace376ad-combined-ca-bundle\") pod \"keystone-db-sync-rb4pw\" (UID: \"92a7eed8-de7c-4816-8bd9-e922ace376ad\") " pod="openstack/keystone-db-sync-rb4pw"
Mar 20 15:59:40 crc kubenswrapper[4730]: I0320 15:59:40.291957    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92a7eed8-de7c-4816-8bd9-e922ace376ad-config-data\") pod \"keystone-db-sync-rb4pw\" (UID: \"92a7eed8-de7c-4816-8bd9-e922ace376ad\") " pod="openstack/keystone-db-sync-rb4pw"
Mar 20 15:59:40 crc kubenswrapper[4730]: I0320 15:59:40.307610    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4hmq\" (UniqueName: \"kubernetes.io/projected/92a7eed8-de7c-4816-8bd9-e922ace376ad-kube-api-access-h4hmq\") pod \"keystone-db-sync-rb4pw\" (UID: \"92a7eed8-de7c-4816-8bd9-e922ace376ad\") " pod="openstack/keystone-db-sync-rb4pw"
Mar 20 15:59:40 crc kubenswrapper[4730]: I0320 15:59:40.373678    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3959-account-create-update-qxd89"
Mar 20 15:59:40 crc kubenswrapper[4730]: I0320 15:59:40.393437    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-rb4pw"
Mar 20 15:59:40 crc kubenswrapper[4730]: I0320 15:59:40.504096    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-87csx"]
Mar 20 15:59:40 crc kubenswrapper[4730]: W0320 15:59:40.518706    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6118ed31_b8d7_4a7c_8769_69d996d26915.slice/crio-6d472469b6e16e310262d75432013a7a3a34ad4fdb01866ebdfaf12a061b5465 WatchSource:0}: Error finding container 6d472469b6e16e310262d75432013a7a3a34ad4fdb01866ebdfaf12a061b5465: Status 404 returned error can't find the container with id 6d472469b6e16e310262d75432013a7a3a34ad4fdb01866ebdfaf12a061b5465
Mar 20 15:59:40 crc kubenswrapper[4730]: I0320 15:59:40.585690    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-9f59-account-create-update-vmg5j"]
Mar 20 15:59:40 crc kubenswrapper[4730]: I0320 15:59:40.668763    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-9q2kz"]
Mar 20 15:59:40 crc kubenswrapper[4730]: W0320 15:59:40.674223    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad93c0a8_34d6_4fee_985c_7c7307f00c0c.slice/crio-6c62958dfa715bbaad2cb18642675288a24bc6d6b2b58edbfb494fdc8d1bb7c5 WatchSource:0}: Error finding container 6c62958dfa715bbaad2cb18642675288a24bc6d6b2b58edbfb494fdc8d1bb7c5: Status 404 returned error can't find the container with id 6c62958dfa715bbaad2cb18642675288a24bc6d6b2b58edbfb494fdc8d1bb7c5
Mar 20 15:59:40 crc kubenswrapper[4730]: I0320 15:59:40.859172    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-87csx" event={"ID":"6118ed31-b8d7-4a7c-8769-69d996d26915","Type":"ContainerStarted","Data":"a18801b5a50e28a1d043f07d02846b12496eaa787cc63d296052b7f86700e382"}
Mar 20 15:59:40 crc kubenswrapper[4730]: I0320 15:59:40.859214    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-87csx" event={"ID":"6118ed31-b8d7-4a7c-8769-69d996d26915","Type":"ContainerStarted","Data":"6d472469b6e16e310262d75432013a7a3a34ad4fdb01866ebdfaf12a061b5465"}
Mar 20 15:59:40 crc kubenswrapper[4730]: I0320 15:59:40.861809    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9f59-account-create-update-vmg5j" event={"ID":"44a72513-75fb-4b7e-912b-d28fa63d050a","Type":"ContainerStarted","Data":"ecddce73fd871590be8e4104469454a63bf36c8d5e335fcb1236e7e17748fcf3"}
Mar 20 15:59:40 crc kubenswrapper[4730]: I0320 15:59:40.861857    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9f59-account-create-update-vmg5j" event={"ID":"44a72513-75fb-4b7e-912b-d28fa63d050a","Type":"ContainerStarted","Data":"6a094396f0bb1056f88ceb040f7dab6620e9ea44941063752d5aa89e4f67a92a"}
Mar 20 15:59:40 crc kubenswrapper[4730]: I0320 15:59:40.864460    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-9q2kz" event={"ID":"ad93c0a8-34d6-4fee-985c-7c7307f00c0c","Type":"ContainerStarted","Data":"619c70ff24e78ebd6137bc20c79ee2dc5949bf1cca622b03e9fc4227379e48f4"}
Mar 20 15:59:40 crc kubenswrapper[4730]: I0320 15:59:40.864489    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-9q2kz" event={"ID":"ad93c0a8-34d6-4fee-985c-7c7307f00c0c","Type":"ContainerStarted","Data":"6c62958dfa715bbaad2cb18642675288a24bc6d6b2b58edbfb494fdc8d1bb7c5"}
Mar 20 15:59:40 crc kubenswrapper[4730]: I0320 15:59:40.871389    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-3959-account-create-update-qxd89"]
Mar 20 15:59:40 crc kubenswrapper[4730]: I0320 15:59:40.895982    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-rb4pw"]
Mar 20 15:59:40 crc kubenswrapper[4730]: I0320 15:59:40.903566    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-87csx" podStartSLOduration=1.903547689 podStartE2EDuration="1.903547689s" podCreationTimestamp="2026-03-20 15:59:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:59:40.884871811 +0000 UTC m=+1240.098243190" watchObservedRunningTime="2026-03-20 15:59:40.903547689 +0000 UTC m=+1240.116919058"
Mar 20 15:59:40 crc kubenswrapper[4730]: I0320 15:59:40.928488    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-9q2kz" podStartSLOduration=1.9284704860000002 podStartE2EDuration="1.928470486s" podCreationTimestamp="2026-03-20 15:59:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:59:40.901563165 +0000 UTC m=+1240.114934534" watchObservedRunningTime="2026-03-20 15:59:40.928470486 +0000 UTC m=+1240.141841855"
Mar 20 15:59:40 crc kubenswrapper[4730]: I0320 15:59:40.938142    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-9f59-account-create-update-vmg5j" podStartSLOduration=1.938123612 podStartE2EDuration="1.938123612s" podCreationTimestamp="2026-03-20 15:59:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:59:40.915400684 +0000 UTC m=+1240.128772053" watchObservedRunningTime="2026-03-20 15:59:40.938123612 +0000 UTC m=+1240.151494981"
Mar 20 15:59:41 crc kubenswrapper[4730]: I0320 15:59:41.296935    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-sync-ns6b5"]
Mar 20 15:59:41 crc kubenswrapper[4730]: I0320 15:59:41.299377    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-ns6b5"
Mar 20 15:59:41 crc kubenswrapper[4730]: I0320 15:59:41.303596    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-config-data"
Mar 20 15:59:41 crc kubenswrapper[4730]: I0320 15:59:41.309379    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-h4zsn"
Mar 20 15:59:41 crc kubenswrapper[4730]: I0320 15:59:41.315682    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-ns6b5"]
Mar 20 15:59:41 crc kubenswrapper[4730]: I0320 15:59:41.372187    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-qpb6s"]
Mar 20 15:59:41 crc kubenswrapper[4730]: I0320 15:59:41.373443    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qpb6s"
Mar 20 15:59:41 crc kubenswrapper[4730]: I0320 15:59:41.380781    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-qpb6s"]
Mar 20 15:59:41 crc kubenswrapper[4730]: I0320 15:59:41.403816    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pz9h\" (UniqueName: \"kubernetes.io/projected/9577f66b-a45e-4d51-9d87-4ae757819182-kube-api-access-9pz9h\") pod \"watcher-db-sync-ns6b5\" (UID: \"9577f66b-a45e-4d51-9d87-4ae757819182\") " pod="openstack/watcher-db-sync-ns6b5"
Mar 20 15:59:41 crc kubenswrapper[4730]: I0320 15:59:41.403890    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9577f66b-a45e-4d51-9d87-4ae757819182-combined-ca-bundle\") pod \"watcher-db-sync-ns6b5\" (UID: \"9577f66b-a45e-4d51-9d87-4ae757819182\") " pod="openstack/watcher-db-sync-ns6b5"
Mar 20 15:59:41 crc kubenswrapper[4730]: I0320 15:59:41.403922    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9577f66b-a45e-4d51-9d87-4ae757819182-db-sync-config-data\") pod \"watcher-db-sync-ns6b5\" (UID: \"9577f66b-a45e-4d51-9d87-4ae757819182\") " pod="openstack/watcher-db-sync-ns6b5"
Mar 20 15:59:41 crc kubenswrapper[4730]: I0320 15:59:41.403961    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9577f66b-a45e-4d51-9d87-4ae757819182-config-data\") pod \"watcher-db-sync-ns6b5\" (UID: \"9577f66b-a45e-4d51-9d87-4ae757819182\") " pod="openstack/watcher-db-sync-ns6b5"
Mar 20 15:59:41 crc kubenswrapper[4730]: I0320 15:59:41.466542    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-c423-account-create-update-dcjc2"]
Mar 20 15:59:41 crc kubenswrapper[4730]: I0320 15:59:41.467573    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c423-account-create-update-dcjc2"
Mar 20 15:59:41 crc kubenswrapper[4730]: I0320 15:59:41.469728    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret"
Mar 20 15:59:41 crc kubenswrapper[4730]: I0320 15:59:41.486387    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c423-account-create-update-dcjc2"]
Mar 20 15:59:41 crc kubenswrapper[4730]: I0320 15:59:41.505810    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pz9h\" (UniqueName: \"kubernetes.io/projected/9577f66b-a45e-4d51-9d87-4ae757819182-kube-api-access-9pz9h\") pod \"watcher-db-sync-ns6b5\" (UID: \"9577f66b-a45e-4d51-9d87-4ae757819182\") " pod="openstack/watcher-db-sync-ns6b5"
Mar 20 15:59:41 crc kubenswrapper[4730]: I0320 15:59:41.506020    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h52h\" (UniqueName: \"kubernetes.io/projected/1c94c6d8-4c40-455a-a536-7c64e3838986-kube-api-access-8h52h\") pod \"neutron-db-create-qpb6s\" (UID: \"1c94c6d8-4c40-455a-a536-7c64e3838986\") " pod="openstack/neutron-db-create-qpb6s"
Mar 20 15:59:41 crc kubenswrapper[4730]: I0320 15:59:41.506107    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c94c6d8-4c40-455a-a536-7c64e3838986-operator-scripts\") pod \"neutron-db-create-qpb6s\" (UID: \"1c94c6d8-4c40-455a-a536-7c64e3838986\") " pod="openstack/neutron-db-create-qpb6s"
Mar 20 15:59:41 crc kubenswrapper[4730]: I0320 15:59:41.506149    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9577f66b-a45e-4d51-9d87-4ae757819182-combined-ca-bundle\") pod \"watcher-db-sync-ns6b5\" (UID: \"9577f66b-a45e-4d51-9d87-4ae757819182\") " pod="openstack/watcher-db-sync-ns6b5"
Mar 20 15:59:41 crc kubenswrapper[4730]: I0320 15:59:41.506179    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9577f66b-a45e-4d51-9d87-4ae757819182-db-sync-config-data\") pod \"watcher-db-sync-ns6b5\" (UID: \"9577f66b-a45e-4d51-9d87-4ae757819182\") " pod="openstack/watcher-db-sync-ns6b5"
Mar 20 15:59:41 crc kubenswrapper[4730]: I0320 15:59:41.506236    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9577f66b-a45e-4d51-9d87-4ae757819182-config-data\") pod \"watcher-db-sync-ns6b5\" (UID: \"9577f66b-a45e-4d51-9d87-4ae757819182\") " pod="openstack/watcher-db-sync-ns6b5"
Mar 20 15:59:41 crc kubenswrapper[4730]: I0320 15:59:41.512294    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9577f66b-a45e-4d51-9d87-4ae757819182-config-data\") pod \"watcher-db-sync-ns6b5\" (UID: \"9577f66b-a45e-4d51-9d87-4ae757819182\") " pod="openstack/watcher-db-sync-ns6b5"
Mar 20 15:59:41 crc kubenswrapper[4730]: I0320 15:59:41.512534    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9577f66b-a45e-4d51-9d87-4ae757819182-combined-ca-bundle\") pod \"watcher-db-sync-ns6b5\" (UID: \"9577f66b-a45e-4d51-9d87-4ae757819182\") " pod="openstack/watcher-db-sync-ns6b5"
Mar 20 15:59:41 crc kubenswrapper[4730]: I0320 15:59:41.527866    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9577f66b-a45e-4d51-9d87-4ae757819182-db-sync-config-data\") pod \"watcher-db-sync-ns6b5\" (UID: \"9577f66b-a45e-4d51-9d87-4ae757819182\") " pod="openstack/watcher-db-sync-ns6b5"
Mar 20 15:59:41 crc kubenswrapper[4730]: I0320 15:59:41.528595    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pz9h\" (UniqueName: \"kubernetes.io/projected/9577f66b-a45e-4d51-9d87-4ae757819182-kube-api-access-9pz9h\") pod \"watcher-db-sync-ns6b5\" (UID: \"9577f66b-a45e-4d51-9d87-4ae757819182\") " pod="openstack/watcher-db-sync-ns6b5"
Mar 20 15:59:41 crc kubenswrapper[4730]: I0320 15:59:41.616362    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06e5575b-c67a-46fe-8502-efc341523de2-operator-scripts\") pod \"neutron-c423-account-create-update-dcjc2\" (UID: \"06e5575b-c67a-46fe-8502-efc341523de2\") " pod="openstack/neutron-c423-account-create-update-dcjc2"
Mar 20 15:59:41 crc kubenswrapper[4730]: I0320 15:59:41.616464    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8h52h\" (UniqueName: \"kubernetes.io/projected/1c94c6d8-4c40-455a-a536-7c64e3838986-kube-api-access-8h52h\") pod \"neutron-db-create-qpb6s\" (UID: \"1c94c6d8-4c40-455a-a536-7c64e3838986\") " pod="openstack/neutron-db-create-qpb6s"
Mar 20 15:59:41 crc kubenswrapper[4730]: I0320 15:59:41.616517    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c94c6d8-4c40-455a-a536-7c64e3838986-operator-scripts\") pod \"neutron-db-create-qpb6s\" (UID: \"1c94c6d8-4c40-455a-a536-7c64e3838986\") " pod="openstack/neutron-db-create-qpb6s"
Mar 20 15:59:41 crc kubenswrapper[4730]: I0320 15:59:41.616749    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r4t9\" (UniqueName: \"kubernetes.io/projected/06e5575b-c67a-46fe-8502-efc341523de2-kube-api-access-7r4t9\") pod \"neutron-c423-account-create-update-dcjc2\" (UID: \"06e5575b-c67a-46fe-8502-efc341523de2\") " pod="openstack/neutron-c423-account-create-update-dcjc2"
Mar 20 15:59:41 crc kubenswrapper[4730]: I0320 15:59:41.625331    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c94c6d8-4c40-455a-a536-7c64e3838986-operator-scripts\") pod \"neutron-db-create-qpb6s\" (UID: \"1c94c6d8-4c40-455a-a536-7c64e3838986\") " pod="openstack/neutron-db-create-qpb6s"
Mar 20 15:59:41 crc kubenswrapper[4730]: I0320 15:59:41.626033    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-ns6b5"
Mar 20 15:59:41 crc kubenswrapper[4730]: I0320 15:59:41.650913    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8h52h\" (UniqueName: \"kubernetes.io/projected/1c94c6d8-4c40-455a-a536-7c64e3838986-kube-api-access-8h52h\") pod \"neutron-db-create-qpb6s\" (UID: \"1c94c6d8-4c40-455a-a536-7c64e3838986\") " pod="openstack/neutron-db-create-qpb6s"
Mar 20 15:59:41 crc kubenswrapper[4730]: I0320 15:59:41.695331    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qpb6s"
Mar 20 15:59:41 crc kubenswrapper[4730]: I0320 15:59:41.718333    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7r4t9\" (UniqueName: \"kubernetes.io/projected/06e5575b-c67a-46fe-8502-efc341523de2-kube-api-access-7r4t9\") pod \"neutron-c423-account-create-update-dcjc2\" (UID: \"06e5575b-c67a-46fe-8502-efc341523de2\") " pod="openstack/neutron-c423-account-create-update-dcjc2"
Mar 20 15:59:41 crc kubenswrapper[4730]: I0320 15:59:41.718453    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06e5575b-c67a-46fe-8502-efc341523de2-operator-scripts\") pod \"neutron-c423-account-create-update-dcjc2\" (UID: \"06e5575b-c67a-46fe-8502-efc341523de2\") " pod="openstack/neutron-c423-account-create-update-dcjc2"
Mar 20 15:59:41 crc kubenswrapper[4730]: I0320 15:59:41.719218    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06e5575b-c67a-46fe-8502-efc341523de2-operator-scripts\") pod \"neutron-c423-account-create-update-dcjc2\" (UID: \"06e5575b-c67a-46fe-8502-efc341523de2\") " pod="openstack/neutron-c423-account-create-update-dcjc2"
Mar 20 15:59:41 crc kubenswrapper[4730]: I0320 15:59:41.740838    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r4t9\" (UniqueName: \"kubernetes.io/projected/06e5575b-c67a-46fe-8502-efc341523de2-kube-api-access-7r4t9\") pod \"neutron-c423-account-create-update-dcjc2\" (UID: \"06e5575b-c67a-46fe-8502-efc341523de2\") " pod="openstack/neutron-c423-account-create-update-dcjc2"
Mar 20 15:59:41 crc kubenswrapper[4730]: I0320 15:59:41.784798    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c423-account-create-update-dcjc2"
Mar 20 15:59:41 crc kubenswrapper[4730]: I0320 15:59:41.878646    4730 generic.go:334] "Generic (PLEG): container finished" podID="e01c2575-5301-494a-bf47-9a6053de9c64" containerID="c35a1209cb7b3066725c4f8438840ab79db396745f3b69d5ee16580ca7ae88eb" exitCode=0
Mar 20 15:59:41 crc kubenswrapper[4730]: I0320 15:59:41.879119    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-3959-account-create-update-qxd89" event={"ID":"e01c2575-5301-494a-bf47-9a6053de9c64","Type":"ContainerDied","Data":"c35a1209cb7b3066725c4f8438840ab79db396745f3b69d5ee16580ca7ae88eb"}
Mar 20 15:59:41 crc kubenswrapper[4730]: I0320 15:59:41.879164    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-3959-account-create-update-qxd89" event={"ID":"e01c2575-5301-494a-bf47-9a6053de9c64","Type":"ContainerStarted","Data":"88bb1e1b8804c326245e327cd045f9a813fc551a35052713e8b42dc063c0a8fa"}
Mar 20 15:59:41 crc kubenswrapper[4730]: I0320 15:59:41.881738    4730 generic.go:334] "Generic (PLEG): container finished" podID="ad93c0a8-34d6-4fee-985c-7c7307f00c0c" containerID="619c70ff24e78ebd6137bc20c79ee2dc5949bf1cca622b03e9fc4227379e48f4" exitCode=0
Mar 20 15:59:41 crc kubenswrapper[4730]: I0320 15:59:41.881794    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-9q2kz" event={"ID":"ad93c0a8-34d6-4fee-985c-7c7307f00c0c","Type":"ContainerDied","Data":"619c70ff24e78ebd6137bc20c79ee2dc5949bf1cca622b03e9fc4227379e48f4"}
Mar 20 15:59:41 crc kubenswrapper[4730]: I0320 15:59:41.888483    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-rb4pw" event={"ID":"92a7eed8-de7c-4816-8bd9-e922ace376ad","Type":"ContainerStarted","Data":"15e39e3baf091f230b52a236e8e09de020ef95a8e3541076e0af482275abf9c1"}
Mar 20 15:59:41 crc kubenswrapper[4730]: I0320 15:59:41.915514    4730 generic.go:334] "Generic (PLEG): container finished" podID="6118ed31-b8d7-4a7c-8769-69d996d26915" containerID="a18801b5a50e28a1d043f07d02846b12496eaa787cc63d296052b7f86700e382" exitCode=0
Mar 20 15:59:41 crc kubenswrapper[4730]: I0320 15:59:41.915598    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-87csx" event={"ID":"6118ed31-b8d7-4a7c-8769-69d996d26915","Type":"ContainerDied","Data":"a18801b5a50e28a1d043f07d02846b12496eaa787cc63d296052b7f86700e382"}
Mar 20 15:59:41 crc kubenswrapper[4730]: I0320 15:59:41.917994    4730 generic.go:334] "Generic (PLEG): container finished" podID="44a72513-75fb-4b7e-912b-d28fa63d050a" containerID="ecddce73fd871590be8e4104469454a63bf36c8d5e335fcb1236e7e17748fcf3" exitCode=0
Mar 20 15:59:41 crc kubenswrapper[4730]: I0320 15:59:41.918029    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9f59-account-create-update-vmg5j" event={"ID":"44a72513-75fb-4b7e-912b-d28fa63d050a","Type":"ContainerDied","Data":"ecddce73fd871590be8e4104469454a63bf36c8d5e335fcb1236e7e17748fcf3"}
Mar 20 15:59:42 crc kubenswrapper[4730]: I0320 15:59:42.099502    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-ns6b5"]
Mar 20 15:59:42 crc kubenswrapper[4730]: I0320 15:59:42.289226    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-qpb6s"]
Mar 20 15:59:42 crc kubenswrapper[4730]: W0320 15:59:42.319980    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c94c6d8_4c40_455a_a536_7c64e3838986.slice/crio-4c5e1f649837c47339a572132afe7d468b5386426786fd3de46f1f95322e65e5 WatchSource:0}: Error finding container 4c5e1f649837c47339a572132afe7d468b5386426786fd3de46f1f95322e65e5: Status 404 returned error can't find the container with id 4c5e1f649837c47339a572132afe7d468b5386426786fd3de46f1f95322e65e5
Mar 20 15:59:42 crc kubenswrapper[4730]: I0320 15:59:42.385256    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c423-account-create-update-dcjc2"]
Mar 20 15:59:42 crc kubenswrapper[4730]: I0320 15:59:42.932427    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c423-account-create-update-dcjc2" event={"ID":"06e5575b-c67a-46fe-8502-efc341523de2","Type":"ContainerStarted","Data":"99fba5e2cadd379521ca79369b155ec13b031c591917c4f1be4fc608956b6dda"}
Mar 20 15:59:42 crc kubenswrapper[4730]: I0320 15:59:42.932780    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c423-account-create-update-dcjc2" event={"ID":"06e5575b-c67a-46fe-8502-efc341523de2","Type":"ContainerStarted","Data":"2876d90345315b2a0809c889345800e43703e0183ea1a884fd91389c2ca8ac2a"}
Mar 20 15:59:42 crc kubenswrapper[4730]: I0320 15:59:42.934585    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-qpb6s" event={"ID":"1c94c6d8-4c40-455a-a536-7c64e3838986","Type":"ContainerStarted","Data":"40e0babc7b2f63017ce242ba014b4798c26ae0c66070098c86ad2de5a7400e6c"}
Mar 20 15:59:42 crc kubenswrapper[4730]: I0320 15:59:42.934623    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-qpb6s" event={"ID":"1c94c6d8-4c40-455a-a536-7c64e3838986","Type":"ContainerStarted","Data":"4c5e1f649837c47339a572132afe7d468b5386426786fd3de46f1f95322e65e5"}
Mar 20 15:59:42 crc kubenswrapper[4730]: I0320 15:59:42.936241    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-ns6b5" event={"ID":"9577f66b-a45e-4d51-9d87-4ae757819182","Type":"ContainerStarted","Data":"2a09ead92eb5e9f3f31718e3636e13dfb29ffdd6eee5420ea19a1b73c6cb9b82"}
Mar 20 15:59:42 crc kubenswrapper[4730]: I0320 15:59:42.952800    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-c423-account-create-update-dcjc2" podStartSLOduration=1.952767946 podStartE2EDuration="1.952767946s" podCreationTimestamp="2026-03-20 15:59:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:59:42.949010732 +0000 UTC m=+1242.162382101" watchObservedRunningTime="2026-03-20 15:59:42.952767946 +0000 UTC m=+1242.166139315"
Mar 20 15:59:43 crc kubenswrapper[4730]: I0320 15:59:43.392830    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-9q2kz"
Mar 20 15:59:43 crc kubenswrapper[4730]: I0320 15:59:43.563089    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad93c0a8-34d6-4fee-985c-7c7307f00c0c-operator-scripts\") pod \"ad93c0a8-34d6-4fee-985c-7c7307f00c0c\" (UID: \"ad93c0a8-34d6-4fee-985c-7c7307f00c0c\") "
Mar 20 15:59:43 crc kubenswrapper[4730]: I0320 15:59:43.563462    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnfnf\" (UniqueName: \"kubernetes.io/projected/ad93c0a8-34d6-4fee-985c-7c7307f00c0c-kube-api-access-gnfnf\") pod \"ad93c0a8-34d6-4fee-985c-7c7307f00c0c\" (UID: \"ad93c0a8-34d6-4fee-985c-7c7307f00c0c\") "
Mar 20 15:59:43 crc kubenswrapper[4730]: I0320 15:59:43.563693    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad93c0a8-34d6-4fee-985c-7c7307f00c0c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ad93c0a8-34d6-4fee-985c-7c7307f00c0c" (UID: "ad93c0a8-34d6-4fee-985c-7c7307f00c0c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:59:43 crc kubenswrapper[4730]: I0320 15:59:43.563995    4730 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad93c0a8-34d6-4fee-985c-7c7307f00c0c-operator-scripts\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:43 crc kubenswrapper[4730]: I0320 15:59:43.569833    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad93c0a8-34d6-4fee-985c-7c7307f00c0c-kube-api-access-gnfnf" (OuterVolumeSpecName: "kube-api-access-gnfnf") pod "ad93c0a8-34d6-4fee-985c-7c7307f00c0c" (UID: "ad93c0a8-34d6-4fee-985c-7c7307f00c0c"). InnerVolumeSpecName "kube-api-access-gnfnf". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:59:43 crc kubenswrapper[4730]: I0320 15:59:43.589223    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-87csx"
Mar 20 15:59:43 crc kubenswrapper[4730]: I0320 15:59:43.599749    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9f59-account-create-update-vmg5j"
Mar 20 15:59:43 crc kubenswrapper[4730]: I0320 15:59:43.617288    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3959-account-create-update-qxd89"
Mar 20 15:59:43 crc kubenswrapper[4730]: I0320 15:59:43.665087    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6118ed31-b8d7-4a7c-8769-69d996d26915-operator-scripts\") pod \"6118ed31-b8d7-4a7c-8769-69d996d26915\" (UID: \"6118ed31-b8d7-4a7c-8769-69d996d26915\") "
Mar 20 15:59:43 crc kubenswrapper[4730]: I0320 15:59:43.665189    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jc4nn\" (UniqueName: \"kubernetes.io/projected/6118ed31-b8d7-4a7c-8769-69d996d26915-kube-api-access-jc4nn\") pod \"6118ed31-b8d7-4a7c-8769-69d996d26915\" (UID: \"6118ed31-b8d7-4a7c-8769-69d996d26915\") "
Mar 20 15:59:43 crc kubenswrapper[4730]: I0320 15:59:43.665635    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6118ed31-b8d7-4a7c-8769-69d996d26915-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6118ed31-b8d7-4a7c-8769-69d996d26915" (UID: "6118ed31-b8d7-4a7c-8769-69d996d26915"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:59:43 crc kubenswrapper[4730]: I0320 15:59:43.665829    4730 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6118ed31-b8d7-4a7c-8769-69d996d26915-operator-scripts\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:43 crc kubenswrapper[4730]: I0320 15:59:43.665848    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnfnf\" (UniqueName: \"kubernetes.io/projected/ad93c0a8-34d6-4fee-985c-7c7307f00c0c-kube-api-access-gnfnf\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:43 crc kubenswrapper[4730]: I0320 15:59:43.668649    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6118ed31-b8d7-4a7c-8769-69d996d26915-kube-api-access-jc4nn" (OuterVolumeSpecName: "kube-api-access-jc4nn") pod "6118ed31-b8d7-4a7c-8769-69d996d26915" (UID: "6118ed31-b8d7-4a7c-8769-69d996d26915"). InnerVolumeSpecName "kube-api-access-jc4nn". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:59:43 crc kubenswrapper[4730]: I0320 15:59:43.767310    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44a72513-75fb-4b7e-912b-d28fa63d050a-operator-scripts\") pod \"44a72513-75fb-4b7e-912b-d28fa63d050a\" (UID: \"44a72513-75fb-4b7e-912b-d28fa63d050a\") "
Mar 20 15:59:43 crc kubenswrapper[4730]: I0320 15:59:43.767412    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2tsx\" (UniqueName: \"kubernetes.io/projected/e01c2575-5301-494a-bf47-9a6053de9c64-kube-api-access-z2tsx\") pod \"e01c2575-5301-494a-bf47-9a6053de9c64\" (UID: \"e01c2575-5301-494a-bf47-9a6053de9c64\") "
Mar 20 15:59:43 crc kubenswrapper[4730]: I0320 15:59:43.767444    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e01c2575-5301-494a-bf47-9a6053de9c64-operator-scripts\") pod \"e01c2575-5301-494a-bf47-9a6053de9c64\" (UID: \"e01c2575-5301-494a-bf47-9a6053de9c64\") "
Mar 20 15:59:43 crc kubenswrapper[4730]: I0320 15:59:43.767468    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmbg2\" (UniqueName: \"kubernetes.io/projected/44a72513-75fb-4b7e-912b-d28fa63d050a-kube-api-access-cmbg2\") pod \"44a72513-75fb-4b7e-912b-d28fa63d050a\" (UID: \"44a72513-75fb-4b7e-912b-d28fa63d050a\") "
Mar 20 15:59:43 crc kubenswrapper[4730]: I0320 15:59:43.767786    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44a72513-75fb-4b7e-912b-d28fa63d050a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "44a72513-75fb-4b7e-912b-d28fa63d050a" (UID: "44a72513-75fb-4b7e-912b-d28fa63d050a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:59:43 crc kubenswrapper[4730]: I0320 15:59:43.768103    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e01c2575-5301-494a-bf47-9a6053de9c64-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e01c2575-5301-494a-bf47-9a6053de9c64" (UID: "e01c2575-5301-494a-bf47-9a6053de9c64"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:59:43 crc kubenswrapper[4730]: I0320 15:59:43.768279    4730 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e01c2575-5301-494a-bf47-9a6053de9c64-operator-scripts\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:43 crc kubenswrapper[4730]: I0320 15:59:43.768293    4730 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44a72513-75fb-4b7e-912b-d28fa63d050a-operator-scripts\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:43 crc kubenswrapper[4730]: I0320 15:59:43.768303    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jc4nn\" (UniqueName: \"kubernetes.io/projected/6118ed31-b8d7-4a7c-8769-69d996d26915-kube-api-access-jc4nn\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:43 crc kubenswrapper[4730]: I0320 15:59:43.770459    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44a72513-75fb-4b7e-912b-d28fa63d050a-kube-api-access-cmbg2" (OuterVolumeSpecName: "kube-api-access-cmbg2") pod "44a72513-75fb-4b7e-912b-d28fa63d050a" (UID: "44a72513-75fb-4b7e-912b-d28fa63d050a"). InnerVolumeSpecName "kube-api-access-cmbg2". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:59:43 crc kubenswrapper[4730]: I0320 15:59:43.770829    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e01c2575-5301-494a-bf47-9a6053de9c64-kube-api-access-z2tsx" (OuterVolumeSpecName: "kube-api-access-z2tsx") pod "e01c2575-5301-494a-bf47-9a6053de9c64" (UID: "e01c2575-5301-494a-bf47-9a6053de9c64"). InnerVolumeSpecName "kube-api-access-z2tsx". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:59:43 crc kubenswrapper[4730]: I0320 15:59:43.869964    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2tsx\" (UniqueName: \"kubernetes.io/projected/e01c2575-5301-494a-bf47-9a6053de9c64-kube-api-access-z2tsx\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:43 crc kubenswrapper[4730]: I0320 15:59:43.869996    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmbg2\" (UniqueName: \"kubernetes.io/projected/44a72513-75fb-4b7e-912b-d28fa63d050a-kube-api-access-cmbg2\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:43 crc kubenswrapper[4730]: I0320 15:59:43.956154    4730 generic.go:334] "Generic (PLEG): container finished" podID="06e5575b-c67a-46fe-8502-efc341523de2" containerID="99fba5e2cadd379521ca79369b155ec13b031c591917c4f1be4fc608956b6dda" exitCode=0
Mar 20 15:59:43 crc kubenswrapper[4730]: I0320 15:59:43.956226    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c423-account-create-update-dcjc2" event={"ID":"06e5575b-c67a-46fe-8502-efc341523de2","Type":"ContainerDied","Data":"99fba5e2cadd379521ca79369b155ec13b031c591917c4f1be4fc608956b6dda"}
Mar 20 15:59:43 crc kubenswrapper[4730]: I0320 15:59:43.960800    4730 generic.go:334] "Generic (PLEG): container finished" podID="1c94c6d8-4c40-455a-a536-7c64e3838986" containerID="40e0babc7b2f63017ce242ba014b4798c26ae0c66070098c86ad2de5a7400e6c" exitCode=0
Mar 20 15:59:43 crc kubenswrapper[4730]: I0320 15:59:43.960867    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-qpb6s" event={"ID":"1c94c6d8-4c40-455a-a536-7c64e3838986","Type":"ContainerDied","Data":"40e0babc7b2f63017ce242ba014b4798c26ae0c66070098c86ad2de5a7400e6c"}
Mar 20 15:59:43 crc kubenswrapper[4730]: I0320 15:59:43.967366    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-87csx" event={"ID":"6118ed31-b8d7-4a7c-8769-69d996d26915","Type":"ContainerDied","Data":"6d472469b6e16e310262d75432013a7a3a34ad4fdb01866ebdfaf12a061b5465"}
Mar 20 15:59:43 crc kubenswrapper[4730]: I0320 15:59:43.967425    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d472469b6e16e310262d75432013a7a3a34ad4fdb01866ebdfaf12a061b5465"
Mar 20 15:59:43 crc kubenswrapper[4730]: I0320 15:59:43.967374    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-87csx"
Mar 20 15:59:43 crc kubenswrapper[4730]: I0320 15:59:43.979929    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9f59-account-create-update-vmg5j" event={"ID":"44a72513-75fb-4b7e-912b-d28fa63d050a","Type":"ContainerDied","Data":"6a094396f0bb1056f88ceb040f7dab6620e9ea44941063752d5aa89e4f67a92a"}
Mar 20 15:59:43 crc kubenswrapper[4730]: I0320 15:59:43.979968    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a094396f0bb1056f88ceb040f7dab6620e9ea44941063752d5aa89e4f67a92a"
Mar 20 15:59:43 crc kubenswrapper[4730]: I0320 15:59:43.980029    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9f59-account-create-update-vmg5j"
Mar 20 15:59:43 crc kubenswrapper[4730]: I0320 15:59:43.999658    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3959-account-create-update-qxd89"
Mar 20 15:59:43 crc kubenswrapper[4730]: I0320 15:59:43.999651    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-3959-account-create-update-qxd89" event={"ID":"e01c2575-5301-494a-bf47-9a6053de9c64","Type":"ContainerDied","Data":"88bb1e1b8804c326245e327cd045f9a813fc551a35052713e8b42dc063c0a8fa"}
Mar 20 15:59:43 crc kubenswrapper[4730]: I0320 15:59:43.999715    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88bb1e1b8804c326245e327cd045f9a813fc551a35052713e8b42dc063c0a8fa"
Mar 20 15:59:44 crc kubenswrapper[4730]: I0320 15:59:44.004227    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-9q2kz" event={"ID":"ad93c0a8-34d6-4fee-985c-7c7307f00c0c","Type":"ContainerDied","Data":"6c62958dfa715bbaad2cb18642675288a24bc6d6b2b58edbfb494fdc8d1bb7c5"}
Mar 20 15:59:44 crc kubenswrapper[4730]: I0320 15:59:44.004269    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c62958dfa715bbaad2cb18642675288a24bc6d6b2b58edbfb494fdc8d1bb7c5"
Mar 20 15:59:44 crc kubenswrapper[4730]: I0320 15:59:44.004346    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-9q2kz"
Mar 20 15:59:44 crc kubenswrapper[4730]: I0320 15:59:44.326290    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qpb6s"
Mar 20 15:59:44 crc kubenswrapper[4730]: I0320 15:59:44.480154    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c94c6d8-4c40-455a-a536-7c64e3838986-operator-scripts\") pod \"1c94c6d8-4c40-455a-a536-7c64e3838986\" (UID: \"1c94c6d8-4c40-455a-a536-7c64e3838986\") "
Mar 20 15:59:44 crc kubenswrapper[4730]: I0320 15:59:44.480204    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8h52h\" (UniqueName: \"kubernetes.io/projected/1c94c6d8-4c40-455a-a536-7c64e3838986-kube-api-access-8h52h\") pod \"1c94c6d8-4c40-455a-a536-7c64e3838986\" (UID: \"1c94c6d8-4c40-455a-a536-7c64e3838986\") "
Mar 20 15:59:44 crc kubenswrapper[4730]: I0320 15:59:44.480885    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c94c6d8-4c40-455a-a536-7c64e3838986-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1c94c6d8-4c40-455a-a536-7c64e3838986" (UID: "1c94c6d8-4c40-455a-a536-7c64e3838986"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:59:44 crc kubenswrapper[4730]: I0320 15:59:44.484888    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c94c6d8-4c40-455a-a536-7c64e3838986-kube-api-access-8h52h" (OuterVolumeSpecName: "kube-api-access-8h52h") pod "1c94c6d8-4c40-455a-a536-7c64e3838986" (UID: "1c94c6d8-4c40-455a-a536-7c64e3838986"). InnerVolumeSpecName "kube-api-access-8h52h". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:59:44 crc kubenswrapper[4730]: I0320 15:59:44.581984    4730 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c94c6d8-4c40-455a-a536-7c64e3838986-operator-scripts\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:44 crc kubenswrapper[4730]: I0320 15:59:44.582025    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8h52h\" (UniqueName: \"kubernetes.io/projected/1c94c6d8-4c40-455a-a536-7c64e3838986-kube-api-access-8h52h\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:45 crc kubenswrapper[4730]: I0320 15:59:45.021632    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-qpb6s" event={"ID":"1c94c6d8-4c40-455a-a536-7c64e3838986","Type":"ContainerDied","Data":"4c5e1f649837c47339a572132afe7d468b5386426786fd3de46f1f95322e65e5"}
Mar 20 15:59:45 crc kubenswrapper[4730]: I0320 15:59:45.021691    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c5e1f649837c47339a572132afe7d468b5386426786fd3de46f1f95322e65e5"
Mar 20 15:59:45 crc kubenswrapper[4730]: I0320 15:59:45.021736    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qpb6s"
Mar 20 15:59:46 crc kubenswrapper[4730]: I0320 15:59:46.422409    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59fc649cc7-tct2h"
Mar 20 15:59:46 crc kubenswrapper[4730]: I0320 15:59:46.473809    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68f7d4448c-cvqk4"]
Mar 20 15:59:46 crc kubenswrapper[4730]: I0320 15:59:46.474065    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-68f7d4448c-cvqk4" podUID="3ae0793d-af8a-4808-b632-9f8b22a4d0c0" containerName="dnsmasq-dns" containerID="cri-o://cca89ebad14e458072774d3863e92e5ae8961f7af2fa95160875c04d8dd584a6" gracePeriod=10
Mar 20 15:59:47 crc kubenswrapper[4730]: I0320 15:59:47.042051    4730 generic.go:334] "Generic (PLEG): container finished" podID="3ae0793d-af8a-4808-b632-9f8b22a4d0c0" containerID="cca89ebad14e458072774d3863e92e5ae8961f7af2fa95160875c04d8dd584a6" exitCode=0
Mar 20 15:59:47 crc kubenswrapper[4730]: I0320 15:59:47.042114    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68f7d4448c-cvqk4" event={"ID":"3ae0793d-af8a-4808-b632-9f8b22a4d0c0","Type":"ContainerDied","Data":"cca89ebad14e458072774d3863e92e5ae8961f7af2fa95160875c04d8dd584a6"}
Mar 20 15:59:47 crc kubenswrapper[4730]: I0320 15:59:47.933687    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c423-account-create-update-dcjc2"
Mar 20 15:59:48 crc kubenswrapper[4730]: I0320 15:59:48.036447    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7r4t9\" (UniqueName: \"kubernetes.io/projected/06e5575b-c67a-46fe-8502-efc341523de2-kube-api-access-7r4t9\") pod \"06e5575b-c67a-46fe-8502-efc341523de2\" (UID: \"06e5575b-c67a-46fe-8502-efc341523de2\") "
Mar 20 15:59:48 crc kubenswrapper[4730]: I0320 15:59:48.036505    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06e5575b-c67a-46fe-8502-efc341523de2-operator-scripts\") pod \"06e5575b-c67a-46fe-8502-efc341523de2\" (UID: \"06e5575b-c67a-46fe-8502-efc341523de2\") "
Mar 20 15:59:48 crc kubenswrapper[4730]: I0320 15:59:48.037100    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06e5575b-c67a-46fe-8502-efc341523de2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "06e5575b-c67a-46fe-8502-efc341523de2" (UID: "06e5575b-c67a-46fe-8502-efc341523de2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:59:48 crc kubenswrapper[4730]: I0320 15:59:48.037224    4730 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06e5575b-c67a-46fe-8502-efc341523de2-operator-scripts\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:48 crc kubenswrapper[4730]: I0320 15:59:48.043737    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06e5575b-c67a-46fe-8502-efc341523de2-kube-api-access-7r4t9" (OuterVolumeSpecName: "kube-api-access-7r4t9") pod "06e5575b-c67a-46fe-8502-efc341523de2" (UID: "06e5575b-c67a-46fe-8502-efc341523de2"). InnerVolumeSpecName "kube-api-access-7r4t9". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:59:48 crc kubenswrapper[4730]: I0320 15:59:48.053855    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c423-account-create-update-dcjc2" event={"ID":"06e5575b-c67a-46fe-8502-efc341523de2","Type":"ContainerDied","Data":"2876d90345315b2a0809c889345800e43703e0183ea1a884fd91389c2ca8ac2a"}
Mar 20 15:59:48 crc kubenswrapper[4730]: I0320 15:59:48.053988    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2876d90345315b2a0809c889345800e43703e0183ea1a884fd91389c2ca8ac2a"
Mar 20 15:59:48 crc kubenswrapper[4730]: I0320 15:59:48.054112    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c423-account-create-update-dcjc2"
Mar 20 15:59:48 crc kubenswrapper[4730]: I0320 15:59:48.138603    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7r4t9\" (UniqueName: \"kubernetes.io/projected/06e5575b-c67a-46fe-8502-efc341523de2-kube-api-access-7r4t9\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:51 crc kubenswrapper[4730]: I0320 15:59:51.287777    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68f7d4448c-cvqk4"
Mar 20 15:59:51 crc kubenswrapper[4730]: I0320 15:59:51.398469    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ae0793d-af8a-4808-b632-9f8b22a4d0c0-config\") pod \"3ae0793d-af8a-4808-b632-9f8b22a4d0c0\" (UID: \"3ae0793d-af8a-4808-b632-9f8b22a4d0c0\") "
Mar 20 15:59:51 crc kubenswrapper[4730]: I0320 15:59:51.398548    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3ae0793d-af8a-4808-b632-9f8b22a4d0c0-dns-svc\") pod \"3ae0793d-af8a-4808-b632-9f8b22a4d0c0\" (UID: \"3ae0793d-af8a-4808-b632-9f8b22a4d0c0\") "
Mar 20 15:59:51 crc kubenswrapper[4730]: I0320 15:59:51.398586    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3ae0793d-af8a-4808-b632-9f8b22a4d0c0-ovsdbserver-sb\") pod \"3ae0793d-af8a-4808-b632-9f8b22a4d0c0\" (UID: \"3ae0793d-af8a-4808-b632-9f8b22a4d0c0\") "
Mar 20 15:59:51 crc kubenswrapper[4730]: I0320 15:59:51.398609    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3ae0793d-af8a-4808-b632-9f8b22a4d0c0-ovsdbserver-nb\") pod \"3ae0793d-af8a-4808-b632-9f8b22a4d0c0\" (UID: \"3ae0793d-af8a-4808-b632-9f8b22a4d0c0\") "
Mar 20 15:59:51 crc kubenswrapper[4730]: I0320 15:59:51.399346    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hslv\" (UniqueName: \"kubernetes.io/projected/3ae0793d-af8a-4808-b632-9f8b22a4d0c0-kube-api-access-9hslv\") pod \"3ae0793d-af8a-4808-b632-9f8b22a4d0c0\" (UID: \"3ae0793d-af8a-4808-b632-9f8b22a4d0c0\") "
Mar 20 15:59:51 crc kubenswrapper[4730]: I0320 15:59:51.403794    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ae0793d-af8a-4808-b632-9f8b22a4d0c0-kube-api-access-9hslv" (OuterVolumeSpecName: "kube-api-access-9hslv") pod "3ae0793d-af8a-4808-b632-9f8b22a4d0c0" (UID: "3ae0793d-af8a-4808-b632-9f8b22a4d0c0"). InnerVolumeSpecName "kube-api-access-9hslv". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:59:51 crc kubenswrapper[4730]: I0320 15:59:51.438832    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ae0793d-af8a-4808-b632-9f8b22a4d0c0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3ae0793d-af8a-4808-b632-9f8b22a4d0c0" (UID: "3ae0793d-af8a-4808-b632-9f8b22a4d0c0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:59:51 crc kubenswrapper[4730]: I0320 15:59:51.441623    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ae0793d-af8a-4808-b632-9f8b22a4d0c0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3ae0793d-af8a-4808-b632-9f8b22a4d0c0" (UID: "3ae0793d-af8a-4808-b632-9f8b22a4d0c0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:59:51 crc kubenswrapper[4730]: I0320 15:59:51.443163    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ae0793d-af8a-4808-b632-9f8b22a4d0c0-config" (OuterVolumeSpecName: "config") pod "3ae0793d-af8a-4808-b632-9f8b22a4d0c0" (UID: "3ae0793d-af8a-4808-b632-9f8b22a4d0c0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:59:51 crc kubenswrapper[4730]: I0320 15:59:51.444418    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ae0793d-af8a-4808-b632-9f8b22a4d0c0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3ae0793d-af8a-4808-b632-9f8b22a4d0c0" (UID: "3ae0793d-af8a-4808-b632-9f8b22a4d0c0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 15:59:51 crc kubenswrapper[4730]: I0320 15:59:51.501396    4730 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ae0793d-af8a-4808-b632-9f8b22a4d0c0-config\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:51 crc kubenswrapper[4730]: I0320 15:59:51.501430    4730 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3ae0793d-af8a-4808-b632-9f8b22a4d0c0-dns-svc\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:51 crc kubenswrapper[4730]: I0320 15:59:51.501444    4730 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3ae0793d-af8a-4808-b632-9f8b22a4d0c0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:51 crc kubenswrapper[4730]: I0320 15:59:51.501454    4730 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3ae0793d-af8a-4808-b632-9f8b22a4d0c0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:51 crc kubenswrapper[4730]: I0320 15:59:51.501465    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hslv\" (UniqueName: \"kubernetes.io/projected/3ae0793d-af8a-4808-b632-9f8b22a4d0c0-kube-api-access-9hslv\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:52 crc kubenswrapper[4730]: I0320 15:59:52.105948    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68f7d4448c-cvqk4" event={"ID":"3ae0793d-af8a-4808-b632-9f8b22a4d0c0","Type":"ContainerDied","Data":"12c589b54c6d611757ef6559c9af4693c0c76b7c0e6f07d2b8ae5108a83f1489"}
Mar 20 15:59:52 crc kubenswrapper[4730]: I0320 15:59:52.106015    4730 scope.go:117] "RemoveContainer" containerID="cca89ebad14e458072774d3863e92e5ae8961f7af2fa95160875c04d8dd584a6"
Mar 20 15:59:52 crc kubenswrapper[4730]: I0320 15:59:52.106178    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68f7d4448c-cvqk4"
Mar 20 15:59:52 crc kubenswrapper[4730]: I0320 15:59:52.114507    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-ns6b5" event={"ID":"9577f66b-a45e-4d51-9d87-4ae757819182","Type":"ContainerStarted","Data":"32fe76fbff47bfdd3ed0a42b1fb587052917346b2dd9af6a6803fc8251d250e7"}
Mar 20 15:59:52 crc kubenswrapper[4730]: I0320 15:59:52.117210    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-rb4pw" event={"ID":"92a7eed8-de7c-4816-8bd9-e922ace376ad","Type":"ContainerStarted","Data":"29ed2b28b91aee9b1496fc9ae566fd345c663655b5eba7831621c42547aa8e83"}
Mar 20 15:59:52 crc kubenswrapper[4730]: I0320 15:59:52.146284    4730 scope.go:117] "RemoveContainer" containerID="ed9c122c62fe6334e758d54aed5e4b1a868adadd18c7ec06bf5ae96e604dfa76"
Mar 20 15:59:52 crc kubenswrapper[4730]: I0320 15:59:52.186681    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-db-sync-ns6b5" podStartSLOduration=1.482184874 podStartE2EDuration="11.186660089s" podCreationTimestamp="2026-03-20 15:59:41 +0000 UTC" firstStartedPulling="2026-03-20 15:59:42.122381947 +0000 UTC m=+1241.335753316" lastFinishedPulling="2026-03-20 15:59:51.826857152 +0000 UTC m=+1251.040228531" observedRunningTime="2026-03-20 15:59:52.143568835 +0000 UTC m=+1251.356940204" watchObservedRunningTime="2026-03-20 15:59:52.186660089 +0000 UTC m=+1251.400031458"
Mar 20 15:59:52 crc kubenswrapper[4730]: I0320 15:59:52.200047    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68f7d4448c-cvqk4"]
Mar 20 15:59:52 crc kubenswrapper[4730]: I0320 15:59:52.207406    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-68f7d4448c-cvqk4"]
Mar 20 15:59:52 crc kubenswrapper[4730]: I0320 15:59:52.209802    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-rb4pw" podStartSLOduration=1.310520081 podStartE2EDuration="12.209784336s" podCreationTimestamp="2026-03-20 15:59:40 +0000 UTC" firstStartedPulling="2026-03-20 15:59:40.906806152 +0000 UTC m=+1240.120177551" lastFinishedPulling="2026-03-20 15:59:51.806070417 +0000 UTC m=+1251.019441806" observedRunningTime="2026-03-20 15:59:52.196766935 +0000 UTC m=+1251.410138304" watchObservedRunningTime="2026-03-20 15:59:52.209784336 +0000 UTC m=+1251.423155705"
Mar 20 15:59:53 crc kubenswrapper[4730]: I0320 15:59:53.547558    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ae0793d-af8a-4808-b632-9f8b22a4d0c0" path="/var/lib/kubelet/pods/3ae0793d-af8a-4808-b632-9f8b22a4d0c0/volumes"
Mar 20 15:59:55 crc kubenswrapper[4730]: I0320 15:59:55.642571    4730 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-68f7d4448c-cvqk4" podUID="3ae0793d-af8a-4808-b632-9f8b22a4d0c0" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.140:5353: i/o timeout"
Mar 20 15:59:56 crc kubenswrapper[4730]: I0320 15:59:56.158460    4730 generic.go:334] "Generic (PLEG): container finished" podID="9577f66b-a45e-4d51-9d87-4ae757819182" containerID="32fe76fbff47bfdd3ed0a42b1fb587052917346b2dd9af6a6803fc8251d250e7" exitCode=0
Mar 20 15:59:56 crc kubenswrapper[4730]: I0320 15:59:56.158522    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-ns6b5" event={"ID":"9577f66b-a45e-4d51-9d87-4ae757819182","Type":"ContainerDied","Data":"32fe76fbff47bfdd3ed0a42b1fb587052917346b2dd9af6a6803fc8251d250e7"}
Mar 20 15:59:57 crc kubenswrapper[4730]: I0320 15:59:57.168024    4730 generic.go:334] "Generic (PLEG): container finished" podID="92a7eed8-de7c-4816-8bd9-e922ace376ad" containerID="29ed2b28b91aee9b1496fc9ae566fd345c663655b5eba7831621c42547aa8e83" exitCode=0
Mar 20 15:59:57 crc kubenswrapper[4730]: I0320 15:59:57.168151    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-rb4pw" event={"ID":"92a7eed8-de7c-4816-8bd9-e922ace376ad","Type":"ContainerDied","Data":"29ed2b28b91aee9b1496fc9ae566fd345c663655b5eba7831621c42547aa8e83"}
Mar 20 15:59:57 crc kubenswrapper[4730]: I0320 15:59:57.480969    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-ns6b5"
Mar 20 15:59:57 crc kubenswrapper[4730]: I0320 15:59:57.608891    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pz9h\" (UniqueName: \"kubernetes.io/projected/9577f66b-a45e-4d51-9d87-4ae757819182-kube-api-access-9pz9h\") pod \"9577f66b-a45e-4d51-9d87-4ae757819182\" (UID: \"9577f66b-a45e-4d51-9d87-4ae757819182\") "
Mar 20 15:59:57 crc kubenswrapper[4730]: I0320 15:59:57.608990    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9577f66b-a45e-4d51-9d87-4ae757819182-config-data\") pod \"9577f66b-a45e-4d51-9d87-4ae757819182\" (UID: \"9577f66b-a45e-4d51-9d87-4ae757819182\") "
Mar 20 15:59:57 crc kubenswrapper[4730]: I0320 15:59:57.609106    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9577f66b-a45e-4d51-9d87-4ae757819182-db-sync-config-data\") pod \"9577f66b-a45e-4d51-9d87-4ae757819182\" (UID: \"9577f66b-a45e-4d51-9d87-4ae757819182\") "
Mar 20 15:59:57 crc kubenswrapper[4730]: I0320 15:59:57.609208    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9577f66b-a45e-4d51-9d87-4ae757819182-combined-ca-bundle\") pod \"9577f66b-a45e-4d51-9d87-4ae757819182\" (UID: \"9577f66b-a45e-4d51-9d87-4ae757819182\") "
Mar 20 15:59:57 crc kubenswrapper[4730]: I0320 15:59:57.615142    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9577f66b-a45e-4d51-9d87-4ae757819182-kube-api-access-9pz9h" (OuterVolumeSpecName: "kube-api-access-9pz9h") pod "9577f66b-a45e-4d51-9d87-4ae757819182" (UID: "9577f66b-a45e-4d51-9d87-4ae757819182"). InnerVolumeSpecName "kube-api-access-9pz9h". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:59:57 crc kubenswrapper[4730]: I0320 15:59:57.616497    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9577f66b-a45e-4d51-9d87-4ae757819182-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "9577f66b-a45e-4d51-9d87-4ae757819182" (UID: "9577f66b-a45e-4d51-9d87-4ae757819182"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:59:57 crc kubenswrapper[4730]: I0320 15:59:57.642146    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9577f66b-a45e-4d51-9d87-4ae757819182-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9577f66b-a45e-4d51-9d87-4ae757819182" (UID: "9577f66b-a45e-4d51-9d87-4ae757819182"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:59:57 crc kubenswrapper[4730]: I0320 15:59:57.668023    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9577f66b-a45e-4d51-9d87-4ae757819182-config-data" (OuterVolumeSpecName: "config-data") pod "9577f66b-a45e-4d51-9d87-4ae757819182" (UID: "9577f66b-a45e-4d51-9d87-4ae757819182"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:59:57 crc kubenswrapper[4730]: I0320 15:59:57.712991    4730 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9577f66b-a45e-4d51-9d87-4ae757819182-db-sync-config-data\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:57 crc kubenswrapper[4730]: I0320 15:59:57.713033    4730 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9577f66b-a45e-4d51-9d87-4ae757819182-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:57 crc kubenswrapper[4730]: I0320 15:59:57.713047    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pz9h\" (UniqueName: \"kubernetes.io/projected/9577f66b-a45e-4d51-9d87-4ae757819182-kube-api-access-9pz9h\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:57 crc kubenswrapper[4730]: I0320 15:59:57.713061    4730 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9577f66b-a45e-4d51-9d87-4ae757819182-config-data\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:58 crc kubenswrapper[4730]: I0320 15:59:58.182897    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-ns6b5"
Mar 20 15:59:58 crc kubenswrapper[4730]: I0320 15:59:58.182859    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-ns6b5" event={"ID":"9577f66b-a45e-4d51-9d87-4ae757819182","Type":"ContainerDied","Data":"2a09ead92eb5e9f3f31718e3636e13dfb29ffdd6eee5420ea19a1b73c6cb9b82"}
Mar 20 15:59:58 crc kubenswrapper[4730]: I0320 15:59:58.182999    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a09ead92eb5e9f3f31718e3636e13dfb29ffdd6eee5420ea19a1b73c6cb9b82"
Mar 20 15:59:58 crc kubenswrapper[4730]: I0320 15:59:58.609719    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-rb4pw"
Mar 20 15:59:58 crc kubenswrapper[4730]: I0320 15:59:58.756795    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92a7eed8-de7c-4816-8bd9-e922ace376ad-config-data\") pod \"92a7eed8-de7c-4816-8bd9-e922ace376ad\" (UID: \"92a7eed8-de7c-4816-8bd9-e922ace376ad\") "
Mar 20 15:59:58 crc kubenswrapper[4730]: I0320 15:59:58.757059    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92a7eed8-de7c-4816-8bd9-e922ace376ad-combined-ca-bundle\") pod \"92a7eed8-de7c-4816-8bd9-e922ace376ad\" (UID: \"92a7eed8-de7c-4816-8bd9-e922ace376ad\") "
Mar 20 15:59:58 crc kubenswrapper[4730]: I0320 15:59:58.757538    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4hmq\" (UniqueName: \"kubernetes.io/projected/92a7eed8-de7c-4816-8bd9-e922ace376ad-kube-api-access-h4hmq\") pod \"92a7eed8-de7c-4816-8bd9-e922ace376ad\" (UID: \"92a7eed8-de7c-4816-8bd9-e922ace376ad\") "
Mar 20 15:59:58 crc kubenswrapper[4730]: I0320 15:59:58.761961    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92a7eed8-de7c-4816-8bd9-e922ace376ad-kube-api-access-h4hmq" (OuterVolumeSpecName: "kube-api-access-h4hmq") pod "92a7eed8-de7c-4816-8bd9-e922ace376ad" (UID: "92a7eed8-de7c-4816-8bd9-e922ace376ad"). InnerVolumeSpecName "kube-api-access-h4hmq". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 15:59:58 crc kubenswrapper[4730]: I0320 15:59:58.782699    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92a7eed8-de7c-4816-8bd9-e922ace376ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "92a7eed8-de7c-4816-8bd9-e922ace376ad" (UID: "92a7eed8-de7c-4816-8bd9-e922ace376ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:59:58 crc kubenswrapper[4730]: I0320 15:59:58.796782    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92a7eed8-de7c-4816-8bd9-e922ace376ad-config-data" (OuterVolumeSpecName: "config-data") pod "92a7eed8-de7c-4816-8bd9-e922ace376ad" (UID: "92a7eed8-de7c-4816-8bd9-e922ace376ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 15:59:58 crc kubenswrapper[4730]: I0320 15:59:58.859827    4730 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92a7eed8-de7c-4816-8bd9-e922ace376ad-config-data\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:58 crc kubenswrapper[4730]: I0320 15:59:58.859858    4730 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92a7eed8-de7c-4816-8bd9-e922ace376ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:58 crc kubenswrapper[4730]: I0320 15:59:58.859870    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4hmq\" (UniqueName: \"kubernetes.io/projected/92a7eed8-de7c-4816-8bd9-e922ace376ad-kube-api-access-h4hmq\") on node \"crc\" DevicePath \"\""
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.191896    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-rb4pw" event={"ID":"92a7eed8-de7c-4816-8bd9-e922ace376ad","Type":"ContainerDied","Data":"15e39e3baf091f230b52a236e8e09de020ef95a8e3541076e0af482275abf9c1"}
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.191950    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-rb4pw"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.191959    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15e39e3baf091f230b52a236e8e09de020ef95a8e3541076e0af482275abf9c1"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.424396    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-rmtsq"]
Mar 20 15:59:59 crc kubenswrapper[4730]: E0320 15:59:59.425181    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ae0793d-af8a-4808-b632-9f8b22a4d0c0" containerName="init"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.425210    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ae0793d-af8a-4808-b632-9f8b22a4d0c0" containerName="init"
Mar 20 15:59:59 crc kubenswrapper[4730]: E0320 15:59:59.425239    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44a72513-75fb-4b7e-912b-d28fa63d050a" containerName="mariadb-account-create-update"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.425268    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="44a72513-75fb-4b7e-912b-d28fa63d050a" containerName="mariadb-account-create-update"
Mar 20 15:59:59 crc kubenswrapper[4730]: E0320 15:59:59.425285    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92a7eed8-de7c-4816-8bd9-e922ace376ad" containerName="keystone-db-sync"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.425293    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="92a7eed8-de7c-4816-8bd9-e922ace376ad" containerName="keystone-db-sync"
Mar 20 15:59:59 crc kubenswrapper[4730]: E0320 15:59:59.425311    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ae0793d-af8a-4808-b632-9f8b22a4d0c0" containerName="dnsmasq-dns"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.425319    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ae0793d-af8a-4808-b632-9f8b22a4d0c0" containerName="dnsmasq-dns"
Mar 20 15:59:59 crc kubenswrapper[4730]: E0320 15:59:59.425343    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06e5575b-c67a-46fe-8502-efc341523de2" containerName="mariadb-account-create-update"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.425352    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="06e5575b-c67a-46fe-8502-efc341523de2" containerName="mariadb-account-create-update"
Mar 20 15:59:59 crc kubenswrapper[4730]: E0320 15:59:59.425363    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c94c6d8-4c40-455a-a536-7c64e3838986" containerName="mariadb-database-create"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.425371    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c94c6d8-4c40-455a-a536-7c64e3838986" containerName="mariadb-database-create"
Mar 20 15:59:59 crc kubenswrapper[4730]: E0320 15:59:59.425381    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e01c2575-5301-494a-bf47-9a6053de9c64" containerName="mariadb-account-create-update"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.425389    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="e01c2575-5301-494a-bf47-9a6053de9c64" containerName="mariadb-account-create-update"
Mar 20 15:59:59 crc kubenswrapper[4730]: E0320 15:59:59.425404    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad93c0a8-34d6-4fee-985c-7c7307f00c0c" containerName="mariadb-database-create"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.425412    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad93c0a8-34d6-4fee-985c-7c7307f00c0c" containerName="mariadb-database-create"
Mar 20 15:59:59 crc kubenswrapper[4730]: E0320 15:59:59.425423    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9577f66b-a45e-4d51-9d87-4ae757819182" containerName="watcher-db-sync"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.425431    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="9577f66b-a45e-4d51-9d87-4ae757819182" containerName="watcher-db-sync"
Mar 20 15:59:59 crc kubenswrapper[4730]: E0320 15:59:59.425446    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6118ed31-b8d7-4a7c-8769-69d996d26915" containerName="mariadb-database-create"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.425455    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="6118ed31-b8d7-4a7c-8769-69d996d26915" containerName="mariadb-database-create"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.425654    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="06e5575b-c67a-46fe-8502-efc341523de2" containerName="mariadb-account-create-update"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.425684    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="92a7eed8-de7c-4816-8bd9-e922ace376ad" containerName="keystone-db-sync"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.425715    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="44a72513-75fb-4b7e-912b-d28fa63d050a" containerName="mariadb-account-create-update"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.425742    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="9577f66b-a45e-4d51-9d87-4ae757819182" containerName="watcher-db-sync"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.425754    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="e01c2575-5301-494a-bf47-9a6053de9c64" containerName="mariadb-account-create-update"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.425774    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="6118ed31-b8d7-4a7c-8769-69d996d26915" containerName="mariadb-database-create"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.425794    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ae0793d-af8a-4808-b632-9f8b22a4d0c0" containerName="dnsmasq-dns"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.425806    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad93c0a8-34d6-4fee-985c-7c7307f00c0c" containerName="mariadb-database-create"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.425825    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c94c6d8-4c40-455a-a536-7c64e3838986" containerName="mariadb-database-create"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.426630    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rmtsq"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.430449    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.430524    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.430619    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-pjvk4"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.432856    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.435338    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-758cdcd5c9-9m42d"]
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.437189    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-758cdcd5c9-9m42d"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.438761    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.446951    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-758cdcd5c9-9m42d"]
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.573564    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3349c2c-6f29-425d-9d25-b4f23821cfcc-scripts\") pod \"keystone-bootstrap-rmtsq\" (UID: \"e3349c2c-6f29-425d-9d25-b4f23821cfcc\") " pod="openstack/keystone-bootstrap-rmtsq"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.573619    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c857c34a-0efc-4ebe-8f42-e562e88de7a4-ovsdbserver-nb\") pod \"dnsmasq-dns-758cdcd5c9-9m42d\" (UID: \"c857c34a-0efc-4ebe-8f42-e562e88de7a4\") " pod="openstack/dnsmasq-dns-758cdcd5c9-9m42d"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.573660    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4d8h\" (UniqueName: \"kubernetes.io/projected/c857c34a-0efc-4ebe-8f42-e562e88de7a4-kube-api-access-q4d8h\") pod \"dnsmasq-dns-758cdcd5c9-9m42d\" (UID: \"c857c34a-0efc-4ebe-8f42-e562e88de7a4\") " pod="openstack/dnsmasq-dns-758cdcd5c9-9m42d"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.573703    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3349c2c-6f29-425d-9d25-b4f23821cfcc-config-data\") pod \"keystone-bootstrap-rmtsq\" (UID: \"e3349c2c-6f29-425d-9d25-b4f23821cfcc\") " pod="openstack/keystone-bootstrap-rmtsq"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.573723    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj56l\" (UniqueName: \"kubernetes.io/projected/e3349c2c-6f29-425d-9d25-b4f23821cfcc-kube-api-access-qj56l\") pod \"keystone-bootstrap-rmtsq\" (UID: \"e3349c2c-6f29-425d-9d25-b4f23821cfcc\") " pod="openstack/keystone-bootstrap-rmtsq"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.573775    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3349c2c-6f29-425d-9d25-b4f23821cfcc-combined-ca-bundle\") pod \"keystone-bootstrap-rmtsq\" (UID: \"e3349c2c-6f29-425d-9d25-b4f23821cfcc\") " pod="openstack/keystone-bootstrap-rmtsq"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.573798    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c857c34a-0efc-4ebe-8f42-e562e88de7a4-config\") pod \"dnsmasq-dns-758cdcd5c9-9m42d\" (UID: \"c857c34a-0efc-4ebe-8f42-e562e88de7a4\") " pod="openstack/dnsmasq-dns-758cdcd5c9-9m42d"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.573816    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c857c34a-0efc-4ebe-8f42-e562e88de7a4-ovsdbserver-sb\") pod \"dnsmasq-dns-758cdcd5c9-9m42d\" (UID: \"c857c34a-0efc-4ebe-8f42-e562e88de7a4\") " pod="openstack/dnsmasq-dns-758cdcd5c9-9m42d"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.573837    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c857c34a-0efc-4ebe-8f42-e562e88de7a4-dns-swift-storage-0\") pod \"dnsmasq-dns-758cdcd5c9-9m42d\" (UID: \"c857c34a-0efc-4ebe-8f42-e562e88de7a4\") " pod="openstack/dnsmasq-dns-758cdcd5c9-9m42d"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.573870    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e3349c2c-6f29-425d-9d25-b4f23821cfcc-credential-keys\") pod \"keystone-bootstrap-rmtsq\" (UID: \"e3349c2c-6f29-425d-9d25-b4f23821cfcc\") " pod="openstack/keystone-bootstrap-rmtsq"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.573912    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e3349c2c-6f29-425d-9d25-b4f23821cfcc-fernet-keys\") pod \"keystone-bootstrap-rmtsq\" (UID: \"e3349c2c-6f29-425d-9d25-b4f23821cfcc\") " pod="openstack/keystone-bootstrap-rmtsq"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.573967    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c857c34a-0efc-4ebe-8f42-e562e88de7a4-dns-svc\") pod \"dnsmasq-dns-758cdcd5c9-9m42d\" (UID: \"c857c34a-0efc-4ebe-8f42-e562e88de7a4\") " pod="openstack/dnsmasq-dns-758cdcd5c9-9m42d"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.597631    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rmtsq"]
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.624548    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"]
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.625727    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.632139    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-h4zsn"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.632719    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.638331    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"]
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.639458    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.645465    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.653977    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"]
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.675811    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3349c2c-6f29-425d-9d25-b4f23821cfcc-config-data\") pod \"keystone-bootstrap-rmtsq\" (UID: \"e3349c2c-6f29-425d-9d25-b4f23821cfcc\") " pod="openstack/keystone-bootstrap-rmtsq"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.675865    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj56l\" (UniqueName: \"kubernetes.io/projected/e3349c2c-6f29-425d-9d25-b4f23821cfcc-kube-api-access-qj56l\") pod \"keystone-bootstrap-rmtsq\" (UID: \"e3349c2c-6f29-425d-9d25-b4f23821cfcc\") " pod="openstack/keystone-bootstrap-rmtsq"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.675936    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3349c2c-6f29-425d-9d25-b4f23821cfcc-combined-ca-bundle\") pod \"keystone-bootstrap-rmtsq\" (UID: \"e3349c2c-6f29-425d-9d25-b4f23821cfcc\") " pod="openstack/keystone-bootstrap-rmtsq"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.675971    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c857c34a-0efc-4ebe-8f42-e562e88de7a4-config\") pod \"dnsmasq-dns-758cdcd5c9-9m42d\" (UID: \"c857c34a-0efc-4ebe-8f42-e562e88de7a4\") " pod="openstack/dnsmasq-dns-758cdcd5c9-9m42d"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.675994    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c857c34a-0efc-4ebe-8f42-e562e88de7a4-ovsdbserver-sb\") pod \"dnsmasq-dns-758cdcd5c9-9m42d\" (UID: \"c857c34a-0efc-4ebe-8f42-e562e88de7a4\") " pod="openstack/dnsmasq-dns-758cdcd5c9-9m42d"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.676017    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c857c34a-0efc-4ebe-8f42-e562e88de7a4-dns-swift-storage-0\") pod \"dnsmasq-dns-758cdcd5c9-9m42d\" (UID: \"c857c34a-0efc-4ebe-8f42-e562e88de7a4\") " pod="openstack/dnsmasq-dns-758cdcd5c9-9m42d"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.676049    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e3349c2c-6f29-425d-9d25-b4f23821cfcc-credential-keys\") pod \"keystone-bootstrap-rmtsq\" (UID: \"e3349c2c-6f29-425d-9d25-b4f23821cfcc\") " pod="openstack/keystone-bootstrap-rmtsq"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.676095    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e3349c2c-6f29-425d-9d25-b4f23821cfcc-fernet-keys\") pod \"keystone-bootstrap-rmtsq\" (UID: \"e3349c2c-6f29-425d-9d25-b4f23821cfcc\") " pod="openstack/keystone-bootstrap-rmtsq"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.676167    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c857c34a-0efc-4ebe-8f42-e562e88de7a4-dns-svc\") pod \"dnsmasq-dns-758cdcd5c9-9m42d\" (UID: \"c857c34a-0efc-4ebe-8f42-e562e88de7a4\") " pod="openstack/dnsmasq-dns-758cdcd5c9-9m42d"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.676261    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3349c2c-6f29-425d-9d25-b4f23821cfcc-scripts\") pod \"keystone-bootstrap-rmtsq\" (UID: \"e3349c2c-6f29-425d-9d25-b4f23821cfcc\") " pod="openstack/keystone-bootstrap-rmtsq"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.676292    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c857c34a-0efc-4ebe-8f42-e562e88de7a4-ovsdbserver-nb\") pod \"dnsmasq-dns-758cdcd5c9-9m42d\" (UID: \"c857c34a-0efc-4ebe-8f42-e562e88de7a4\") " pod="openstack/dnsmasq-dns-758cdcd5c9-9m42d"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.676333    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4d8h\" (UniqueName: \"kubernetes.io/projected/c857c34a-0efc-4ebe-8f42-e562e88de7a4-kube-api-access-q4d8h\") pod \"dnsmasq-dns-758cdcd5c9-9m42d\" (UID: \"c857c34a-0efc-4ebe-8f42-e562e88de7a4\") " pod="openstack/dnsmasq-dns-758cdcd5c9-9m42d"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.679306    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"]
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.681329    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c857c34a-0efc-4ebe-8f42-e562e88de7a4-config\") pod \"dnsmasq-dns-758cdcd5c9-9m42d\" (UID: \"c857c34a-0efc-4ebe-8f42-e562e88de7a4\") " pod="openstack/dnsmasq-dns-758cdcd5c9-9m42d"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.683143    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c857c34a-0efc-4ebe-8f42-e562e88de7a4-dns-swift-storage-0\") pod \"dnsmasq-dns-758cdcd5c9-9m42d\" (UID: \"c857c34a-0efc-4ebe-8f42-e562e88de7a4\") " pod="openstack/dnsmasq-dns-758cdcd5c9-9m42d"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.688839    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c857c34a-0efc-4ebe-8f42-e562e88de7a4-dns-svc\") pod \"dnsmasq-dns-758cdcd5c9-9m42d\" (UID: \"c857c34a-0efc-4ebe-8f42-e562e88de7a4\") " pod="openstack/dnsmasq-dns-758cdcd5c9-9m42d"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.693560    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c857c34a-0efc-4ebe-8f42-e562e88de7a4-ovsdbserver-sb\") pod \"dnsmasq-dns-758cdcd5c9-9m42d\" (UID: \"c857c34a-0efc-4ebe-8f42-e562e88de7a4\") " pod="openstack/dnsmasq-dns-758cdcd5c9-9m42d"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.694301    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c857c34a-0efc-4ebe-8f42-e562e88de7a4-ovsdbserver-nb\") pod \"dnsmasq-dns-758cdcd5c9-9m42d\" (UID: \"c857c34a-0efc-4ebe-8f42-e562e88de7a4\") " pod="openstack/dnsmasq-dns-758cdcd5c9-9m42d"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.695044    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e3349c2c-6f29-425d-9d25-b4f23821cfcc-credential-keys\") pod \"keystone-bootstrap-rmtsq\" (UID: \"e3349c2c-6f29-425d-9d25-b4f23821cfcc\") " pod="openstack/keystone-bootstrap-rmtsq"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.700941    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3349c2c-6f29-425d-9d25-b4f23821cfcc-combined-ca-bundle\") pod \"keystone-bootstrap-rmtsq\" (UID: \"e3349c2c-6f29-425d-9d25-b4f23821cfcc\") " pod="openstack/keystone-bootstrap-rmtsq"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.712619    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"]
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.714063    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.722600    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3349c2c-6f29-425d-9d25-b4f23821cfcc-scripts\") pod \"keystone-bootstrap-rmtsq\" (UID: \"e3349c2c-6f29-425d-9d25-b4f23821cfcc\") " pod="openstack/keystone-bootstrap-rmtsq"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.727851    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e3349c2c-6f29-425d-9d25-b4f23821cfcc-fernet-keys\") pod \"keystone-bootstrap-rmtsq\" (UID: \"e3349c2c-6f29-425d-9d25-b4f23821cfcc\") " pod="openstack/keystone-bootstrap-rmtsq"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.728380    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3349c2c-6f29-425d-9d25-b4f23821cfcc-config-data\") pod \"keystone-bootstrap-rmtsq\" (UID: \"e3349c2c-6f29-425d-9d25-b4f23821cfcc\") " pod="openstack/keystone-bootstrap-rmtsq"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.728725    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.730740    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4d8h\" (UniqueName: \"kubernetes.io/projected/c857c34a-0efc-4ebe-8f42-e562e88de7a4-kube-api-access-q4d8h\") pod \"dnsmasq-dns-758cdcd5c9-9m42d\" (UID: \"c857c34a-0efc-4ebe-8f42-e562e88de7a4\") " pod="openstack/dnsmasq-dns-758cdcd5c9-9m42d"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.736440    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"]
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.738812    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj56l\" (UniqueName: \"kubernetes.io/projected/e3349c2c-6f29-425d-9d25-b4f23821cfcc-kube-api-access-qj56l\") pod \"keystone-bootstrap-rmtsq\" (UID: \"e3349c2c-6f29-425d-9d25-b4f23821cfcc\") " pod="openstack/keystone-bootstrap-rmtsq"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.764669    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rmtsq"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.779341    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f653a7b-251e-4eb2-92cd-74e23ac4dba5-logs\") pod \"watcher-applier-0\" (UID: \"9f653a7b-251e-4eb2-92cd-74e23ac4dba5\") " pod="openstack/watcher-applier-0"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.779384    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z2tj\" (UniqueName: \"kubernetes.io/projected/3f6c808e-d523-48bd-8ec2-28b625834317-kube-api-access-2z2tj\") pod \"watcher-decision-engine-0\" (UID: \"3f6c808e-d523-48bd-8ec2-28b625834317\") " pod="openstack/watcher-decision-engine-0"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.779405    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f653a7b-251e-4eb2-92cd-74e23ac4dba5-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"9f653a7b-251e-4eb2-92cd-74e23ac4dba5\") " pod="openstack/watcher-applier-0"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.779442    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3f6c808e-d523-48bd-8ec2-28b625834317-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"3f6c808e-d523-48bd-8ec2-28b625834317\") " pod="openstack/watcher-decision-engine-0"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.779463    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f6c808e-d523-48bd-8ec2-28b625834317-logs\") pod \"watcher-decision-engine-0\" (UID: \"3f6c808e-d523-48bd-8ec2-28b625834317\") " pod="openstack/watcher-decision-engine-0"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.779490    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52bmh\" (UniqueName: \"kubernetes.io/projected/9f653a7b-251e-4eb2-92cd-74e23ac4dba5-kube-api-access-52bmh\") pod \"watcher-applier-0\" (UID: \"9f653a7b-251e-4eb2-92cd-74e23ac4dba5\") " pod="openstack/watcher-applier-0"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.779524    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f6c808e-d523-48bd-8ec2-28b625834317-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"3f6c808e-d523-48bd-8ec2-28b625834317\") " pod="openstack/watcher-decision-engine-0"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.779571    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f6c808e-d523-48bd-8ec2-28b625834317-config-data\") pod \"watcher-decision-engine-0\" (UID: \"3f6c808e-d523-48bd-8ec2-28b625834317\") " pod="openstack/watcher-decision-engine-0"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.779625    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f653a7b-251e-4eb2-92cd-74e23ac4dba5-config-data\") pod \"watcher-applier-0\" (UID: \"9f653a7b-251e-4eb2-92cd-74e23ac4dba5\") " pod="openstack/watcher-applier-0"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.780106    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-758cdcd5c9-9m42d"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.821157    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-z9mtx"]
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.822441    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-z9mtx"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.832879    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.833180    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-c2wgv"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.833346    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.846616    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-z9mtx"]
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.884614    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/468631ad-821b-469e-a166-1d32d370e5fa-config-data\") pod \"watcher-api-0\" (UID: \"468631ad-821b-469e-a166-1d32d370e5fa\") " pod="openstack/watcher-api-0"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.884688    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnvh7\" (UniqueName: \"kubernetes.io/projected/468631ad-821b-469e-a166-1d32d370e5fa-kube-api-access-jnvh7\") pod \"watcher-api-0\" (UID: \"468631ad-821b-469e-a166-1d32d370e5fa\") " pod="openstack/watcher-api-0"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.884749    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f6c808e-d523-48bd-8ec2-28b625834317-config-data\") pod \"watcher-decision-engine-0\" (UID: \"3f6c808e-d523-48bd-8ec2-28b625834317\") " pod="openstack/watcher-decision-engine-0"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.884809    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/468631ad-821b-469e-a166-1d32d370e5fa-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"468631ad-821b-469e-a166-1d32d370e5fa\") " pod="openstack/watcher-api-0"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.884845    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/468631ad-821b-469e-a166-1d32d370e5fa-logs\") pod \"watcher-api-0\" (UID: \"468631ad-821b-469e-a166-1d32d370e5fa\") " pod="openstack/watcher-api-0"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.884874    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f653a7b-251e-4eb2-92cd-74e23ac4dba5-config-data\") pod \"watcher-applier-0\" (UID: \"9f653a7b-251e-4eb2-92cd-74e23ac4dba5\") " pod="openstack/watcher-applier-0"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.884905    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f653a7b-251e-4eb2-92cd-74e23ac4dba5-logs\") pod \"watcher-applier-0\" (UID: \"9f653a7b-251e-4eb2-92cd-74e23ac4dba5\") " pod="openstack/watcher-applier-0"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.884930    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z2tj\" (UniqueName: \"kubernetes.io/projected/3f6c808e-d523-48bd-8ec2-28b625834317-kube-api-access-2z2tj\") pod \"watcher-decision-engine-0\" (UID: \"3f6c808e-d523-48bd-8ec2-28b625834317\") " pod="openstack/watcher-decision-engine-0"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.884956    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f653a7b-251e-4eb2-92cd-74e23ac4dba5-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"9f653a7b-251e-4eb2-92cd-74e23ac4dba5\") " pod="openstack/watcher-applier-0"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.884986    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/468631ad-821b-469e-a166-1d32d370e5fa-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"468631ad-821b-469e-a166-1d32d370e5fa\") " pod="openstack/watcher-api-0"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.885026    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3f6c808e-d523-48bd-8ec2-28b625834317-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"3f6c808e-d523-48bd-8ec2-28b625834317\") " pod="openstack/watcher-decision-engine-0"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.885061    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f6c808e-d523-48bd-8ec2-28b625834317-logs\") pod \"watcher-decision-engine-0\" (UID: \"3f6c808e-d523-48bd-8ec2-28b625834317\") " pod="openstack/watcher-decision-engine-0"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.885093    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52bmh\" (UniqueName: \"kubernetes.io/projected/9f653a7b-251e-4eb2-92cd-74e23ac4dba5-kube-api-access-52bmh\") pod \"watcher-applier-0\" (UID: \"9f653a7b-251e-4eb2-92cd-74e23ac4dba5\") " pod="openstack/watcher-applier-0"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.885163    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f6c808e-d523-48bd-8ec2-28b625834317-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"3f6c808e-d523-48bd-8ec2-28b625834317\") " pod="openstack/watcher-decision-engine-0"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.886560    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f653a7b-251e-4eb2-92cd-74e23ac4dba5-logs\") pod \"watcher-applier-0\" (UID: \"9f653a7b-251e-4eb2-92cd-74e23ac4dba5\") " pod="openstack/watcher-applier-0"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.889218    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f6c808e-d523-48bd-8ec2-28b625834317-logs\") pod \"watcher-decision-engine-0\" (UID: \"3f6c808e-d523-48bd-8ec2-28b625834317\") " pod="openstack/watcher-decision-engine-0"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.906911    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f653a7b-251e-4eb2-92cd-74e23ac4dba5-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"9f653a7b-251e-4eb2-92cd-74e23ac4dba5\") " pod="openstack/watcher-applier-0"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.908462    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f6c808e-d523-48bd-8ec2-28b625834317-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"3f6c808e-d523-48bd-8ec2-28b625834317\") " pod="openstack/watcher-decision-engine-0"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.914121    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3f6c808e-d523-48bd-8ec2-28b625834317-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"3f6c808e-d523-48bd-8ec2-28b625834317\") " pod="openstack/watcher-decision-engine-0"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.914951    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f653a7b-251e-4eb2-92cd-74e23ac4dba5-config-data\") pod \"watcher-applier-0\" (UID: \"9f653a7b-251e-4eb2-92cd-74e23ac4dba5\") " pod="openstack/watcher-applier-0"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.919271    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z2tj\" (UniqueName: \"kubernetes.io/projected/3f6c808e-d523-48bd-8ec2-28b625834317-kube-api-access-2z2tj\") pod \"watcher-decision-engine-0\" (UID: \"3f6c808e-d523-48bd-8ec2-28b625834317\") " pod="openstack/watcher-decision-engine-0"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.920712    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f6c808e-d523-48bd-8ec2-28b625834317-config-data\") pod \"watcher-decision-engine-0\" (UID: \"3f6c808e-d523-48bd-8ec2-28b625834317\") " pod="openstack/watcher-decision-engine-0"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.932233    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52bmh\" (UniqueName: \"kubernetes.io/projected/9f653a7b-251e-4eb2-92cd-74e23ac4dba5-kube-api-access-52bmh\") pod \"watcher-applier-0\" (UID: \"9f653a7b-251e-4eb2-92cd-74e23ac4dba5\") " pod="openstack/watcher-applier-0"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.954338    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"]
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.963467    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.965505    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.973580    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.973655    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.988153    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.989161    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7sfc\" (UniqueName: \"kubernetes.io/projected/09f27249-61fb-4e13-9eb9-9b804f256d81-kube-api-access-t7sfc\") pod \"neutron-db-sync-z9mtx\" (UID: \"09f27249-61fb-4e13-9eb9-9b804f256d81\") " pod="openstack/neutron-db-sync-z9mtx"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.989201    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/468631ad-821b-469e-a166-1d32d370e5fa-config-data\") pod \"watcher-api-0\" (UID: \"468631ad-821b-469e-a166-1d32d370e5fa\") " pod="openstack/watcher-api-0"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.989229    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnvh7\" (UniqueName: \"kubernetes.io/projected/468631ad-821b-469e-a166-1d32d370e5fa-kube-api-access-jnvh7\") pod \"watcher-api-0\" (UID: \"468631ad-821b-469e-a166-1d32d370e5fa\") " pod="openstack/watcher-api-0"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.989275    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/09f27249-61fb-4e13-9eb9-9b804f256d81-config\") pod \"neutron-db-sync-z9mtx\" (UID: \"09f27249-61fb-4e13-9eb9-9b804f256d81\") " pod="openstack/neutron-db-sync-z9mtx"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.989316    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/468631ad-821b-469e-a166-1d32d370e5fa-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"468631ad-821b-469e-a166-1d32d370e5fa\") " pod="openstack/watcher-api-0"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.989339    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/468631ad-821b-469e-a166-1d32d370e5fa-logs\") pod \"watcher-api-0\" (UID: \"468631ad-821b-469e-a166-1d32d370e5fa\") " pod="openstack/watcher-api-0"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.989371    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/468631ad-821b-469e-a166-1d32d370e5fa-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"468631ad-821b-469e-a166-1d32d370e5fa\") " pod="openstack/watcher-api-0"
Mar 20 15:59:59 crc kubenswrapper[4730]: I0320 15:59:59.989400    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09f27249-61fb-4e13-9eb9-9b804f256d81-combined-ca-bundle\") pod \"neutron-db-sync-z9mtx\" (UID: \"09f27249-61fb-4e13-9eb9-9b804f256d81\") " pod="openstack/neutron-db-sync-z9mtx"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.000493    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/468631ad-821b-469e-a166-1d32d370e5fa-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"468631ad-821b-469e-a166-1d32d370e5fa\") " pod="openstack/watcher-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.000756    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/468631ad-821b-469e-a166-1d32d370e5fa-logs\") pod \"watcher-api-0\" (UID: \"468631ad-821b-469e-a166-1d32d370e5fa\") " pod="openstack/watcher-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.003760    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"]
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.004652    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/468631ad-821b-469e-a166-1d32d370e5fa-config-data\") pod \"watcher-api-0\" (UID: \"468631ad-821b-469e-a166-1d32d370e5fa\") " pod="openstack/watcher-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.007383    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-hbplf"]
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.008761    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-hbplf"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.018892    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/468631ad-821b-469e-a166-1d32d370e5fa-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"468631ad-821b-469e-a166-1d32d370e5fa\") " pod="openstack/watcher-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.019346    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.019539    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-bg7s8"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.019623    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.034753    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnvh7\" (UniqueName: \"kubernetes.io/projected/468631ad-821b-469e-a166-1d32d370e5fa-kube-api-access-jnvh7\") pod \"watcher-api-0\" (UID: \"468631ad-821b-469e-a166-1d32d370e5fa\") " pod="openstack/watcher-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.060557    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-hbplf"]
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.082066    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-tz6x7"]
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.083108    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-tz6x7"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.084911    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-wcvgq"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.086406    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.091423    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/223c97f9-0680-47b8-bc2e-1c914296d29e-run-httpd\") pod \"ceilometer-0\" (UID: \"223c97f9-0680-47b8-bc2e-1c914296d29e\") " pod="openstack/ceilometer-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.091461    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/223c97f9-0680-47b8-bc2e-1c914296d29e-scripts\") pod \"ceilometer-0\" (UID: \"223c97f9-0680-47b8-bc2e-1c914296d29e\") " pod="openstack/ceilometer-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.091490    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/223c97f9-0680-47b8-bc2e-1c914296d29e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"223c97f9-0680-47b8-bc2e-1c914296d29e\") " pod="openstack/ceilometer-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.091523    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6fb4d42d-6cd9-480c-8ee0-1e168504a4cd-scripts\") pod \"cinder-db-sync-hbplf\" (UID: \"6fb4d42d-6cd9-480c-8ee0-1e168504a4cd\") " pod="openstack/cinder-db-sync-hbplf"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.091552    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/223c97f9-0680-47b8-bc2e-1c914296d29e-log-httpd\") pod \"ceilometer-0\" (UID: \"223c97f9-0680-47b8-bc2e-1c914296d29e\") " pod="openstack/ceilometer-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.091587    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24qfr\" (UniqueName: \"kubernetes.io/projected/6fb4d42d-6cd9-480c-8ee0-1e168504a4cd-kube-api-access-24qfr\") pod \"cinder-db-sync-hbplf\" (UID: \"6fb4d42d-6cd9-480c-8ee0-1e168504a4cd\") " pod="openstack/cinder-db-sync-hbplf"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.091604    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6fb4d42d-6cd9-480c-8ee0-1e168504a4cd-db-sync-config-data\") pod \"cinder-db-sync-hbplf\" (UID: \"6fb4d42d-6cd9-480c-8ee0-1e168504a4cd\") " pod="openstack/cinder-db-sync-hbplf"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.091625    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/223c97f9-0680-47b8-bc2e-1c914296d29e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"223c97f9-0680-47b8-bc2e-1c914296d29e\") " pod="openstack/ceilometer-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.091646    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09f27249-61fb-4e13-9eb9-9b804f256d81-combined-ca-bundle\") pod \"neutron-db-sync-z9mtx\" (UID: \"09f27249-61fb-4e13-9eb9-9b804f256d81\") " pod="openstack/neutron-db-sync-z9mtx"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.091671    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fb4d42d-6cd9-480c-8ee0-1e168504a4cd-combined-ca-bundle\") pod \"cinder-db-sync-hbplf\" (UID: \"6fb4d42d-6cd9-480c-8ee0-1e168504a4cd\") " pod="openstack/cinder-db-sync-hbplf"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.091689    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6fb4d42d-6cd9-480c-8ee0-1e168504a4cd-etc-machine-id\") pod \"cinder-db-sync-hbplf\" (UID: \"6fb4d42d-6cd9-480c-8ee0-1e168504a4cd\") " pod="openstack/cinder-db-sync-hbplf"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.091711    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7sfc\" (UniqueName: \"kubernetes.io/projected/09f27249-61fb-4e13-9eb9-9b804f256d81-kube-api-access-t7sfc\") pod \"neutron-db-sync-z9mtx\" (UID: \"09f27249-61fb-4e13-9eb9-9b804f256d81\") " pod="openstack/neutron-db-sync-z9mtx"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.091749    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fb4d42d-6cd9-480c-8ee0-1e168504a4cd-config-data\") pod \"cinder-db-sync-hbplf\" (UID: \"6fb4d42d-6cd9-480c-8ee0-1e168504a4cd\") " pod="openstack/cinder-db-sync-hbplf"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.091771    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/223c97f9-0680-47b8-bc2e-1c914296d29e-config-data\") pod \"ceilometer-0\" (UID: \"223c97f9-0680-47b8-bc2e-1c914296d29e\") " pod="openstack/ceilometer-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.091787    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np4pc\" (UniqueName: \"kubernetes.io/projected/223c97f9-0680-47b8-bc2e-1c914296d29e-kube-api-access-np4pc\") pod \"ceilometer-0\" (UID: \"223c97f9-0680-47b8-bc2e-1c914296d29e\") " pod="openstack/ceilometer-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.091811    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/09f27249-61fb-4e13-9eb9-9b804f256d81-config\") pod \"neutron-db-sync-z9mtx\" (UID: \"09f27249-61fb-4e13-9eb9-9b804f256d81\") " pod="openstack/neutron-db-sync-z9mtx"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.094664    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-tz6x7"]
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.097820    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/09f27249-61fb-4e13-9eb9-9b804f256d81-config\") pod \"neutron-db-sync-z9mtx\" (UID: \"09f27249-61fb-4e13-9eb9-9b804f256d81\") " pod="openstack/neutron-db-sync-z9mtx"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.101116    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09f27249-61fb-4e13-9eb9-9b804f256d81-combined-ca-bundle\") pod \"neutron-db-sync-z9mtx\" (UID: \"09f27249-61fb-4e13-9eb9-9b804f256d81\") " pod="openstack/neutron-db-sync-z9mtx"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.113268    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-758cdcd5c9-9m42d"]
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.146215    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7sfc\" (UniqueName: \"kubernetes.io/projected/09f27249-61fb-4e13-9eb9-9b804f256d81-kube-api-access-t7sfc\") pod \"neutron-db-sync-z9mtx\" (UID: \"09f27249-61fb-4e13-9eb9-9b804f256d81\") " pod="openstack/neutron-db-sync-z9mtx"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.162978    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-x2t9r"]
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.164123    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-x2t9r"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.166486    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-wvjmt"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.166963    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.167136    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.185676    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77878fc4cf-8s7hs"]
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.187044    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77878fc4cf-8s7hs"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.195673    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48fc8af0-e30f-4f3f-88d3-8b054c6359ef-combined-ca-bundle\") pod \"barbican-db-sync-tz6x7\" (UID: \"48fc8af0-e30f-4f3f-88d3-8b054c6359ef\") " pod="openstack/barbican-db-sync-tz6x7"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.195738    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fb4d42d-6cd9-480c-8ee0-1e168504a4cd-config-data\") pod \"cinder-db-sync-hbplf\" (UID: \"6fb4d42d-6cd9-480c-8ee0-1e168504a4cd\") " pod="openstack/cinder-db-sync-hbplf"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.195772    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/223c97f9-0680-47b8-bc2e-1c914296d29e-config-data\") pod \"ceilometer-0\" (UID: \"223c97f9-0680-47b8-bc2e-1c914296d29e\") " pod="openstack/ceilometer-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.195790    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np4pc\" (UniqueName: \"kubernetes.io/projected/223c97f9-0680-47b8-bc2e-1c914296d29e-kube-api-access-np4pc\") pod \"ceilometer-0\" (UID: \"223c97f9-0680-47b8-bc2e-1c914296d29e\") " pod="openstack/ceilometer-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.195943    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/223c97f9-0680-47b8-bc2e-1c914296d29e-run-httpd\") pod \"ceilometer-0\" (UID: \"223c97f9-0680-47b8-bc2e-1c914296d29e\") " pod="openstack/ceilometer-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.195981    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/223c97f9-0680-47b8-bc2e-1c914296d29e-scripts\") pod \"ceilometer-0\" (UID: \"223c97f9-0680-47b8-bc2e-1c914296d29e\") " pod="openstack/ceilometer-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.196002    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bggsr\" (UniqueName: \"kubernetes.io/projected/48fc8af0-e30f-4f3f-88d3-8b054c6359ef-kube-api-access-bggsr\") pod \"barbican-db-sync-tz6x7\" (UID: \"48fc8af0-e30f-4f3f-88d3-8b054c6359ef\") " pod="openstack/barbican-db-sync-tz6x7"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.196028    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/223c97f9-0680-47b8-bc2e-1c914296d29e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"223c97f9-0680-47b8-bc2e-1c914296d29e\") " pod="openstack/ceilometer-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.196770    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6fb4d42d-6cd9-480c-8ee0-1e168504a4cd-scripts\") pod \"cinder-db-sync-hbplf\" (UID: \"6fb4d42d-6cd9-480c-8ee0-1e168504a4cd\") " pod="openstack/cinder-db-sync-hbplf"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.196822    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/223c97f9-0680-47b8-bc2e-1c914296d29e-log-httpd\") pod \"ceilometer-0\" (UID: \"223c97f9-0680-47b8-bc2e-1c914296d29e\") " pod="openstack/ceilometer-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.197114    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24qfr\" (UniqueName: \"kubernetes.io/projected/6fb4d42d-6cd9-480c-8ee0-1e168504a4cd-kube-api-access-24qfr\") pod \"cinder-db-sync-hbplf\" (UID: \"6fb4d42d-6cd9-480c-8ee0-1e168504a4cd\") " pod="openstack/cinder-db-sync-hbplf"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.197139    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6fb4d42d-6cd9-480c-8ee0-1e168504a4cd-db-sync-config-data\") pod \"cinder-db-sync-hbplf\" (UID: \"6fb4d42d-6cd9-480c-8ee0-1e168504a4cd\") " pod="openstack/cinder-db-sync-hbplf"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.197169    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/223c97f9-0680-47b8-bc2e-1c914296d29e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"223c97f9-0680-47b8-bc2e-1c914296d29e\") " pod="openstack/ceilometer-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.197212    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fb4d42d-6cd9-480c-8ee0-1e168504a4cd-combined-ca-bundle\") pod \"cinder-db-sync-hbplf\" (UID: \"6fb4d42d-6cd9-480c-8ee0-1e168504a4cd\") " pod="openstack/cinder-db-sync-hbplf"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.197231    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/48fc8af0-e30f-4f3f-88d3-8b054c6359ef-db-sync-config-data\") pod \"barbican-db-sync-tz6x7\" (UID: \"48fc8af0-e30f-4f3f-88d3-8b054c6359ef\") " pod="openstack/barbican-db-sync-tz6x7"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.197265    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6fb4d42d-6cd9-480c-8ee0-1e168504a4cd-etc-machine-id\") pod \"cinder-db-sync-hbplf\" (UID: \"6fb4d42d-6cd9-480c-8ee0-1e168504a4cd\") " pod="openstack/cinder-db-sync-hbplf"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.197370    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6fb4d42d-6cd9-480c-8ee0-1e168504a4cd-etc-machine-id\") pod \"cinder-db-sync-hbplf\" (UID: \"6fb4d42d-6cd9-480c-8ee0-1e168504a4cd\") " pod="openstack/cinder-db-sync-hbplf"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.197704    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/223c97f9-0680-47b8-bc2e-1c914296d29e-run-httpd\") pod \"ceilometer-0\" (UID: \"223c97f9-0680-47b8-bc2e-1c914296d29e\") " pod="openstack/ceilometer-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.200499    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/223c97f9-0680-47b8-bc2e-1c914296d29e-scripts\") pod \"ceilometer-0\" (UID: \"223c97f9-0680-47b8-bc2e-1c914296d29e\") " pod="openstack/ceilometer-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.203969    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/223c97f9-0680-47b8-bc2e-1c914296d29e-log-httpd\") pod \"ceilometer-0\" (UID: \"223c97f9-0680-47b8-bc2e-1c914296d29e\") " pod="openstack/ceilometer-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.206050    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/223c97f9-0680-47b8-bc2e-1c914296d29e-config-data\") pod \"ceilometer-0\" (UID: \"223c97f9-0680-47b8-bc2e-1c914296d29e\") " pod="openstack/ceilometer-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.206871    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fb4d42d-6cd9-480c-8ee0-1e168504a4cd-config-data\") pod \"cinder-db-sync-hbplf\" (UID: \"6fb4d42d-6cd9-480c-8ee0-1e168504a4cd\") " pod="openstack/cinder-db-sync-hbplf"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.207603    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6fb4d42d-6cd9-480c-8ee0-1e168504a4cd-scripts\") pod \"cinder-db-sync-hbplf\" (UID: \"6fb4d42d-6cd9-480c-8ee0-1e168504a4cd\") " pod="openstack/cinder-db-sync-hbplf"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.208124    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fb4d42d-6cd9-480c-8ee0-1e168504a4cd-combined-ca-bundle\") pod \"cinder-db-sync-hbplf\" (UID: \"6fb4d42d-6cd9-480c-8ee0-1e168504a4cd\") " pod="openstack/cinder-db-sync-hbplf"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.208485    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6fb4d42d-6cd9-480c-8ee0-1e168504a4cd-db-sync-config-data\") pod \"cinder-db-sync-hbplf\" (UID: \"6fb4d42d-6cd9-480c-8ee0-1e168504a4cd\") " pod="openstack/cinder-db-sync-hbplf"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.211413    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-x2t9r"]
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.221985    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/223c97f9-0680-47b8-bc2e-1c914296d29e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"223c97f9-0680-47b8-bc2e-1c914296d29e\") " pod="openstack/ceilometer-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.224560    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77878fc4cf-8s7hs"]
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.227232    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np4pc\" (UniqueName: \"kubernetes.io/projected/223c97f9-0680-47b8-bc2e-1c914296d29e-kube-api-access-np4pc\") pod \"ceilometer-0\" (UID: \"223c97f9-0680-47b8-bc2e-1c914296d29e\") " pod="openstack/ceilometer-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.227558    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/223c97f9-0680-47b8-bc2e-1c914296d29e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"223c97f9-0680-47b8-bc2e-1c914296d29e\") " pod="openstack/ceilometer-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.231020    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24qfr\" (UniqueName: \"kubernetes.io/projected/6fb4d42d-6cd9-480c-8ee0-1e168504a4cd-kube-api-access-24qfr\") pod \"cinder-db-sync-hbplf\" (UID: \"6fb4d42d-6cd9-480c-8ee0-1e168504a4cd\") " pod="openstack/cinder-db-sync-hbplf"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.270884    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.307437    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567040-2zl4f"]
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.310092    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bggsr\" (UniqueName: \"kubernetes.io/projected/48fc8af0-e30f-4f3f-88d3-8b054c6359ef-kube-api-access-bggsr\") pod \"barbican-db-sync-tz6x7\" (UID: \"48fc8af0-e30f-4f3f-88d3-8b054c6359ef\") " pod="openstack/barbican-db-sync-tz6x7"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.310236    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a05675d7-cd2f-4810-862b-cb0d2d13cbdd-config-data\") pod \"placement-db-sync-x2t9r\" (UID: \"a05675d7-cd2f-4810-862b-cb0d2d13cbdd\") " pod="openstack/placement-db-sync-x2t9r"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.310331    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82ffcdbb-cebb-443a-a8af-3c3543bea13d-ovsdbserver-sb\") pod \"dnsmasq-dns-77878fc4cf-8s7hs\" (UID: \"82ffcdbb-cebb-443a-a8af-3c3543bea13d\") " pod="openstack/dnsmasq-dns-77878fc4cf-8s7hs"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.310399    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a05675d7-cd2f-4810-862b-cb0d2d13cbdd-combined-ca-bundle\") pod \"placement-db-sync-x2t9r\" (UID: \"a05675d7-cd2f-4810-862b-cb0d2d13cbdd\") " pod="openstack/placement-db-sync-x2t9r"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.310479    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a05675d7-cd2f-4810-862b-cb0d2d13cbdd-scripts\") pod \"placement-db-sync-x2t9r\" (UID: \"a05675d7-cd2f-4810-862b-cb0d2d13cbdd\") " pod="openstack/placement-db-sync-x2t9r"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.310516    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/48fc8af0-e30f-4f3f-88d3-8b054c6359ef-db-sync-config-data\") pod \"barbican-db-sync-tz6x7\" (UID: \"48fc8af0-e30f-4f3f-88d3-8b054c6359ef\") " pod="openstack/barbican-db-sync-tz6x7"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.310585    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5dq8\" (UniqueName: \"kubernetes.io/projected/82ffcdbb-cebb-443a-a8af-3c3543bea13d-kube-api-access-m5dq8\") pod \"dnsmasq-dns-77878fc4cf-8s7hs\" (UID: \"82ffcdbb-cebb-443a-a8af-3c3543bea13d\") " pod="openstack/dnsmasq-dns-77878fc4cf-8s7hs"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.310627    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82ffcdbb-cebb-443a-a8af-3c3543bea13d-config\") pod \"dnsmasq-dns-77878fc4cf-8s7hs\" (UID: \"82ffcdbb-cebb-443a-a8af-3c3543bea13d\") " pod="openstack/dnsmasq-dns-77878fc4cf-8s7hs"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.310661    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82ffcdbb-cebb-443a-a8af-3c3543bea13d-dns-svc\") pod \"dnsmasq-dns-77878fc4cf-8s7hs\" (UID: \"82ffcdbb-cebb-443a-a8af-3c3543bea13d\") " pod="openstack/dnsmasq-dns-77878fc4cf-8s7hs"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.310691    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/82ffcdbb-cebb-443a-a8af-3c3543bea13d-dns-swift-storage-0\") pod \"dnsmasq-dns-77878fc4cf-8s7hs\" (UID: \"82ffcdbb-cebb-443a-a8af-3c3543bea13d\") " pod="openstack/dnsmasq-dns-77878fc4cf-8s7hs"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.310743    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48fc8af0-e30f-4f3f-88d3-8b054c6359ef-combined-ca-bundle\") pod \"barbican-db-sync-tz6x7\" (UID: \"48fc8af0-e30f-4f3f-88d3-8b054c6359ef\") " pod="openstack/barbican-db-sync-tz6x7"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.310808    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82ffcdbb-cebb-443a-a8af-3c3543bea13d-ovsdbserver-nb\") pod \"dnsmasq-dns-77878fc4cf-8s7hs\" (UID: \"82ffcdbb-cebb-443a-a8af-3c3543bea13d\") " pod="openstack/dnsmasq-dns-77878fc4cf-8s7hs"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.310848    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a05675d7-cd2f-4810-862b-cb0d2d13cbdd-logs\") pod \"placement-db-sync-x2t9r\" (UID: \"a05675d7-cd2f-4810-862b-cb0d2d13cbdd\") " pod="openstack/placement-db-sync-x2t9r"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.310898    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxgg6\" (UniqueName: \"kubernetes.io/projected/a05675d7-cd2f-4810-862b-cb0d2d13cbdd-kube-api-access-xxgg6\") pod \"placement-db-sync-x2t9r\" (UID: \"a05675d7-cd2f-4810-862b-cb0d2d13cbdd\") " pod="openstack/placement-db-sync-x2t9r"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.316012    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567040-2zl4f"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.317767    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48fc8af0-e30f-4f3f-88d3-8b054c6359ef-combined-ca-bundle\") pod \"barbican-db-sync-tz6x7\" (UID: \"48fc8af0-e30f-4f3f-88d3-8b054c6359ef\") " pod="openstack/barbican-db-sync-tz6x7"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.323595    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.323840    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.324028    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vlqbl"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.329969    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567040-cz69b"]
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.329974    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/48fc8af0-e30f-4f3f-88d3-8b054c6359ef-db-sync-config-data\") pod \"barbican-db-sync-tz6x7\" (UID: \"48fc8af0-e30f-4f3f-88d3-8b054c6359ef\") " pod="openstack/barbican-db-sync-tz6x7"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.333080    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-cz69b"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.337876    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.338169    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.338645    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-z9mtx"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.339103    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bggsr\" (UniqueName: \"kubernetes.io/projected/48fc8af0-e30f-4f3f-88d3-8b054c6359ef-kube-api-access-bggsr\") pod \"barbican-db-sync-tz6x7\" (UID: \"48fc8af0-e30f-4f3f-88d3-8b054c6359ef\") " pod="openstack/barbican-db-sync-tz6x7"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.355132    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567040-2zl4f"]
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.359756    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.386647    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567040-cz69b"]
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.412161    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-hbplf"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.412678    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5dq8\" (UniqueName: \"kubernetes.io/projected/82ffcdbb-cebb-443a-a8af-3c3543bea13d-kube-api-access-m5dq8\") pod \"dnsmasq-dns-77878fc4cf-8s7hs\" (UID: \"82ffcdbb-cebb-443a-a8af-3c3543bea13d\") " pod="openstack/dnsmasq-dns-77878fc4cf-8s7hs"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.412994    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82ffcdbb-cebb-443a-a8af-3c3543bea13d-config\") pod \"dnsmasq-dns-77878fc4cf-8s7hs\" (UID: \"82ffcdbb-cebb-443a-a8af-3c3543bea13d\") " pod="openstack/dnsmasq-dns-77878fc4cf-8s7hs"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.413018    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82ffcdbb-cebb-443a-a8af-3c3543bea13d-dns-svc\") pod \"dnsmasq-dns-77878fc4cf-8s7hs\" (UID: \"82ffcdbb-cebb-443a-a8af-3c3543bea13d\") " pod="openstack/dnsmasq-dns-77878fc4cf-8s7hs"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.413778    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82ffcdbb-cebb-443a-a8af-3c3543bea13d-config\") pod \"dnsmasq-dns-77878fc4cf-8s7hs\" (UID: \"82ffcdbb-cebb-443a-a8af-3c3543bea13d\") " pod="openstack/dnsmasq-dns-77878fc4cf-8s7hs"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.413842    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/82ffcdbb-cebb-443a-a8af-3c3543bea13d-dns-swift-storage-0\") pod \"dnsmasq-dns-77878fc4cf-8s7hs\" (UID: \"82ffcdbb-cebb-443a-a8af-3c3543bea13d\") " pod="openstack/dnsmasq-dns-77878fc4cf-8s7hs"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.413983    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82ffcdbb-cebb-443a-a8af-3c3543bea13d-dns-svc\") pod \"dnsmasq-dns-77878fc4cf-8s7hs\" (UID: \"82ffcdbb-cebb-443a-a8af-3c3543bea13d\") " pod="openstack/dnsmasq-dns-77878fc4cf-8s7hs"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.414901    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/82ffcdbb-cebb-443a-a8af-3c3543bea13d-dns-swift-storage-0\") pod \"dnsmasq-dns-77878fc4cf-8s7hs\" (UID: \"82ffcdbb-cebb-443a-a8af-3c3543bea13d\") " pod="openstack/dnsmasq-dns-77878fc4cf-8s7hs"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.414986    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82ffcdbb-cebb-443a-a8af-3c3543bea13d-ovsdbserver-nb\") pod \"dnsmasq-dns-77878fc4cf-8s7hs\" (UID: \"82ffcdbb-cebb-443a-a8af-3c3543bea13d\") " pod="openstack/dnsmasq-dns-77878fc4cf-8s7hs"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.415190    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a05675d7-cd2f-4810-862b-cb0d2d13cbdd-logs\") pod \"placement-db-sync-x2t9r\" (UID: \"a05675d7-cd2f-4810-862b-cb0d2d13cbdd\") " pod="openstack/placement-db-sync-x2t9r"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.415321    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxgg6\" (UniqueName: \"kubernetes.io/projected/a05675d7-cd2f-4810-862b-cb0d2d13cbdd-kube-api-access-xxgg6\") pod \"placement-db-sync-x2t9r\" (UID: \"a05675d7-cd2f-4810-862b-cb0d2d13cbdd\") " pod="openstack/placement-db-sync-x2t9r"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.415375    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/672cfda1-2ec8-41fe-b3dc-eabe4e60726d-config-volume\") pod \"collect-profiles-29567040-cz69b\" (UID: \"672cfda1-2ec8-41fe-b3dc-eabe4e60726d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-cz69b"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.415510    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a05675d7-cd2f-4810-862b-cb0d2d13cbdd-config-data\") pod \"placement-db-sync-x2t9r\" (UID: \"a05675d7-cd2f-4810-862b-cb0d2d13cbdd\") " pod="openstack/placement-db-sync-x2t9r"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.415669    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82ffcdbb-cebb-443a-a8af-3c3543bea13d-ovsdbserver-sb\") pod \"dnsmasq-dns-77878fc4cf-8s7hs\" (UID: \"82ffcdbb-cebb-443a-a8af-3c3543bea13d\") " pod="openstack/dnsmasq-dns-77878fc4cf-8s7hs"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.415715    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxpm7\" (UniqueName: \"kubernetes.io/projected/672cfda1-2ec8-41fe-b3dc-eabe4e60726d-kube-api-access-wxpm7\") pod \"collect-profiles-29567040-cz69b\" (UID: \"672cfda1-2ec8-41fe-b3dc-eabe4e60726d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-cz69b"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.415786    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a05675d7-cd2f-4810-862b-cb0d2d13cbdd-combined-ca-bundle\") pod \"placement-db-sync-x2t9r\" (UID: \"a05675d7-cd2f-4810-862b-cb0d2d13cbdd\") " pod="openstack/placement-db-sync-x2t9r"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.415922    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-459cr\" (UniqueName: \"kubernetes.io/projected/97b63a10-b572-4a37-a2a4-079852aa2d3d-kube-api-access-459cr\") pod \"auto-csr-approver-29567040-2zl4f\" (UID: \"97b63a10-b572-4a37-a2a4-079852aa2d3d\") " pod="openshift-infra/auto-csr-approver-29567040-2zl4f"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.415992    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a05675d7-cd2f-4810-862b-cb0d2d13cbdd-scripts\") pod \"placement-db-sync-x2t9r\" (UID: \"a05675d7-cd2f-4810-862b-cb0d2d13cbdd\") " pod="openstack/placement-db-sync-x2t9r"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.416041    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/672cfda1-2ec8-41fe-b3dc-eabe4e60726d-secret-volume\") pod \"collect-profiles-29567040-cz69b\" (UID: \"672cfda1-2ec8-41fe-b3dc-eabe4e60726d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-cz69b"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.417217    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82ffcdbb-cebb-443a-a8af-3c3543bea13d-ovsdbserver-nb\") pod \"dnsmasq-dns-77878fc4cf-8s7hs\" (UID: \"82ffcdbb-cebb-443a-a8af-3c3543bea13d\") " pod="openstack/dnsmasq-dns-77878fc4cf-8s7hs"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.417722    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82ffcdbb-cebb-443a-a8af-3c3543bea13d-ovsdbserver-sb\") pod \"dnsmasq-dns-77878fc4cf-8s7hs\" (UID: \"82ffcdbb-cebb-443a-a8af-3c3543bea13d\") " pod="openstack/dnsmasq-dns-77878fc4cf-8s7hs"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.420077    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a05675d7-cd2f-4810-862b-cb0d2d13cbdd-config-data\") pod \"placement-db-sync-x2t9r\" (UID: \"a05675d7-cd2f-4810-862b-cb0d2d13cbdd\") " pod="openstack/placement-db-sync-x2t9r"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.420433    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a05675d7-cd2f-4810-862b-cb0d2d13cbdd-logs\") pod \"placement-db-sync-x2t9r\" (UID: \"a05675d7-cd2f-4810-862b-cb0d2d13cbdd\") " pod="openstack/placement-db-sync-x2t9r"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.420575    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a05675d7-cd2f-4810-862b-cb0d2d13cbdd-combined-ca-bundle\") pod \"placement-db-sync-x2t9r\" (UID: \"a05675d7-cd2f-4810-862b-cb0d2d13cbdd\") " pod="openstack/placement-db-sync-x2t9r"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.426599    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a05675d7-cd2f-4810-862b-cb0d2d13cbdd-scripts\") pod \"placement-db-sync-x2t9r\" (UID: \"a05675d7-cd2f-4810-862b-cb0d2d13cbdd\") " pod="openstack/placement-db-sync-x2t9r"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.431116    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5dq8\" (UniqueName: \"kubernetes.io/projected/82ffcdbb-cebb-443a-a8af-3c3543bea13d-kube-api-access-m5dq8\") pod \"dnsmasq-dns-77878fc4cf-8s7hs\" (UID: \"82ffcdbb-cebb-443a-a8af-3c3543bea13d\") " pod="openstack/dnsmasq-dns-77878fc4cf-8s7hs"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.438192    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-tz6x7"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.441148    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxgg6\" (UniqueName: \"kubernetes.io/projected/a05675d7-cd2f-4810-862b-cb0d2d13cbdd-kube-api-access-xxgg6\") pod \"placement-db-sync-x2t9r\" (UID: \"a05675d7-cd2f-4810-862b-cb0d2d13cbdd\") " pod="openstack/placement-db-sync-x2t9r"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.482053    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rmtsq"]
Mar 20 16:00:00 crc kubenswrapper[4730]: W0320 16:00:00.494319    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3349c2c_6f29_425d_9d25_b4f23821cfcc.slice/crio-707ec669900453b1a2293614ef0436d415043c661d842387b86130234173b70a WatchSource:0}: Error finding container 707ec669900453b1a2293614ef0436d415043c661d842387b86130234173b70a: Status 404 returned error can't find the container with id 707ec669900453b1a2293614ef0436d415043c661d842387b86130234173b70a
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.501262    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-x2t9r"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.517833    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/672cfda1-2ec8-41fe-b3dc-eabe4e60726d-config-volume\") pod \"collect-profiles-29567040-cz69b\" (UID: \"672cfda1-2ec8-41fe-b3dc-eabe4e60726d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-cz69b"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.517998    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxpm7\" (UniqueName: \"kubernetes.io/projected/672cfda1-2ec8-41fe-b3dc-eabe4e60726d-kube-api-access-wxpm7\") pod \"collect-profiles-29567040-cz69b\" (UID: \"672cfda1-2ec8-41fe-b3dc-eabe4e60726d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-cz69b"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.518041    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-459cr\" (UniqueName: \"kubernetes.io/projected/97b63a10-b572-4a37-a2a4-079852aa2d3d-kube-api-access-459cr\") pod \"auto-csr-approver-29567040-2zl4f\" (UID: \"97b63a10-b572-4a37-a2a4-079852aa2d3d\") " pod="openshift-infra/auto-csr-approver-29567040-2zl4f"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.518089    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/672cfda1-2ec8-41fe-b3dc-eabe4e60726d-secret-volume\") pod \"collect-profiles-29567040-cz69b\" (UID: \"672cfda1-2ec8-41fe-b3dc-eabe4e60726d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-cz69b"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.520033    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/672cfda1-2ec8-41fe-b3dc-eabe4e60726d-config-volume\") pod \"collect-profiles-29567040-cz69b\" (UID: \"672cfda1-2ec8-41fe-b3dc-eabe4e60726d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-cz69b"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.525747    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/672cfda1-2ec8-41fe-b3dc-eabe4e60726d-secret-volume\") pod \"collect-profiles-29567040-cz69b\" (UID: \"672cfda1-2ec8-41fe-b3dc-eabe4e60726d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-cz69b"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.538101    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77878fc4cf-8s7hs"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.538208    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxpm7\" (UniqueName: \"kubernetes.io/projected/672cfda1-2ec8-41fe-b3dc-eabe4e60726d-kube-api-access-wxpm7\") pod \"collect-profiles-29567040-cz69b\" (UID: \"672cfda1-2ec8-41fe-b3dc-eabe4e60726d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-cz69b"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.556637    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-459cr\" (UniqueName: \"kubernetes.io/projected/97b63a10-b572-4a37-a2a4-079852aa2d3d-kube-api-access-459cr\") pod \"auto-csr-approver-29567040-2zl4f\" (UID: \"97b63a10-b572-4a37-a2a4-079852aa2d3d\") " pod="openshift-infra/auto-csr-approver-29567040-2zl4f"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.618305    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"]
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.620064    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.634518    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-758cdcd5c9-9m42d"]
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.640898    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-b8g88"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.641055    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.641163    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.641430    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.647042    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"]
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.679944    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567040-2zl4f"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.686870    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-cz69b"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.713569    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"]
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.733730    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4458311-d050-4887-b4c7-6df6c993d66e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c4458311-d050-4887-b4c7-6df6c993d66e\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.733826    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"c4458311-d050-4887-b4c7-6df6c993d66e\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.733861    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c4458311-d050-4887-b4c7-6df6c993d66e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c4458311-d050-4887-b4c7-6df6c993d66e\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.733904    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgxjd\" (UniqueName: \"kubernetes.io/projected/c4458311-d050-4887-b4c7-6df6c993d66e-kube-api-access-jgxjd\") pod \"glance-default-internal-api-0\" (UID: \"c4458311-d050-4887-b4c7-6df6c993d66e\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.733960    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4458311-d050-4887-b4c7-6df6c993d66e-logs\") pod \"glance-default-internal-api-0\" (UID: \"c4458311-d050-4887-b4c7-6df6c993d66e\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.733988    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4458311-d050-4887-b4c7-6df6c993d66e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c4458311-d050-4887-b4c7-6df6c993d66e\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.734107    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4458311-d050-4887-b4c7-6df6c993d66e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c4458311-d050-4887-b4c7-6df6c993d66e\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.734141    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4458311-d050-4887-b4c7-6df6c993d66e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c4458311-d050-4887-b4c7-6df6c993d66e\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.760847    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"]
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.771876    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"]
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.775643    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.780575    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.780912    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.799211    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"]
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.835349    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b821c271-d46c-4a68-a6ea-438e616c4d47-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b821c271-d46c-4a68-a6ea-438e616c4d47\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.835390    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b821c271-d46c-4a68-a6ea-438e616c4d47-scripts\") pod \"glance-default-external-api-0\" (UID: \"b821c271-d46c-4a68-a6ea-438e616c4d47\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.835434    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b821c271-d46c-4a68-a6ea-438e616c4d47-config-data\") pod \"glance-default-external-api-0\" (UID: \"b821c271-d46c-4a68-a6ea-438e616c4d47\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.835454    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4t8c\" (UniqueName: \"kubernetes.io/projected/b821c271-d46c-4a68-a6ea-438e616c4d47-kube-api-access-g4t8c\") pod \"glance-default-external-api-0\" (UID: \"b821c271-d46c-4a68-a6ea-438e616c4d47\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.835479    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b821c271-d46c-4a68-a6ea-438e616c4d47-logs\") pod \"glance-default-external-api-0\" (UID: \"b821c271-d46c-4a68-a6ea-438e616c4d47\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.835506    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4458311-d050-4887-b4c7-6df6c993d66e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c4458311-d050-4887-b4c7-6df6c993d66e\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.835525    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4458311-d050-4887-b4c7-6df6c993d66e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c4458311-d050-4887-b4c7-6df6c993d66e\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.835552    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b821c271-d46c-4a68-a6ea-438e616c4d47-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b821c271-d46c-4a68-a6ea-438e616c4d47\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.835570    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b821c271-d46c-4a68-a6ea-438e616c4d47-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b821c271-d46c-4a68-a6ea-438e616c4d47\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.835604    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4458311-d050-4887-b4c7-6df6c993d66e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c4458311-d050-4887-b4c7-6df6c993d66e\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.835628    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"c4458311-d050-4887-b4c7-6df6c993d66e\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.835647    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c4458311-d050-4887-b4c7-6df6c993d66e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c4458311-d050-4887-b4c7-6df6c993d66e\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.835897    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgxjd\" (UniqueName: \"kubernetes.io/projected/c4458311-d050-4887-b4c7-6df6c993d66e-kube-api-access-jgxjd\") pod \"glance-default-internal-api-0\" (UID: \"c4458311-d050-4887-b4c7-6df6c993d66e\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.835936    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4458311-d050-4887-b4c7-6df6c993d66e-logs\") pod \"glance-default-internal-api-0\" (UID: \"c4458311-d050-4887-b4c7-6df6c993d66e\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.835965    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4458311-d050-4887-b4c7-6df6c993d66e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c4458311-d050-4887-b4c7-6df6c993d66e\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.835983    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"b821c271-d46c-4a68-a6ea-438e616c4d47\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.843491    4730 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"c4458311-d050-4887-b4c7-6df6c993d66e\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.848013    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c4458311-d050-4887-b4c7-6df6c993d66e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c4458311-d050-4887-b4c7-6df6c993d66e\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.848360    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4458311-d050-4887-b4c7-6df6c993d66e-logs\") pod \"glance-default-internal-api-0\" (UID: \"c4458311-d050-4887-b4c7-6df6c993d66e\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.857220    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4458311-d050-4887-b4c7-6df6c993d66e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c4458311-d050-4887-b4c7-6df6c993d66e\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.865290    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4458311-d050-4887-b4c7-6df6c993d66e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c4458311-d050-4887-b4c7-6df6c993d66e\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.870296    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4458311-d050-4887-b4c7-6df6c993d66e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c4458311-d050-4887-b4c7-6df6c993d66e\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.871535    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4458311-d050-4887-b4c7-6df6c993d66e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c4458311-d050-4887-b4c7-6df6c993d66e\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.876324    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgxjd\" (UniqueName: \"kubernetes.io/projected/c4458311-d050-4887-b4c7-6df6c993d66e-kube-api-access-jgxjd\") pod \"glance-default-internal-api-0\" (UID: \"c4458311-d050-4887-b4c7-6df6c993d66e\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.907665    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"]
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.914553    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"c4458311-d050-4887-b4c7-6df6c993d66e\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: W0320 16:00:00.940319    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod468631ad_821b_469e_a166_1d32d370e5fa.slice/crio-61de70dac9ca369e87e9de4023e0b6d3ab234abdc4d63ca022c46c3f926b57cd WatchSource:0}: Error finding container 61de70dac9ca369e87e9de4023e0b6d3ab234abdc4d63ca022c46c3f926b57cd: Status 404 returned error can't find the container with id 61de70dac9ca369e87e9de4023e0b6d3ab234abdc4d63ca022c46c3f926b57cd
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.945149    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"b821c271-d46c-4a68-a6ea-438e616c4d47\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.945261    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b821c271-d46c-4a68-a6ea-438e616c4d47-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b821c271-d46c-4a68-a6ea-438e616c4d47\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.945304    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b821c271-d46c-4a68-a6ea-438e616c4d47-scripts\") pod \"glance-default-external-api-0\" (UID: \"b821c271-d46c-4a68-a6ea-438e616c4d47\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.945375    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b821c271-d46c-4a68-a6ea-438e616c4d47-config-data\") pod \"glance-default-external-api-0\" (UID: \"b821c271-d46c-4a68-a6ea-438e616c4d47\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.945404    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4t8c\" (UniqueName: \"kubernetes.io/projected/b821c271-d46c-4a68-a6ea-438e616c4d47-kube-api-access-g4t8c\") pod \"glance-default-external-api-0\" (UID: \"b821c271-d46c-4a68-a6ea-438e616c4d47\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.945441    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b821c271-d46c-4a68-a6ea-438e616c4d47-logs\") pod \"glance-default-external-api-0\" (UID: \"b821c271-d46c-4a68-a6ea-438e616c4d47\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.945508    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b821c271-d46c-4a68-a6ea-438e616c4d47-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b821c271-d46c-4a68-a6ea-438e616c4d47\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.945530    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b821c271-d46c-4a68-a6ea-438e616c4d47-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b821c271-d46c-4a68-a6ea-438e616c4d47\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.946164    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b821c271-d46c-4a68-a6ea-438e616c4d47-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b821c271-d46c-4a68-a6ea-438e616c4d47\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.946777    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b821c271-d46c-4a68-a6ea-438e616c4d47-logs\") pod \"glance-default-external-api-0\" (UID: \"b821c271-d46c-4a68-a6ea-438e616c4d47\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.946893    4730 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"b821c271-d46c-4a68-a6ea-438e616c4d47\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.961302    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.973326    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b821c271-d46c-4a68-a6ea-438e616c4d47-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b821c271-d46c-4a68-a6ea-438e616c4d47\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.973347    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b821c271-d46c-4a68-a6ea-438e616c4d47-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b821c271-d46c-4a68-a6ea-438e616c4d47\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.973880    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4t8c\" (UniqueName: \"kubernetes.io/projected/b821c271-d46c-4a68-a6ea-438e616c4d47-kube-api-access-g4t8c\") pod \"glance-default-external-api-0\" (UID: \"b821c271-d46c-4a68-a6ea-438e616c4d47\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.976549    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b821c271-d46c-4a68-a6ea-438e616c4d47-scripts\") pod \"glance-default-external-api-0\" (UID: \"b821c271-d46c-4a68-a6ea-438e616c4d47\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:00:00 crc kubenswrapper[4730]: I0320 16:00:00.994630    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b821c271-d46c-4a68-a6ea-438e616c4d47-config-data\") pod \"glance-default-external-api-0\" (UID: \"b821c271-d46c-4a68-a6ea-438e616c4d47\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:00:01 crc kubenswrapper[4730]: I0320 16:00:01.013179    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"b821c271-d46c-4a68-a6ea-438e616c4d47\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:00:01 crc kubenswrapper[4730]: I0320 16:00:01.155263    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-z9mtx"]
Mar 20 16:00:01 crc kubenswrapper[4730]: I0320 16:00:01.235827    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"9f653a7b-251e-4eb2-92cd-74e23ac4dba5","Type":"ContainerStarted","Data":"c21d7cc71db24a4080b26c4c7e24c638e9de88158fbe9899bd6305ba181030ef"}
Mar 20 16:00:01 crc kubenswrapper[4730]: I0320 16:00:01.236984    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rmtsq" event={"ID":"e3349c2c-6f29-425d-9d25-b4f23821cfcc","Type":"ContainerStarted","Data":"707ec669900453b1a2293614ef0436d415043c661d842387b86130234173b70a"}
Mar 20 16:00:01 crc kubenswrapper[4730]: I0320 16:00:01.238139    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"3f6c808e-d523-48bd-8ec2-28b625834317","Type":"ContainerStarted","Data":"d03f482cee6be89afdaaa20261b8838840560db337ea7e22e3e773497fe55805"}
Mar 20 16:00:01 crc kubenswrapper[4730]: I0320 16:00:01.239577    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-z9mtx" event={"ID":"09f27249-61fb-4e13-9eb9-9b804f256d81","Type":"ContainerStarted","Data":"a168269d42bdf34c043574c3c59fe418fbfcc2023ca3015cc5326a7e3f76f715"}
Mar 20 16:00:01 crc kubenswrapper[4730]: I0320 16:00:01.245429    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-758cdcd5c9-9m42d" event={"ID":"c857c34a-0efc-4ebe-8f42-e562e88de7a4","Type":"ContainerStarted","Data":"4c72e558e23a7c4c418bb2168aa9ecd0cbbacb82442ab392bdc56b6a65616fb6"}
Mar 20 16:00:01 crc kubenswrapper[4730]: I0320 16:00:01.247427    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"468631ad-821b-469e-a166-1d32d370e5fa","Type":"ContainerStarted","Data":"61de70dac9ca369e87e9de4023e0b6d3ab234abdc4d63ca022c46c3f926b57cd"}
Mar 20 16:00:01 crc kubenswrapper[4730]: I0320 16:00:01.349524    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0"
Mar 20 16:00:01 crc kubenswrapper[4730]: I0320 16:00:01.382579    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-tz6x7"]
Mar 20 16:00:01 crc kubenswrapper[4730]: I0320 16:00:01.425152    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"]
Mar 20 16:00:01 crc kubenswrapper[4730]: I0320 16:00:01.442501    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-x2t9r"]
Mar 20 16:00:01 crc kubenswrapper[4730]: W0320 16:00:01.450425    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48fc8af0_e30f_4f3f_88d3_8b054c6359ef.slice/crio-af7beef135dc284222c89d8da5556d80f3f65072f0a1d94b5d3c1cbd5f0ae59a WatchSource:0}: Error finding container af7beef135dc284222c89d8da5556d80f3f65072f0a1d94b5d3c1cbd5f0ae59a: Status 404 returned error can't find the container with id af7beef135dc284222c89d8da5556d80f3f65072f0a1d94b5d3c1cbd5f0ae59a
Mar 20 16:00:01 crc kubenswrapper[4730]: I0320 16:00:01.470596    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-hbplf"]
Mar 20 16:00:01 crc kubenswrapper[4730]: I0320 16:00:01.596727    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77878fc4cf-8s7hs"]
Mar 20 16:00:01 crc kubenswrapper[4730]: I0320 16:00:01.604952    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567040-2zl4f"]
Mar 20 16:00:01 crc kubenswrapper[4730]: W0320 16:00:01.615117    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97b63a10_b572_4a37_a2a4_079852aa2d3d.slice/crio-5160551b06a1e1f1e7b6689c4dec94a675729bc0e9edd00f1bb67e7f9a23750a WatchSource:0}: Error finding container 5160551b06a1e1f1e7b6689c4dec94a675729bc0e9edd00f1bb67e7f9a23750a: Status 404 returned error can't find the container with id 5160551b06a1e1f1e7b6689c4dec94a675729bc0e9edd00f1bb67e7f9a23750a
Mar 20 16:00:01 crc kubenswrapper[4730]: I0320 16:00:01.755364    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567040-cz69b"]
Mar 20 16:00:01 crc kubenswrapper[4730]: I0320 16:00:01.923147    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"]
Mar 20 16:00:02 crc kubenswrapper[4730]: I0320 16:00:02.111499    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"]
Mar 20 16:00:02 crc kubenswrapper[4730]: I0320 16:00:02.189081    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"]
Mar 20 16:00:02 crc kubenswrapper[4730]: I0320 16:00:02.316682    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"223c97f9-0680-47b8-bc2e-1c914296d29e","Type":"ContainerStarted","Data":"7f071b5518e739d74a059048c81e33bc96125faedfd609490b9b1dda80135229"}
Mar 20 16:00:02 crc kubenswrapper[4730]: I0320 16:00:02.320839    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-x2t9r" event={"ID":"a05675d7-cd2f-4810-862b-cb0d2d13cbdd","Type":"ContainerStarted","Data":"31cc41440870730294a7b216b0c2b45c7a76296191729b22f502e1990a4511bd"}
Mar 20 16:00:02 crc kubenswrapper[4730]: I0320 16:00:02.324785    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"]
Mar 20 16:00:02 crc kubenswrapper[4730]: I0320 16:00:02.331526    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77878fc4cf-8s7hs" event={"ID":"82ffcdbb-cebb-443a-a8af-3c3543bea13d","Type":"ContainerStarted","Data":"7a2cb82ca156020b21392ad78c85cb4ddbdb0874dad3d5fcc0112cae0cde0511"}
Mar 20 16:00:02 crc kubenswrapper[4730]: I0320 16:00:02.345764    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-hbplf" event={"ID":"6fb4d42d-6cd9-480c-8ee0-1e168504a4cd","Type":"ContainerStarted","Data":"ad1075e305b4a94d7393955deeb754baca56955fca19f6133e616c9e84808c7e"}
Mar 20 16:00:02 crc kubenswrapper[4730]: I0320 16:00:02.349500    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c4458311-d050-4887-b4c7-6df6c993d66e","Type":"ContainerStarted","Data":"fe390ee094d1d3c5ad2dd1bacd57dfb2a471207a757c9831b79190ce13256f10"}
Mar 20 16:00:02 crc kubenswrapper[4730]: I0320 16:00:02.353846    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-758cdcd5c9-9m42d" event={"ID":"c857c34a-0efc-4ebe-8f42-e562e88de7a4","Type":"ContainerStarted","Data":"7c04756ee5298735d97672f2fb6129f8d01b04d3f8a464980ca1051cf42e4065"}
Mar 20 16:00:02 crc kubenswrapper[4730]: I0320 16:00:02.380093    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"468631ad-821b-469e-a166-1d32d370e5fa","Type":"ContainerStarted","Data":"f7dc5000437c6d89457d868ca7a318170322b81dfd62eee1c3bc3df85ceba4e5"}
Mar 20 16:00:02 crc kubenswrapper[4730]: I0320 16:00:02.382539    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-tz6x7" event={"ID":"48fc8af0-e30f-4f3f-88d3-8b054c6359ef","Type":"ContainerStarted","Data":"af7beef135dc284222c89d8da5556d80f3f65072f0a1d94b5d3c1cbd5f0ae59a"}
Mar 20 16:00:02 crc kubenswrapper[4730]: I0320 16:00:02.411552    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567040-2zl4f" event={"ID":"97b63a10-b572-4a37-a2a4-079852aa2d3d","Type":"ContainerStarted","Data":"5160551b06a1e1f1e7b6689c4dec94a675729bc0e9edd00f1bb67e7f9a23750a"}
Mar 20 16:00:02 crc kubenswrapper[4730]: I0320 16:00:02.428416    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-cz69b" event={"ID":"672cfda1-2ec8-41fe-b3dc-eabe4e60726d","Type":"ContainerStarted","Data":"4bf30bdbb0501e2c053e337b9252dd926d2eb7fdfb0b6f4828ca3c6d25528b11"}
Mar 20 16:00:02 crc kubenswrapper[4730]: I0320 16:00:02.430305    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rmtsq" event={"ID":"e3349c2c-6f29-425d-9d25-b4f23821cfcc","Type":"ContainerStarted","Data":"89ef3de4f8d5002494a53f05fdcc4fa61cfc7cf388b35f48076aa3b98fc5e176"}
Mar 20 16:00:02 crc kubenswrapper[4730]: I0320 16:00:02.433468    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-z9mtx" event={"ID":"09f27249-61fb-4e13-9eb9-9b804f256d81","Type":"ContainerStarted","Data":"3f4c141955a3579b06be021435ce1c3642e2a9b4483a932d05648a4559764229"}
Mar 20 16:00:02 crc kubenswrapper[4730]: I0320 16:00:02.445230    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"]
Mar 20 16:00:02 crc kubenswrapper[4730]: I0320 16:00:02.515576    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"]
Mar 20 16:00:03 crc kubenswrapper[4730]: I0320 16:00:03.452585    4730 generic.go:334] "Generic (PLEG): container finished" podID="672cfda1-2ec8-41fe-b3dc-eabe4e60726d" containerID="aa12014b37ee0e01204777f8c797059805894b107ea52ba01e8a5d24299b55a5" exitCode=0
Mar 20 16:00:03 crc kubenswrapper[4730]: I0320 16:00:03.454659    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-cz69b" event={"ID":"672cfda1-2ec8-41fe-b3dc-eabe4e60726d","Type":"ContainerDied","Data":"aa12014b37ee0e01204777f8c797059805894b107ea52ba01e8a5d24299b55a5"}
Mar 20 16:00:03 crc kubenswrapper[4730]: I0320 16:00:03.461256    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b821c271-d46c-4a68-a6ea-438e616c4d47","Type":"ContainerStarted","Data":"20e1876e1a0a25a88aae1c6629f93da770afea8c540f5629af87b68779bdafc7"}
Mar 20 16:00:03 crc kubenswrapper[4730]: I0320 16:00:03.461696    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b821c271-d46c-4a68-a6ea-438e616c4d47","Type":"ContainerStarted","Data":"bbee271175baced0ec1f00dd406517b1f7f205b8c8460fadc05517bb59103028"}
Mar 20 16:00:03 crc kubenswrapper[4730]: I0320 16:00:03.469474    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c4458311-d050-4887-b4c7-6df6c993d66e","Type":"ContainerStarted","Data":"074644c96fd740e9300e0c15a05692e6097bf6107a8f64dc4cfd8d399bd7edb2"}
Mar 20 16:00:03 crc kubenswrapper[4730]: I0320 16:00:03.473334    4730 generic.go:334] "Generic (PLEG): container finished" podID="82ffcdbb-cebb-443a-a8af-3c3543bea13d" containerID="8048a1935689b83c55f9f97ca86b535cb03200a9107add7dbb79c33ad2385a52" exitCode=0
Mar 20 16:00:03 crc kubenswrapper[4730]: I0320 16:00:03.473400    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77878fc4cf-8s7hs" event={"ID":"82ffcdbb-cebb-443a-a8af-3c3543bea13d","Type":"ContainerDied","Data":"8048a1935689b83c55f9f97ca86b535cb03200a9107add7dbb79c33ad2385a52"}
Mar 20 16:00:03 crc kubenswrapper[4730]: I0320 16:00:03.477883    4730 generic.go:334] "Generic (PLEG): container finished" podID="c857c34a-0efc-4ebe-8f42-e562e88de7a4" containerID="7c04756ee5298735d97672f2fb6129f8d01b04d3f8a464980ca1051cf42e4065" exitCode=0
Mar 20 16:00:03 crc kubenswrapper[4730]: I0320 16:00:03.479181    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-758cdcd5c9-9m42d" event={"ID":"c857c34a-0efc-4ebe-8f42-e562e88de7a4","Type":"ContainerDied","Data":"7c04756ee5298735d97672f2fb6129f8d01b04d3f8a464980ca1051cf42e4065"}
Mar 20 16:00:03 crc kubenswrapper[4730]: I0320 16:00:03.532082    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-rmtsq" podStartSLOduration=4.53206494 podStartE2EDuration="4.53206494s" podCreationTimestamp="2026-03-20 15:59:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:00:03.518516237 +0000 UTC m=+1262.731887606" watchObservedRunningTime="2026-03-20 16:00:03.53206494 +0000 UTC m=+1262.745436309"
Mar 20 16:00:03 crc kubenswrapper[4730]: I0320 16:00:03.565685    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-z9mtx" podStartSLOduration=4.565665052 podStartE2EDuration="4.565665052s" podCreationTimestamp="2026-03-20 15:59:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:00:03.561549759 +0000 UTC m=+1262.774921148" watchObservedRunningTime="2026-03-20 16:00:03.565665052 +0000 UTC m=+1262.779036421"
Mar 20 16:00:04 crc kubenswrapper[4730]: I0320 16:00:04.482098    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-758cdcd5c9-9m42d"
Mar 20 16:00:04 crc kubenswrapper[4730]: I0320 16:00:04.493522    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-758cdcd5c9-9m42d"
Mar 20 16:00:04 crc kubenswrapper[4730]: I0320 16:00:04.493612    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-758cdcd5c9-9m42d" event={"ID":"c857c34a-0efc-4ebe-8f42-e562e88de7a4","Type":"ContainerDied","Data":"4c72e558e23a7c4c418bb2168aa9ecd0cbbacb82442ab392bdc56b6a65616fb6"}
Mar 20 16:00:04 crc kubenswrapper[4730]: I0320 16:00:04.493947    4730 scope.go:117] "RemoveContainer" containerID="7c04756ee5298735d97672f2fb6129f8d01b04d3f8a464980ca1051cf42e4065"
Mar 20 16:00:04 crc kubenswrapper[4730]: I0320 16:00:04.540920    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c857c34a-0efc-4ebe-8f42-e562e88de7a4-ovsdbserver-nb\") pod \"c857c34a-0efc-4ebe-8f42-e562e88de7a4\" (UID: \"c857c34a-0efc-4ebe-8f42-e562e88de7a4\") "
Mar 20 16:00:04 crc kubenswrapper[4730]: I0320 16:00:04.540992    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c857c34a-0efc-4ebe-8f42-e562e88de7a4-ovsdbserver-sb\") pod \"c857c34a-0efc-4ebe-8f42-e562e88de7a4\" (UID: \"c857c34a-0efc-4ebe-8f42-e562e88de7a4\") "
Mar 20 16:00:04 crc kubenswrapper[4730]: I0320 16:00:04.541032    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c857c34a-0efc-4ebe-8f42-e562e88de7a4-config\") pod \"c857c34a-0efc-4ebe-8f42-e562e88de7a4\" (UID: \"c857c34a-0efc-4ebe-8f42-e562e88de7a4\") "
Mar 20 16:00:04 crc kubenswrapper[4730]: I0320 16:00:04.541076    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c857c34a-0efc-4ebe-8f42-e562e88de7a4-dns-swift-storage-0\") pod \"c857c34a-0efc-4ebe-8f42-e562e88de7a4\" (UID: \"c857c34a-0efc-4ebe-8f42-e562e88de7a4\") "
Mar 20 16:00:04 crc kubenswrapper[4730]: I0320 16:00:04.541205    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4d8h\" (UniqueName: \"kubernetes.io/projected/c857c34a-0efc-4ebe-8f42-e562e88de7a4-kube-api-access-q4d8h\") pod \"c857c34a-0efc-4ebe-8f42-e562e88de7a4\" (UID: \"c857c34a-0efc-4ebe-8f42-e562e88de7a4\") "
Mar 20 16:00:04 crc kubenswrapper[4730]: I0320 16:00:04.541353    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c857c34a-0efc-4ebe-8f42-e562e88de7a4-dns-svc\") pod \"c857c34a-0efc-4ebe-8f42-e562e88de7a4\" (UID: \"c857c34a-0efc-4ebe-8f42-e562e88de7a4\") "
Mar 20 16:00:04 crc kubenswrapper[4730]: I0320 16:00:04.550398    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c857c34a-0efc-4ebe-8f42-e562e88de7a4-kube-api-access-q4d8h" (OuterVolumeSpecName: "kube-api-access-q4d8h") pod "c857c34a-0efc-4ebe-8f42-e562e88de7a4" (UID: "c857c34a-0efc-4ebe-8f42-e562e88de7a4"). InnerVolumeSpecName "kube-api-access-q4d8h". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:00:04 crc kubenswrapper[4730]: I0320 16:00:04.630892    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c857c34a-0efc-4ebe-8f42-e562e88de7a4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c857c34a-0efc-4ebe-8f42-e562e88de7a4" (UID: "c857c34a-0efc-4ebe-8f42-e562e88de7a4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:00:04 crc kubenswrapper[4730]: I0320 16:00:04.635872    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c857c34a-0efc-4ebe-8f42-e562e88de7a4-config" (OuterVolumeSpecName: "config") pod "c857c34a-0efc-4ebe-8f42-e562e88de7a4" (UID: "c857c34a-0efc-4ebe-8f42-e562e88de7a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:00:04 crc kubenswrapper[4730]: I0320 16:00:04.643884    4730 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c857c34a-0efc-4ebe-8f42-e562e88de7a4-dns-svc\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:04 crc kubenswrapper[4730]: I0320 16:00:04.643911    4730 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c857c34a-0efc-4ebe-8f42-e562e88de7a4-config\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:04 crc kubenswrapper[4730]: I0320 16:00:04.643920    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4d8h\" (UniqueName: \"kubernetes.io/projected/c857c34a-0efc-4ebe-8f42-e562e88de7a4-kube-api-access-q4d8h\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:04 crc kubenswrapper[4730]: I0320 16:00:04.648011    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c857c34a-0efc-4ebe-8f42-e562e88de7a4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c857c34a-0efc-4ebe-8f42-e562e88de7a4" (UID: "c857c34a-0efc-4ebe-8f42-e562e88de7a4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:00:04 crc kubenswrapper[4730]: I0320 16:00:04.682065    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c857c34a-0efc-4ebe-8f42-e562e88de7a4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c857c34a-0efc-4ebe-8f42-e562e88de7a4" (UID: "c857c34a-0efc-4ebe-8f42-e562e88de7a4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:00:04 crc kubenswrapper[4730]: I0320 16:00:04.684190    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c857c34a-0efc-4ebe-8f42-e562e88de7a4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c857c34a-0efc-4ebe-8f42-e562e88de7a4" (UID: "c857c34a-0efc-4ebe-8f42-e562e88de7a4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:00:04 crc kubenswrapper[4730]: I0320 16:00:04.749664    4730 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c857c34a-0efc-4ebe-8f42-e562e88de7a4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:04 crc kubenswrapper[4730]: I0320 16:00:04.749701    4730 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c857c34a-0efc-4ebe-8f42-e562e88de7a4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:04 crc kubenswrapper[4730]: I0320 16:00:04.749731    4730 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c857c34a-0efc-4ebe-8f42-e562e88de7a4-dns-swift-storage-0\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:05 crc kubenswrapper[4730]: I0320 16:00:05.007700    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-cz69b"
Mar 20 16:00:05 crc kubenswrapper[4730]: I0320 16:00:05.069027    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/672cfda1-2ec8-41fe-b3dc-eabe4e60726d-secret-volume\") pod \"672cfda1-2ec8-41fe-b3dc-eabe4e60726d\" (UID: \"672cfda1-2ec8-41fe-b3dc-eabe4e60726d\") "
Mar 20 16:00:05 crc kubenswrapper[4730]: I0320 16:00:05.069257    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxpm7\" (UniqueName: \"kubernetes.io/projected/672cfda1-2ec8-41fe-b3dc-eabe4e60726d-kube-api-access-wxpm7\") pod \"672cfda1-2ec8-41fe-b3dc-eabe4e60726d\" (UID: \"672cfda1-2ec8-41fe-b3dc-eabe4e60726d\") "
Mar 20 16:00:05 crc kubenswrapper[4730]: I0320 16:00:05.069302    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/672cfda1-2ec8-41fe-b3dc-eabe4e60726d-config-volume\") pod \"672cfda1-2ec8-41fe-b3dc-eabe4e60726d\" (UID: \"672cfda1-2ec8-41fe-b3dc-eabe4e60726d\") "
Mar 20 16:00:05 crc kubenswrapper[4730]: I0320 16:00:05.071853    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/672cfda1-2ec8-41fe-b3dc-eabe4e60726d-config-volume" (OuterVolumeSpecName: "config-volume") pod "672cfda1-2ec8-41fe-b3dc-eabe4e60726d" (UID: "672cfda1-2ec8-41fe-b3dc-eabe4e60726d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:00:05 crc kubenswrapper[4730]: I0320 16:00:05.073633    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/672cfda1-2ec8-41fe-b3dc-eabe4e60726d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "672cfda1-2ec8-41fe-b3dc-eabe4e60726d" (UID: "672cfda1-2ec8-41fe-b3dc-eabe4e60726d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:00:05 crc kubenswrapper[4730]: I0320 16:00:05.100579    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/672cfda1-2ec8-41fe-b3dc-eabe4e60726d-kube-api-access-wxpm7" (OuterVolumeSpecName: "kube-api-access-wxpm7") pod "672cfda1-2ec8-41fe-b3dc-eabe4e60726d" (UID: "672cfda1-2ec8-41fe-b3dc-eabe4e60726d"). InnerVolumeSpecName "kube-api-access-wxpm7". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:00:05 crc kubenswrapper[4730]: I0320 16:00:05.129115    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-758cdcd5c9-9m42d"]
Mar 20 16:00:05 crc kubenswrapper[4730]: I0320 16:00:05.150368    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-758cdcd5c9-9m42d"]
Mar 20 16:00:05 crc kubenswrapper[4730]: I0320 16:00:05.176391    4730 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/672cfda1-2ec8-41fe-b3dc-eabe4e60726d-secret-volume\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:05 crc kubenswrapper[4730]: I0320 16:00:05.176425    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxpm7\" (UniqueName: \"kubernetes.io/projected/672cfda1-2ec8-41fe-b3dc-eabe4e60726d-kube-api-access-wxpm7\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:05 crc kubenswrapper[4730]: I0320 16:00:05.176438    4730 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/672cfda1-2ec8-41fe-b3dc-eabe4e60726d-config-volume\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:05 crc kubenswrapper[4730]: I0320 16:00:05.511896    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77878fc4cf-8s7hs" event={"ID":"82ffcdbb-cebb-443a-a8af-3c3543bea13d","Type":"ContainerStarted","Data":"a42cd9a54cab5432c7c6a61bb8a81e8c16b97e8e15d954234ce03c8ac58b65f1"}
Mar 20 16:00:05 crc kubenswrapper[4730]: I0320 16:00:05.513272    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77878fc4cf-8s7hs"
Mar 20 16:00:05 crc kubenswrapper[4730]: I0320 16:00:05.519173    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"468631ad-821b-469e-a166-1d32d370e5fa","Type":"ContainerStarted","Data":"3f2599acfa566a2a6933c070fb527ae037e48d58578163cf559260ee0ee91126"}
Mar 20 16:00:05 crc kubenswrapper[4730]: I0320 16:00:05.519406    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="468631ad-821b-469e-a166-1d32d370e5fa" containerName="watcher-api-log" containerID="cri-o://f7dc5000437c6d89457d868ca7a318170322b81dfd62eee1c3bc3df85ceba4e5" gracePeriod=30
Mar 20 16:00:05 crc kubenswrapper[4730]: I0320 16:00:05.519874    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="468631ad-821b-469e-a166-1d32d370e5fa" containerName="watcher-api" containerID="cri-o://3f2599acfa566a2a6933c070fb527ae037e48d58578163cf559260ee0ee91126" gracePeriod=30
Mar 20 16:00:05 crc kubenswrapper[4730]: I0320 16:00:05.520014    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0"
Mar 20 16:00:05 crc kubenswrapper[4730]: I0320 16:00:05.521446    4730 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="468631ad-821b-469e-a166-1d32d370e5fa" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.155:9322/\": dial tcp 10.217.0.155:9322: connect: connection refused"
Mar 20 16:00:05 crc kubenswrapper[4730]: I0320 16:00:05.527114    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-cz69b" event={"ID":"672cfda1-2ec8-41fe-b3dc-eabe4e60726d","Type":"ContainerDied","Data":"4bf30bdbb0501e2c053e337b9252dd926d2eb7fdfb0b6f4828ca3c6d25528b11"}
Mar 20 16:00:05 crc kubenswrapper[4730]: I0320 16:00:05.527160    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4bf30bdbb0501e2c053e337b9252dd926d2eb7fdfb0b6f4828ca3c6d25528b11"
Mar 20 16:00:05 crc kubenswrapper[4730]: I0320 16:00:05.527212    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-cz69b"
Mar 20 16:00:05 crc kubenswrapper[4730]: I0320 16:00:05.546819    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77878fc4cf-8s7hs" podStartSLOduration=6.546796606 podStartE2EDuration="6.546796606s" podCreationTimestamp="2026-03-20 15:59:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:00:05.531982135 +0000 UTC m=+1264.745353504" watchObservedRunningTime="2026-03-20 16:00:05.546796606 +0000 UTC m=+1264.760167975"
Mar 20 16:00:05 crc kubenswrapper[4730]: I0320 16:00:05.560861    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=6.56084403 podStartE2EDuration="6.56084403s" podCreationTimestamp="2026-03-20 15:59:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:00:05.554984459 +0000 UTC m=+1264.768355828" watchObservedRunningTime="2026-03-20 16:00:05.56084403 +0000 UTC m=+1264.774215399"
Mar 20 16:00:05 crc kubenswrapper[4730]: I0320 16:00:05.571215    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c857c34a-0efc-4ebe-8f42-e562e88de7a4" path="/var/lib/kubelet/pods/c857c34a-0efc-4ebe-8f42-e562e88de7a4/volumes"
Mar 20 16:00:05 crc kubenswrapper[4730]: I0320 16:00:05.573503    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"9f653a7b-251e-4eb2-92cd-74e23ac4dba5","Type":"ContainerStarted","Data":"859fbb7a55a48ffe4a6d03732d3cc6088c3d226367ded637f27bf24936c41dba"}
Mar 20 16:00:05 crc kubenswrapper[4730]: I0320 16:00:05.576770    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"3f6c808e-d523-48bd-8ec2-28b625834317","Type":"ContainerStarted","Data":"c057d3a2f6ef1e71f3dc2bc7abb9c07a0dfab9e5b78bf4eb4546276a24c2d109"}
Mar 20 16:00:05 crc kubenswrapper[4730]: I0320 16:00:05.597185    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=2.971820327 podStartE2EDuration="6.597162422s" podCreationTimestamp="2026-03-20 15:59:59 +0000 UTC" firstStartedPulling="2026-03-20 16:00:00.863683595 +0000 UTC m=+1260.077054964" lastFinishedPulling="2026-03-20 16:00:04.48902569 +0000 UTC m=+1263.702397059" observedRunningTime="2026-03-20 16:00:05.590883862 +0000 UTC m=+1264.804255241" watchObservedRunningTime="2026-03-20 16:00:05.597162422 +0000 UTC m=+1264.810533791"
Mar 20 16:00:06 crc kubenswrapper[4730]: I0320 16:00:06.048217    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=3.256383911 podStartE2EDuration="7.048200439s" podCreationTimestamp="2026-03-20 15:59:59 +0000 UTC" firstStartedPulling="2026-03-20 16:00:00.85675141 +0000 UTC m=+1260.070122779" lastFinishedPulling="2026-03-20 16:00:04.648567938 +0000 UTC m=+1263.861939307" observedRunningTime="2026-03-20 16:00:05.619608484 +0000 UTC m=+1264.832979853" watchObservedRunningTime="2026-03-20 16:00:06.048200439 +0000 UTC m=+1265.261571808"
Mar 20 16:00:06 crc kubenswrapper[4730]: I0320 16:00:06.587897    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b821c271-d46c-4a68-a6ea-438e616c4d47","Type":"ContainerStarted","Data":"5b6df27b1ba23b9df05a1fffaaba05b754c2fb2f90da0b4adb5478040787f576"}
Mar 20 16:00:06 crc kubenswrapper[4730]: I0320 16:00:06.588003    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b821c271-d46c-4a68-a6ea-438e616c4d47" containerName="glance-log" containerID="cri-o://20e1876e1a0a25a88aae1c6629f93da770afea8c540f5629af87b68779bdafc7" gracePeriod=30
Mar 20 16:00:06 crc kubenswrapper[4730]: I0320 16:00:06.588039    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b821c271-d46c-4a68-a6ea-438e616c4d47" containerName="glance-httpd" containerID="cri-o://5b6df27b1ba23b9df05a1fffaaba05b754c2fb2f90da0b4adb5478040787f576" gracePeriod=30
Mar 20 16:00:06 crc kubenswrapper[4730]: I0320 16:00:06.592194    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c4458311-d050-4887-b4c7-6df6c993d66e","Type":"ContainerStarted","Data":"9a79a1bc9a5cd84e40618d9d5567e18d723a975ff2aadedb78d41e84461ef193"}
Mar 20 16:00:06 crc kubenswrapper[4730]: I0320 16:00:06.592326    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c4458311-d050-4887-b4c7-6df6c993d66e" containerName="glance-httpd" containerID="cri-o://9a79a1bc9a5cd84e40618d9d5567e18d723a975ff2aadedb78d41e84461ef193" gracePeriod=30
Mar 20 16:00:06 crc kubenswrapper[4730]: I0320 16:00:06.592485    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c4458311-d050-4887-b4c7-6df6c993d66e" containerName="glance-log" containerID="cri-o://074644c96fd740e9300e0c15a05692e6097bf6107a8f64dc4cfd8d399bd7edb2" gracePeriod=30
Mar 20 16:00:06 crc kubenswrapper[4730]: I0320 16:00:06.595454    4730 generic.go:334] "Generic (PLEG): container finished" podID="468631ad-821b-469e-a166-1d32d370e5fa" containerID="f7dc5000437c6d89457d868ca7a318170322b81dfd62eee1c3bc3df85ceba4e5" exitCode=143
Mar 20 16:00:06 crc kubenswrapper[4730]: I0320 16:00:06.595557    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"468631ad-821b-469e-a166-1d32d370e5fa","Type":"ContainerDied","Data":"f7dc5000437c6d89457d868ca7a318170322b81dfd62eee1c3bc3df85ceba4e5"}
Mar 20 16:00:06 crc kubenswrapper[4730]: I0320 16:00:06.618750    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.618733948 podStartE2EDuration="7.618733948s" podCreationTimestamp="2026-03-20 15:59:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:00:06.617217825 +0000 UTC m=+1265.830589194" watchObservedRunningTime="2026-03-20 16:00:06.618733948 +0000 UTC m=+1265.832105317"
Mar 20 16:00:06 crc kubenswrapper[4730]: I0320 16:00:06.651677    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.651659475 podStartE2EDuration="7.651659475s" podCreationTimestamp="2026-03-20 15:59:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:00:06.644409083 +0000 UTC m=+1265.857780452" watchObservedRunningTime="2026-03-20 16:00:06.651659475 +0000 UTC m=+1265.865030834"
Mar 20 16:00:07 crc kubenswrapper[4730]: I0320 16:00:07.607781    4730 generic.go:334] "Generic (PLEG): container finished" podID="c4458311-d050-4887-b4c7-6df6c993d66e" containerID="9a79a1bc9a5cd84e40618d9d5567e18d723a975ff2aadedb78d41e84461ef193" exitCode=0
Mar 20 16:00:07 crc kubenswrapper[4730]: I0320 16:00:07.607828    4730 generic.go:334] "Generic (PLEG): container finished" podID="c4458311-d050-4887-b4c7-6df6c993d66e" containerID="074644c96fd740e9300e0c15a05692e6097bf6107a8f64dc4cfd8d399bd7edb2" exitCode=143
Mar 20 16:00:07 crc kubenswrapper[4730]: I0320 16:00:07.607890    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c4458311-d050-4887-b4c7-6df6c993d66e","Type":"ContainerDied","Data":"9a79a1bc9a5cd84e40618d9d5567e18d723a975ff2aadedb78d41e84461ef193"}
Mar 20 16:00:07 crc kubenswrapper[4730]: I0320 16:00:07.607919    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c4458311-d050-4887-b4c7-6df6c993d66e","Type":"ContainerDied","Data":"074644c96fd740e9300e0c15a05692e6097bf6107a8f64dc4cfd8d399bd7edb2"}
Mar 20 16:00:07 crc kubenswrapper[4730]: I0320 16:00:07.615057    4730 generic.go:334] "Generic (PLEG): container finished" podID="b821c271-d46c-4a68-a6ea-438e616c4d47" containerID="5b6df27b1ba23b9df05a1fffaaba05b754c2fb2f90da0b4adb5478040787f576" exitCode=0
Mar 20 16:00:07 crc kubenswrapper[4730]: I0320 16:00:07.615091    4730 generic.go:334] "Generic (PLEG): container finished" podID="b821c271-d46c-4a68-a6ea-438e616c4d47" containerID="20e1876e1a0a25a88aae1c6629f93da770afea8c540f5629af87b68779bdafc7" exitCode=143
Mar 20 16:00:07 crc kubenswrapper[4730]: I0320 16:00:07.615125    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b821c271-d46c-4a68-a6ea-438e616c4d47","Type":"ContainerDied","Data":"5b6df27b1ba23b9df05a1fffaaba05b754c2fb2f90da0b4adb5478040787f576"}
Mar 20 16:00:07 crc kubenswrapper[4730]: I0320 16:00:07.615162    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b821c271-d46c-4a68-a6ea-438e616c4d47","Type":"ContainerDied","Data":"20e1876e1a0a25a88aae1c6629f93da770afea8c540f5629af87b68779bdafc7"}
Mar 20 16:00:08 crc kubenswrapper[4730]: I0320 16:00:08.624794    4730 generic.go:334] "Generic (PLEG): container finished" podID="e3349c2c-6f29-425d-9d25-b4f23821cfcc" containerID="89ef3de4f8d5002494a53f05fdcc4fa61cfc7cf388b35f48076aa3b98fc5e176" exitCode=0
Mar 20 16:00:08 crc kubenswrapper[4730]: I0320 16:00:08.625138    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rmtsq" event={"ID":"e3349c2c-6f29-425d-9d25-b4f23821cfcc","Type":"ContainerDied","Data":"89ef3de4f8d5002494a53f05fdcc4fa61cfc7cf388b35f48076aa3b98fc5e176"}
Mar 20 16:00:09 crc kubenswrapper[4730]: I0320 16:00:09.636805    4730 generic.go:334] "Generic (PLEG): container finished" podID="3f6c808e-d523-48bd-8ec2-28b625834317" containerID="c057d3a2f6ef1e71f3dc2bc7abb9c07a0dfab9e5b78bf4eb4546276a24c2d109" exitCode=1
Mar 20 16:00:09 crc kubenswrapper[4730]: I0320 16:00:09.637073    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"3f6c808e-d523-48bd-8ec2-28b625834317","Type":"ContainerDied","Data":"c057d3a2f6ef1e71f3dc2bc7abb9c07a0dfab9e5b78bf4eb4546276a24c2d109"}
Mar 20 16:00:09 crc kubenswrapper[4730]: I0320 16:00:09.637922    4730 scope.go:117] "RemoveContainer" containerID="c057d3a2f6ef1e71f3dc2bc7abb9c07a0dfab9e5b78bf4eb4546276a24c2d109"
Mar 20 16:00:09 crc kubenswrapper[4730]: I0320 16:00:09.964945    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0"
Mar 20 16:00:09 crc kubenswrapper[4730]: I0320 16:00:09.965222    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0"
Mar 20 16:00:09 crc kubenswrapper[4730]: I0320 16:00:09.974644    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0"
Mar 20 16:00:09 crc kubenswrapper[4730]: I0320 16:00:09.974673    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0"
Mar 20 16:00:10 crc kubenswrapper[4730]: I0320 16:00:10.000154    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0"
Mar 20 16:00:10 crc kubenswrapper[4730]: I0320 16:00:10.272446    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0"
Mar 20 16:00:10 crc kubenswrapper[4730]: I0320 16:00:10.540661    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77878fc4cf-8s7hs"
Mar 20 16:00:10 crc kubenswrapper[4730]: I0320 16:00:10.630002    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59fc649cc7-tct2h"]
Mar 20 16:00:10 crc kubenswrapper[4730]: I0320 16:00:10.631662    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59fc649cc7-tct2h" podUID="37d31419-eada-4b93-bc20-bac232ced058" containerName="dnsmasq-dns" containerID="cri-o://bb5b7167e8f80b19b6b3c7bf7988748aafee8cd2702715c1020def5b6b2b9fb6" gracePeriod=10
Mar 20 16:00:10 crc kubenswrapper[4730]: I0320 16:00:10.715971    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0"
Mar 20 16:00:10 crc kubenswrapper[4730]: I0320 16:00:10.759637    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"]
Mar 20 16:00:11 crc kubenswrapper[4730]: I0320 16:00:11.423102    4730 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-59fc649cc7-tct2h" podUID="37d31419-eada-4b93-bc20-bac232ced058" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.142:5353: connect: connection refused"
Mar 20 16:00:11 crc kubenswrapper[4730]: I0320 16:00:11.670098    4730 generic.go:334] "Generic (PLEG): container finished" podID="37d31419-eada-4b93-bc20-bac232ced058" containerID="bb5b7167e8f80b19b6b3c7bf7988748aafee8cd2702715c1020def5b6b2b9fb6" exitCode=0
Mar 20 16:00:11 crc kubenswrapper[4730]: I0320 16:00:11.670314    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59fc649cc7-tct2h" event={"ID":"37d31419-eada-4b93-bc20-bac232ced058","Type":"ContainerDied","Data":"bb5b7167e8f80b19b6b3c7bf7988748aafee8cd2702715c1020def5b6b2b9fb6"}
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.200846    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.213644    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0"
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.330754    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"c4458311-d050-4887-b4c7-6df6c993d66e\" (UID: \"c4458311-d050-4887-b4c7-6df6c993d66e\") "
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.330805    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4458311-d050-4887-b4c7-6df6c993d66e-scripts\") pod \"c4458311-d050-4887-b4c7-6df6c993d66e\" (UID: \"c4458311-d050-4887-b4c7-6df6c993d66e\") "
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.330835    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b821c271-d46c-4a68-a6ea-438e616c4d47-config-data\") pod \"b821c271-d46c-4a68-a6ea-438e616c4d47\" (UID: \"b821c271-d46c-4a68-a6ea-438e616c4d47\") "
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.330869    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"b821c271-d46c-4a68-a6ea-438e616c4d47\" (UID: \"b821c271-d46c-4a68-a6ea-438e616c4d47\") "
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.330899    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b821c271-d46c-4a68-a6ea-438e616c4d47-public-tls-certs\") pod \"b821c271-d46c-4a68-a6ea-438e616c4d47\" (UID: \"b821c271-d46c-4a68-a6ea-438e616c4d47\") "
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.331010    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgxjd\" (UniqueName: \"kubernetes.io/projected/c4458311-d050-4887-b4c7-6df6c993d66e-kube-api-access-jgxjd\") pod \"c4458311-d050-4887-b4c7-6df6c993d66e\" (UID: \"c4458311-d050-4887-b4c7-6df6c993d66e\") "
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.331042    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4t8c\" (UniqueName: \"kubernetes.io/projected/b821c271-d46c-4a68-a6ea-438e616c4d47-kube-api-access-g4t8c\") pod \"b821c271-d46c-4a68-a6ea-438e616c4d47\" (UID: \"b821c271-d46c-4a68-a6ea-438e616c4d47\") "
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.331132    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c4458311-d050-4887-b4c7-6df6c993d66e-httpd-run\") pod \"c4458311-d050-4887-b4c7-6df6c993d66e\" (UID: \"c4458311-d050-4887-b4c7-6df6c993d66e\") "
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.335239    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4458311-d050-4887-b4c7-6df6c993d66e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c4458311-d050-4887-b4c7-6df6c993d66e" (UID: "c4458311-d050-4887-b4c7-6df6c993d66e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.335427    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b821c271-d46c-4a68-a6ea-438e616c4d47-httpd-run\") pod \"b821c271-d46c-4a68-a6ea-438e616c4d47\" (UID: \"b821c271-d46c-4a68-a6ea-438e616c4d47\") "
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.335516    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b821c271-d46c-4a68-a6ea-438e616c4d47-scripts\") pod \"b821c271-d46c-4a68-a6ea-438e616c4d47\" (UID: \"b821c271-d46c-4a68-a6ea-438e616c4d47\") "
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.336799    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b821c271-d46c-4a68-a6ea-438e616c4d47-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b821c271-d46c-4a68-a6ea-438e616c4d47" (UID: "b821c271-d46c-4a68-a6ea-438e616c4d47"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.336873    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b821c271-d46c-4a68-a6ea-438e616c4d47-logs\") pod \"b821c271-d46c-4a68-a6ea-438e616c4d47\" (UID: \"b821c271-d46c-4a68-a6ea-438e616c4d47\") "
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.337319    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4458311-d050-4887-b4c7-6df6c993d66e-scripts" (OuterVolumeSpecName: "scripts") pod "c4458311-d050-4887-b4c7-6df6c993d66e" (UID: "c4458311-d050-4887-b4c7-6df6c993d66e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.338824    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "c4458311-d050-4887-b4c7-6df6c993d66e" (UID: "c4458311-d050-4887-b4c7-6df6c993d66e"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue ""
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.339341    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4458311-d050-4887-b4c7-6df6c993d66e-logs" (OuterVolumeSpecName: "logs") pod "c4458311-d050-4887-b4c7-6df6c993d66e" (UID: "c4458311-d050-4887-b4c7-6df6c993d66e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.339769    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "b821c271-d46c-4a68-a6ea-438e616c4d47" (UID: "b821c271-d46c-4a68-a6ea-438e616c4d47"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue ""
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.340230    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b821c271-d46c-4a68-a6ea-438e616c4d47-kube-api-access-g4t8c" (OuterVolumeSpecName: "kube-api-access-g4t8c") pod "b821c271-d46c-4a68-a6ea-438e616c4d47" (UID: "b821c271-d46c-4a68-a6ea-438e616c4d47"). InnerVolumeSpecName "kube-api-access-g4t8c". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.342989    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4458311-d050-4887-b4c7-6df6c993d66e-kube-api-access-jgxjd" (OuterVolumeSpecName: "kube-api-access-jgxjd") pod "c4458311-d050-4887-b4c7-6df6c993d66e" (UID: "c4458311-d050-4887-b4c7-6df6c993d66e"). InnerVolumeSpecName "kube-api-access-jgxjd". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.344278    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b821c271-d46c-4a68-a6ea-438e616c4d47-logs" (OuterVolumeSpecName: "logs") pod "b821c271-d46c-4a68-a6ea-438e616c4d47" (UID: "b821c271-d46c-4a68-a6ea-438e616c4d47"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.344357    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4458311-d050-4887-b4c7-6df6c993d66e-logs\") pod \"c4458311-d050-4887-b4c7-6df6c993d66e\" (UID: \"c4458311-d050-4887-b4c7-6df6c993d66e\") "
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.344428    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4458311-d050-4887-b4c7-6df6c993d66e-combined-ca-bundle\") pod \"c4458311-d050-4887-b4c7-6df6c993d66e\" (UID: \"c4458311-d050-4887-b4c7-6df6c993d66e\") "
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.344493    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b821c271-d46c-4a68-a6ea-438e616c4d47-combined-ca-bundle\") pod \"b821c271-d46c-4a68-a6ea-438e616c4d47\" (UID: \"b821c271-d46c-4a68-a6ea-438e616c4d47\") "
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.344527    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4458311-d050-4887-b4c7-6df6c993d66e-config-data\") pod \"c4458311-d050-4887-b4c7-6df6c993d66e\" (UID: \"c4458311-d050-4887-b4c7-6df6c993d66e\") "
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.344561    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4458311-d050-4887-b4c7-6df6c993d66e-internal-tls-certs\") pod \"c4458311-d050-4887-b4c7-6df6c993d66e\" (UID: \"c4458311-d050-4887-b4c7-6df6c993d66e\") "
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.345975    4730 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" "
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.345999    4730 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4458311-d050-4887-b4c7-6df6c993d66e-scripts\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.346016    4730 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" "
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.346028    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgxjd\" (UniqueName: \"kubernetes.io/projected/c4458311-d050-4887-b4c7-6df6c993d66e-kube-api-access-jgxjd\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.346047    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4t8c\" (UniqueName: \"kubernetes.io/projected/b821c271-d46c-4a68-a6ea-438e616c4d47-kube-api-access-g4t8c\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.346058    4730 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c4458311-d050-4887-b4c7-6df6c993d66e-httpd-run\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.346068    4730 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b821c271-d46c-4a68-a6ea-438e616c4d47-httpd-run\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.346078    4730 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b821c271-d46c-4a68-a6ea-438e616c4d47-logs\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.346088    4730 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4458311-d050-4887-b4c7-6df6c993d66e-logs\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.372610    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b821c271-d46c-4a68-a6ea-438e616c4d47-scripts" (OuterVolumeSpecName: "scripts") pod "b821c271-d46c-4a68-a6ea-438e616c4d47" (UID: "b821c271-d46c-4a68-a6ea-438e616c4d47"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.377676    4730 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc"
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.380730    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4458311-d050-4887-b4c7-6df6c993d66e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c4458311-d050-4887-b4c7-6df6c993d66e" (UID: "c4458311-d050-4887-b4c7-6df6c993d66e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.386521    4730 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc"
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.414496    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b821c271-d46c-4a68-a6ea-438e616c4d47-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b821c271-d46c-4a68-a6ea-438e616c4d47" (UID: "b821c271-d46c-4a68-a6ea-438e616c4d47"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.414612    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4458311-d050-4887-b4c7-6df6c993d66e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c4458311-d050-4887-b4c7-6df6c993d66e" (UID: "c4458311-d050-4887-b4c7-6df6c993d66e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.421631    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4458311-d050-4887-b4c7-6df6c993d66e-config-data" (OuterVolumeSpecName: "config-data") pod "c4458311-d050-4887-b4c7-6df6c993d66e" (UID: "c4458311-d050-4887-b4c7-6df6c993d66e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.428376    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b821c271-d46c-4a68-a6ea-438e616c4d47-config-data" (OuterVolumeSpecName: "config-data") pod "b821c271-d46c-4a68-a6ea-438e616c4d47" (UID: "b821c271-d46c-4a68-a6ea-438e616c4d47"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.436331    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b821c271-d46c-4a68-a6ea-438e616c4d47-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b821c271-d46c-4a68-a6ea-438e616c4d47" (UID: "b821c271-d46c-4a68-a6ea-438e616c4d47"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.447954    4730 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b821c271-d46c-4a68-a6ea-438e616c4d47-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.447995    4730 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4458311-d050-4887-b4c7-6df6c993d66e-config-data\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.448008    4730 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4458311-d050-4887-b4c7-6df6c993d66e-internal-tls-certs\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.448021    4730 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.448033    4730 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b821c271-d46c-4a68-a6ea-438e616c4d47-config-data\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.448043    4730 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.448055    4730 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b821c271-d46c-4a68-a6ea-438e616c4d47-public-tls-certs\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.448065    4730 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b821c271-d46c-4a68-a6ea-438e616c4d47-scripts\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.448075    4730 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4458311-d050-4887-b4c7-6df6c993d66e-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.682168    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c4458311-d050-4887-b4c7-6df6c993d66e","Type":"ContainerDied","Data":"fe390ee094d1d3c5ad2dd1bacd57dfb2a471207a757c9831b79190ce13256f10"}
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.682225    4730 scope.go:117] "RemoveContainer" containerID="9a79a1bc9a5cd84e40618d9d5567e18d723a975ff2aadedb78d41e84461ef193"
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.682346    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.685303    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-applier-0" podUID="9f653a7b-251e-4eb2-92cd-74e23ac4dba5" containerName="watcher-applier" containerID="cri-o://859fbb7a55a48ffe4a6d03732d3cc6088c3d226367ded637f27bf24936c41dba" gracePeriod=30
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.685427    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0"
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.686201    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b821c271-d46c-4a68-a6ea-438e616c4d47","Type":"ContainerDied","Data":"bbee271175baced0ec1f00dd406517b1f7f205b8c8460fadc05517bb59103028"}
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.734924    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"]
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.759112    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"]
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.781765    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"]
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.796988    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"]
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.816156    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"]
Mar 20 16:00:12 crc kubenswrapper[4730]: E0320 16:00:12.817774    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4458311-d050-4887-b4c7-6df6c993d66e" containerName="glance-httpd"
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.817794    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4458311-d050-4887-b4c7-6df6c993d66e" containerName="glance-httpd"
Mar 20 16:00:12 crc kubenswrapper[4730]: E0320 16:00:12.817812    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="672cfda1-2ec8-41fe-b3dc-eabe4e60726d" containerName="collect-profiles"
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.817818    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="672cfda1-2ec8-41fe-b3dc-eabe4e60726d" containerName="collect-profiles"
Mar 20 16:00:12 crc kubenswrapper[4730]: E0320 16:00:12.817830    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b821c271-d46c-4a68-a6ea-438e616c4d47" containerName="glance-log"
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.817837    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="b821c271-d46c-4a68-a6ea-438e616c4d47" containerName="glance-log"
Mar 20 16:00:12 crc kubenswrapper[4730]: E0320 16:00:12.817848    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b821c271-d46c-4a68-a6ea-438e616c4d47" containerName="glance-httpd"
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.817854    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="b821c271-d46c-4a68-a6ea-438e616c4d47" containerName="glance-httpd"
Mar 20 16:00:12 crc kubenswrapper[4730]: E0320 16:00:12.817868    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c857c34a-0efc-4ebe-8f42-e562e88de7a4" containerName="init"
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.817874    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="c857c34a-0efc-4ebe-8f42-e562e88de7a4" containerName="init"
Mar 20 16:00:12 crc kubenswrapper[4730]: E0320 16:00:12.817893    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4458311-d050-4887-b4c7-6df6c993d66e" containerName="glance-log"
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.817900    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4458311-d050-4887-b4c7-6df6c993d66e" containerName="glance-log"
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.818070    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="c857c34a-0efc-4ebe-8f42-e562e88de7a4" containerName="init"
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.818081    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="672cfda1-2ec8-41fe-b3dc-eabe4e60726d" containerName="collect-profiles"
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.818092    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4458311-d050-4887-b4c7-6df6c993d66e" containerName="glance-log"
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.818101    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="b821c271-d46c-4a68-a6ea-438e616c4d47" containerName="glance-httpd"
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.818113    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4458311-d050-4887-b4c7-6df6c993d66e" containerName="glance-httpd"
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.818121    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="b821c271-d46c-4a68-a6ea-438e616c4d47" containerName="glance-log"
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.818964    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"]
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.819048    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0"
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.826450    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-b8g88"
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.827360    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data"
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.827590    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts"
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.829112    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc"
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.853003    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"]
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.857401    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.859642    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc"
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.860844    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data"
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.887581    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"]
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.964034    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82401c9f-f5f9-4bc6-a085-c89d3632493e-scripts\") pod \"glance-default-external-api-0\" (UID: \"82401c9f-f5f9-4bc6-a085-c89d3632493e\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.964118    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"82401c9f-f5f9-4bc6-a085-c89d3632493e\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.964156    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/82401c9f-f5f9-4bc6-a085-c89d3632493e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"82401c9f-f5f9-4bc6-a085-c89d3632493e\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.964391    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw75x\" (UniqueName: \"kubernetes.io/projected/82401c9f-f5f9-4bc6-a085-c89d3632493e-kube-api-access-pw75x\") pod \"glance-default-external-api-0\" (UID: \"82401c9f-f5f9-4bc6-a085-c89d3632493e\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.964474    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82401c9f-f5f9-4bc6-a085-c89d3632493e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"82401c9f-f5f9-4bc6-a085-c89d3632493e\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.964529    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82401c9f-f5f9-4bc6-a085-c89d3632493e-config-data\") pod \"glance-default-external-api-0\" (UID: \"82401c9f-f5f9-4bc6-a085-c89d3632493e\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.964553    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82401c9f-f5f9-4bc6-a085-c89d3632493e-logs\") pod \"glance-default-external-api-0\" (UID: \"82401c9f-f5f9-4bc6-a085-c89d3632493e\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.964630    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/82401c9f-f5f9-4bc6-a085-c89d3632493e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"82401c9f-f5f9-4bc6-a085-c89d3632493e\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.992160    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rmtsq"
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.993910    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59fc649cc7-tct2h"
Mar 20 16:00:12 crc kubenswrapper[4730]: I0320 16:00:12.998297    4730 scope.go:117] "RemoveContainer" containerID="074644c96fd740e9300e0c15a05692e6097bf6107a8f64dc4cfd8d399bd7edb2"
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.066964    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14cdd4b7-7a81-469f-ae2f-104b054cc583-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"14cdd4b7-7a81-469f-ae2f-104b054cc583\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.067032    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82401c9f-f5f9-4bc6-a085-c89d3632493e-config-data\") pod \"glance-default-external-api-0\" (UID: \"82401c9f-f5f9-4bc6-a085-c89d3632493e\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.067054    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82401c9f-f5f9-4bc6-a085-c89d3632493e-logs\") pod \"glance-default-external-api-0\" (UID: \"82401c9f-f5f9-4bc6-a085-c89d3632493e\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.067078    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14cdd4b7-7a81-469f-ae2f-104b054cc583-scripts\") pod \"glance-default-internal-api-0\" (UID: \"14cdd4b7-7a81-469f-ae2f-104b054cc583\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.067109    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/82401c9f-f5f9-4bc6-a085-c89d3632493e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"82401c9f-f5f9-4bc6-a085-c89d3632493e\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.067137    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"14cdd4b7-7a81-469f-ae2f-104b054cc583\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.067168    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/14cdd4b7-7a81-469f-ae2f-104b054cc583-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"14cdd4b7-7a81-469f-ae2f-104b054cc583\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.067210    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/14cdd4b7-7a81-469f-ae2f-104b054cc583-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"14cdd4b7-7a81-469f-ae2f-104b054cc583\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.067237    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82401c9f-f5f9-4bc6-a085-c89d3632493e-scripts\") pod \"glance-default-external-api-0\" (UID: \"82401c9f-f5f9-4bc6-a085-c89d3632493e\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.067287    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14cdd4b7-7a81-469f-ae2f-104b054cc583-logs\") pod \"glance-default-internal-api-0\" (UID: \"14cdd4b7-7a81-469f-ae2f-104b054cc583\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.067315    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"82401c9f-f5f9-4bc6-a085-c89d3632493e\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.067352    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/82401c9f-f5f9-4bc6-a085-c89d3632493e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"82401c9f-f5f9-4bc6-a085-c89d3632493e\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.067393    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-566vl\" (UniqueName: \"kubernetes.io/projected/14cdd4b7-7a81-469f-ae2f-104b054cc583-kube-api-access-566vl\") pod \"glance-default-internal-api-0\" (UID: \"14cdd4b7-7a81-469f-ae2f-104b054cc583\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.067415    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw75x\" (UniqueName: \"kubernetes.io/projected/82401c9f-f5f9-4bc6-a085-c89d3632493e-kube-api-access-pw75x\") pod \"glance-default-external-api-0\" (UID: \"82401c9f-f5f9-4bc6-a085-c89d3632493e\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.067467    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82401c9f-f5f9-4bc6-a085-c89d3632493e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"82401c9f-f5f9-4bc6-a085-c89d3632493e\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.067498    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14cdd4b7-7a81-469f-ae2f-104b054cc583-config-data\") pod \"glance-default-internal-api-0\" (UID: \"14cdd4b7-7a81-469f-ae2f-104b054cc583\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.067920    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82401c9f-f5f9-4bc6-a085-c89d3632493e-logs\") pod \"glance-default-external-api-0\" (UID: \"82401c9f-f5f9-4bc6-a085-c89d3632493e\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.068214    4730 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"82401c9f-f5f9-4bc6-a085-c89d3632493e\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0"
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.068323    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/82401c9f-f5f9-4bc6-a085-c89d3632493e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"82401c9f-f5f9-4bc6-a085-c89d3632493e\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.075301    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82401c9f-f5f9-4bc6-a085-c89d3632493e-config-data\") pod \"glance-default-external-api-0\" (UID: \"82401c9f-f5f9-4bc6-a085-c89d3632493e\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.082503    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/82401c9f-f5f9-4bc6-a085-c89d3632493e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"82401c9f-f5f9-4bc6-a085-c89d3632493e\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.083825    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82401c9f-f5f9-4bc6-a085-c89d3632493e-scripts\") pod \"glance-default-external-api-0\" (UID: \"82401c9f-f5f9-4bc6-a085-c89d3632493e\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.088449    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw75x\" (UniqueName: \"kubernetes.io/projected/82401c9f-f5f9-4bc6-a085-c89d3632493e-kube-api-access-pw75x\") pod \"glance-default-external-api-0\" (UID: \"82401c9f-f5f9-4bc6-a085-c89d3632493e\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.107016    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82401c9f-f5f9-4bc6-a085-c89d3632493e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"82401c9f-f5f9-4bc6-a085-c89d3632493e\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.113118    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"82401c9f-f5f9-4bc6-a085-c89d3632493e\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.123999    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0"
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.168838    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37d31419-eada-4b93-bc20-bac232ced058-dns-swift-storage-0\") pod \"37d31419-eada-4b93-bc20-bac232ced058\" (UID: \"37d31419-eada-4b93-bc20-bac232ced058\") "
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.168919    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37d31419-eada-4b93-bc20-bac232ced058-ovsdbserver-nb\") pod \"37d31419-eada-4b93-bc20-bac232ced058\" (UID: \"37d31419-eada-4b93-bc20-bac232ced058\") "
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.168960    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e3349c2c-6f29-425d-9d25-b4f23821cfcc-fernet-keys\") pod \"e3349c2c-6f29-425d-9d25-b4f23821cfcc\" (UID: \"e3349c2c-6f29-425d-9d25-b4f23821cfcc\") "
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.168991    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qj56l\" (UniqueName: \"kubernetes.io/projected/e3349c2c-6f29-425d-9d25-b4f23821cfcc-kube-api-access-qj56l\") pod \"e3349c2c-6f29-425d-9d25-b4f23821cfcc\" (UID: \"e3349c2c-6f29-425d-9d25-b4f23821cfcc\") "
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.169048    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6x9cc\" (UniqueName: \"kubernetes.io/projected/37d31419-eada-4b93-bc20-bac232ced058-kube-api-access-6x9cc\") pod \"37d31419-eada-4b93-bc20-bac232ced058\" (UID: \"37d31419-eada-4b93-bc20-bac232ced058\") "
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.169080    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3349c2c-6f29-425d-9d25-b4f23821cfcc-scripts\") pod \"e3349c2c-6f29-425d-9d25-b4f23821cfcc\" (UID: \"e3349c2c-6f29-425d-9d25-b4f23821cfcc\") "
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.169126    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37d31419-eada-4b93-bc20-bac232ced058-dns-svc\") pod \"37d31419-eada-4b93-bc20-bac232ced058\" (UID: \"37d31419-eada-4b93-bc20-bac232ced058\") "
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.169164    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3349c2c-6f29-425d-9d25-b4f23821cfcc-config-data\") pod \"e3349c2c-6f29-425d-9d25-b4f23821cfcc\" (UID: \"e3349c2c-6f29-425d-9d25-b4f23821cfcc\") "
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.169211    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3349c2c-6f29-425d-9d25-b4f23821cfcc-combined-ca-bundle\") pod \"e3349c2c-6f29-425d-9d25-b4f23821cfcc\" (UID: \"e3349c2c-6f29-425d-9d25-b4f23821cfcc\") "
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.169258    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e3349c2c-6f29-425d-9d25-b4f23821cfcc-credential-keys\") pod \"e3349c2c-6f29-425d-9d25-b4f23821cfcc\" (UID: \"e3349c2c-6f29-425d-9d25-b4f23821cfcc\") "
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.169374    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37d31419-eada-4b93-bc20-bac232ced058-ovsdbserver-sb\") pod \"37d31419-eada-4b93-bc20-bac232ced058\" (UID: \"37d31419-eada-4b93-bc20-bac232ced058\") "
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.169401    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37d31419-eada-4b93-bc20-bac232ced058-config\") pod \"37d31419-eada-4b93-bc20-bac232ced058\" (UID: \"37d31419-eada-4b93-bc20-bac232ced058\") "
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.169676    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14cdd4b7-7a81-469f-ae2f-104b054cc583-config-data\") pod \"glance-default-internal-api-0\" (UID: \"14cdd4b7-7a81-469f-ae2f-104b054cc583\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.169731    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14cdd4b7-7a81-469f-ae2f-104b054cc583-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"14cdd4b7-7a81-469f-ae2f-104b054cc583\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.169768    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14cdd4b7-7a81-469f-ae2f-104b054cc583-scripts\") pod \"glance-default-internal-api-0\" (UID: \"14cdd4b7-7a81-469f-ae2f-104b054cc583\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.169810    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"14cdd4b7-7a81-469f-ae2f-104b054cc583\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.169847    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/14cdd4b7-7a81-469f-ae2f-104b054cc583-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"14cdd4b7-7a81-469f-ae2f-104b054cc583\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.169893    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/14cdd4b7-7a81-469f-ae2f-104b054cc583-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"14cdd4b7-7a81-469f-ae2f-104b054cc583\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.169943    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14cdd4b7-7a81-469f-ae2f-104b054cc583-logs\") pod \"glance-default-internal-api-0\" (UID: \"14cdd4b7-7a81-469f-ae2f-104b054cc583\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.170009    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-566vl\" (UniqueName: \"kubernetes.io/projected/14cdd4b7-7a81-469f-ae2f-104b054cc583-kube-api-access-566vl\") pod \"glance-default-internal-api-0\" (UID: \"14cdd4b7-7a81-469f-ae2f-104b054cc583\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.171404    4730 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"14cdd4b7-7a81-469f-ae2f-104b054cc583\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.171931    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/14cdd4b7-7a81-469f-ae2f-104b054cc583-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"14cdd4b7-7a81-469f-ae2f-104b054cc583\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.172241    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14cdd4b7-7a81-469f-ae2f-104b054cc583-logs\") pod \"glance-default-internal-api-0\" (UID: \"14cdd4b7-7a81-469f-ae2f-104b054cc583\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.174405    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3349c2c-6f29-425d-9d25-b4f23821cfcc-scripts" (OuterVolumeSpecName: "scripts") pod "e3349c2c-6f29-425d-9d25-b4f23821cfcc" (UID: "e3349c2c-6f29-425d-9d25-b4f23821cfcc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.174572    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0"
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.179617    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14cdd4b7-7a81-469f-ae2f-104b054cc583-config-data\") pod \"glance-default-internal-api-0\" (UID: \"14cdd4b7-7a81-469f-ae2f-104b054cc583\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.187455    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3349c2c-6f29-425d-9d25-b4f23821cfcc-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e3349c2c-6f29-425d-9d25-b4f23821cfcc" (UID: "e3349c2c-6f29-425d-9d25-b4f23821cfcc"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.188948    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/14cdd4b7-7a81-469f-ae2f-104b054cc583-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"14cdd4b7-7a81-469f-ae2f-104b054cc583\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.189030    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14cdd4b7-7a81-469f-ae2f-104b054cc583-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"14cdd4b7-7a81-469f-ae2f-104b054cc583\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.189614    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3349c2c-6f29-425d-9d25-b4f23821cfcc-kube-api-access-qj56l" (OuterVolumeSpecName: "kube-api-access-qj56l") pod "e3349c2c-6f29-425d-9d25-b4f23821cfcc" (UID: "e3349c2c-6f29-425d-9d25-b4f23821cfcc"). InnerVolumeSpecName "kube-api-access-qj56l". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.190264    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37d31419-eada-4b93-bc20-bac232ced058-kube-api-access-6x9cc" (OuterVolumeSpecName: "kube-api-access-6x9cc") pod "37d31419-eada-4b93-bc20-bac232ced058" (UID: "37d31419-eada-4b93-bc20-bac232ced058"). InnerVolumeSpecName "kube-api-access-6x9cc". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.190532    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3349c2c-6f29-425d-9d25-b4f23821cfcc-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e3349c2c-6f29-425d-9d25-b4f23821cfcc" (UID: "e3349c2c-6f29-425d-9d25-b4f23821cfcc"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.191749    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14cdd4b7-7a81-469f-ae2f-104b054cc583-scripts\") pod \"glance-default-internal-api-0\" (UID: \"14cdd4b7-7a81-469f-ae2f-104b054cc583\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.205717    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-566vl\" (UniqueName: \"kubernetes.io/projected/14cdd4b7-7a81-469f-ae2f-104b054cc583-kube-api-access-566vl\") pod \"glance-default-internal-api-0\" (UID: \"14cdd4b7-7a81-469f-ae2f-104b054cc583\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.213375    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"14cdd4b7-7a81-469f-ae2f-104b054cc583\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.224667    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3349c2c-6f29-425d-9d25-b4f23821cfcc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3349c2c-6f29-425d-9d25-b4f23821cfcc" (UID: "e3349c2c-6f29-425d-9d25-b4f23821cfcc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.231025    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37d31419-eada-4b93-bc20-bac232ced058-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "37d31419-eada-4b93-bc20-bac232ced058" (UID: "37d31419-eada-4b93-bc20-bac232ced058"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.233230    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37d31419-eada-4b93-bc20-bac232ced058-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "37d31419-eada-4b93-bc20-bac232ced058" (UID: "37d31419-eada-4b93-bc20-bac232ced058"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.242419    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37d31419-eada-4b93-bc20-bac232ced058-config" (OuterVolumeSpecName: "config") pod "37d31419-eada-4b93-bc20-bac232ced058" (UID: "37d31419-eada-4b93-bc20-bac232ced058"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.245540    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3349c2c-6f29-425d-9d25-b4f23821cfcc-config-data" (OuterVolumeSpecName: "config-data") pod "e3349c2c-6f29-425d-9d25-b4f23821cfcc" (UID: "e3349c2c-6f29-425d-9d25-b4f23821cfcc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.252105    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37d31419-eada-4b93-bc20-bac232ced058-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "37d31419-eada-4b93-bc20-bac232ced058" (UID: "37d31419-eada-4b93-bc20-bac232ced058"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.253167    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37d31419-eada-4b93-bc20-bac232ced058-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "37d31419-eada-4b93-bc20-bac232ced058" (UID: "37d31419-eada-4b93-bc20-bac232ced058"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.271670    4730 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37d31419-eada-4b93-bc20-bac232ced058-ovsdbserver-nb\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.271707    4730 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e3349c2c-6f29-425d-9d25-b4f23821cfcc-fernet-keys\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.271716    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qj56l\" (UniqueName: \"kubernetes.io/projected/e3349c2c-6f29-425d-9d25-b4f23821cfcc-kube-api-access-qj56l\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.271725    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6x9cc\" (UniqueName: \"kubernetes.io/projected/37d31419-eada-4b93-bc20-bac232ced058-kube-api-access-6x9cc\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.271734    4730 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3349c2c-6f29-425d-9d25-b4f23821cfcc-scripts\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.271742    4730 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37d31419-eada-4b93-bc20-bac232ced058-dns-svc\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.271749    4730 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3349c2c-6f29-425d-9d25-b4f23821cfcc-config-data\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.271758    4730 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3349c2c-6f29-425d-9d25-b4f23821cfcc-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.271765    4730 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e3349c2c-6f29-425d-9d25-b4f23821cfcc-credential-keys\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.271773    4730 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37d31419-eada-4b93-bc20-bac232ced058-ovsdbserver-sb\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.271780    4730 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37d31419-eada-4b93-bc20-bac232ced058-config\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.271790    4730 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37d31419-eada-4b93-bc20-bac232ced058-dns-swift-storage-0\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.493383    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.543034    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b821c271-d46c-4a68-a6ea-438e616c4d47" path="/var/lib/kubelet/pods/b821c271-d46c-4a68-a6ea-438e616c4d47/volumes"
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.544282    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4458311-d050-4887-b4c7-6df6c993d66e" path="/var/lib/kubelet/pods/c4458311-d050-4887-b4c7-6df6c993d66e/volumes"
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.696630    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rmtsq"
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.697888    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rmtsq" event={"ID":"e3349c2c-6f29-425d-9d25-b4f23821cfcc","Type":"ContainerDied","Data":"707ec669900453b1a2293614ef0436d415043c661d842387b86130234173b70a"}
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.697934    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="707ec669900453b1a2293614ef0436d415043c661d842387b86130234173b70a"
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.715816    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59fc649cc7-tct2h" event={"ID":"37d31419-eada-4b93-bc20-bac232ced058","Type":"ContainerDied","Data":"3bd944df856483bb32661f18779d6ead5d56c48a59d438569d6061308ace393b"}
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.715916    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59fc649cc7-tct2h"
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.743396    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59fc649cc7-tct2h"]
Mar 20 16:00:13 crc kubenswrapper[4730]: I0320 16:00:13.756753    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59fc649cc7-tct2h"]
Mar 20 16:00:14 crc kubenswrapper[4730]: I0320 16:00:14.083427    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-rmtsq"]
Mar 20 16:00:14 crc kubenswrapper[4730]: I0320 16:00:14.092571    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-rmtsq"]
Mar 20 16:00:14 crc kubenswrapper[4730]: I0320 16:00:14.183852    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-4kfmn"]
Mar 20 16:00:14 crc kubenswrapper[4730]: E0320 16:00:14.185108    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37d31419-eada-4b93-bc20-bac232ced058" containerName="dnsmasq-dns"
Mar 20 16:00:14 crc kubenswrapper[4730]: I0320 16:00:14.185130    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="37d31419-eada-4b93-bc20-bac232ced058" containerName="dnsmasq-dns"
Mar 20 16:00:14 crc kubenswrapper[4730]: E0320 16:00:14.185150    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37d31419-eada-4b93-bc20-bac232ced058" containerName="init"
Mar 20 16:00:14 crc kubenswrapper[4730]: I0320 16:00:14.185158    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="37d31419-eada-4b93-bc20-bac232ced058" containerName="init"
Mar 20 16:00:14 crc kubenswrapper[4730]: E0320 16:00:14.185195    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3349c2c-6f29-425d-9d25-b4f23821cfcc" containerName="keystone-bootstrap"
Mar 20 16:00:14 crc kubenswrapper[4730]: I0320 16:00:14.185513    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3349c2c-6f29-425d-9d25-b4f23821cfcc" containerName="keystone-bootstrap"
Mar 20 16:00:14 crc kubenswrapper[4730]: I0320 16:00:14.185732    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="37d31419-eada-4b93-bc20-bac232ced058" containerName="dnsmasq-dns"
Mar 20 16:00:14 crc kubenswrapper[4730]: I0320 16:00:14.185757    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3349c2c-6f29-425d-9d25-b4f23821cfcc" containerName="keystone-bootstrap"
Mar 20 16:00:14 crc kubenswrapper[4730]: I0320 16:00:14.188426    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4kfmn"
Mar 20 16:00:14 crc kubenswrapper[4730]: I0320 16:00:14.192015    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-4kfmn"]
Mar 20 16:00:14 crc kubenswrapper[4730]: I0320 16:00:14.192028    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret"
Mar 20 16:00:14 crc kubenswrapper[4730]: I0320 16:00:14.192378    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-pjvk4"
Mar 20 16:00:14 crc kubenswrapper[4730]: I0320 16:00:14.192498    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts"
Mar 20 16:00:14 crc kubenswrapper[4730]: I0320 16:00:14.192572    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data"
Mar 20 16:00:14 crc kubenswrapper[4730]: I0320 16:00:14.194566    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone"
Mar 20 16:00:14 crc kubenswrapper[4730]: I0320 16:00:14.291836    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fedef548-ce31-47a2-92fc-911f167635f9-credential-keys\") pod \"keystone-bootstrap-4kfmn\" (UID: \"fedef548-ce31-47a2-92fc-911f167635f9\") " pod="openstack/keystone-bootstrap-4kfmn"
Mar 20 16:00:14 crc kubenswrapper[4730]: I0320 16:00:14.292167    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fedef548-ce31-47a2-92fc-911f167635f9-scripts\") pod \"keystone-bootstrap-4kfmn\" (UID: \"fedef548-ce31-47a2-92fc-911f167635f9\") " pod="openstack/keystone-bootstrap-4kfmn"
Mar 20 16:00:14 crc kubenswrapper[4730]: I0320 16:00:14.292227    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fedef548-ce31-47a2-92fc-911f167635f9-combined-ca-bundle\") pod \"keystone-bootstrap-4kfmn\" (UID: \"fedef548-ce31-47a2-92fc-911f167635f9\") " pod="openstack/keystone-bootstrap-4kfmn"
Mar 20 16:00:14 crc kubenswrapper[4730]: I0320 16:00:14.292378    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fedef548-ce31-47a2-92fc-911f167635f9-config-data\") pod \"keystone-bootstrap-4kfmn\" (UID: \"fedef548-ce31-47a2-92fc-911f167635f9\") " pod="openstack/keystone-bootstrap-4kfmn"
Mar 20 16:00:14 crc kubenswrapper[4730]: I0320 16:00:14.292518    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fedef548-ce31-47a2-92fc-911f167635f9-fernet-keys\") pod \"keystone-bootstrap-4kfmn\" (UID: \"fedef548-ce31-47a2-92fc-911f167635f9\") " pod="openstack/keystone-bootstrap-4kfmn"
Mar 20 16:00:14 crc kubenswrapper[4730]: I0320 16:00:14.292593    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98kgr\" (UniqueName: \"kubernetes.io/projected/fedef548-ce31-47a2-92fc-911f167635f9-kube-api-access-98kgr\") pod \"keystone-bootstrap-4kfmn\" (UID: \"fedef548-ce31-47a2-92fc-911f167635f9\") " pod="openstack/keystone-bootstrap-4kfmn"
Mar 20 16:00:14 crc kubenswrapper[4730]: I0320 16:00:14.394704    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fedef548-ce31-47a2-92fc-911f167635f9-fernet-keys\") pod \"keystone-bootstrap-4kfmn\" (UID: \"fedef548-ce31-47a2-92fc-911f167635f9\") " pod="openstack/keystone-bootstrap-4kfmn"
Mar 20 16:00:14 crc kubenswrapper[4730]: I0320 16:00:14.394749    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98kgr\" (UniqueName: \"kubernetes.io/projected/fedef548-ce31-47a2-92fc-911f167635f9-kube-api-access-98kgr\") pod \"keystone-bootstrap-4kfmn\" (UID: \"fedef548-ce31-47a2-92fc-911f167635f9\") " pod="openstack/keystone-bootstrap-4kfmn"
Mar 20 16:00:14 crc kubenswrapper[4730]: I0320 16:00:14.394832    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fedef548-ce31-47a2-92fc-911f167635f9-credential-keys\") pod \"keystone-bootstrap-4kfmn\" (UID: \"fedef548-ce31-47a2-92fc-911f167635f9\") " pod="openstack/keystone-bootstrap-4kfmn"
Mar 20 16:00:14 crc kubenswrapper[4730]: I0320 16:00:14.394851    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fedef548-ce31-47a2-92fc-911f167635f9-scripts\") pod \"keystone-bootstrap-4kfmn\" (UID: \"fedef548-ce31-47a2-92fc-911f167635f9\") " pod="openstack/keystone-bootstrap-4kfmn"
Mar 20 16:00:14 crc kubenswrapper[4730]: I0320 16:00:14.394895    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fedef548-ce31-47a2-92fc-911f167635f9-combined-ca-bundle\") pod \"keystone-bootstrap-4kfmn\" (UID: \"fedef548-ce31-47a2-92fc-911f167635f9\") " pod="openstack/keystone-bootstrap-4kfmn"
Mar 20 16:00:14 crc kubenswrapper[4730]: I0320 16:00:14.394940    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fedef548-ce31-47a2-92fc-911f167635f9-config-data\") pod \"keystone-bootstrap-4kfmn\" (UID: \"fedef548-ce31-47a2-92fc-911f167635f9\") " pod="openstack/keystone-bootstrap-4kfmn"
Mar 20 16:00:14 crc kubenswrapper[4730]: I0320 16:00:14.402141    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fedef548-ce31-47a2-92fc-911f167635f9-credential-keys\") pod \"keystone-bootstrap-4kfmn\" (UID: \"fedef548-ce31-47a2-92fc-911f167635f9\") " pod="openstack/keystone-bootstrap-4kfmn"
Mar 20 16:00:14 crc kubenswrapper[4730]: I0320 16:00:14.403201    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fedef548-ce31-47a2-92fc-911f167635f9-config-data\") pod \"keystone-bootstrap-4kfmn\" (UID: \"fedef548-ce31-47a2-92fc-911f167635f9\") " pod="openstack/keystone-bootstrap-4kfmn"
Mar 20 16:00:14 crc kubenswrapper[4730]: I0320 16:00:14.405100    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fedef548-ce31-47a2-92fc-911f167635f9-fernet-keys\") pod \"keystone-bootstrap-4kfmn\" (UID: \"fedef548-ce31-47a2-92fc-911f167635f9\") " pod="openstack/keystone-bootstrap-4kfmn"
Mar 20 16:00:14 crc kubenswrapper[4730]: I0320 16:00:14.407912    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fedef548-ce31-47a2-92fc-911f167635f9-combined-ca-bundle\") pod \"keystone-bootstrap-4kfmn\" (UID: \"fedef548-ce31-47a2-92fc-911f167635f9\") " pod="openstack/keystone-bootstrap-4kfmn"
Mar 20 16:00:14 crc kubenswrapper[4730]: I0320 16:00:14.408477    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fedef548-ce31-47a2-92fc-911f167635f9-scripts\") pod \"keystone-bootstrap-4kfmn\" (UID: \"fedef548-ce31-47a2-92fc-911f167635f9\") " pod="openstack/keystone-bootstrap-4kfmn"
Mar 20 16:00:14 crc kubenswrapper[4730]: I0320 16:00:14.411353    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98kgr\" (UniqueName: \"kubernetes.io/projected/fedef548-ce31-47a2-92fc-911f167635f9-kube-api-access-98kgr\") pod \"keystone-bootstrap-4kfmn\" (UID: \"fedef548-ce31-47a2-92fc-911f167635f9\") " pod="openstack/keystone-bootstrap-4kfmn"
Mar 20 16:00:14 crc kubenswrapper[4730]: I0320 16:00:14.512054    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4kfmn"
Mar 20 16:00:14 crc kubenswrapper[4730]: E0320 16:00:14.966912    4730 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="859fbb7a55a48ffe4a6d03732d3cc6088c3d226367ded637f27bf24936c41dba" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"]
Mar 20 16:00:14 crc kubenswrapper[4730]: E0320 16:00:14.968497    4730 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="859fbb7a55a48ffe4a6d03732d3cc6088c3d226367ded637f27bf24936c41dba" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"]
Mar 20 16:00:14 crc kubenswrapper[4730]: E0320 16:00:14.969778    4730 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="859fbb7a55a48ffe4a6d03732d3cc6088c3d226367ded637f27bf24936c41dba" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"]
Mar 20 16:00:14 crc kubenswrapper[4730]: E0320 16:00:14.969841    4730 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="9f653a7b-251e-4eb2-92cd-74e23ac4dba5" containerName="watcher-applier"
Mar 20 16:00:15 crc kubenswrapper[4730]: I0320 16:00:15.551903    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37d31419-eada-4b93-bc20-bac232ced058" path="/var/lib/kubelet/pods/37d31419-eada-4b93-bc20-bac232ced058/volumes"
Mar 20 16:00:15 crc kubenswrapper[4730]: I0320 16:00:15.553428    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3349c2c-6f29-425d-9d25-b4f23821cfcc" path="/var/lib/kubelet/pods/e3349c2c-6f29-425d-9d25-b4f23821cfcc/volumes"
Mar 20 16:00:17 crc kubenswrapper[4730]: I0320 16:00:17.755461    4730 generic.go:334] "Generic (PLEG): container finished" podID="9f653a7b-251e-4eb2-92cd-74e23ac4dba5" containerID="859fbb7a55a48ffe4a6d03732d3cc6088c3d226367ded637f27bf24936c41dba" exitCode=0
Mar 20 16:00:17 crc kubenswrapper[4730]: I0320 16:00:17.755775    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"9f653a7b-251e-4eb2-92cd-74e23ac4dba5","Type":"ContainerDied","Data":"859fbb7a55a48ffe4a6d03732d3cc6088c3d226367ded637f27bf24936c41dba"}
Mar 20 16:00:19 crc kubenswrapper[4730]: E0320 16:00:19.965364    4730 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 859fbb7a55a48ffe4a6d03732d3cc6088c3d226367ded637f27bf24936c41dba is running failed: container process not found" containerID="859fbb7a55a48ffe4a6d03732d3cc6088c3d226367ded637f27bf24936c41dba" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"]
Mar 20 16:00:19 crc kubenswrapper[4730]: E0320 16:00:19.966111    4730 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 859fbb7a55a48ffe4a6d03732d3cc6088c3d226367ded637f27bf24936c41dba is running failed: container process not found" containerID="859fbb7a55a48ffe4a6d03732d3cc6088c3d226367ded637f27bf24936c41dba" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"]
Mar 20 16:00:19 crc kubenswrapper[4730]: E0320 16:00:19.966802    4730 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 859fbb7a55a48ffe4a6d03732d3cc6088c3d226367ded637f27bf24936c41dba is running failed: container process not found" containerID="859fbb7a55a48ffe4a6d03732d3cc6088c3d226367ded637f27bf24936c41dba" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"]
Mar 20 16:00:19 crc kubenswrapper[4730]: E0320 16:00:19.966843    4730 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 859fbb7a55a48ffe4a6d03732d3cc6088c3d226367ded637f27bf24936c41dba is running failed: container process not found" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="9f653a7b-251e-4eb2-92cd-74e23ac4dba5" containerName="watcher-applier"
Mar 20 16:00:20 crc kubenswrapper[4730]: I0320 16:00:20.699485    4730 scope.go:117] "RemoveContainer" containerID="5b6df27b1ba23b9df05a1fffaaba05b754c2fb2f90da0b4adb5478040787f576"
Mar 20 16:00:21 crc kubenswrapper[4730]: I0320 16:00:21.808668    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"9f653a7b-251e-4eb2-92cd-74e23ac4dba5","Type":"ContainerDied","Data":"c21d7cc71db24a4080b26c4c7e24c638e9de88158fbe9899bd6305ba181030ef"}
Mar 20 16:00:21 crc kubenswrapper[4730]: I0320 16:00:21.808866    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c21d7cc71db24a4080b26c4c7e24c638e9de88158fbe9899bd6305ba181030ef"
Mar 20 16:00:21 crc kubenswrapper[4730]: I0320 16:00:21.893133    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0"
Mar 20 16:00:21 crc kubenswrapper[4730]: I0320 16:00:21.904103    4730 scope.go:117] "RemoveContainer" containerID="20e1876e1a0a25a88aae1c6629f93da770afea8c540f5629af87b68779bdafc7"
Mar 20 16:00:22 crc kubenswrapper[4730]: I0320 16:00:22.037439    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f653a7b-251e-4eb2-92cd-74e23ac4dba5-logs\") pod \"9f653a7b-251e-4eb2-92cd-74e23ac4dba5\" (UID: \"9f653a7b-251e-4eb2-92cd-74e23ac4dba5\") "
Mar 20 16:00:22 crc kubenswrapper[4730]: I0320 16:00:22.037492    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52bmh\" (UniqueName: \"kubernetes.io/projected/9f653a7b-251e-4eb2-92cd-74e23ac4dba5-kube-api-access-52bmh\") pod \"9f653a7b-251e-4eb2-92cd-74e23ac4dba5\" (UID: \"9f653a7b-251e-4eb2-92cd-74e23ac4dba5\") "
Mar 20 16:00:22 crc kubenswrapper[4730]: I0320 16:00:22.037541    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f653a7b-251e-4eb2-92cd-74e23ac4dba5-config-data\") pod \"9f653a7b-251e-4eb2-92cd-74e23ac4dba5\" (UID: \"9f653a7b-251e-4eb2-92cd-74e23ac4dba5\") "
Mar 20 16:00:22 crc kubenswrapper[4730]: I0320 16:00:22.037664    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f653a7b-251e-4eb2-92cd-74e23ac4dba5-combined-ca-bundle\") pod \"9f653a7b-251e-4eb2-92cd-74e23ac4dba5\" (UID: \"9f653a7b-251e-4eb2-92cd-74e23ac4dba5\") "
Mar 20 16:00:22 crc kubenswrapper[4730]: I0320 16:00:22.038976    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f653a7b-251e-4eb2-92cd-74e23ac4dba5-logs" (OuterVolumeSpecName: "logs") pod "9f653a7b-251e-4eb2-92cd-74e23ac4dba5" (UID: "9f653a7b-251e-4eb2-92cd-74e23ac4dba5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:00:22 crc kubenswrapper[4730]: I0320 16:00:22.064510    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f653a7b-251e-4eb2-92cd-74e23ac4dba5-kube-api-access-52bmh" (OuterVolumeSpecName: "kube-api-access-52bmh") pod "9f653a7b-251e-4eb2-92cd-74e23ac4dba5" (UID: "9f653a7b-251e-4eb2-92cd-74e23ac4dba5"). InnerVolumeSpecName "kube-api-access-52bmh". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:00:22 crc kubenswrapper[4730]: I0320 16:00:22.096876    4730 scope.go:117] "RemoveContainer" containerID="bb5b7167e8f80b19b6b3c7bf7988748aafee8cd2702715c1020def5b6b2b9fb6"
Mar 20 16:00:22 crc kubenswrapper[4730]: I0320 16:00:22.106449    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f653a7b-251e-4eb2-92cd-74e23ac4dba5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f653a7b-251e-4eb2-92cd-74e23ac4dba5" (UID: "9f653a7b-251e-4eb2-92cd-74e23ac4dba5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:00:22 crc kubenswrapper[4730]: I0320 16:00:22.140321    4730 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f653a7b-251e-4eb2-92cd-74e23ac4dba5-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:22 crc kubenswrapper[4730]: I0320 16:00:22.140353    4730 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f653a7b-251e-4eb2-92cd-74e23ac4dba5-logs\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:22 crc kubenswrapper[4730]: I0320 16:00:22.140363    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52bmh\" (UniqueName: \"kubernetes.io/projected/9f653a7b-251e-4eb2-92cd-74e23ac4dba5-kube-api-access-52bmh\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:22 crc kubenswrapper[4730]: I0320 16:00:22.184105    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f653a7b-251e-4eb2-92cd-74e23ac4dba5-config-data" (OuterVolumeSpecName: "config-data") pod "9f653a7b-251e-4eb2-92cd-74e23ac4dba5" (UID: "9f653a7b-251e-4eb2-92cd-74e23ac4dba5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:00:22 crc kubenswrapper[4730]: I0320 16:00:22.184909    4730 scope.go:117] "RemoveContainer" containerID="44eee6d0807ce3fdd8bb3db86c31803cf1f646803d40fa88d8701a25be8c2aaa"
Mar 20 16:00:22 crc kubenswrapper[4730]: I0320 16:00:22.242077    4730 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f653a7b-251e-4eb2-92cd-74e23ac4dba5-config-data\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:22 crc kubenswrapper[4730]: I0320 16:00:22.599602    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"]
Mar 20 16:00:22 crc kubenswrapper[4730]: I0320 16:00:22.665954    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-4kfmn"]
Mar 20 16:00:22 crc kubenswrapper[4730]: I0320 16:00:22.853963    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-x2t9r" event={"ID":"a05675d7-cd2f-4810-862b-cb0d2d13cbdd","Type":"ContainerStarted","Data":"6f35041c9925accfe452d038ab9d3c1753f640407e6e4a51f0b4d6916cb04e6f"}
Mar 20 16:00:22 crc kubenswrapper[4730]: I0320 16:00:22.857442    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567040-2zl4f" event={"ID":"97b63a10-b572-4a37-a2a4-079852aa2d3d","Type":"ContainerStarted","Data":"347fe11ee7c05acba952c1a21fa83ca176c9f921071221e9dbdf6170682cd003"}
Mar 20 16:00:22 crc kubenswrapper[4730]: I0320 16:00:22.861043    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4kfmn" event={"ID":"fedef548-ce31-47a2-92fc-911f167635f9","Type":"ContainerStarted","Data":"23a1410560f5a07b5a54b48e6e05506048eb34cae5e9bbf86d9f30143ef5d1b0"}
Mar 20 16:00:22 crc kubenswrapper[4730]: I0320 16:00:22.886837    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-x2t9r" podStartSLOduration=12.375509646 podStartE2EDuration="23.886823039s" podCreationTimestamp="2026-03-20 15:59:59 +0000 UTC" firstStartedPulling="2026-03-20 16:00:01.462055347 +0000 UTC m=+1260.675426716" lastFinishedPulling="2026-03-20 16:00:12.97336874 +0000 UTC m=+1272.186740109" observedRunningTime="2026-03-20 16:00:22.870920743 +0000 UTC m=+1282.084292132" watchObservedRunningTime="2026-03-20 16:00:22.886823039 +0000 UTC m=+1282.100194408"
Mar 20 16:00:22 crc kubenswrapper[4730]: I0320 16:00:22.887421    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567040-2zl4f" podStartSLOduration=2.577543313 podStartE2EDuration="22.887415542s" podCreationTimestamp="2026-03-20 16:00:00 +0000 UTC" firstStartedPulling="2026-03-20 16:00:01.625324578 +0000 UTC m=+1260.838695947" lastFinishedPulling="2026-03-20 16:00:21.935196807 +0000 UTC m=+1281.148568176" observedRunningTime="2026-03-20 16:00:22.884698501 +0000 UTC m=+1282.098069870" watchObservedRunningTime="2026-03-20 16:00:22.887415542 +0000 UTC m=+1282.100786911"
Mar 20 16:00:22 crc kubenswrapper[4730]: I0320 16:00:22.895486    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-tz6x7" event={"ID":"48fc8af0-e30f-4f3f-88d3-8b054c6359ef","Type":"ContainerStarted","Data":"59a0ed1595de1b0849599bb5a7c10e7cfbb46ad061c13c2ab2d12fc1bc355373"}
Mar 20 16:00:22 crc kubenswrapper[4730]: I0320 16:00:22.900495    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"14cdd4b7-7a81-469f-ae2f-104b054cc583","Type":"ContainerStarted","Data":"656641958e552f93316749e193bcc1be09eafd97508856daa5fede226eb204fa"}
Mar 20 16:00:22 crc kubenswrapper[4730]: I0320 16:00:22.906918    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"223c97f9-0680-47b8-bc2e-1c914296d29e","Type":"ContainerStarted","Data":"d6846337441238e0631dc47666643f2d85b6c9e548d29144f9972ff195d4dc1e"}
Mar 20 16:00:22 crc kubenswrapper[4730]: I0320 16:00:22.912484    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-tz6x7" podStartSLOduration=3.545593208 podStartE2EDuration="23.912446142s" podCreationTimestamp="2026-03-20 15:59:59 +0000 UTC" firstStartedPulling="2026-03-20 16:00:01.460581194 +0000 UTC m=+1260.673952563" lastFinishedPulling="2026-03-20 16:00:21.827434128 +0000 UTC m=+1281.040805497" observedRunningTime="2026-03-20 16:00:22.912286028 +0000 UTC m=+1282.125657417" watchObservedRunningTime="2026-03-20 16:00:22.912446142 +0000 UTC m=+1282.125817511"
Mar 20 16:00:22 crc kubenswrapper[4730]: I0320 16:00:22.928555    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0"
Mar 20 16:00:22 crc kubenswrapper[4730]: I0320 16:00:22.928621    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"3f6c808e-d523-48bd-8ec2-28b625834317","Type":"ContainerStarted","Data":"834cc7775c739ec615e04b1c22eba8f136c36cff5d344d3613f2797565551c85"}
Mar 20 16:00:22 crc kubenswrapper[4730]: I0320 16:00:22.985941    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"]
Mar 20 16:00:23 crc kubenswrapper[4730]: I0320 16:00:23.015035    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-applier-0"]
Mar 20 16:00:23 crc kubenswrapper[4730]: I0320 16:00:23.024618    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"]
Mar 20 16:00:23 crc kubenswrapper[4730]: E0320 16:00:23.025475    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f653a7b-251e-4eb2-92cd-74e23ac4dba5" containerName="watcher-applier"
Mar 20 16:00:23 crc kubenswrapper[4730]: I0320 16:00:23.025496    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f653a7b-251e-4eb2-92cd-74e23ac4dba5" containerName="watcher-applier"
Mar 20 16:00:23 crc kubenswrapper[4730]: I0320 16:00:23.025718    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f653a7b-251e-4eb2-92cd-74e23ac4dba5" containerName="watcher-applier"
Mar 20 16:00:23 crc kubenswrapper[4730]: I0320 16:00:23.026863    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0"
Mar 20 16:00:23 crc kubenswrapper[4730]: I0320 16:00:23.030006    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data"
Mar 20 16:00:23 crc kubenswrapper[4730]: I0320 16:00:23.046384    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"]
Mar 20 16:00:23 crc kubenswrapper[4730]: I0320 16:00:23.175108    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0\") " pod="openstack/watcher-applier-0"
Mar 20 16:00:23 crc kubenswrapper[4730]: I0320 16:00:23.175176    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0-config-data\") pod \"watcher-applier-0\" (UID: \"5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0\") " pod="openstack/watcher-applier-0"
Mar 20 16:00:23 crc kubenswrapper[4730]: I0320 16:00:23.175208    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djjz2\" (UniqueName: \"kubernetes.io/projected/5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0-kube-api-access-djjz2\") pod \"watcher-applier-0\" (UID: \"5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0\") " pod="openstack/watcher-applier-0"
Mar 20 16:00:23 crc kubenswrapper[4730]: I0320 16:00:23.175227    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0-logs\") pod \"watcher-applier-0\" (UID: \"5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0\") " pod="openstack/watcher-applier-0"
Mar 20 16:00:23 crc kubenswrapper[4730]: I0320 16:00:23.276820    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0\") " pod="openstack/watcher-applier-0"
Mar 20 16:00:23 crc kubenswrapper[4730]: I0320 16:00:23.276909    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0-config-data\") pod \"watcher-applier-0\" (UID: \"5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0\") " pod="openstack/watcher-applier-0"
Mar 20 16:00:23 crc kubenswrapper[4730]: I0320 16:00:23.276950    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djjz2\" (UniqueName: \"kubernetes.io/projected/5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0-kube-api-access-djjz2\") pod \"watcher-applier-0\" (UID: \"5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0\") " pod="openstack/watcher-applier-0"
Mar 20 16:00:23 crc kubenswrapper[4730]: I0320 16:00:23.276974    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0-logs\") pod \"watcher-applier-0\" (UID: \"5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0\") " pod="openstack/watcher-applier-0"
Mar 20 16:00:23 crc kubenswrapper[4730]: I0320 16:00:23.277588    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0-logs\") pod \"watcher-applier-0\" (UID: \"5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0\") " pod="openstack/watcher-applier-0"
Mar 20 16:00:23 crc kubenswrapper[4730]: I0320 16:00:23.307520    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0\") " pod="openstack/watcher-applier-0"
Mar 20 16:00:23 crc kubenswrapper[4730]: I0320 16:00:23.307581    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0-config-data\") pod \"watcher-applier-0\" (UID: \"5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0\") " pod="openstack/watcher-applier-0"
Mar 20 16:00:23 crc kubenswrapper[4730]: I0320 16:00:23.315417    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djjz2\" (UniqueName: \"kubernetes.io/projected/5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0-kube-api-access-djjz2\") pod \"watcher-applier-0\" (UID: \"5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0\") " pod="openstack/watcher-applier-0"
Mar 20 16:00:23 crc kubenswrapper[4730]: I0320 16:00:23.403385    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0"
Mar 20 16:00:23 crc kubenswrapper[4730]: I0320 16:00:23.523286    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"]
Mar 20 16:00:23 crc kubenswrapper[4730]: I0320 16:00:23.551557    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f653a7b-251e-4eb2-92cd-74e23ac4dba5" path="/var/lib/kubelet/pods/9f653a7b-251e-4eb2-92cd-74e23ac4dba5/volumes"
Mar 20 16:00:23 crc kubenswrapper[4730]: I0320 16:00:23.951344    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4kfmn" event={"ID":"fedef548-ce31-47a2-92fc-911f167635f9","Type":"ContainerStarted","Data":"2cbf92580c54611c192a57c093a66c2f77a3a73726fc9a21c3aef24b4e922f95"}
Mar 20 16:00:23 crc kubenswrapper[4730]: I0320 16:00:23.953895    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"82401c9f-f5f9-4bc6-a085-c89d3632493e","Type":"ContainerStarted","Data":"81ccba513b601e62e75f217051576c1c723201231f3534205e6403a320c44aa9"}
Mar 20 16:00:23 crc kubenswrapper[4730]: I0320 16:00:23.957405    4730 generic.go:334] "Generic (PLEG): container finished" podID="97b63a10-b572-4a37-a2a4-079852aa2d3d" containerID="347fe11ee7c05acba952c1a21fa83ca176c9f921071221e9dbdf6170682cd003" exitCode=0
Mar 20 16:00:23 crc kubenswrapper[4730]: I0320 16:00:23.957573    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567040-2zl4f" event={"ID":"97b63a10-b572-4a37-a2a4-079852aa2d3d","Type":"ContainerDied","Data":"347fe11ee7c05acba952c1a21fa83ca176c9f921071221e9dbdf6170682cd003"}
Mar 20 16:00:23 crc kubenswrapper[4730]: I0320 16:00:23.959571    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"14cdd4b7-7a81-469f-ae2f-104b054cc583","Type":"ContainerStarted","Data":"1eef3a4c57a302c49282046a25ce9b6b686742d1068d8539a0c4f898222c31dc"}
Mar 20 16:00:23 crc kubenswrapper[4730]: I0320 16:00:23.961348    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-hbplf" event={"ID":"6fb4d42d-6cd9-480c-8ee0-1e168504a4cd","Type":"ContainerStarted","Data":"1763f714611816ce76b822616e2726ee2af2ec1d061896faecc0edc07186595f"}
Mar 20 16:00:23 crc kubenswrapper[4730]: I0320 16:00:23.972159    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-4kfmn" podStartSLOduration=9.972139 podStartE2EDuration="9.972139s" podCreationTimestamp="2026-03-20 16:00:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:00:23.969576513 +0000 UTC m=+1283.182947902" watchObservedRunningTime="2026-03-20 16:00:23.972139 +0000 UTC m=+1283.185510369"
Mar 20 16:00:23 crc kubenswrapper[4730]: I0320 16:00:23.992800    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-hbplf" podStartSLOduration=4.429086767 podStartE2EDuration="24.992782942s" podCreationTimestamp="2026-03-20 15:59:59 +0000 UTC" firstStartedPulling="2026-03-20 16:00:01.479049247 +0000 UTC m=+1260.692420616" lastFinishedPulling="2026-03-20 16:00:22.042745422 +0000 UTC m=+1281.256116791" observedRunningTime="2026-03-20 16:00:23.988700501 +0000 UTC m=+1283.202071870" watchObservedRunningTime="2026-03-20 16:00:23.992782942 +0000 UTC m=+1283.206154311"
Mar 20 16:00:24 crc kubenswrapper[4730]: I0320 16:00:24.100856    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"]
Mar 20 16:00:24 crc kubenswrapper[4730]: W0320 16:00:24.170483    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5bd7fdc7_f9da_4f44_98d3_7b86f541b9f0.slice/crio-505e09709aeaef8a87d638031cf49a2f3e2fbdc9ec33e31ca6d7f0d3b9378532 WatchSource:0}: Error finding container 505e09709aeaef8a87d638031cf49a2f3e2fbdc9ec33e31ca6d7f0d3b9378532: Status 404 returned error can't find the container with id 505e09709aeaef8a87d638031cf49a2f3e2fbdc9ec33e31ca6d7f0d3b9378532
Mar 20 16:00:25 crc kubenswrapper[4730]: I0320 16:00:25.003626    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"82401c9f-f5f9-4bc6-a085-c89d3632493e","Type":"ContainerStarted","Data":"d342f4f373c3a46fb291e5160cd15525a5be1018e68f010945cfba9de11fd3fe"}
Mar 20 16:00:25 crc kubenswrapper[4730]: I0320 16:00:25.007040    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0","Type":"ContainerStarted","Data":"06e4300ad2aa0cb045945f92073482ca78016e812eb685af49fb195ce150c681"}
Mar 20 16:00:25 crc kubenswrapper[4730]: I0320 16:00:25.007077    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0","Type":"ContainerStarted","Data":"505e09709aeaef8a87d638031cf49a2f3e2fbdc9ec33e31ca6d7f0d3b9378532"}
Mar 20 16:00:25 crc kubenswrapper[4730]: I0320 16:00:25.018090    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"14cdd4b7-7a81-469f-ae2f-104b054cc583","Type":"ContainerStarted","Data":"8f1291a8157b5fac6e0adc6431d157cb584ae26777485369c5a717b5d22da62c"}
Mar 20 16:00:25 crc kubenswrapper[4730]: I0320 16:00:25.033996    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"223c97f9-0680-47b8-bc2e-1c914296d29e","Type":"ContainerStarted","Data":"f29767ca3e9d6cdef0508a609251241ac12e93823a37b6304da4f130070ee420"}
Mar 20 16:00:25 crc kubenswrapper[4730]: I0320 16:00:25.036457    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=3.036410681 podStartE2EDuration="3.036410681s" podCreationTimestamp="2026-03-20 16:00:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:00:25.030574911 +0000 UTC m=+1284.243946290" watchObservedRunningTime="2026-03-20 16:00:25.036410681 +0000 UTC m=+1284.249782060"
Mar 20 16:00:25 crc kubenswrapper[4730]: I0320 16:00:25.064752    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=13.064730975 podStartE2EDuration="13.064730975s" podCreationTimestamp="2026-03-20 16:00:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:00:25.060294535 +0000 UTC m=+1284.273665924" watchObservedRunningTime="2026-03-20 16:00:25.064730975 +0000 UTC m=+1284.278102344"
Mar 20 16:00:25 crc kubenswrapper[4730]: I0320 16:00:25.324777    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567040-2zl4f"
Mar 20 16:00:25 crc kubenswrapper[4730]: I0320 16:00:25.425457    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-459cr\" (UniqueName: \"kubernetes.io/projected/97b63a10-b572-4a37-a2a4-079852aa2d3d-kube-api-access-459cr\") pod \"97b63a10-b572-4a37-a2a4-079852aa2d3d\" (UID: \"97b63a10-b572-4a37-a2a4-079852aa2d3d\") "
Mar 20 16:00:25 crc kubenswrapper[4730]: I0320 16:00:25.435477    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97b63a10-b572-4a37-a2a4-079852aa2d3d-kube-api-access-459cr" (OuterVolumeSpecName: "kube-api-access-459cr") pod "97b63a10-b572-4a37-a2a4-079852aa2d3d" (UID: "97b63a10-b572-4a37-a2a4-079852aa2d3d"). InnerVolumeSpecName "kube-api-access-459cr". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:00:25 crc kubenswrapper[4730]: I0320 16:00:25.528309    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-459cr\" (UniqueName: \"kubernetes.io/projected/97b63a10-b572-4a37-a2a4-079852aa2d3d-kube-api-access-459cr\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:25 crc kubenswrapper[4730]: I0320 16:00:25.957908    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567034-sdvfb"]
Mar 20 16:00:25 crc kubenswrapper[4730]: I0320 16:00:25.966151    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567034-sdvfb"]
Mar 20 16:00:26 crc kubenswrapper[4730]: I0320 16:00:26.049435    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567040-2zl4f"
Mar 20 16:00:26 crc kubenswrapper[4730]: I0320 16:00:26.049476    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567040-2zl4f" event={"ID":"97b63a10-b572-4a37-a2a4-079852aa2d3d","Type":"ContainerDied","Data":"5160551b06a1e1f1e7b6689c4dec94a675729bc0e9edd00f1bb67e7f9a23750a"}
Mar 20 16:00:26 crc kubenswrapper[4730]: I0320 16:00:26.049519    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5160551b06a1e1f1e7b6689c4dec94a675729bc0e9edd00f1bb67e7f9a23750a"
Mar 20 16:00:27 crc kubenswrapper[4730]: I0320 16:00:27.548663    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c84a0097-0ea0-4397-b72b-07e391268b84" path="/var/lib/kubelet/pods/c84a0097-0ea0-4397-b72b-07e391268b84/volumes"
Mar 20 16:00:28 crc kubenswrapper[4730]: I0320 16:00:28.403640    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0"
Mar 20 16:00:29 crc kubenswrapper[4730]: I0320 16:00:29.078302    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"82401c9f-f5f9-4bc6-a085-c89d3632493e","Type":"ContainerStarted","Data":"68b0b2752749a64e7ce292cfa6aabcc6400dcc4552e165b090d093ce63fe5a35"}
Mar 20 16:00:29 crc kubenswrapper[4730]: I0320 16:00:29.101640    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=17.101608592 podStartE2EDuration="17.101608592s" podCreationTimestamp="2026-03-20 16:00:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:00:29.100663001 +0000 UTC m=+1288.314034390" watchObservedRunningTime="2026-03-20 16:00:29.101608592 +0000 UTC m=+1288.314979961"
Mar 20 16:00:29 crc kubenswrapper[4730]: I0320 16:00:29.974531    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0"
Mar 20 16:00:29 crc kubenswrapper[4730]: I0320 16:00:29.974835    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0"
Mar 20 16:00:29 crc kubenswrapper[4730]: E0320 16:00:29.976155    4730 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 834cc7775c739ec615e04b1c22eba8f136c36cff5d344d3613f2797565551c85 is running failed: container process not found" containerID="834cc7775c739ec615e04b1c22eba8f136c36cff5d344d3613f2797565551c85" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"]
Mar 20 16:00:29 crc kubenswrapper[4730]: E0320 16:00:29.976530    4730 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 834cc7775c739ec615e04b1c22eba8f136c36cff5d344d3613f2797565551c85 is running failed: container process not found" containerID="834cc7775c739ec615e04b1c22eba8f136c36cff5d344d3613f2797565551c85" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"]
Mar 20 16:00:29 crc kubenswrapper[4730]: E0320 16:00:29.976711    4730 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 834cc7775c739ec615e04b1c22eba8f136c36cff5d344d3613f2797565551c85 is running failed: container process not found" containerID="834cc7775c739ec615e04b1c22eba8f136c36cff5d344d3613f2797565551c85" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"]
Mar 20 16:00:29 crc kubenswrapper[4730]: E0320 16:00:29.976736    4730 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 834cc7775c739ec615e04b1c22eba8f136c36cff5d344d3613f2797565551c85 is running failed: container process not found" probeType="Startup" pod="openstack/watcher-decision-engine-0" podUID="3f6c808e-d523-48bd-8ec2-28b625834317" containerName="watcher-decision-engine"
Mar 20 16:00:30 crc kubenswrapper[4730]: I0320 16:00:30.087962    4730 generic.go:334] "Generic (PLEG): container finished" podID="fedef548-ce31-47a2-92fc-911f167635f9" containerID="2cbf92580c54611c192a57c093a66c2f77a3a73726fc9a21c3aef24b4e922f95" exitCode=0
Mar 20 16:00:30 crc kubenswrapper[4730]: I0320 16:00:30.088039    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4kfmn" event={"ID":"fedef548-ce31-47a2-92fc-911f167635f9","Type":"ContainerDied","Data":"2cbf92580c54611c192a57c093a66c2f77a3a73726fc9a21c3aef24b4e922f95"}
Mar 20 16:00:30 crc kubenswrapper[4730]: I0320 16:00:30.090895    4730 generic.go:334] "Generic (PLEG): container finished" podID="3f6c808e-d523-48bd-8ec2-28b625834317" containerID="834cc7775c739ec615e04b1c22eba8f136c36cff5d344d3613f2797565551c85" exitCode=1
Mar 20 16:00:30 crc kubenswrapper[4730]: I0320 16:00:30.090926    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"3f6c808e-d523-48bd-8ec2-28b625834317","Type":"ContainerDied","Data":"834cc7775c739ec615e04b1c22eba8f136c36cff5d344d3613f2797565551c85"}
Mar 20 16:00:30 crc kubenswrapper[4730]: I0320 16:00:30.090997    4730 scope.go:117] "RemoveContainer" containerID="c057d3a2f6ef1e71f3dc2bc7abb9c07a0dfab9e5b78bf4eb4546276a24c2d109"
Mar 20 16:00:30 crc kubenswrapper[4730]: I0320 16:00:30.091687    4730 scope.go:117] "RemoveContainer" containerID="834cc7775c739ec615e04b1c22eba8f136c36cff5d344d3613f2797565551c85"
Mar 20 16:00:30 crc kubenswrapper[4730]: E0320 16:00:30.091928    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(3f6c808e-d523-48bd-8ec2-28b625834317)\"" pod="openstack/watcher-decision-engine-0" podUID="3f6c808e-d523-48bd-8ec2-28b625834317"
Mar 20 16:00:31 crc kubenswrapper[4730]: I0320 16:00:31.453276    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4kfmn"
Mar 20 16:00:31 crc kubenswrapper[4730]: I0320 16:00:31.540510    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98kgr\" (UniqueName: \"kubernetes.io/projected/fedef548-ce31-47a2-92fc-911f167635f9-kube-api-access-98kgr\") pod \"fedef548-ce31-47a2-92fc-911f167635f9\" (UID: \"fedef548-ce31-47a2-92fc-911f167635f9\") "
Mar 20 16:00:31 crc kubenswrapper[4730]: I0320 16:00:31.540628    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fedef548-ce31-47a2-92fc-911f167635f9-combined-ca-bundle\") pod \"fedef548-ce31-47a2-92fc-911f167635f9\" (UID: \"fedef548-ce31-47a2-92fc-911f167635f9\") "
Mar 20 16:00:31 crc kubenswrapper[4730]: I0320 16:00:31.540658    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fedef548-ce31-47a2-92fc-911f167635f9-fernet-keys\") pod \"fedef548-ce31-47a2-92fc-911f167635f9\" (UID: \"fedef548-ce31-47a2-92fc-911f167635f9\") "
Mar 20 16:00:31 crc kubenswrapper[4730]: I0320 16:00:31.540753    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fedef548-ce31-47a2-92fc-911f167635f9-config-data\") pod \"fedef548-ce31-47a2-92fc-911f167635f9\" (UID: \"fedef548-ce31-47a2-92fc-911f167635f9\") "
Mar 20 16:00:31 crc kubenswrapper[4730]: I0320 16:00:31.540935    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fedef548-ce31-47a2-92fc-911f167635f9-scripts\") pod \"fedef548-ce31-47a2-92fc-911f167635f9\" (UID: \"fedef548-ce31-47a2-92fc-911f167635f9\") "
Mar 20 16:00:31 crc kubenswrapper[4730]: I0320 16:00:31.540974    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fedef548-ce31-47a2-92fc-911f167635f9-credential-keys\") pod \"fedef548-ce31-47a2-92fc-911f167635f9\" (UID: \"fedef548-ce31-47a2-92fc-911f167635f9\") "
Mar 20 16:00:31 crc kubenswrapper[4730]: I0320 16:00:31.546077    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fedef548-ce31-47a2-92fc-911f167635f9-scripts" (OuterVolumeSpecName: "scripts") pod "fedef548-ce31-47a2-92fc-911f167635f9" (UID: "fedef548-ce31-47a2-92fc-911f167635f9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:00:31 crc kubenswrapper[4730]: I0320 16:00:31.546480    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fedef548-ce31-47a2-92fc-911f167635f9-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "fedef548-ce31-47a2-92fc-911f167635f9" (UID: "fedef548-ce31-47a2-92fc-911f167635f9"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:00:31 crc kubenswrapper[4730]: I0320 16:00:31.546732    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fedef548-ce31-47a2-92fc-911f167635f9-kube-api-access-98kgr" (OuterVolumeSpecName: "kube-api-access-98kgr") pod "fedef548-ce31-47a2-92fc-911f167635f9" (UID: "fedef548-ce31-47a2-92fc-911f167635f9"). InnerVolumeSpecName "kube-api-access-98kgr". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:00:31 crc kubenswrapper[4730]: I0320 16:00:31.548321    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fedef548-ce31-47a2-92fc-911f167635f9-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "fedef548-ce31-47a2-92fc-911f167635f9" (UID: "fedef548-ce31-47a2-92fc-911f167635f9"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:00:31 crc kubenswrapper[4730]: I0320 16:00:31.567707    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fedef548-ce31-47a2-92fc-911f167635f9-config-data" (OuterVolumeSpecName: "config-data") pod "fedef548-ce31-47a2-92fc-911f167635f9" (UID: "fedef548-ce31-47a2-92fc-911f167635f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:00:31 crc kubenswrapper[4730]: I0320 16:00:31.569893    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fedef548-ce31-47a2-92fc-911f167635f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fedef548-ce31-47a2-92fc-911f167635f9" (UID: "fedef548-ce31-47a2-92fc-911f167635f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:00:31 crc kubenswrapper[4730]: I0320 16:00:31.643986    4730 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fedef548-ce31-47a2-92fc-911f167635f9-config-data\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:31 crc kubenswrapper[4730]: I0320 16:00:31.644012    4730 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fedef548-ce31-47a2-92fc-911f167635f9-scripts\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:31 crc kubenswrapper[4730]: I0320 16:00:31.644022    4730 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fedef548-ce31-47a2-92fc-911f167635f9-credential-keys\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:31 crc kubenswrapper[4730]: I0320 16:00:31.644034    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98kgr\" (UniqueName: \"kubernetes.io/projected/fedef548-ce31-47a2-92fc-911f167635f9-kube-api-access-98kgr\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:31 crc kubenswrapper[4730]: I0320 16:00:31.644043    4730 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fedef548-ce31-47a2-92fc-911f167635f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:31 crc kubenswrapper[4730]: I0320 16:00:31.644051    4730 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fedef548-ce31-47a2-92fc-911f167635f9-fernet-keys\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:32 crc kubenswrapper[4730]: I0320 16:00:32.114870    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4kfmn" event={"ID":"fedef548-ce31-47a2-92fc-911f167635f9","Type":"ContainerDied","Data":"23a1410560f5a07b5a54b48e6e05506048eb34cae5e9bbf86d9f30143ef5d1b0"}
Mar 20 16:00:32 crc kubenswrapper[4730]: I0320 16:00:32.114910    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23a1410560f5a07b5a54b48e6e05506048eb34cae5e9bbf86d9f30143ef5d1b0"
Mar 20 16:00:32 crc kubenswrapper[4730]: I0320 16:00:32.114884    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4kfmn"
Mar 20 16:00:32 crc kubenswrapper[4730]: I0320 16:00:32.117822    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"223c97f9-0680-47b8-bc2e-1c914296d29e","Type":"ContainerStarted","Data":"71df4b6d20c02608180532f04e26895e3f9ad0248e3128f02a3d2c457f5baa48"}
Mar 20 16:00:32 crc kubenswrapper[4730]: I0320 16:00:32.119603    4730 generic.go:334] "Generic (PLEG): container finished" podID="a05675d7-cd2f-4810-862b-cb0d2d13cbdd" containerID="6f35041c9925accfe452d038ab9d3c1753f640407e6e4a51f0b4d6916cb04e6f" exitCode=0
Mar 20 16:00:32 crc kubenswrapper[4730]: I0320 16:00:32.119660    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-x2t9r" event={"ID":"a05675d7-cd2f-4810-862b-cb0d2d13cbdd","Type":"ContainerDied","Data":"6f35041c9925accfe452d038ab9d3c1753f640407e6e4a51f0b4d6916cb04e6f"}
Mar 20 16:00:32 crc kubenswrapper[4730]: I0320 16:00:32.246759    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6fb7949f77-2l9t7"]
Mar 20 16:00:32 crc kubenswrapper[4730]: E0320 16:00:32.247258    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fedef548-ce31-47a2-92fc-911f167635f9" containerName="keystone-bootstrap"
Mar 20 16:00:32 crc kubenswrapper[4730]: I0320 16:00:32.247278    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="fedef548-ce31-47a2-92fc-911f167635f9" containerName="keystone-bootstrap"
Mar 20 16:00:32 crc kubenswrapper[4730]: E0320 16:00:32.247305    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97b63a10-b572-4a37-a2a4-079852aa2d3d" containerName="oc"
Mar 20 16:00:32 crc kubenswrapper[4730]: I0320 16:00:32.247313    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="97b63a10-b572-4a37-a2a4-079852aa2d3d" containerName="oc"
Mar 20 16:00:32 crc kubenswrapper[4730]: I0320 16:00:32.247580    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="fedef548-ce31-47a2-92fc-911f167635f9" containerName="keystone-bootstrap"
Mar 20 16:00:32 crc kubenswrapper[4730]: I0320 16:00:32.247605    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="97b63a10-b572-4a37-a2a4-079852aa2d3d" containerName="oc"
Mar 20 16:00:32 crc kubenswrapper[4730]: I0320 16:00:32.248344    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6fb7949f77-2l9t7"
Mar 20 16:00:32 crc kubenswrapper[4730]: I0320 16:00:32.255106    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc"
Mar 20 16:00:32 crc kubenswrapper[4730]: I0320 16:00:32.255341    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone"
Mar 20 16:00:32 crc kubenswrapper[4730]: I0320 16:00:32.255366    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts"
Mar 20 16:00:32 crc kubenswrapper[4730]: I0320 16:00:32.255552    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc"
Mar 20 16:00:32 crc kubenswrapper[4730]: I0320 16:00:32.255700    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-pjvk4"
Mar 20 16:00:32 crc kubenswrapper[4730]: I0320 16:00:32.255549    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data"
Mar 20 16:00:32 crc kubenswrapper[4730]: I0320 16:00:32.270153    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6fb7949f77-2l9t7"]
Mar 20 16:00:32 crc kubenswrapper[4730]: I0320 16:00:32.360268    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2b9f0c5-80cc-4a4c-bbd8-c70cda9d5d3d-config-data\") pod \"keystone-6fb7949f77-2l9t7\" (UID: \"e2b9f0c5-80cc-4a4c-bbd8-c70cda9d5d3d\") " pod="openstack/keystone-6fb7949f77-2l9t7"
Mar 20 16:00:32 crc kubenswrapper[4730]: I0320 16:00:32.360318    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e2b9f0c5-80cc-4a4c-bbd8-c70cda9d5d3d-fernet-keys\") pod \"keystone-6fb7949f77-2l9t7\" (UID: \"e2b9f0c5-80cc-4a4c-bbd8-c70cda9d5d3d\") " pod="openstack/keystone-6fb7949f77-2l9t7"
Mar 20 16:00:32 crc kubenswrapper[4730]: I0320 16:00:32.360367    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2b9f0c5-80cc-4a4c-bbd8-c70cda9d5d3d-public-tls-certs\") pod \"keystone-6fb7949f77-2l9t7\" (UID: \"e2b9f0c5-80cc-4a4c-bbd8-c70cda9d5d3d\") " pod="openstack/keystone-6fb7949f77-2l9t7"
Mar 20 16:00:32 crc kubenswrapper[4730]: I0320 16:00:32.360409    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2b9f0c5-80cc-4a4c-bbd8-c70cda9d5d3d-scripts\") pod \"keystone-6fb7949f77-2l9t7\" (UID: \"e2b9f0c5-80cc-4a4c-bbd8-c70cda9d5d3d\") " pod="openstack/keystone-6fb7949f77-2l9t7"
Mar 20 16:00:32 crc kubenswrapper[4730]: I0320 16:00:32.360447    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2b9f0c5-80cc-4a4c-bbd8-c70cda9d5d3d-internal-tls-certs\") pod \"keystone-6fb7949f77-2l9t7\" (UID: \"e2b9f0c5-80cc-4a4c-bbd8-c70cda9d5d3d\") " pod="openstack/keystone-6fb7949f77-2l9t7"
Mar 20 16:00:32 crc kubenswrapper[4730]: I0320 16:00:32.360482    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e2b9f0c5-80cc-4a4c-bbd8-c70cda9d5d3d-credential-keys\") pod \"keystone-6fb7949f77-2l9t7\" (UID: \"e2b9f0c5-80cc-4a4c-bbd8-c70cda9d5d3d\") " pod="openstack/keystone-6fb7949f77-2l9t7"
Mar 20 16:00:32 crc kubenswrapper[4730]: I0320 16:00:32.360515    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz5lz\" (UniqueName: \"kubernetes.io/projected/e2b9f0c5-80cc-4a4c-bbd8-c70cda9d5d3d-kube-api-access-mz5lz\") pod \"keystone-6fb7949f77-2l9t7\" (UID: \"e2b9f0c5-80cc-4a4c-bbd8-c70cda9d5d3d\") " pod="openstack/keystone-6fb7949f77-2l9t7"
Mar 20 16:00:32 crc kubenswrapper[4730]: I0320 16:00:32.360540    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2b9f0c5-80cc-4a4c-bbd8-c70cda9d5d3d-combined-ca-bundle\") pod \"keystone-6fb7949f77-2l9t7\" (UID: \"e2b9f0c5-80cc-4a4c-bbd8-c70cda9d5d3d\") " pod="openstack/keystone-6fb7949f77-2l9t7"
Mar 20 16:00:32 crc kubenswrapper[4730]: I0320 16:00:32.461995    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2b9f0c5-80cc-4a4c-bbd8-c70cda9d5d3d-combined-ca-bundle\") pod \"keystone-6fb7949f77-2l9t7\" (UID: \"e2b9f0c5-80cc-4a4c-bbd8-c70cda9d5d3d\") " pod="openstack/keystone-6fb7949f77-2l9t7"
Mar 20 16:00:32 crc kubenswrapper[4730]: I0320 16:00:32.462463    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2b9f0c5-80cc-4a4c-bbd8-c70cda9d5d3d-config-data\") pod \"keystone-6fb7949f77-2l9t7\" (UID: \"e2b9f0c5-80cc-4a4c-bbd8-c70cda9d5d3d\") " pod="openstack/keystone-6fb7949f77-2l9t7"
Mar 20 16:00:32 crc kubenswrapper[4730]: I0320 16:00:32.462499    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e2b9f0c5-80cc-4a4c-bbd8-c70cda9d5d3d-fernet-keys\") pod \"keystone-6fb7949f77-2l9t7\" (UID: \"e2b9f0c5-80cc-4a4c-bbd8-c70cda9d5d3d\") " pod="openstack/keystone-6fb7949f77-2l9t7"
Mar 20 16:00:32 crc kubenswrapper[4730]: I0320 16:00:32.462573    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2b9f0c5-80cc-4a4c-bbd8-c70cda9d5d3d-public-tls-certs\") pod \"keystone-6fb7949f77-2l9t7\" (UID: \"e2b9f0c5-80cc-4a4c-bbd8-c70cda9d5d3d\") " pod="openstack/keystone-6fb7949f77-2l9t7"
Mar 20 16:00:32 crc kubenswrapper[4730]: I0320 16:00:32.462631    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2b9f0c5-80cc-4a4c-bbd8-c70cda9d5d3d-scripts\") pod \"keystone-6fb7949f77-2l9t7\" (UID: \"e2b9f0c5-80cc-4a4c-bbd8-c70cda9d5d3d\") " pod="openstack/keystone-6fb7949f77-2l9t7"
Mar 20 16:00:32 crc kubenswrapper[4730]: I0320 16:00:32.462689    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2b9f0c5-80cc-4a4c-bbd8-c70cda9d5d3d-internal-tls-certs\") pod \"keystone-6fb7949f77-2l9t7\" (UID: \"e2b9f0c5-80cc-4a4c-bbd8-c70cda9d5d3d\") " pod="openstack/keystone-6fb7949f77-2l9t7"
Mar 20 16:00:32 crc kubenswrapper[4730]: I0320 16:00:32.462738    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e2b9f0c5-80cc-4a4c-bbd8-c70cda9d5d3d-credential-keys\") pod \"keystone-6fb7949f77-2l9t7\" (UID: \"e2b9f0c5-80cc-4a4c-bbd8-c70cda9d5d3d\") " pod="openstack/keystone-6fb7949f77-2l9t7"
Mar 20 16:00:32 crc kubenswrapper[4730]: I0320 16:00:32.462807    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mz5lz\" (UniqueName: \"kubernetes.io/projected/e2b9f0c5-80cc-4a4c-bbd8-c70cda9d5d3d-kube-api-access-mz5lz\") pod \"keystone-6fb7949f77-2l9t7\" (UID: \"e2b9f0c5-80cc-4a4c-bbd8-c70cda9d5d3d\") " pod="openstack/keystone-6fb7949f77-2l9t7"
Mar 20 16:00:32 crc kubenswrapper[4730]: I0320 16:00:32.466332    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2b9f0c5-80cc-4a4c-bbd8-c70cda9d5d3d-config-data\") pod \"keystone-6fb7949f77-2l9t7\" (UID: \"e2b9f0c5-80cc-4a4c-bbd8-c70cda9d5d3d\") " pod="openstack/keystone-6fb7949f77-2l9t7"
Mar 20 16:00:32 crc kubenswrapper[4730]: I0320 16:00:32.466848    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2b9f0c5-80cc-4a4c-bbd8-c70cda9d5d3d-public-tls-certs\") pod \"keystone-6fb7949f77-2l9t7\" (UID: \"e2b9f0c5-80cc-4a4c-bbd8-c70cda9d5d3d\") " pod="openstack/keystone-6fb7949f77-2l9t7"
Mar 20 16:00:32 crc kubenswrapper[4730]: I0320 16:00:32.467003    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2b9f0c5-80cc-4a4c-bbd8-c70cda9d5d3d-scripts\") pod \"keystone-6fb7949f77-2l9t7\" (UID: \"e2b9f0c5-80cc-4a4c-bbd8-c70cda9d5d3d\") " pod="openstack/keystone-6fb7949f77-2l9t7"
Mar 20 16:00:32 crc kubenswrapper[4730]: I0320 16:00:32.467113    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e2b9f0c5-80cc-4a4c-bbd8-c70cda9d5d3d-fernet-keys\") pod \"keystone-6fb7949f77-2l9t7\" (UID: \"e2b9f0c5-80cc-4a4c-bbd8-c70cda9d5d3d\") " pod="openstack/keystone-6fb7949f77-2l9t7"
Mar 20 16:00:32 crc kubenswrapper[4730]: I0320 16:00:32.467168    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e2b9f0c5-80cc-4a4c-bbd8-c70cda9d5d3d-credential-keys\") pod \"keystone-6fb7949f77-2l9t7\" (UID: \"e2b9f0c5-80cc-4a4c-bbd8-c70cda9d5d3d\") " pod="openstack/keystone-6fb7949f77-2l9t7"
Mar 20 16:00:32 crc kubenswrapper[4730]: I0320 16:00:32.467527    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2b9f0c5-80cc-4a4c-bbd8-c70cda9d5d3d-combined-ca-bundle\") pod \"keystone-6fb7949f77-2l9t7\" (UID: \"e2b9f0c5-80cc-4a4c-bbd8-c70cda9d5d3d\") " pod="openstack/keystone-6fb7949f77-2l9t7"
Mar 20 16:00:32 crc kubenswrapper[4730]: I0320 16:00:32.478702    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2b9f0c5-80cc-4a4c-bbd8-c70cda9d5d3d-internal-tls-certs\") pod \"keystone-6fb7949f77-2l9t7\" (UID: \"e2b9f0c5-80cc-4a4c-bbd8-c70cda9d5d3d\") " pod="openstack/keystone-6fb7949f77-2l9t7"
Mar 20 16:00:32 crc kubenswrapper[4730]: I0320 16:00:32.485008    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz5lz\" (UniqueName: \"kubernetes.io/projected/e2b9f0c5-80cc-4a4c-bbd8-c70cda9d5d3d-kube-api-access-mz5lz\") pod \"keystone-6fb7949f77-2l9t7\" (UID: \"e2b9f0c5-80cc-4a4c-bbd8-c70cda9d5d3d\") " pod="openstack/keystone-6fb7949f77-2l9t7"
Mar 20 16:00:32 crc kubenswrapper[4730]: I0320 16:00:32.592241    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6fb7949f77-2l9t7"
Mar 20 16:00:33 crc kubenswrapper[4730]: I0320 16:00:33.056080    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6fb7949f77-2l9t7"]
Mar 20 16:00:33 crc kubenswrapper[4730]: W0320 16:00:33.059776    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2b9f0c5_80cc_4a4c_bbd8_c70cda9d5d3d.slice/crio-992a0a8a715f735a908f7e67a9aa86874d7aa020108d6ed3d3f2b9de33ca1de4 WatchSource:0}: Error finding container 992a0a8a715f735a908f7e67a9aa86874d7aa020108d6ed3d3f2b9de33ca1de4: Status 404 returned error can't find the container with id 992a0a8a715f735a908f7e67a9aa86874d7aa020108d6ed3d3f2b9de33ca1de4
Mar 20 16:00:33 crc kubenswrapper[4730]: I0320 16:00:33.128303    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6fb7949f77-2l9t7" event={"ID":"e2b9f0c5-80cc-4a4c-bbd8-c70cda9d5d3d","Type":"ContainerStarted","Data":"992a0a8a715f735a908f7e67a9aa86874d7aa020108d6ed3d3f2b9de33ca1de4"}
Mar 20 16:00:33 crc kubenswrapper[4730]: I0320 16:00:33.175570    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0"
Mar 20 16:00:33 crc kubenswrapper[4730]: I0320 16:00:33.175622    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0"
Mar 20 16:00:33 crc kubenswrapper[4730]: I0320 16:00:33.225686    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0"
Mar 20 16:00:33 crc kubenswrapper[4730]: I0320 16:00:33.271643    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0"
Mar 20 16:00:33 crc kubenswrapper[4730]: I0320 16:00:33.403738    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0"
Mar 20 16:00:33 crc kubenswrapper[4730]: I0320 16:00:33.429467    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0"
Mar 20 16:00:33 crc kubenswrapper[4730]: I0320 16:00:33.494593    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:33 crc kubenswrapper[4730]: I0320 16:00:33.494647    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:33 crc kubenswrapper[4730]: I0320 16:00:33.516698    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-x2t9r"
Mar 20 16:00:33 crc kubenswrapper[4730]: I0320 16:00:33.524393    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:33 crc kubenswrapper[4730]: I0320 16:00:33.547349    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:33 crc kubenswrapper[4730]: I0320 16:00:33.582038    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a05675d7-cd2f-4810-862b-cb0d2d13cbdd-config-data\") pod \"a05675d7-cd2f-4810-862b-cb0d2d13cbdd\" (UID: \"a05675d7-cd2f-4810-862b-cb0d2d13cbdd\") "
Mar 20 16:00:33 crc kubenswrapper[4730]: I0320 16:00:33.582102    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a05675d7-cd2f-4810-862b-cb0d2d13cbdd-logs\") pod \"a05675d7-cd2f-4810-862b-cb0d2d13cbdd\" (UID: \"a05675d7-cd2f-4810-862b-cb0d2d13cbdd\") "
Mar 20 16:00:33 crc kubenswrapper[4730]: I0320 16:00:33.582299    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxgg6\" (UniqueName: \"kubernetes.io/projected/a05675d7-cd2f-4810-862b-cb0d2d13cbdd-kube-api-access-xxgg6\") pod \"a05675d7-cd2f-4810-862b-cb0d2d13cbdd\" (UID: \"a05675d7-cd2f-4810-862b-cb0d2d13cbdd\") "
Mar 20 16:00:33 crc kubenswrapper[4730]: I0320 16:00:33.582609    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a05675d7-cd2f-4810-862b-cb0d2d13cbdd-logs" (OuterVolumeSpecName: "logs") pod "a05675d7-cd2f-4810-862b-cb0d2d13cbdd" (UID: "a05675d7-cd2f-4810-862b-cb0d2d13cbdd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:00:33 crc kubenswrapper[4730]: I0320 16:00:33.582945    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a05675d7-cd2f-4810-862b-cb0d2d13cbdd-combined-ca-bundle\") pod \"a05675d7-cd2f-4810-862b-cb0d2d13cbdd\" (UID: \"a05675d7-cd2f-4810-862b-cb0d2d13cbdd\") "
Mar 20 16:00:33 crc kubenswrapper[4730]: I0320 16:00:33.583074    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a05675d7-cd2f-4810-862b-cb0d2d13cbdd-scripts\") pod \"a05675d7-cd2f-4810-862b-cb0d2d13cbdd\" (UID: \"a05675d7-cd2f-4810-862b-cb0d2d13cbdd\") "
Mar 20 16:00:33 crc kubenswrapper[4730]: I0320 16:00:33.583599    4730 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a05675d7-cd2f-4810-862b-cb0d2d13cbdd-logs\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:33 crc kubenswrapper[4730]: I0320 16:00:33.588379    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a05675d7-cd2f-4810-862b-cb0d2d13cbdd-scripts" (OuterVolumeSpecName: "scripts") pod "a05675d7-cd2f-4810-862b-cb0d2d13cbdd" (UID: "a05675d7-cd2f-4810-862b-cb0d2d13cbdd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:00:33 crc kubenswrapper[4730]: I0320 16:00:33.605241    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a05675d7-cd2f-4810-862b-cb0d2d13cbdd-kube-api-access-xxgg6" (OuterVolumeSpecName: "kube-api-access-xxgg6") pod "a05675d7-cd2f-4810-862b-cb0d2d13cbdd" (UID: "a05675d7-cd2f-4810-862b-cb0d2d13cbdd"). InnerVolumeSpecName "kube-api-access-xxgg6". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:00:33 crc kubenswrapper[4730]: E0320 16:00:33.627139    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a05675d7-cd2f-4810-862b-cb0d2d13cbdd-combined-ca-bundle podName:a05675d7-cd2f-4810-862b-cb0d2d13cbdd nodeName:}" failed. No retries permitted until 2026-03-20 16:00:34.127110508 +0000 UTC m=+1293.340481877 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/a05675d7-cd2f-4810-862b-cb0d2d13cbdd-combined-ca-bundle") pod "a05675d7-cd2f-4810-862b-cb0d2d13cbdd" (UID: "a05675d7-cd2f-4810-862b-cb0d2d13cbdd") : error deleting /var/lib/kubelet/pods/a05675d7-cd2f-4810-862b-cb0d2d13cbdd/volume-subpaths: remove /var/lib/kubelet/pods/a05675d7-cd2f-4810-862b-cb0d2d13cbdd/volume-subpaths: no such file or directory
Mar 20 16:00:33 crc kubenswrapper[4730]: I0320 16:00:33.629821    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a05675d7-cd2f-4810-862b-cb0d2d13cbdd-config-data" (OuterVolumeSpecName: "config-data") pod "a05675d7-cd2f-4810-862b-cb0d2d13cbdd" (UID: "a05675d7-cd2f-4810-862b-cb0d2d13cbdd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:00:33 crc kubenswrapper[4730]: I0320 16:00:33.685634    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxgg6\" (UniqueName: \"kubernetes.io/projected/a05675d7-cd2f-4810-862b-cb0d2d13cbdd-kube-api-access-xxgg6\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:33 crc kubenswrapper[4730]: I0320 16:00:33.685682    4730 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a05675d7-cd2f-4810-862b-cb0d2d13cbdd-scripts\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:33 crc kubenswrapper[4730]: I0320 16:00:33.685696    4730 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a05675d7-cd2f-4810-862b-cb0d2d13cbdd-config-data\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:34 crc kubenswrapper[4730]: I0320 16:00:34.147654    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-x2t9r" event={"ID":"a05675d7-cd2f-4810-862b-cb0d2d13cbdd","Type":"ContainerDied","Data":"31cc41440870730294a7b216b0c2b45c7a76296191729b22f502e1990a4511bd"}
Mar 20 16:00:34 crc kubenswrapper[4730]: I0320 16:00:34.147708    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31cc41440870730294a7b216b0c2b45c7a76296191729b22f502e1990a4511bd"
Mar 20 16:00:34 crc kubenswrapper[4730]: I0320 16:00:34.147795    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-x2t9r"
Mar 20 16:00:34 crc kubenswrapper[4730]: I0320 16:00:34.149396    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6fb7949f77-2l9t7" event={"ID":"e2b9f0c5-80cc-4a4c-bbd8-c70cda9d5d3d","Type":"ContainerStarted","Data":"b29c80237578b12b920e2e64f747c1b3f7795837afb2efdcf749183f1cac741c"}
Mar 20 16:00:34 crc kubenswrapper[4730]: I0320 16:00:34.149869    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-6fb7949f77-2l9t7"
Mar 20 16:00:34 crc kubenswrapper[4730]: I0320 16:00:34.151038    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0"
Mar 20 16:00:34 crc kubenswrapper[4730]: I0320 16:00:34.151059    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0"
Mar 20 16:00:34 crc kubenswrapper[4730]: I0320 16:00:34.152986    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:34 crc kubenswrapper[4730]: I0320 16:00:34.153030    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:34 crc kubenswrapper[4730]: I0320 16:00:34.185083    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6fb7949f77-2l9t7" podStartSLOduration=2.185064266 podStartE2EDuration="2.185064266s" podCreationTimestamp="2026-03-20 16:00:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:00:34.181469556 +0000 UTC m=+1293.394840925" watchObservedRunningTime="2026-03-20 16:00:34.185064266 +0000 UTC m=+1293.398435645"
Mar 20 16:00:34 crc kubenswrapper[4730]: I0320 16:00:34.192429    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a05675d7-cd2f-4810-862b-cb0d2d13cbdd-combined-ca-bundle\") pod \"a05675d7-cd2f-4810-862b-cb0d2d13cbdd\" (UID: \"a05675d7-cd2f-4810-862b-cb0d2d13cbdd\") "
Mar 20 16:00:34 crc kubenswrapper[4730]: I0320 16:00:34.197047    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a05675d7-cd2f-4810-862b-cb0d2d13cbdd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a05675d7-cd2f-4810-862b-cb0d2d13cbdd" (UID: "a05675d7-cd2f-4810-862b-cb0d2d13cbdd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:00:34 crc kubenswrapper[4730]: I0320 16:00:34.245037    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0"
Mar 20 16:00:34 crc kubenswrapper[4730]: I0320 16:00:34.289195    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-78b446cdb6-zs6nw"]
Mar 20 16:00:34 crc kubenswrapper[4730]: E0320 16:00:34.289627    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a05675d7-cd2f-4810-862b-cb0d2d13cbdd" containerName="placement-db-sync"
Mar 20 16:00:34 crc kubenswrapper[4730]: I0320 16:00:34.289650    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="a05675d7-cd2f-4810-862b-cb0d2d13cbdd" containerName="placement-db-sync"
Mar 20 16:00:34 crc kubenswrapper[4730]: I0320 16:00:34.289848    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="a05675d7-cd2f-4810-862b-cb0d2d13cbdd" containerName="placement-db-sync"
Mar 20 16:00:34 crc kubenswrapper[4730]: I0320 16:00:34.290800    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-78b446cdb6-zs6nw"
Mar 20 16:00:34 crc kubenswrapper[4730]: I0320 16:00:34.297603    4730 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a05675d7-cd2f-4810-862b-cb0d2d13cbdd-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:34 crc kubenswrapper[4730]: I0320 16:00:34.297955    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc"
Mar 20 16:00:34 crc kubenswrapper[4730]: I0320 16:00:34.298346    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc"
Mar 20 16:00:34 crc kubenswrapper[4730]: I0320 16:00:34.302122    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-78b446cdb6-zs6nw"]
Mar 20 16:00:34 crc kubenswrapper[4730]: I0320 16:00:34.399182    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2885c5d-681f-4e22-bdeb-b716957d83e1-logs\") pod \"placement-78b446cdb6-zs6nw\" (UID: \"d2885c5d-681f-4e22-bdeb-b716957d83e1\") " pod="openstack/placement-78b446cdb6-zs6nw"
Mar 20 16:00:34 crc kubenswrapper[4730]: I0320 16:00:34.399312    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2885c5d-681f-4e22-bdeb-b716957d83e1-scripts\") pod \"placement-78b446cdb6-zs6nw\" (UID: \"d2885c5d-681f-4e22-bdeb-b716957d83e1\") " pod="openstack/placement-78b446cdb6-zs6nw"
Mar 20 16:00:34 crc kubenswrapper[4730]: I0320 16:00:34.399361    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2885c5d-681f-4e22-bdeb-b716957d83e1-combined-ca-bundle\") pod \"placement-78b446cdb6-zs6nw\" (UID: \"d2885c5d-681f-4e22-bdeb-b716957d83e1\") " pod="openstack/placement-78b446cdb6-zs6nw"
Mar 20 16:00:34 crc kubenswrapper[4730]: I0320 16:00:34.399458    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2885c5d-681f-4e22-bdeb-b716957d83e1-config-data\") pod \"placement-78b446cdb6-zs6nw\" (UID: \"d2885c5d-681f-4e22-bdeb-b716957d83e1\") " pod="openstack/placement-78b446cdb6-zs6nw"
Mar 20 16:00:34 crc kubenswrapper[4730]: I0320 16:00:34.399496    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmk8n\" (UniqueName: \"kubernetes.io/projected/d2885c5d-681f-4e22-bdeb-b716957d83e1-kube-api-access-qmk8n\") pod \"placement-78b446cdb6-zs6nw\" (UID: \"d2885c5d-681f-4e22-bdeb-b716957d83e1\") " pod="openstack/placement-78b446cdb6-zs6nw"
Mar 20 16:00:34 crc kubenswrapper[4730]: I0320 16:00:34.399518    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2885c5d-681f-4e22-bdeb-b716957d83e1-public-tls-certs\") pod \"placement-78b446cdb6-zs6nw\" (UID: \"d2885c5d-681f-4e22-bdeb-b716957d83e1\") " pod="openstack/placement-78b446cdb6-zs6nw"
Mar 20 16:00:34 crc kubenswrapper[4730]: I0320 16:00:34.399576    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2885c5d-681f-4e22-bdeb-b716957d83e1-internal-tls-certs\") pod \"placement-78b446cdb6-zs6nw\" (UID: \"d2885c5d-681f-4e22-bdeb-b716957d83e1\") " pod="openstack/placement-78b446cdb6-zs6nw"
Mar 20 16:00:34 crc kubenswrapper[4730]: I0320 16:00:34.501812    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2885c5d-681f-4e22-bdeb-b716957d83e1-combined-ca-bundle\") pod \"placement-78b446cdb6-zs6nw\" (UID: \"d2885c5d-681f-4e22-bdeb-b716957d83e1\") " pod="openstack/placement-78b446cdb6-zs6nw"
Mar 20 16:00:34 crc kubenswrapper[4730]: I0320 16:00:34.501911    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2885c5d-681f-4e22-bdeb-b716957d83e1-config-data\") pod \"placement-78b446cdb6-zs6nw\" (UID: \"d2885c5d-681f-4e22-bdeb-b716957d83e1\") " pod="openstack/placement-78b446cdb6-zs6nw"
Mar 20 16:00:34 crc kubenswrapper[4730]: I0320 16:00:34.501942    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmk8n\" (UniqueName: \"kubernetes.io/projected/d2885c5d-681f-4e22-bdeb-b716957d83e1-kube-api-access-qmk8n\") pod \"placement-78b446cdb6-zs6nw\" (UID: \"d2885c5d-681f-4e22-bdeb-b716957d83e1\") " pod="openstack/placement-78b446cdb6-zs6nw"
Mar 20 16:00:34 crc kubenswrapper[4730]: I0320 16:00:34.501968    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2885c5d-681f-4e22-bdeb-b716957d83e1-public-tls-certs\") pod \"placement-78b446cdb6-zs6nw\" (UID: \"d2885c5d-681f-4e22-bdeb-b716957d83e1\") " pod="openstack/placement-78b446cdb6-zs6nw"
Mar 20 16:00:34 crc kubenswrapper[4730]: I0320 16:00:34.502008    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2885c5d-681f-4e22-bdeb-b716957d83e1-internal-tls-certs\") pod \"placement-78b446cdb6-zs6nw\" (UID: \"d2885c5d-681f-4e22-bdeb-b716957d83e1\") " pod="openstack/placement-78b446cdb6-zs6nw"
Mar 20 16:00:34 crc kubenswrapper[4730]: I0320 16:00:34.502062    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2885c5d-681f-4e22-bdeb-b716957d83e1-logs\") pod \"placement-78b446cdb6-zs6nw\" (UID: \"d2885c5d-681f-4e22-bdeb-b716957d83e1\") " pod="openstack/placement-78b446cdb6-zs6nw"
Mar 20 16:00:34 crc kubenswrapper[4730]: I0320 16:00:34.502116    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2885c5d-681f-4e22-bdeb-b716957d83e1-scripts\") pod \"placement-78b446cdb6-zs6nw\" (UID: \"d2885c5d-681f-4e22-bdeb-b716957d83e1\") " pod="openstack/placement-78b446cdb6-zs6nw"
Mar 20 16:00:34 crc kubenswrapper[4730]: I0320 16:00:34.518870    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2885c5d-681f-4e22-bdeb-b716957d83e1-logs\") pod \"placement-78b446cdb6-zs6nw\" (UID: \"d2885c5d-681f-4e22-bdeb-b716957d83e1\") " pod="openstack/placement-78b446cdb6-zs6nw"
Mar 20 16:00:34 crc kubenswrapper[4730]: I0320 16:00:34.521789    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2885c5d-681f-4e22-bdeb-b716957d83e1-scripts\") pod \"placement-78b446cdb6-zs6nw\" (UID: \"d2885c5d-681f-4e22-bdeb-b716957d83e1\") " pod="openstack/placement-78b446cdb6-zs6nw"
Mar 20 16:00:34 crc kubenswrapper[4730]: I0320 16:00:34.522835    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2885c5d-681f-4e22-bdeb-b716957d83e1-internal-tls-certs\") pod \"placement-78b446cdb6-zs6nw\" (UID: \"d2885c5d-681f-4e22-bdeb-b716957d83e1\") " pod="openstack/placement-78b446cdb6-zs6nw"
Mar 20 16:00:34 crc kubenswrapper[4730]: I0320 16:00:34.524380    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2885c5d-681f-4e22-bdeb-b716957d83e1-combined-ca-bundle\") pod \"placement-78b446cdb6-zs6nw\" (UID: \"d2885c5d-681f-4e22-bdeb-b716957d83e1\") " pod="openstack/placement-78b446cdb6-zs6nw"
Mar 20 16:00:34 crc kubenswrapper[4730]: I0320 16:00:34.524429    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2885c5d-681f-4e22-bdeb-b716957d83e1-config-data\") pod \"placement-78b446cdb6-zs6nw\" (UID: \"d2885c5d-681f-4e22-bdeb-b716957d83e1\") " pod="openstack/placement-78b446cdb6-zs6nw"
Mar 20 16:00:34 crc kubenswrapper[4730]: I0320 16:00:34.525327    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2885c5d-681f-4e22-bdeb-b716957d83e1-public-tls-certs\") pod \"placement-78b446cdb6-zs6nw\" (UID: \"d2885c5d-681f-4e22-bdeb-b716957d83e1\") " pod="openstack/placement-78b446cdb6-zs6nw"
Mar 20 16:00:34 crc kubenswrapper[4730]: I0320 16:00:34.537713    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmk8n\" (UniqueName: \"kubernetes.io/projected/d2885c5d-681f-4e22-bdeb-b716957d83e1-kube-api-access-qmk8n\") pod \"placement-78b446cdb6-zs6nw\" (UID: \"d2885c5d-681f-4e22-bdeb-b716957d83e1\") " pod="openstack/placement-78b446cdb6-zs6nw"
Mar 20 16:00:34 crc kubenswrapper[4730]: I0320 16:00:34.609762    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-78b446cdb6-zs6nw"
Mar 20 16:00:35 crc kubenswrapper[4730]: I0320 16:00:35.164878    4730 generic.go:334] "Generic (PLEG): container finished" podID="48fc8af0-e30f-4f3f-88d3-8b054c6359ef" containerID="59a0ed1595de1b0849599bb5a7c10e7cfbb46ad061c13c2ab2d12fc1bc355373" exitCode=0
Mar 20 16:00:35 crc kubenswrapper[4730]: I0320 16:00:35.164952    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-tz6x7" event={"ID":"48fc8af0-e30f-4f3f-88d3-8b054c6359ef","Type":"ContainerDied","Data":"59a0ed1595de1b0849599bb5a7c10e7cfbb46ad061c13c2ab2d12fc1bc355373"}
Mar 20 16:00:35 crc kubenswrapper[4730]: I0320 16:00:35.206593    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-78b446cdb6-zs6nw"]
Mar 20 16:00:35 crc kubenswrapper[4730]: W0320 16:00:35.210204    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2885c5d_681f_4e22_bdeb_b716957d83e1.slice/crio-421f8aaa4b2b066ba0db639897a31bc1c26529226105978e2e6c3ddf0c3f4ce8 WatchSource:0}: Error finding container 421f8aaa4b2b066ba0db639897a31bc1c26529226105978e2e6c3ddf0c3f4ce8: Status 404 returned error can't find the container with id 421f8aaa4b2b066ba0db639897a31bc1c26529226105978e2e6c3ddf0c3f4ce8
Mar 20 16:00:35 crc kubenswrapper[4730]: I0320 16:00:35.979164    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0"
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.140051    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/468631ad-821b-469e-a166-1d32d370e5fa-config-data\") pod \"468631ad-821b-469e-a166-1d32d370e5fa\" (UID: \"468631ad-821b-469e-a166-1d32d370e5fa\") "
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.140192    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnvh7\" (UniqueName: \"kubernetes.io/projected/468631ad-821b-469e-a166-1d32d370e5fa-kube-api-access-jnvh7\") pod \"468631ad-821b-469e-a166-1d32d370e5fa\" (UID: \"468631ad-821b-469e-a166-1d32d370e5fa\") "
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.140294    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/468631ad-821b-469e-a166-1d32d370e5fa-combined-ca-bundle\") pod \"468631ad-821b-469e-a166-1d32d370e5fa\" (UID: \"468631ad-821b-469e-a166-1d32d370e5fa\") "
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.140356    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/468631ad-821b-469e-a166-1d32d370e5fa-logs\") pod \"468631ad-821b-469e-a166-1d32d370e5fa\" (UID: \"468631ad-821b-469e-a166-1d32d370e5fa\") "
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.140435    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/468631ad-821b-469e-a166-1d32d370e5fa-custom-prometheus-ca\") pod \"468631ad-821b-469e-a166-1d32d370e5fa\" (UID: \"468631ad-821b-469e-a166-1d32d370e5fa\") "
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.140867    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/468631ad-821b-469e-a166-1d32d370e5fa-logs" (OuterVolumeSpecName: "logs") pod "468631ad-821b-469e-a166-1d32d370e5fa" (UID: "468631ad-821b-469e-a166-1d32d370e5fa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.148569    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/468631ad-821b-469e-a166-1d32d370e5fa-kube-api-access-jnvh7" (OuterVolumeSpecName: "kube-api-access-jnvh7") pod "468631ad-821b-469e-a166-1d32d370e5fa" (UID: "468631ad-821b-469e-a166-1d32d370e5fa"). InnerVolumeSpecName "kube-api-access-jnvh7". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.178602    4730 generic.go:334] "Generic (PLEG): container finished" podID="468631ad-821b-469e-a166-1d32d370e5fa" containerID="3f2599acfa566a2a6933c070fb527ae037e48d58578163cf559260ee0ee91126" exitCode=137
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.178729    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0"
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.178798    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"468631ad-821b-469e-a166-1d32d370e5fa","Type":"ContainerDied","Data":"3f2599acfa566a2a6933c070fb527ae037e48d58578163cf559260ee0ee91126"}
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.178854    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"468631ad-821b-469e-a166-1d32d370e5fa","Type":"ContainerDied","Data":"61de70dac9ca369e87e9de4023e0b6d3ab234abdc4d63ca022c46c3f926b57cd"}
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.178874    4730 scope.go:117] "RemoveContainer" containerID="3f2599acfa566a2a6933c070fb527ae037e48d58578163cf559260ee0ee91126"
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.184640    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-78b446cdb6-zs6nw" event={"ID":"d2885c5d-681f-4e22-bdeb-b716957d83e1","Type":"ContainerStarted","Data":"c7752714919e7683b19fa98a27dc456cceb1ff84e231d081c0bf6cdf50a72206"}
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.184681    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-78b446cdb6-zs6nw" event={"ID":"d2885c5d-681f-4e22-bdeb-b716957d83e1","Type":"ContainerStarted","Data":"7eb26d82b530714f030687355ff1803daa124db098cf71b7400758d270fece6e"}
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.184697    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-78b446cdb6-zs6nw" event={"ID":"d2885c5d-681f-4e22-bdeb-b716957d83e1","Type":"ContainerStarted","Data":"421f8aaa4b2b066ba0db639897a31bc1c26529226105978e2e6c3ddf0c3f4ce8"}
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.184714    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-78b446cdb6-zs6nw"
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.184767    4730 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness"
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.184777    4730 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness"
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.185365    4730 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness"
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.185383    4730 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness"
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.185932    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-78b446cdb6-zs6nw"
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.186479    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/468631ad-821b-469e-a166-1d32d370e5fa-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "468631ad-821b-469e-a166-1d32d370e5fa" (UID: "468631ad-821b-469e-a166-1d32d370e5fa"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.188167    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/468631ad-821b-469e-a166-1d32d370e5fa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "468631ad-821b-469e-a166-1d32d370e5fa" (UID: "468631ad-821b-469e-a166-1d32d370e5fa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.216690    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/468631ad-821b-469e-a166-1d32d370e5fa-config-data" (OuterVolumeSpecName: "config-data") pod "468631ad-821b-469e-a166-1d32d370e5fa" (UID: "468631ad-821b-469e-a166-1d32d370e5fa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.219477    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-78b446cdb6-zs6nw" podStartSLOduration=2.219462741 podStartE2EDuration="2.219462741s" podCreationTimestamp="2026-03-20 16:00:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:00:36.213806865 +0000 UTC m=+1295.427178234" watchObservedRunningTime="2026-03-20 16:00:36.219462741 +0000 UTC m=+1295.432834110"
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.244376    4730 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/468631ad-821b-469e-a166-1d32d370e5fa-config-data\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.244408    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnvh7\" (UniqueName: \"kubernetes.io/projected/468631ad-821b-469e-a166-1d32d370e5fa-kube-api-access-jnvh7\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.244419    4730 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/468631ad-821b-469e-a166-1d32d370e5fa-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.244429    4730 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/468631ad-821b-469e-a166-1d32d370e5fa-logs\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.244438    4730 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/468631ad-821b-469e-a166-1d32d370e5fa-custom-prometheus-ca\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.342393    4730 scope.go:117] "RemoveContainer" containerID="f7dc5000437c6d89457d868ca7a318170322b81dfd62eee1c3bc3df85ceba4e5"
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.366931    4730 scope.go:117] "RemoveContainer" containerID="3f2599acfa566a2a6933c070fb527ae037e48d58578163cf559260ee0ee91126"
Mar 20 16:00:36 crc kubenswrapper[4730]: E0320 16:00:36.372088    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f2599acfa566a2a6933c070fb527ae037e48d58578163cf559260ee0ee91126\": container with ID starting with 3f2599acfa566a2a6933c070fb527ae037e48d58578163cf559260ee0ee91126 not found: ID does not exist" containerID="3f2599acfa566a2a6933c070fb527ae037e48d58578163cf559260ee0ee91126"
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.372137    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f2599acfa566a2a6933c070fb527ae037e48d58578163cf559260ee0ee91126"} err="failed to get container status \"3f2599acfa566a2a6933c070fb527ae037e48d58578163cf559260ee0ee91126\": rpc error: code = NotFound desc = could not find container \"3f2599acfa566a2a6933c070fb527ae037e48d58578163cf559260ee0ee91126\": container with ID starting with 3f2599acfa566a2a6933c070fb527ae037e48d58578163cf559260ee0ee91126 not found: ID does not exist"
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.372174    4730 scope.go:117] "RemoveContainer" containerID="f7dc5000437c6d89457d868ca7a318170322b81dfd62eee1c3bc3df85ceba4e5"
Mar 20 16:00:36 crc kubenswrapper[4730]: E0320 16:00:36.372514    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7dc5000437c6d89457d868ca7a318170322b81dfd62eee1c3bc3df85ceba4e5\": container with ID starting with f7dc5000437c6d89457d868ca7a318170322b81dfd62eee1c3bc3df85ceba4e5 not found: ID does not exist" containerID="f7dc5000437c6d89457d868ca7a318170322b81dfd62eee1c3bc3df85ceba4e5"
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.372561    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7dc5000437c6d89457d868ca7a318170322b81dfd62eee1c3bc3df85ceba4e5"} err="failed to get container status \"f7dc5000437c6d89457d868ca7a318170322b81dfd62eee1c3bc3df85ceba4e5\": rpc error: code = NotFound desc = could not find container \"f7dc5000437c6d89457d868ca7a318170322b81dfd62eee1c3bc3df85ceba4e5\": container with ID starting with f7dc5000437c6d89457d868ca7a318170322b81dfd62eee1c3bc3df85ceba4e5 not found: ID does not exist"
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.512003    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-tz6x7"
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.554403    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"]
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.572442    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"]
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.593690    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"]
Mar 20 16:00:36 crc kubenswrapper[4730]: E0320 16:00:36.594098    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="468631ad-821b-469e-a166-1d32d370e5fa" containerName="watcher-api-log"
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.594117    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="468631ad-821b-469e-a166-1d32d370e5fa" containerName="watcher-api-log"
Mar 20 16:00:36 crc kubenswrapper[4730]: E0320 16:00:36.594149    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48fc8af0-e30f-4f3f-88d3-8b054c6359ef" containerName="barbican-db-sync"
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.594156    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="48fc8af0-e30f-4f3f-88d3-8b054c6359ef" containerName="barbican-db-sync"
Mar 20 16:00:36 crc kubenswrapper[4730]: E0320 16:00:36.594170    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="468631ad-821b-469e-a166-1d32d370e5fa" containerName="watcher-api"
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.594177    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="468631ad-821b-469e-a166-1d32d370e5fa" containerName="watcher-api"
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.594409    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="48fc8af0-e30f-4f3f-88d3-8b054c6359ef" containerName="barbican-db-sync"
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.594428    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="468631ad-821b-469e-a166-1d32d370e5fa" containerName="watcher-api"
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.594440    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="468631ad-821b-469e-a166-1d32d370e5fa" containerName="watcher-api-log"
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.595718    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0"
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.599742    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data"
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.602705    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"]
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.664933    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48fc8af0-e30f-4f3f-88d3-8b054c6359ef-combined-ca-bundle\") pod \"48fc8af0-e30f-4f3f-88d3-8b054c6359ef\" (UID: \"48fc8af0-e30f-4f3f-88d3-8b054c6359ef\") "
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.665017    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/48fc8af0-e30f-4f3f-88d3-8b054c6359ef-db-sync-config-data\") pod \"48fc8af0-e30f-4f3f-88d3-8b054c6359ef\" (UID: \"48fc8af0-e30f-4f3f-88d3-8b054c6359ef\") "
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.665100    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bggsr\" (UniqueName: \"kubernetes.io/projected/48fc8af0-e30f-4f3f-88d3-8b054c6359ef-kube-api-access-bggsr\") pod \"48fc8af0-e30f-4f3f-88d3-8b054c6359ef\" (UID: \"48fc8af0-e30f-4f3f-88d3-8b054c6359ef\") "
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.669765    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48fc8af0-e30f-4f3f-88d3-8b054c6359ef-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "48fc8af0-e30f-4f3f-88d3-8b054c6359ef" (UID: "48fc8af0-e30f-4f3f-88d3-8b054c6359ef"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.669957    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48fc8af0-e30f-4f3f-88d3-8b054c6359ef-kube-api-access-bggsr" (OuterVolumeSpecName: "kube-api-access-bggsr") pod "48fc8af0-e30f-4f3f-88d3-8b054c6359ef" (UID: "48fc8af0-e30f-4f3f-88d3-8b054c6359ef"). InnerVolumeSpecName "kube-api-access-bggsr". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.689555    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48fc8af0-e30f-4f3f-88d3-8b054c6359ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48fc8af0-e30f-4f3f-88d3-8b054c6359ef" (UID: "48fc8af0-e30f-4f3f-88d3-8b054c6359ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.747137    4730 scope.go:117] "RemoveContainer" containerID="cd338cd8acc0dfd58bb17dd35d4fa074369101fa940bc1e78ceafdde3c9aa8ec"
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.767422    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2lq6\" (UniqueName: \"kubernetes.io/projected/aabd3bd6-2cee-47b8-9174-ad9ea1415e82-kube-api-access-m2lq6\") pod \"watcher-api-0\" (UID: \"aabd3bd6-2cee-47b8-9174-ad9ea1415e82\") " pod="openstack/watcher-api-0"
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.767524    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aabd3bd6-2cee-47b8-9174-ad9ea1415e82-config-data\") pod \"watcher-api-0\" (UID: \"aabd3bd6-2cee-47b8-9174-ad9ea1415e82\") " pod="openstack/watcher-api-0"
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.767672    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aabd3bd6-2cee-47b8-9174-ad9ea1415e82-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"aabd3bd6-2cee-47b8-9174-ad9ea1415e82\") " pod="openstack/watcher-api-0"
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.767705    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aabd3bd6-2cee-47b8-9174-ad9ea1415e82-logs\") pod \"watcher-api-0\" (UID: \"aabd3bd6-2cee-47b8-9174-ad9ea1415e82\") " pod="openstack/watcher-api-0"
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.767725    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/aabd3bd6-2cee-47b8-9174-ad9ea1415e82-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"aabd3bd6-2cee-47b8-9174-ad9ea1415e82\") " pod="openstack/watcher-api-0"
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.767811    4730 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48fc8af0-e30f-4f3f-88d3-8b054c6359ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.767822    4730 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/48fc8af0-e30f-4f3f-88d3-8b054c6359ef-db-sync-config-data\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.767831    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bggsr\" (UniqueName: \"kubernetes.io/projected/48fc8af0-e30f-4f3f-88d3-8b054c6359ef-kube-api-access-bggsr\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.869635    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2lq6\" (UniqueName: \"kubernetes.io/projected/aabd3bd6-2cee-47b8-9174-ad9ea1415e82-kube-api-access-m2lq6\") pod \"watcher-api-0\" (UID: \"aabd3bd6-2cee-47b8-9174-ad9ea1415e82\") " pod="openstack/watcher-api-0"
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.870045    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aabd3bd6-2cee-47b8-9174-ad9ea1415e82-config-data\") pod \"watcher-api-0\" (UID: \"aabd3bd6-2cee-47b8-9174-ad9ea1415e82\") " pod="openstack/watcher-api-0"
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.870145    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aabd3bd6-2cee-47b8-9174-ad9ea1415e82-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"aabd3bd6-2cee-47b8-9174-ad9ea1415e82\") " pod="openstack/watcher-api-0"
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.870162    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/aabd3bd6-2cee-47b8-9174-ad9ea1415e82-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"aabd3bd6-2cee-47b8-9174-ad9ea1415e82\") " pod="openstack/watcher-api-0"
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.870177    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aabd3bd6-2cee-47b8-9174-ad9ea1415e82-logs\") pod \"watcher-api-0\" (UID: \"aabd3bd6-2cee-47b8-9174-ad9ea1415e82\") " pod="openstack/watcher-api-0"
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.870877    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aabd3bd6-2cee-47b8-9174-ad9ea1415e82-logs\") pod \"watcher-api-0\" (UID: \"aabd3bd6-2cee-47b8-9174-ad9ea1415e82\") " pod="openstack/watcher-api-0"
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.892481    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aabd3bd6-2cee-47b8-9174-ad9ea1415e82-config-data\") pod \"watcher-api-0\" (UID: \"aabd3bd6-2cee-47b8-9174-ad9ea1415e82\") " pod="openstack/watcher-api-0"
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.892553    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/aabd3bd6-2cee-47b8-9174-ad9ea1415e82-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"aabd3bd6-2cee-47b8-9174-ad9ea1415e82\") " pod="openstack/watcher-api-0"
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.893039    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aabd3bd6-2cee-47b8-9174-ad9ea1415e82-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"aabd3bd6-2cee-47b8-9174-ad9ea1415e82\") " pod="openstack/watcher-api-0"
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.894806    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2lq6\" (UniqueName: \"kubernetes.io/projected/aabd3bd6-2cee-47b8-9174-ad9ea1415e82-kube-api-access-m2lq6\") pod \"watcher-api-0\" (UID: \"aabd3bd6-2cee-47b8-9174-ad9ea1415e82\") " pod="openstack/watcher-api-0"
Mar 20 16:00:36 crc kubenswrapper[4730]: I0320 16:00:36.915115    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.105307    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.107550    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.131187    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.131869    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.220673    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-tz6x7" event={"ID":"48fc8af0-e30f-4f3f-88d3-8b054c6359ef","Type":"ContainerDied","Data":"af7beef135dc284222c89d8da5556d80f3f65072f0a1d94b5d3c1cbd5f0ae59a"}
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.220717    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af7beef135dc284222c89d8da5556d80f3f65072f0a1d94b5d3c1cbd5f0ae59a"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.220780    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-tz6x7"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.438229    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-86bc9f54b4-6szxq"]
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.439696    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-86bc9f54b4-6szxq"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.451561    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.451834    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-wcvgq"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.453446    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.482301    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-54b9958865-vn9kj"]
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.483851    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-54b9958865-vn9kj"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.487223    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.493396    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-86bc9f54b4-6szxq"]
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.520294    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-54b9958865-vn9kj"]
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.597374    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="468631ad-821b-469e-a166-1d32d370e5fa" path="/var/lib/kubelet/pods/468631ad-821b-469e-a166-1d32d370e5fa/volumes"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.598102    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6dccc4d8b9-qqxcj"]
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.602201    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dccc4d8b9-qqxcj"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.613505    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6dccc4d8b9-qqxcj"]
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.627833    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f8c40f6-c8d3-4c8c-97eb-643d32774174-config-data\") pod \"barbican-worker-54b9958865-vn9kj\" (UID: \"8f8c40f6-c8d3-4c8c-97eb-643d32774174\") " pod="openstack/barbican-worker-54b9958865-vn9kj"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.627918    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4dfee88-47ff-4e8b-9f46-60cc17fb0080-config-data-custom\") pod \"barbican-keystone-listener-86bc9f54b4-6szxq\" (UID: \"e4dfee88-47ff-4e8b-9f46-60cc17fb0080\") " pod="openstack/barbican-keystone-listener-86bc9f54b4-6szxq"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.627954    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4dfee88-47ff-4e8b-9f46-60cc17fb0080-logs\") pod \"barbican-keystone-listener-86bc9f54b4-6szxq\" (UID: \"e4dfee88-47ff-4e8b-9f46-60cc17fb0080\") " pod="openstack/barbican-keystone-listener-86bc9f54b4-6szxq"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.627972    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc765c5c-7def-4230-b500-d6410c2da475-dns-svc\") pod \"dnsmasq-dns-6dccc4d8b9-qqxcj\" (UID: \"fc765c5c-7def-4230-b500-d6410c2da475\") " pod="openstack/dnsmasq-dns-6dccc4d8b9-qqxcj"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.627988    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc765c5c-7def-4230-b500-d6410c2da475-config\") pod \"dnsmasq-dns-6dccc4d8b9-qqxcj\" (UID: \"fc765c5c-7def-4230-b500-d6410c2da475\") " pod="openstack/dnsmasq-dns-6dccc4d8b9-qqxcj"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.628012    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9k6g\" (UniqueName: \"kubernetes.io/projected/fc765c5c-7def-4230-b500-d6410c2da475-kube-api-access-q9k6g\") pod \"dnsmasq-dns-6dccc4d8b9-qqxcj\" (UID: \"fc765c5c-7def-4230-b500-d6410c2da475\") " pod="openstack/dnsmasq-dns-6dccc4d8b9-qqxcj"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.628029    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8f8c40f6-c8d3-4c8c-97eb-643d32774174-config-data-custom\") pod \"barbican-worker-54b9958865-vn9kj\" (UID: \"8f8c40f6-c8d3-4c8c-97eb-643d32774174\") " pod="openstack/barbican-worker-54b9958865-vn9kj"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.628043    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvpbl\" (UniqueName: \"kubernetes.io/projected/8f8c40f6-c8d3-4c8c-97eb-643d32774174-kube-api-access-rvpbl\") pod \"barbican-worker-54b9958865-vn9kj\" (UID: \"8f8c40f6-c8d3-4c8c-97eb-643d32774174\") " pod="openstack/barbican-worker-54b9958865-vn9kj"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.628058    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4dfee88-47ff-4e8b-9f46-60cc17fb0080-config-data\") pod \"barbican-keystone-listener-86bc9f54b4-6szxq\" (UID: \"e4dfee88-47ff-4e8b-9f46-60cc17fb0080\") " pod="openstack/barbican-keystone-listener-86bc9f54b4-6szxq"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.628076    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc765c5c-7def-4230-b500-d6410c2da475-ovsdbserver-sb\") pod \"dnsmasq-dns-6dccc4d8b9-qqxcj\" (UID: \"fc765c5c-7def-4230-b500-d6410c2da475\") " pod="openstack/dnsmasq-dns-6dccc4d8b9-qqxcj"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.628101    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8c40f6-c8d3-4c8c-97eb-643d32774174-combined-ca-bundle\") pod \"barbican-worker-54b9958865-vn9kj\" (UID: \"8f8c40f6-c8d3-4c8c-97eb-643d32774174\") " pod="openstack/barbican-worker-54b9958865-vn9kj"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.628131    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4dfee88-47ff-4e8b-9f46-60cc17fb0080-combined-ca-bundle\") pod \"barbican-keystone-listener-86bc9f54b4-6szxq\" (UID: \"e4dfee88-47ff-4e8b-9f46-60cc17fb0080\") " pod="openstack/barbican-keystone-listener-86bc9f54b4-6szxq"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.628160    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc765c5c-7def-4230-b500-d6410c2da475-ovsdbserver-nb\") pod \"dnsmasq-dns-6dccc4d8b9-qqxcj\" (UID: \"fc765c5c-7def-4230-b500-d6410c2da475\") " pod="openstack/dnsmasq-dns-6dccc4d8b9-qqxcj"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.628174    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f8c40f6-c8d3-4c8c-97eb-643d32774174-logs\") pod \"barbican-worker-54b9958865-vn9kj\" (UID: \"8f8c40f6-c8d3-4c8c-97eb-643d32774174\") " pod="openstack/barbican-worker-54b9958865-vn9kj"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.628188    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcckp\" (UniqueName: \"kubernetes.io/projected/e4dfee88-47ff-4e8b-9f46-60cc17fb0080-kube-api-access-jcckp\") pod \"barbican-keystone-listener-86bc9f54b4-6szxq\" (UID: \"e4dfee88-47ff-4e8b-9f46-60cc17fb0080\") " pod="openstack/barbican-keystone-listener-86bc9f54b4-6szxq"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.628205    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc765c5c-7def-4230-b500-d6410c2da475-dns-swift-storage-0\") pod \"dnsmasq-dns-6dccc4d8b9-qqxcj\" (UID: \"fc765c5c-7def-4230-b500-d6410c2da475\") " pod="openstack/dnsmasq-dns-6dccc4d8b9-qqxcj"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.730614    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc765c5c-7def-4230-b500-d6410c2da475-ovsdbserver-nb\") pod \"dnsmasq-dns-6dccc4d8b9-qqxcj\" (UID: \"fc765c5c-7def-4230-b500-d6410c2da475\") " pod="openstack/dnsmasq-dns-6dccc4d8b9-qqxcj"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.730662    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f8c40f6-c8d3-4c8c-97eb-643d32774174-logs\") pod \"barbican-worker-54b9958865-vn9kj\" (UID: \"8f8c40f6-c8d3-4c8c-97eb-643d32774174\") " pod="openstack/barbican-worker-54b9958865-vn9kj"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.730687    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcckp\" (UniqueName: \"kubernetes.io/projected/e4dfee88-47ff-4e8b-9f46-60cc17fb0080-kube-api-access-jcckp\") pod \"barbican-keystone-listener-86bc9f54b4-6szxq\" (UID: \"e4dfee88-47ff-4e8b-9f46-60cc17fb0080\") " pod="openstack/barbican-keystone-listener-86bc9f54b4-6szxq"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.730716    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc765c5c-7def-4230-b500-d6410c2da475-dns-swift-storage-0\") pod \"dnsmasq-dns-6dccc4d8b9-qqxcj\" (UID: \"fc765c5c-7def-4230-b500-d6410c2da475\") " pod="openstack/dnsmasq-dns-6dccc4d8b9-qqxcj"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.730770    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f8c40f6-c8d3-4c8c-97eb-643d32774174-config-data\") pod \"barbican-worker-54b9958865-vn9kj\" (UID: \"8f8c40f6-c8d3-4c8c-97eb-643d32774174\") " pod="openstack/barbican-worker-54b9958865-vn9kj"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.730837    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4dfee88-47ff-4e8b-9f46-60cc17fb0080-config-data-custom\") pod \"barbican-keystone-listener-86bc9f54b4-6szxq\" (UID: \"e4dfee88-47ff-4e8b-9f46-60cc17fb0080\") " pod="openstack/barbican-keystone-listener-86bc9f54b4-6szxq"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.730871    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4dfee88-47ff-4e8b-9f46-60cc17fb0080-logs\") pod \"barbican-keystone-listener-86bc9f54b4-6szxq\" (UID: \"e4dfee88-47ff-4e8b-9f46-60cc17fb0080\") " pod="openstack/barbican-keystone-listener-86bc9f54b4-6szxq"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.730890    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc765c5c-7def-4230-b500-d6410c2da475-dns-svc\") pod \"dnsmasq-dns-6dccc4d8b9-qqxcj\" (UID: \"fc765c5c-7def-4230-b500-d6410c2da475\") " pod="openstack/dnsmasq-dns-6dccc4d8b9-qqxcj"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.730908    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc765c5c-7def-4230-b500-d6410c2da475-config\") pod \"dnsmasq-dns-6dccc4d8b9-qqxcj\" (UID: \"fc765c5c-7def-4230-b500-d6410c2da475\") " pod="openstack/dnsmasq-dns-6dccc4d8b9-qqxcj"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.730935    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9k6g\" (UniqueName: \"kubernetes.io/projected/fc765c5c-7def-4230-b500-d6410c2da475-kube-api-access-q9k6g\") pod \"dnsmasq-dns-6dccc4d8b9-qqxcj\" (UID: \"fc765c5c-7def-4230-b500-d6410c2da475\") " pod="openstack/dnsmasq-dns-6dccc4d8b9-qqxcj"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.730949    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8f8c40f6-c8d3-4c8c-97eb-643d32774174-config-data-custom\") pod \"barbican-worker-54b9958865-vn9kj\" (UID: \"8f8c40f6-c8d3-4c8c-97eb-643d32774174\") " pod="openstack/barbican-worker-54b9958865-vn9kj"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.730965    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvpbl\" (UniqueName: \"kubernetes.io/projected/8f8c40f6-c8d3-4c8c-97eb-643d32774174-kube-api-access-rvpbl\") pod \"barbican-worker-54b9958865-vn9kj\" (UID: \"8f8c40f6-c8d3-4c8c-97eb-643d32774174\") " pod="openstack/barbican-worker-54b9958865-vn9kj"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.730983    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4dfee88-47ff-4e8b-9f46-60cc17fb0080-config-data\") pod \"barbican-keystone-listener-86bc9f54b4-6szxq\" (UID: \"e4dfee88-47ff-4e8b-9f46-60cc17fb0080\") " pod="openstack/barbican-keystone-listener-86bc9f54b4-6szxq"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.730999    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc765c5c-7def-4230-b500-d6410c2da475-ovsdbserver-sb\") pod \"dnsmasq-dns-6dccc4d8b9-qqxcj\" (UID: \"fc765c5c-7def-4230-b500-d6410c2da475\") " pod="openstack/dnsmasq-dns-6dccc4d8b9-qqxcj"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.731047    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8c40f6-c8d3-4c8c-97eb-643d32774174-combined-ca-bundle\") pod \"barbican-worker-54b9958865-vn9kj\" (UID: \"8f8c40f6-c8d3-4c8c-97eb-643d32774174\") " pod="openstack/barbican-worker-54b9958865-vn9kj"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.731077    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4dfee88-47ff-4e8b-9f46-60cc17fb0080-combined-ca-bundle\") pod \"barbican-keystone-listener-86bc9f54b4-6szxq\" (UID: \"e4dfee88-47ff-4e8b-9f46-60cc17fb0080\") " pod="openstack/barbican-keystone-listener-86bc9f54b4-6szxq"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.732651    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4dfee88-47ff-4e8b-9f46-60cc17fb0080-logs\") pod \"barbican-keystone-listener-86bc9f54b4-6szxq\" (UID: \"e4dfee88-47ff-4e8b-9f46-60cc17fb0080\") " pod="openstack/barbican-keystone-listener-86bc9f54b4-6szxq"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.734129    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc765c5c-7def-4230-b500-d6410c2da475-ovsdbserver-nb\") pod \"dnsmasq-dns-6dccc4d8b9-qqxcj\" (UID: \"fc765c5c-7def-4230-b500-d6410c2da475\") " pod="openstack/dnsmasq-dns-6dccc4d8b9-qqxcj"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.734381    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f8c40f6-c8d3-4c8c-97eb-643d32774174-logs\") pod \"barbican-worker-54b9958865-vn9kj\" (UID: \"8f8c40f6-c8d3-4c8c-97eb-643d32774174\") " pod="openstack/barbican-worker-54b9958865-vn9kj"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.743892    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4dfee88-47ff-4e8b-9f46-60cc17fb0080-combined-ca-bundle\") pod \"barbican-keystone-listener-86bc9f54b4-6szxq\" (UID: \"e4dfee88-47ff-4e8b-9f46-60cc17fb0080\") " pod="openstack/barbican-keystone-listener-86bc9f54b4-6szxq"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.746296    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc765c5c-7def-4230-b500-d6410c2da475-dns-swift-storage-0\") pod \"dnsmasq-dns-6dccc4d8b9-qqxcj\" (UID: \"fc765c5c-7def-4230-b500-d6410c2da475\") " pod="openstack/dnsmasq-dns-6dccc4d8b9-qqxcj"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.747709    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc765c5c-7def-4230-b500-d6410c2da475-config\") pod \"dnsmasq-dns-6dccc4d8b9-qqxcj\" (UID: \"fc765c5c-7def-4230-b500-d6410c2da475\") " pod="openstack/dnsmasq-dns-6dccc4d8b9-qqxcj"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.749118    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8c40f6-c8d3-4c8c-97eb-643d32774174-combined-ca-bundle\") pod \"barbican-worker-54b9958865-vn9kj\" (UID: \"8f8c40f6-c8d3-4c8c-97eb-643d32774174\") " pod="openstack/barbican-worker-54b9958865-vn9kj"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.749717    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc765c5c-7def-4230-b500-d6410c2da475-ovsdbserver-sb\") pod \"dnsmasq-dns-6dccc4d8b9-qqxcj\" (UID: \"fc765c5c-7def-4230-b500-d6410c2da475\") " pod="openstack/dnsmasq-dns-6dccc4d8b9-qqxcj"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.750197    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc765c5c-7def-4230-b500-d6410c2da475-dns-svc\") pod \"dnsmasq-dns-6dccc4d8b9-qqxcj\" (UID: \"fc765c5c-7def-4230-b500-d6410c2da475\") " pod="openstack/dnsmasq-dns-6dccc4d8b9-qqxcj"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.751702    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4dfee88-47ff-4e8b-9f46-60cc17fb0080-config-data\") pod \"barbican-keystone-listener-86bc9f54b4-6szxq\" (UID: \"e4dfee88-47ff-4e8b-9f46-60cc17fb0080\") " pod="openstack/barbican-keystone-listener-86bc9f54b4-6szxq"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.757820    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8f8c40f6-c8d3-4c8c-97eb-643d32774174-config-data-custom\") pod \"barbican-worker-54b9958865-vn9kj\" (UID: \"8f8c40f6-c8d3-4c8c-97eb-643d32774174\") " pod="openstack/barbican-worker-54b9958865-vn9kj"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.761287    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f8c40f6-c8d3-4c8c-97eb-643d32774174-config-data\") pod \"barbican-worker-54b9958865-vn9kj\" (UID: \"8f8c40f6-c8d3-4c8c-97eb-643d32774174\") " pod="openstack/barbican-worker-54b9958865-vn9kj"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.765268    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcckp\" (UniqueName: \"kubernetes.io/projected/e4dfee88-47ff-4e8b-9f46-60cc17fb0080-kube-api-access-jcckp\") pod \"barbican-keystone-listener-86bc9f54b4-6szxq\" (UID: \"e4dfee88-47ff-4e8b-9f46-60cc17fb0080\") " pod="openstack/barbican-keystone-listener-86bc9f54b4-6szxq"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.766874    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4dfee88-47ff-4e8b-9f46-60cc17fb0080-config-data-custom\") pod \"barbican-keystone-listener-86bc9f54b4-6szxq\" (UID: \"e4dfee88-47ff-4e8b-9f46-60cc17fb0080\") " pod="openstack/barbican-keystone-listener-86bc9f54b4-6szxq"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.779752    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9k6g\" (UniqueName: \"kubernetes.io/projected/fc765c5c-7def-4230-b500-d6410c2da475-kube-api-access-q9k6g\") pod \"dnsmasq-dns-6dccc4d8b9-qqxcj\" (UID: \"fc765c5c-7def-4230-b500-d6410c2da475\") " pod="openstack/dnsmasq-dns-6dccc4d8b9-qqxcj"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.785204    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-66f7c676c8-wdfnw"]
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.785593    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-86bc9f54b4-6szxq"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.786749    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-66f7c676c8-wdfnw"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.797534    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-66f7c676c8-wdfnw"]
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.800849    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvpbl\" (UniqueName: \"kubernetes.io/projected/8f8c40f6-c8d3-4c8c-97eb-643d32774174-kube-api-access-rvpbl\") pod \"barbican-worker-54b9958865-vn9kj\" (UID: \"8f8c40f6-c8d3-4c8c-97eb-643d32774174\") " pod="openstack/barbican-worker-54b9958865-vn9kj"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.801611    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.812700    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-54b9958865-vn9kj"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.936755    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/625a25c3-e585-4848-bbe1-0bdd4be731a9-config-data\") pod \"barbican-api-66f7c676c8-wdfnw\" (UID: \"625a25c3-e585-4848-bbe1-0bdd4be731a9\") " pod="openstack/barbican-api-66f7c676c8-wdfnw"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.936795    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/625a25c3-e585-4848-bbe1-0bdd4be731a9-logs\") pod \"barbican-api-66f7c676c8-wdfnw\" (UID: \"625a25c3-e585-4848-bbe1-0bdd4be731a9\") " pod="openstack/barbican-api-66f7c676c8-wdfnw"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.937762    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/625a25c3-e585-4848-bbe1-0bdd4be731a9-combined-ca-bundle\") pod \"barbican-api-66f7c676c8-wdfnw\" (UID: \"625a25c3-e585-4848-bbe1-0bdd4be731a9\") " pod="openstack/barbican-api-66f7c676c8-wdfnw"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.937815    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/625a25c3-e585-4848-bbe1-0bdd4be731a9-config-data-custom\") pod \"barbican-api-66f7c676c8-wdfnw\" (UID: \"625a25c3-e585-4848-bbe1-0bdd4be731a9\") " pod="openstack/barbican-api-66f7c676c8-wdfnw"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.938037    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfcrj\" (UniqueName: \"kubernetes.io/projected/625a25c3-e585-4848-bbe1-0bdd4be731a9-kube-api-access-nfcrj\") pod \"barbican-api-66f7c676c8-wdfnw\" (UID: \"625a25c3-e585-4848-bbe1-0bdd4be731a9\") " pod="openstack/barbican-api-66f7c676c8-wdfnw"
Mar 20 16:00:37 crc kubenswrapper[4730]: I0320 16:00:37.959773    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dccc4d8b9-qqxcj"
Mar 20 16:00:38 crc kubenswrapper[4730]: I0320 16:00:38.044949    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/625a25c3-e585-4848-bbe1-0bdd4be731a9-config-data\") pod \"barbican-api-66f7c676c8-wdfnw\" (UID: \"625a25c3-e585-4848-bbe1-0bdd4be731a9\") " pod="openstack/barbican-api-66f7c676c8-wdfnw"
Mar 20 16:00:38 crc kubenswrapper[4730]: I0320 16:00:38.045774    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/625a25c3-e585-4848-bbe1-0bdd4be731a9-logs\") pod \"barbican-api-66f7c676c8-wdfnw\" (UID: \"625a25c3-e585-4848-bbe1-0bdd4be731a9\") " pod="openstack/barbican-api-66f7c676c8-wdfnw"
Mar 20 16:00:38 crc kubenswrapper[4730]: I0320 16:00:38.045878    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/625a25c3-e585-4848-bbe1-0bdd4be731a9-combined-ca-bundle\") pod \"barbican-api-66f7c676c8-wdfnw\" (UID: \"625a25c3-e585-4848-bbe1-0bdd4be731a9\") " pod="openstack/barbican-api-66f7c676c8-wdfnw"
Mar 20 16:00:38 crc kubenswrapper[4730]: I0320 16:00:38.045901    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/625a25c3-e585-4848-bbe1-0bdd4be731a9-config-data-custom\") pod \"barbican-api-66f7c676c8-wdfnw\" (UID: \"625a25c3-e585-4848-bbe1-0bdd4be731a9\") " pod="openstack/barbican-api-66f7c676c8-wdfnw"
Mar 20 16:00:38 crc kubenswrapper[4730]: I0320 16:00:38.045950    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfcrj\" (UniqueName: \"kubernetes.io/projected/625a25c3-e585-4848-bbe1-0bdd4be731a9-kube-api-access-nfcrj\") pod \"barbican-api-66f7c676c8-wdfnw\" (UID: \"625a25c3-e585-4848-bbe1-0bdd4be731a9\") " pod="openstack/barbican-api-66f7c676c8-wdfnw"
Mar 20 16:00:38 crc kubenswrapper[4730]: I0320 16:00:38.052110    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/625a25c3-e585-4848-bbe1-0bdd4be731a9-logs\") pod \"barbican-api-66f7c676c8-wdfnw\" (UID: \"625a25c3-e585-4848-bbe1-0bdd4be731a9\") " pod="openstack/barbican-api-66f7c676c8-wdfnw"
Mar 20 16:00:38 crc kubenswrapper[4730]: I0320 16:00:38.053765    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/625a25c3-e585-4848-bbe1-0bdd4be731a9-config-data\") pod \"barbican-api-66f7c676c8-wdfnw\" (UID: \"625a25c3-e585-4848-bbe1-0bdd4be731a9\") " pod="openstack/barbican-api-66f7c676c8-wdfnw"
Mar 20 16:00:38 crc kubenswrapper[4730]: I0320 16:00:38.059481    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/625a25c3-e585-4848-bbe1-0bdd4be731a9-config-data-custom\") pod \"barbican-api-66f7c676c8-wdfnw\" (UID: \"625a25c3-e585-4848-bbe1-0bdd4be731a9\") " pod="openstack/barbican-api-66f7c676c8-wdfnw"
Mar 20 16:00:38 crc kubenswrapper[4730]: I0320 16:00:38.059704    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/625a25c3-e585-4848-bbe1-0bdd4be731a9-combined-ca-bundle\") pod \"barbican-api-66f7c676c8-wdfnw\" (UID: \"625a25c3-e585-4848-bbe1-0bdd4be731a9\") " pod="openstack/barbican-api-66f7c676c8-wdfnw"
Mar 20 16:00:38 crc kubenswrapper[4730]: I0320 16:00:38.062549    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfcrj\" (UniqueName: \"kubernetes.io/projected/625a25c3-e585-4848-bbe1-0bdd4be731a9-kube-api-access-nfcrj\") pod \"barbican-api-66f7c676c8-wdfnw\" (UID: \"625a25c3-e585-4848-bbe1-0bdd4be731a9\") " pod="openstack/barbican-api-66f7c676c8-wdfnw"
Mar 20 16:00:38 crc kubenswrapper[4730]: I0320 16:00:38.153858    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-66f7c676c8-wdfnw"
Mar 20 16:00:39 crc kubenswrapper[4730]: I0320 16:00:39.251280    4730 generic.go:334] "Generic (PLEG): container finished" podID="09f27249-61fb-4e13-9eb9-9b804f256d81" containerID="3f4c141955a3579b06be021435ce1c3642e2a9b4483a932d05648a4559764229" exitCode=0
Mar 20 16:00:39 crc kubenswrapper[4730]: I0320 16:00:39.251587    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-z9mtx" event={"ID":"09f27249-61fb-4e13-9eb9-9b804f256d81","Type":"ContainerDied","Data":"3f4c141955a3579b06be021435ce1c3642e2a9b4483a932d05648a4559764229"}
Mar 20 16:00:39 crc kubenswrapper[4730]: I0320 16:00:39.254018    4730 generic.go:334] "Generic (PLEG): container finished" podID="6fb4d42d-6cd9-480c-8ee0-1e168504a4cd" containerID="1763f714611816ce76b822616e2726ee2af2ec1d061896faecc0edc07186595f" exitCode=0
Mar 20 16:00:39 crc kubenswrapper[4730]: I0320 16:00:39.254061    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-hbplf" event={"ID":"6fb4d42d-6cd9-480c-8ee0-1e168504a4cd","Type":"ContainerDied","Data":"1763f714611816ce76b822616e2726ee2af2ec1d061896faecc0edc07186595f"}
Mar 20 16:00:39 crc kubenswrapper[4730]: I0320 16:00:39.973825    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0"
Mar 20 16:00:39 crc kubenswrapper[4730]: I0320 16:00:39.974520    4730 scope.go:117] "RemoveContainer" containerID="834cc7775c739ec615e04b1c22eba8f136c36cff5d344d3613f2797565551c85"
Mar 20 16:00:40 crc kubenswrapper[4730]: I0320 16:00:40.570715    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-86947bcbc8-94hl8"]
Mar 20 16:00:40 crc kubenswrapper[4730]: I0320 16:00:40.572471    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-86947bcbc8-94hl8"
Mar 20 16:00:40 crc kubenswrapper[4730]: I0320 16:00:40.587287    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc"
Mar 20 16:00:40 crc kubenswrapper[4730]: I0320 16:00:40.587867    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc"
Mar 20 16:00:40 crc kubenswrapper[4730]: I0320 16:00:40.615242    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-86947bcbc8-94hl8"]
Mar 20 16:00:40 crc kubenswrapper[4730]: I0320 16:00:40.701228    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59460a49-c9fe-46c9-b898-d08234ca7cd3-logs\") pod \"barbican-api-86947bcbc8-94hl8\" (UID: \"59460a49-c9fe-46c9-b898-d08234ca7cd3\") " pod="openstack/barbican-api-86947bcbc8-94hl8"
Mar 20 16:00:40 crc kubenswrapper[4730]: I0320 16:00:40.701318    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59460a49-c9fe-46c9-b898-d08234ca7cd3-public-tls-certs\") pod \"barbican-api-86947bcbc8-94hl8\" (UID: \"59460a49-c9fe-46c9-b898-d08234ca7cd3\") " pod="openstack/barbican-api-86947bcbc8-94hl8"
Mar 20 16:00:40 crc kubenswrapper[4730]: I0320 16:00:40.701408    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59460a49-c9fe-46c9-b898-d08234ca7cd3-combined-ca-bundle\") pod \"barbican-api-86947bcbc8-94hl8\" (UID: \"59460a49-c9fe-46c9-b898-d08234ca7cd3\") " pod="openstack/barbican-api-86947bcbc8-94hl8"
Mar 20 16:00:40 crc kubenswrapper[4730]: I0320 16:00:40.701435    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59460a49-c9fe-46c9-b898-d08234ca7cd3-config-data\") pod \"barbican-api-86947bcbc8-94hl8\" (UID: \"59460a49-c9fe-46c9-b898-d08234ca7cd3\") " pod="openstack/barbican-api-86947bcbc8-94hl8"
Mar 20 16:00:40 crc kubenswrapper[4730]: I0320 16:00:40.701456    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nvg6\" (UniqueName: \"kubernetes.io/projected/59460a49-c9fe-46c9-b898-d08234ca7cd3-kube-api-access-8nvg6\") pod \"barbican-api-86947bcbc8-94hl8\" (UID: \"59460a49-c9fe-46c9-b898-d08234ca7cd3\") " pod="openstack/barbican-api-86947bcbc8-94hl8"
Mar 20 16:00:40 crc kubenswrapper[4730]: I0320 16:00:40.701473    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59460a49-c9fe-46c9-b898-d08234ca7cd3-config-data-custom\") pod \"barbican-api-86947bcbc8-94hl8\" (UID: \"59460a49-c9fe-46c9-b898-d08234ca7cd3\") " pod="openstack/barbican-api-86947bcbc8-94hl8"
Mar 20 16:00:40 crc kubenswrapper[4730]: I0320 16:00:40.701510    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59460a49-c9fe-46c9-b898-d08234ca7cd3-internal-tls-certs\") pod \"barbican-api-86947bcbc8-94hl8\" (UID: \"59460a49-c9fe-46c9-b898-d08234ca7cd3\") " pod="openstack/barbican-api-86947bcbc8-94hl8"
Mar 20 16:00:40 crc kubenswrapper[4730]: I0320 16:00:40.803400    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59460a49-c9fe-46c9-b898-d08234ca7cd3-config-data-custom\") pod \"barbican-api-86947bcbc8-94hl8\" (UID: \"59460a49-c9fe-46c9-b898-d08234ca7cd3\") " pod="openstack/barbican-api-86947bcbc8-94hl8"
Mar 20 16:00:40 crc kubenswrapper[4730]: I0320 16:00:40.803470    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59460a49-c9fe-46c9-b898-d08234ca7cd3-internal-tls-certs\") pod \"barbican-api-86947bcbc8-94hl8\" (UID: \"59460a49-c9fe-46c9-b898-d08234ca7cd3\") " pod="openstack/barbican-api-86947bcbc8-94hl8"
Mar 20 16:00:40 crc kubenswrapper[4730]: I0320 16:00:40.803502    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59460a49-c9fe-46c9-b898-d08234ca7cd3-logs\") pod \"barbican-api-86947bcbc8-94hl8\" (UID: \"59460a49-c9fe-46c9-b898-d08234ca7cd3\") " pod="openstack/barbican-api-86947bcbc8-94hl8"
Mar 20 16:00:40 crc kubenswrapper[4730]: I0320 16:00:40.803547    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59460a49-c9fe-46c9-b898-d08234ca7cd3-public-tls-certs\") pod \"barbican-api-86947bcbc8-94hl8\" (UID: \"59460a49-c9fe-46c9-b898-d08234ca7cd3\") " pod="openstack/barbican-api-86947bcbc8-94hl8"
Mar 20 16:00:40 crc kubenswrapper[4730]: I0320 16:00:40.803658    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59460a49-c9fe-46c9-b898-d08234ca7cd3-combined-ca-bundle\") pod \"barbican-api-86947bcbc8-94hl8\" (UID: \"59460a49-c9fe-46c9-b898-d08234ca7cd3\") " pod="openstack/barbican-api-86947bcbc8-94hl8"
Mar 20 16:00:40 crc kubenswrapper[4730]: I0320 16:00:40.803685    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59460a49-c9fe-46c9-b898-d08234ca7cd3-config-data\") pod \"barbican-api-86947bcbc8-94hl8\" (UID: \"59460a49-c9fe-46c9-b898-d08234ca7cd3\") " pod="openstack/barbican-api-86947bcbc8-94hl8"
Mar 20 16:00:40 crc kubenswrapper[4730]: I0320 16:00:40.803707    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nvg6\" (UniqueName: \"kubernetes.io/projected/59460a49-c9fe-46c9-b898-d08234ca7cd3-kube-api-access-8nvg6\") pod \"barbican-api-86947bcbc8-94hl8\" (UID: \"59460a49-c9fe-46c9-b898-d08234ca7cd3\") " pod="openstack/barbican-api-86947bcbc8-94hl8"
Mar 20 16:00:40 crc kubenswrapper[4730]: I0320 16:00:40.809195    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59460a49-c9fe-46c9-b898-d08234ca7cd3-logs\") pod \"barbican-api-86947bcbc8-94hl8\" (UID: \"59460a49-c9fe-46c9-b898-d08234ca7cd3\") " pod="openstack/barbican-api-86947bcbc8-94hl8"
Mar 20 16:00:40 crc kubenswrapper[4730]: I0320 16:00:40.811136    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59460a49-c9fe-46c9-b898-d08234ca7cd3-combined-ca-bundle\") pod \"barbican-api-86947bcbc8-94hl8\" (UID: \"59460a49-c9fe-46c9-b898-d08234ca7cd3\") " pod="openstack/barbican-api-86947bcbc8-94hl8"
Mar 20 16:00:40 crc kubenswrapper[4730]: I0320 16:00:40.812901    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59460a49-c9fe-46c9-b898-d08234ca7cd3-public-tls-certs\") pod \"barbican-api-86947bcbc8-94hl8\" (UID: \"59460a49-c9fe-46c9-b898-d08234ca7cd3\") " pod="openstack/barbican-api-86947bcbc8-94hl8"
Mar 20 16:00:40 crc kubenswrapper[4730]: I0320 16:00:40.813386    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59460a49-c9fe-46c9-b898-d08234ca7cd3-internal-tls-certs\") pod \"barbican-api-86947bcbc8-94hl8\" (UID: \"59460a49-c9fe-46c9-b898-d08234ca7cd3\") " pod="openstack/barbican-api-86947bcbc8-94hl8"
Mar 20 16:00:40 crc kubenswrapper[4730]: I0320 16:00:40.819817    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59460a49-c9fe-46c9-b898-d08234ca7cd3-config-data-custom\") pod \"barbican-api-86947bcbc8-94hl8\" (UID: \"59460a49-c9fe-46c9-b898-d08234ca7cd3\") " pod="openstack/barbican-api-86947bcbc8-94hl8"
Mar 20 16:00:40 crc kubenswrapper[4730]: I0320 16:00:40.830261    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59460a49-c9fe-46c9-b898-d08234ca7cd3-config-data\") pod \"barbican-api-86947bcbc8-94hl8\" (UID: \"59460a49-c9fe-46c9-b898-d08234ca7cd3\") " pod="openstack/barbican-api-86947bcbc8-94hl8"
Mar 20 16:00:40 crc kubenswrapper[4730]: I0320 16:00:40.839017    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nvg6\" (UniqueName: \"kubernetes.io/projected/59460a49-c9fe-46c9-b898-d08234ca7cd3-kube-api-access-8nvg6\") pod \"barbican-api-86947bcbc8-94hl8\" (UID: \"59460a49-c9fe-46c9-b898-d08234ca7cd3\") " pod="openstack/barbican-api-86947bcbc8-94hl8"
Mar 20 16:00:40 crc kubenswrapper[4730]: I0320 16:00:40.915084    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-86947bcbc8-94hl8"
Mar 20 16:00:42 crc kubenswrapper[4730]: I0320 16:00:42.880515    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 16:00:42 crc kubenswrapper[4730]: I0320 16:00:42.881172    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 16:00:43 crc kubenswrapper[4730]: I0320 16:00:43.821818    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-z9mtx"
Mar 20 16:00:43 crc kubenswrapper[4730]: I0320 16:00:43.831294    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-hbplf"
Mar 20 16:00:43 crc kubenswrapper[4730]: I0320 16:00:43.963797    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6fb4d42d-6cd9-480c-8ee0-1e168504a4cd-scripts\") pod \"6fb4d42d-6cd9-480c-8ee0-1e168504a4cd\" (UID: \"6fb4d42d-6cd9-480c-8ee0-1e168504a4cd\") "
Mar 20 16:00:43 crc kubenswrapper[4730]: I0320 16:00:43.963913    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09f27249-61fb-4e13-9eb9-9b804f256d81-combined-ca-bundle\") pod \"09f27249-61fb-4e13-9eb9-9b804f256d81\" (UID: \"09f27249-61fb-4e13-9eb9-9b804f256d81\") "
Mar 20 16:00:43 crc kubenswrapper[4730]: I0320 16:00:43.963940    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24qfr\" (UniqueName: \"kubernetes.io/projected/6fb4d42d-6cd9-480c-8ee0-1e168504a4cd-kube-api-access-24qfr\") pod \"6fb4d42d-6cd9-480c-8ee0-1e168504a4cd\" (UID: \"6fb4d42d-6cd9-480c-8ee0-1e168504a4cd\") "
Mar 20 16:00:43 crc kubenswrapper[4730]: I0320 16:00:43.964036    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fb4d42d-6cd9-480c-8ee0-1e168504a4cd-config-data\") pod \"6fb4d42d-6cd9-480c-8ee0-1e168504a4cd\" (UID: \"6fb4d42d-6cd9-480c-8ee0-1e168504a4cd\") "
Mar 20 16:00:43 crc kubenswrapper[4730]: I0320 16:00:43.964059    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6fb4d42d-6cd9-480c-8ee0-1e168504a4cd-db-sync-config-data\") pod \"6fb4d42d-6cd9-480c-8ee0-1e168504a4cd\" (UID: \"6fb4d42d-6cd9-480c-8ee0-1e168504a4cd\") "
Mar 20 16:00:43 crc kubenswrapper[4730]: I0320 16:00:43.964078    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7sfc\" (UniqueName: \"kubernetes.io/projected/09f27249-61fb-4e13-9eb9-9b804f256d81-kube-api-access-t7sfc\") pod \"09f27249-61fb-4e13-9eb9-9b804f256d81\" (UID: \"09f27249-61fb-4e13-9eb9-9b804f256d81\") "
Mar 20 16:00:43 crc kubenswrapper[4730]: I0320 16:00:43.964126    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6fb4d42d-6cd9-480c-8ee0-1e168504a4cd-etc-machine-id\") pod \"6fb4d42d-6cd9-480c-8ee0-1e168504a4cd\" (UID: \"6fb4d42d-6cd9-480c-8ee0-1e168504a4cd\") "
Mar 20 16:00:43 crc kubenswrapper[4730]: I0320 16:00:43.964162    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/09f27249-61fb-4e13-9eb9-9b804f256d81-config\") pod \"09f27249-61fb-4e13-9eb9-9b804f256d81\" (UID: \"09f27249-61fb-4e13-9eb9-9b804f256d81\") "
Mar 20 16:00:43 crc kubenswrapper[4730]: I0320 16:00:43.964222    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fb4d42d-6cd9-480c-8ee0-1e168504a4cd-combined-ca-bundle\") pod \"6fb4d42d-6cd9-480c-8ee0-1e168504a4cd\" (UID: \"6fb4d42d-6cd9-480c-8ee0-1e168504a4cd\") "
Mar 20 16:00:43 crc kubenswrapper[4730]: I0320 16:00:43.982188    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09f27249-61fb-4e13-9eb9-9b804f256d81-kube-api-access-t7sfc" (OuterVolumeSpecName: "kube-api-access-t7sfc") pod "09f27249-61fb-4e13-9eb9-9b804f256d81" (UID: "09f27249-61fb-4e13-9eb9-9b804f256d81"). InnerVolumeSpecName "kube-api-access-t7sfc". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:00:43 crc kubenswrapper[4730]: I0320 16:00:43.982340    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6fb4d42d-6cd9-480c-8ee0-1e168504a4cd-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6fb4d42d-6cd9-480c-8ee0-1e168504a4cd" (UID: "6fb4d42d-6cd9-480c-8ee0-1e168504a4cd"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue ""
Mar 20 16:00:43 crc kubenswrapper[4730]: I0320 16:00:43.986430    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fb4d42d-6cd9-480c-8ee0-1e168504a4cd-kube-api-access-24qfr" (OuterVolumeSpecName: "kube-api-access-24qfr") pod "6fb4d42d-6cd9-480c-8ee0-1e168504a4cd" (UID: "6fb4d42d-6cd9-480c-8ee0-1e168504a4cd"). InnerVolumeSpecName "kube-api-access-24qfr". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:00:44 crc kubenswrapper[4730]: I0320 16:00:44.008629    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fb4d42d-6cd9-480c-8ee0-1e168504a4cd-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6fb4d42d-6cd9-480c-8ee0-1e168504a4cd" (UID: "6fb4d42d-6cd9-480c-8ee0-1e168504a4cd"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:00:44 crc kubenswrapper[4730]: I0320 16:00:44.010360    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fb4d42d-6cd9-480c-8ee0-1e168504a4cd-scripts" (OuterVolumeSpecName: "scripts") pod "6fb4d42d-6cd9-480c-8ee0-1e168504a4cd" (UID: "6fb4d42d-6cd9-480c-8ee0-1e168504a4cd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:00:44 crc kubenswrapper[4730]: I0320 16:00:44.025516    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fb4d42d-6cd9-480c-8ee0-1e168504a4cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6fb4d42d-6cd9-480c-8ee0-1e168504a4cd" (UID: "6fb4d42d-6cd9-480c-8ee0-1e168504a4cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:00:44 crc kubenswrapper[4730]: I0320 16:00:44.062409    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09f27249-61fb-4e13-9eb9-9b804f256d81-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "09f27249-61fb-4e13-9eb9-9b804f256d81" (UID: "09f27249-61fb-4e13-9eb9-9b804f256d81"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:00:44 crc kubenswrapper[4730]: I0320 16:00:44.067122    4730 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6fb4d42d-6cd9-480c-8ee0-1e168504a4cd-etc-machine-id\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:44 crc kubenswrapper[4730]: I0320 16:00:44.067149    4730 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fb4d42d-6cd9-480c-8ee0-1e168504a4cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:44 crc kubenswrapper[4730]: I0320 16:00:44.067158    4730 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6fb4d42d-6cd9-480c-8ee0-1e168504a4cd-scripts\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:44 crc kubenswrapper[4730]: I0320 16:00:44.067166    4730 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09f27249-61fb-4e13-9eb9-9b804f256d81-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:44 crc kubenswrapper[4730]: I0320 16:00:44.067176    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24qfr\" (UniqueName: \"kubernetes.io/projected/6fb4d42d-6cd9-480c-8ee0-1e168504a4cd-kube-api-access-24qfr\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:44 crc kubenswrapper[4730]: I0320 16:00:44.067185    4730 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6fb4d42d-6cd9-480c-8ee0-1e168504a4cd-db-sync-config-data\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:44 crc kubenswrapper[4730]: I0320 16:00:44.067193    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7sfc\" (UniqueName: \"kubernetes.io/projected/09f27249-61fb-4e13-9eb9-9b804f256d81-kube-api-access-t7sfc\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:44 crc kubenswrapper[4730]: I0320 16:00:44.077408    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09f27249-61fb-4e13-9eb9-9b804f256d81-config" (OuterVolumeSpecName: "config") pod "09f27249-61fb-4e13-9eb9-9b804f256d81" (UID: "09f27249-61fb-4e13-9eb9-9b804f256d81"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:00:44 crc kubenswrapper[4730]: I0320 16:00:44.102498    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fb4d42d-6cd9-480c-8ee0-1e168504a4cd-config-data" (OuterVolumeSpecName: "config-data") pod "6fb4d42d-6cd9-480c-8ee0-1e168504a4cd" (UID: "6fb4d42d-6cd9-480c-8ee0-1e168504a4cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:00:44 crc kubenswrapper[4730]: I0320 16:00:44.171482    4730 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fb4d42d-6cd9-480c-8ee0-1e168504a4cd-config-data\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:44 crc kubenswrapper[4730]: I0320 16:00:44.171767    4730 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/09f27249-61fb-4e13-9eb9-9b804f256d81-config\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:44 crc kubenswrapper[4730]: I0320 16:00:44.304974    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-hbplf" event={"ID":"6fb4d42d-6cd9-480c-8ee0-1e168504a4cd","Type":"ContainerDied","Data":"ad1075e305b4a94d7393955deeb754baca56955fca19f6133e616c9e84808c7e"}
Mar 20 16:00:44 crc kubenswrapper[4730]: I0320 16:00:44.305049    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad1075e305b4a94d7393955deeb754baca56955fca19f6133e616c9e84808c7e"
Mar 20 16:00:44 crc kubenswrapper[4730]: I0320 16:00:44.304999    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-hbplf"
Mar 20 16:00:44 crc kubenswrapper[4730]: I0320 16:00:44.317967    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-z9mtx" event={"ID":"09f27249-61fb-4e13-9eb9-9b804f256d81","Type":"ContainerDied","Data":"a168269d42bdf34c043574c3c59fe418fbfcc2023ca3015cc5326a7e3f76f715"}
Mar 20 16:00:44 crc kubenswrapper[4730]: I0320 16:00:44.318003    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a168269d42bdf34c043574c3c59fe418fbfcc2023ca3015cc5326a7e3f76f715"
Mar 20 16:00:44 crc kubenswrapper[4730]: I0320 16:00:44.318205    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-z9mtx"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.165154    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dccc4d8b9-qqxcj"]
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.207792    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"]
Mar 20 16:00:45 crc kubenswrapper[4730]: E0320 16:00:45.208468    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09f27249-61fb-4e13-9eb9-9b804f256d81" containerName="neutron-db-sync"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.208489    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="09f27249-61fb-4e13-9eb9-9b804f256d81" containerName="neutron-db-sync"
Mar 20 16:00:45 crc kubenswrapper[4730]: E0320 16:00:45.208506    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fb4d42d-6cd9-480c-8ee0-1e168504a4cd" containerName="cinder-db-sync"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.208513    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fb4d42d-6cd9-480c-8ee0-1e168504a4cd" containerName="cinder-db-sync"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.208745    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fb4d42d-6cd9-480c-8ee0-1e168504a4cd" containerName="cinder-db-sync"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.208778    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="09f27249-61fb-4e13-9eb9-9b804f256d81" containerName="neutron-db-sync"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.209895    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.221500    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.221847    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.222019    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-bg7s8"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.222330    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.225636    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"]
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.239952    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8448dbfc69-4229t"]
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.241985    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8448dbfc69-4229t"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.279606    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8448dbfc69-4229t"]
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.295291    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/94a8baf5-acb2-4f6d-811c-335fd3bd7b0d-ovsdbserver-nb\") pod \"dnsmasq-dns-8448dbfc69-4229t\" (UID: \"94a8baf5-acb2-4f6d-811c-335fd3bd7b0d\") " pod="openstack/dnsmasq-dns-8448dbfc69-4229t"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.295375    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/211c7770-64fe-4943-becb-bc02113fd867-scripts\") pod \"cinder-scheduler-0\" (UID: \"211c7770-64fe-4943-becb-bc02113fd867\") " pod="openstack/cinder-scheduler-0"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.295420    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/94a8baf5-acb2-4f6d-811c-335fd3bd7b0d-dns-swift-storage-0\") pod \"dnsmasq-dns-8448dbfc69-4229t\" (UID: \"94a8baf5-acb2-4f6d-811c-335fd3bd7b0d\") " pod="openstack/dnsmasq-dns-8448dbfc69-4229t"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.295451    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/211c7770-64fe-4943-becb-bc02113fd867-config-data\") pod \"cinder-scheduler-0\" (UID: \"211c7770-64fe-4943-becb-bc02113fd867\") " pod="openstack/cinder-scheduler-0"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.295479    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/211c7770-64fe-4943-becb-bc02113fd867-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"211c7770-64fe-4943-becb-bc02113fd867\") " pod="openstack/cinder-scheduler-0"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.295508    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srhzx\" (UniqueName: \"kubernetes.io/projected/211c7770-64fe-4943-becb-bc02113fd867-kube-api-access-srhzx\") pod \"cinder-scheduler-0\" (UID: \"211c7770-64fe-4943-becb-bc02113fd867\") " pod="openstack/cinder-scheduler-0"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.295534    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9vz6\" (UniqueName: \"kubernetes.io/projected/94a8baf5-acb2-4f6d-811c-335fd3bd7b0d-kube-api-access-d9vz6\") pod \"dnsmasq-dns-8448dbfc69-4229t\" (UID: \"94a8baf5-acb2-4f6d-811c-335fd3bd7b0d\") " pod="openstack/dnsmasq-dns-8448dbfc69-4229t"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.295560    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/94a8baf5-acb2-4f6d-811c-335fd3bd7b0d-ovsdbserver-sb\") pod \"dnsmasq-dns-8448dbfc69-4229t\" (UID: \"94a8baf5-acb2-4f6d-811c-335fd3bd7b0d\") " pod="openstack/dnsmasq-dns-8448dbfc69-4229t"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.295596    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/211c7770-64fe-4943-becb-bc02113fd867-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"211c7770-64fe-4943-becb-bc02113fd867\") " pod="openstack/cinder-scheduler-0"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.295621    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/94a8baf5-acb2-4f6d-811c-335fd3bd7b0d-dns-svc\") pod \"dnsmasq-dns-8448dbfc69-4229t\" (UID: \"94a8baf5-acb2-4f6d-811c-335fd3bd7b0d\") " pod="openstack/dnsmasq-dns-8448dbfc69-4229t"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.295646    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94a8baf5-acb2-4f6d-811c-335fd3bd7b0d-config\") pod \"dnsmasq-dns-8448dbfc69-4229t\" (UID: \"94a8baf5-acb2-4f6d-811c-335fd3bd7b0d\") " pod="openstack/dnsmasq-dns-8448dbfc69-4229t"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.295705    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/211c7770-64fe-4943-becb-bc02113fd867-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"211c7770-64fe-4943-becb-bc02113fd867\") " pod="openstack/cinder-scheduler-0"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.425546    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/94a8baf5-acb2-4f6d-811c-335fd3bd7b0d-ovsdbserver-nb\") pod \"dnsmasq-dns-8448dbfc69-4229t\" (UID: \"94a8baf5-acb2-4f6d-811c-335fd3bd7b0d\") " pod="openstack/dnsmasq-dns-8448dbfc69-4229t"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.425661    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/211c7770-64fe-4943-becb-bc02113fd867-scripts\") pod \"cinder-scheduler-0\" (UID: \"211c7770-64fe-4943-becb-bc02113fd867\") " pod="openstack/cinder-scheduler-0"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.425747    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/94a8baf5-acb2-4f6d-811c-335fd3bd7b0d-dns-swift-storage-0\") pod \"dnsmasq-dns-8448dbfc69-4229t\" (UID: \"94a8baf5-acb2-4f6d-811c-335fd3bd7b0d\") " pod="openstack/dnsmasq-dns-8448dbfc69-4229t"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.425785    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/211c7770-64fe-4943-becb-bc02113fd867-config-data\") pod \"cinder-scheduler-0\" (UID: \"211c7770-64fe-4943-becb-bc02113fd867\") " pod="openstack/cinder-scheduler-0"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.425821    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/211c7770-64fe-4943-becb-bc02113fd867-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"211c7770-64fe-4943-becb-bc02113fd867\") " pod="openstack/cinder-scheduler-0"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.425857    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srhzx\" (UniqueName: \"kubernetes.io/projected/211c7770-64fe-4943-becb-bc02113fd867-kube-api-access-srhzx\") pod \"cinder-scheduler-0\" (UID: \"211c7770-64fe-4943-becb-bc02113fd867\") " pod="openstack/cinder-scheduler-0"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.425886    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9vz6\" (UniqueName: \"kubernetes.io/projected/94a8baf5-acb2-4f6d-811c-335fd3bd7b0d-kube-api-access-d9vz6\") pod \"dnsmasq-dns-8448dbfc69-4229t\" (UID: \"94a8baf5-acb2-4f6d-811c-335fd3bd7b0d\") " pod="openstack/dnsmasq-dns-8448dbfc69-4229t"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.425914    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/94a8baf5-acb2-4f6d-811c-335fd3bd7b0d-ovsdbserver-sb\") pod \"dnsmasq-dns-8448dbfc69-4229t\" (UID: \"94a8baf5-acb2-4f6d-811c-335fd3bd7b0d\") " pod="openstack/dnsmasq-dns-8448dbfc69-4229t"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.425989    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/211c7770-64fe-4943-becb-bc02113fd867-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"211c7770-64fe-4943-becb-bc02113fd867\") " pod="openstack/cinder-scheduler-0"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.426027    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/94a8baf5-acb2-4f6d-811c-335fd3bd7b0d-dns-svc\") pod \"dnsmasq-dns-8448dbfc69-4229t\" (UID: \"94a8baf5-acb2-4f6d-811c-335fd3bd7b0d\") " pod="openstack/dnsmasq-dns-8448dbfc69-4229t"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.426055    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94a8baf5-acb2-4f6d-811c-335fd3bd7b0d-config\") pod \"dnsmasq-dns-8448dbfc69-4229t\" (UID: \"94a8baf5-acb2-4f6d-811c-335fd3bd7b0d\") " pod="openstack/dnsmasq-dns-8448dbfc69-4229t"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.426147    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/211c7770-64fe-4943-becb-bc02113fd867-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"211c7770-64fe-4943-becb-bc02113fd867\") " pod="openstack/cinder-scheduler-0"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.431645    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/211c7770-64fe-4943-becb-bc02113fd867-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"211c7770-64fe-4943-becb-bc02113fd867\") " pod="openstack/cinder-scheduler-0"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.432966    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/94a8baf5-acb2-4f6d-811c-335fd3bd7b0d-ovsdbserver-nb\") pod \"dnsmasq-dns-8448dbfc69-4229t\" (UID: \"94a8baf5-acb2-4f6d-811c-335fd3bd7b0d\") " pod="openstack/dnsmasq-dns-8448dbfc69-4229t"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.435195    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6858c8d8f6-k4smz"]
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.435563    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/94a8baf5-acb2-4f6d-811c-335fd3bd7b0d-dns-swift-storage-0\") pod \"dnsmasq-dns-8448dbfc69-4229t\" (UID: \"94a8baf5-acb2-4f6d-811c-335fd3bd7b0d\") " pod="openstack/dnsmasq-dns-8448dbfc69-4229t"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.440267    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/94a8baf5-acb2-4f6d-811c-335fd3bd7b0d-ovsdbserver-sb\") pod \"dnsmasq-dns-8448dbfc69-4229t\" (UID: \"94a8baf5-acb2-4f6d-811c-335fd3bd7b0d\") " pod="openstack/dnsmasq-dns-8448dbfc69-4229t"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.442440    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/94a8baf5-acb2-4f6d-811c-335fd3bd7b0d-dns-svc\") pod \"dnsmasq-dns-8448dbfc69-4229t\" (UID: \"94a8baf5-acb2-4f6d-811c-335fd3bd7b0d\") " pod="openstack/dnsmasq-dns-8448dbfc69-4229t"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.444863    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94a8baf5-acb2-4f6d-811c-335fd3bd7b0d-config\") pod \"dnsmasq-dns-8448dbfc69-4229t\" (UID: \"94a8baf5-acb2-4f6d-811c-335fd3bd7b0d\") " pod="openstack/dnsmasq-dns-8448dbfc69-4229t"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.447790    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6858c8d8f6-k4smz"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.452553    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.452746    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.452862    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.453037    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-c2wgv"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.498605    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/211c7770-64fe-4943-becb-bc02113fd867-config-data\") pod \"cinder-scheduler-0\" (UID: \"211c7770-64fe-4943-becb-bc02113fd867\") " pod="openstack/cinder-scheduler-0"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.499867    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/211c7770-64fe-4943-becb-bc02113fd867-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"211c7770-64fe-4943-becb-bc02113fd867\") " pod="openstack/cinder-scheduler-0"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.500540    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srhzx\" (UniqueName: \"kubernetes.io/projected/211c7770-64fe-4943-becb-bc02113fd867-kube-api-access-srhzx\") pod \"cinder-scheduler-0\" (UID: \"211c7770-64fe-4943-becb-bc02113fd867\") " pod="openstack/cinder-scheduler-0"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.501300    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8448dbfc69-4229t"]
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.501412    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/211c7770-64fe-4943-becb-bc02113fd867-scripts\") pod \"cinder-scheduler-0\" (UID: \"211c7770-64fe-4943-becb-bc02113fd867\") " pod="openstack/cinder-scheduler-0"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.502802    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/211c7770-64fe-4943-becb-bc02113fd867-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"211c7770-64fe-4943-becb-bc02113fd867\") " pod="openstack/cinder-scheduler-0"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.513609    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9vz6\" (UniqueName: \"kubernetes.io/projected/94a8baf5-acb2-4f6d-811c-335fd3bd7b0d-kube-api-access-d9vz6\") pod \"dnsmasq-dns-8448dbfc69-4229t\" (UID: \"94a8baf5-acb2-4f6d-811c-335fd3bd7b0d\") " pod="openstack/dnsmasq-dns-8448dbfc69-4229t"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.513764    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6858c8d8f6-k4smz"]
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.524202    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76cb94d47c-txmh6"]
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.526110    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76cb94d47c-txmh6"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.566001    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8448dbfc69-4229t"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.574722    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76cb94d47c-txmh6"]
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.574766    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"]
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.580198    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.588639    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"]
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.588901    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.658910    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff335b2a-909a-4c39-a045-2267c73ac8b2-dns-svc\") pod \"dnsmasq-dns-76cb94d47c-txmh6\" (UID: \"ff335b2a-909a-4c39-a045-2267c73ac8b2\") " pod="openstack/dnsmasq-dns-76cb94d47c-txmh6"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.658978    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff335b2a-909a-4c39-a045-2267c73ac8b2-ovsdbserver-sb\") pod \"dnsmasq-dns-76cb94d47c-txmh6\" (UID: \"ff335b2a-909a-4c39-a045-2267c73ac8b2\") " pod="openstack/dnsmasq-dns-76cb94d47c-txmh6"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.659000    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7njgl\" (UniqueName: \"kubernetes.io/projected/ff335b2a-909a-4c39-a045-2267c73ac8b2-kube-api-access-7njgl\") pod \"dnsmasq-dns-76cb94d47c-txmh6\" (UID: \"ff335b2a-909a-4c39-a045-2267c73ac8b2\") " pod="openstack/dnsmasq-dns-76cb94d47c-txmh6"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.659141    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ed9fad7-284f-40b4-9c3b-7a213aff010a-combined-ca-bundle\") pod \"neutron-6858c8d8f6-k4smz\" (UID: \"4ed9fad7-284f-40b4-9c3b-7a213aff010a\") " pod="openstack/neutron-6858c8d8f6-k4smz"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.660942    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ed9fad7-284f-40b4-9c3b-7a213aff010a-ovndb-tls-certs\") pod \"neutron-6858c8d8f6-k4smz\" (UID: \"4ed9fad7-284f-40b4-9c3b-7a213aff010a\") " pod="openstack/neutron-6858c8d8f6-k4smz"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.660990    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4ed9fad7-284f-40b4-9c3b-7a213aff010a-httpd-config\") pod \"neutron-6858c8d8f6-k4smz\" (UID: \"4ed9fad7-284f-40b4-9c3b-7a213aff010a\") " pod="openstack/neutron-6858c8d8f6-k4smz"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.661078    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff335b2a-909a-4c39-a045-2267c73ac8b2-ovsdbserver-nb\") pod \"dnsmasq-dns-76cb94d47c-txmh6\" (UID: \"ff335b2a-909a-4c39-a045-2267c73ac8b2\") " pod="openstack/dnsmasq-dns-76cb94d47c-txmh6"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.661115    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff335b2a-909a-4c39-a045-2267c73ac8b2-config\") pod \"dnsmasq-dns-76cb94d47c-txmh6\" (UID: \"ff335b2a-909a-4c39-a045-2267c73ac8b2\") " pod="openstack/dnsmasq-dns-76cb94d47c-txmh6"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.661194    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ff335b2a-909a-4c39-a045-2267c73ac8b2-dns-swift-storage-0\") pod \"dnsmasq-dns-76cb94d47c-txmh6\" (UID: \"ff335b2a-909a-4c39-a045-2267c73ac8b2\") " pod="openstack/dnsmasq-dns-76cb94d47c-txmh6"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.661264    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97ld6\" (UniqueName: \"kubernetes.io/projected/4ed9fad7-284f-40b4-9c3b-7a213aff010a-kube-api-access-97ld6\") pod \"neutron-6858c8d8f6-k4smz\" (UID: \"4ed9fad7-284f-40b4-9c3b-7a213aff010a\") " pod="openstack/neutron-6858c8d8f6-k4smz"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.661322    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4ed9fad7-284f-40b4-9c3b-7a213aff010a-config\") pod \"neutron-6858c8d8f6-k4smz\" (UID: \"4ed9fad7-284f-40b4-9c3b-7a213aff010a\") " pod="openstack/neutron-6858c8d8f6-k4smz"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.707938    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.763852    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/553d73e1-f14e-4379-b630-38e440eedb73-etc-machine-id\") pod \"cinder-api-0\" (UID: \"553d73e1-f14e-4379-b630-38e440eedb73\") " pod="openstack/cinder-api-0"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.763914    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff335b2a-909a-4c39-a045-2267c73ac8b2-ovsdbserver-nb\") pod \"dnsmasq-dns-76cb94d47c-txmh6\" (UID: \"ff335b2a-909a-4c39-a045-2267c73ac8b2\") " pod="openstack/dnsmasq-dns-76cb94d47c-txmh6"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.763940    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff335b2a-909a-4c39-a045-2267c73ac8b2-config\") pod \"dnsmasq-dns-76cb94d47c-txmh6\" (UID: \"ff335b2a-909a-4c39-a045-2267c73ac8b2\") " pod="openstack/dnsmasq-dns-76cb94d47c-txmh6"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.763957    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/553d73e1-f14e-4379-b630-38e440eedb73-config-data\") pod \"cinder-api-0\" (UID: \"553d73e1-f14e-4379-b630-38e440eedb73\") " pod="openstack/cinder-api-0"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.763982    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/553d73e1-f14e-4379-b630-38e440eedb73-logs\") pod \"cinder-api-0\" (UID: \"553d73e1-f14e-4379-b630-38e440eedb73\") " pod="openstack/cinder-api-0"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.764013    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ff335b2a-909a-4c39-a045-2267c73ac8b2-dns-swift-storage-0\") pod \"dnsmasq-dns-76cb94d47c-txmh6\" (UID: \"ff335b2a-909a-4c39-a045-2267c73ac8b2\") " pod="openstack/dnsmasq-dns-76cb94d47c-txmh6"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.764028    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/553d73e1-f14e-4379-b630-38e440eedb73-config-data-custom\") pod \"cinder-api-0\" (UID: \"553d73e1-f14e-4379-b630-38e440eedb73\") " pod="openstack/cinder-api-0"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.764059    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97ld6\" (UniqueName: \"kubernetes.io/projected/4ed9fad7-284f-40b4-9c3b-7a213aff010a-kube-api-access-97ld6\") pod \"neutron-6858c8d8f6-k4smz\" (UID: \"4ed9fad7-284f-40b4-9c3b-7a213aff010a\") " pod="openstack/neutron-6858c8d8f6-k4smz"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.764085    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4ed9fad7-284f-40b4-9c3b-7a213aff010a-config\") pod \"neutron-6858c8d8f6-k4smz\" (UID: \"4ed9fad7-284f-40b4-9c3b-7a213aff010a\") " pod="openstack/neutron-6858c8d8f6-k4smz"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.764101    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/553d73e1-f14e-4379-b630-38e440eedb73-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"553d73e1-f14e-4379-b630-38e440eedb73\") " pod="openstack/cinder-api-0"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.764120    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/553d73e1-f14e-4379-b630-38e440eedb73-scripts\") pod \"cinder-api-0\" (UID: \"553d73e1-f14e-4379-b630-38e440eedb73\") " pod="openstack/cinder-api-0"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.764167    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff335b2a-909a-4c39-a045-2267c73ac8b2-dns-svc\") pod \"dnsmasq-dns-76cb94d47c-txmh6\" (UID: \"ff335b2a-909a-4c39-a045-2267c73ac8b2\") " pod="openstack/dnsmasq-dns-76cb94d47c-txmh6"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.764198    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff335b2a-909a-4c39-a045-2267c73ac8b2-ovsdbserver-sb\") pod \"dnsmasq-dns-76cb94d47c-txmh6\" (UID: \"ff335b2a-909a-4c39-a045-2267c73ac8b2\") " pod="openstack/dnsmasq-dns-76cb94d47c-txmh6"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.764215    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7njgl\" (UniqueName: \"kubernetes.io/projected/ff335b2a-909a-4c39-a045-2267c73ac8b2-kube-api-access-7njgl\") pod \"dnsmasq-dns-76cb94d47c-txmh6\" (UID: \"ff335b2a-909a-4c39-a045-2267c73ac8b2\") " pod="openstack/dnsmasq-dns-76cb94d47c-txmh6"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.764231    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ed9fad7-284f-40b4-9c3b-7a213aff010a-combined-ca-bundle\") pod \"neutron-6858c8d8f6-k4smz\" (UID: \"4ed9fad7-284f-40b4-9c3b-7a213aff010a\") " pod="openstack/neutron-6858c8d8f6-k4smz"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.764273    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptrl6\" (UniqueName: \"kubernetes.io/projected/553d73e1-f14e-4379-b630-38e440eedb73-kube-api-access-ptrl6\") pod \"cinder-api-0\" (UID: \"553d73e1-f14e-4379-b630-38e440eedb73\") " pod="openstack/cinder-api-0"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.764290    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ed9fad7-284f-40b4-9c3b-7a213aff010a-ovndb-tls-certs\") pod \"neutron-6858c8d8f6-k4smz\" (UID: \"4ed9fad7-284f-40b4-9c3b-7a213aff010a\") " pod="openstack/neutron-6858c8d8f6-k4smz"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.764307    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4ed9fad7-284f-40b4-9c3b-7a213aff010a-httpd-config\") pod \"neutron-6858c8d8f6-k4smz\" (UID: \"4ed9fad7-284f-40b4-9c3b-7a213aff010a\") " pod="openstack/neutron-6858c8d8f6-k4smz"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.765709    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ff335b2a-909a-4c39-a045-2267c73ac8b2-dns-swift-storage-0\") pod \"dnsmasq-dns-76cb94d47c-txmh6\" (UID: \"ff335b2a-909a-4c39-a045-2267c73ac8b2\") " pod="openstack/dnsmasq-dns-76cb94d47c-txmh6"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.766074    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff335b2a-909a-4c39-a045-2267c73ac8b2-dns-svc\") pod \"dnsmasq-dns-76cb94d47c-txmh6\" (UID: \"ff335b2a-909a-4c39-a045-2267c73ac8b2\") " pod="openstack/dnsmasq-dns-76cb94d47c-txmh6"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.766973    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff335b2a-909a-4c39-a045-2267c73ac8b2-ovsdbserver-nb\") pod \"dnsmasq-dns-76cb94d47c-txmh6\" (UID: \"ff335b2a-909a-4c39-a045-2267c73ac8b2\") " pod="openstack/dnsmasq-dns-76cb94d47c-txmh6"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.767632    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff335b2a-909a-4c39-a045-2267c73ac8b2-ovsdbserver-sb\") pod \"dnsmasq-dns-76cb94d47c-txmh6\" (UID: \"ff335b2a-909a-4c39-a045-2267c73ac8b2\") " pod="openstack/dnsmasq-dns-76cb94d47c-txmh6"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.770315    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4ed9fad7-284f-40b4-9c3b-7a213aff010a-config\") pod \"neutron-6858c8d8f6-k4smz\" (UID: \"4ed9fad7-284f-40b4-9c3b-7a213aff010a\") " pod="openstack/neutron-6858c8d8f6-k4smz"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.771927    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ed9fad7-284f-40b4-9c3b-7a213aff010a-ovndb-tls-certs\") pod \"neutron-6858c8d8f6-k4smz\" (UID: \"4ed9fad7-284f-40b4-9c3b-7a213aff010a\") " pod="openstack/neutron-6858c8d8f6-k4smz"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.772438    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff335b2a-909a-4c39-a045-2267c73ac8b2-config\") pod \"dnsmasq-dns-76cb94d47c-txmh6\" (UID: \"ff335b2a-909a-4c39-a045-2267c73ac8b2\") " pod="openstack/dnsmasq-dns-76cb94d47c-txmh6"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.774788    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4ed9fad7-284f-40b4-9c3b-7a213aff010a-httpd-config\") pod \"neutron-6858c8d8f6-k4smz\" (UID: \"4ed9fad7-284f-40b4-9c3b-7a213aff010a\") " pod="openstack/neutron-6858c8d8f6-k4smz"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.779030    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ed9fad7-284f-40b4-9c3b-7a213aff010a-combined-ca-bundle\") pod \"neutron-6858c8d8f6-k4smz\" (UID: \"4ed9fad7-284f-40b4-9c3b-7a213aff010a\") " pod="openstack/neutron-6858c8d8f6-k4smz"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.788735    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7njgl\" (UniqueName: \"kubernetes.io/projected/ff335b2a-909a-4c39-a045-2267c73ac8b2-kube-api-access-7njgl\") pod \"dnsmasq-dns-76cb94d47c-txmh6\" (UID: \"ff335b2a-909a-4c39-a045-2267c73ac8b2\") " pod="openstack/dnsmasq-dns-76cb94d47c-txmh6"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.788931    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97ld6\" (UniqueName: \"kubernetes.io/projected/4ed9fad7-284f-40b4-9c3b-7a213aff010a-kube-api-access-97ld6\") pod \"neutron-6858c8d8f6-k4smz\" (UID: \"4ed9fad7-284f-40b4-9c3b-7a213aff010a\") " pod="openstack/neutron-6858c8d8f6-k4smz"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.867924    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/553d73e1-f14e-4379-b630-38e440eedb73-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"553d73e1-f14e-4379-b630-38e440eedb73\") " pod="openstack/cinder-api-0"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.869049    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/553d73e1-f14e-4379-b630-38e440eedb73-scripts\") pod \"cinder-api-0\" (UID: \"553d73e1-f14e-4379-b630-38e440eedb73\") " pod="openstack/cinder-api-0"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.869211    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptrl6\" (UniqueName: \"kubernetes.io/projected/553d73e1-f14e-4379-b630-38e440eedb73-kube-api-access-ptrl6\") pod \"cinder-api-0\" (UID: \"553d73e1-f14e-4379-b630-38e440eedb73\") " pod="openstack/cinder-api-0"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.869660    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/553d73e1-f14e-4379-b630-38e440eedb73-etc-machine-id\") pod \"cinder-api-0\" (UID: \"553d73e1-f14e-4379-b630-38e440eedb73\") " pod="openstack/cinder-api-0"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.869740    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/553d73e1-f14e-4379-b630-38e440eedb73-config-data\") pod \"cinder-api-0\" (UID: \"553d73e1-f14e-4379-b630-38e440eedb73\") " pod="openstack/cinder-api-0"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.869774    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/553d73e1-f14e-4379-b630-38e440eedb73-logs\") pod \"cinder-api-0\" (UID: \"553d73e1-f14e-4379-b630-38e440eedb73\") " pod="openstack/cinder-api-0"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.869821    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/553d73e1-f14e-4379-b630-38e440eedb73-config-data-custom\") pod \"cinder-api-0\" (UID: \"553d73e1-f14e-4379-b630-38e440eedb73\") " pod="openstack/cinder-api-0"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.870166    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/553d73e1-f14e-4379-b630-38e440eedb73-etc-machine-id\") pod \"cinder-api-0\" (UID: \"553d73e1-f14e-4379-b630-38e440eedb73\") " pod="openstack/cinder-api-0"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.875926    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/553d73e1-f14e-4379-b630-38e440eedb73-logs\") pod \"cinder-api-0\" (UID: \"553d73e1-f14e-4379-b630-38e440eedb73\") " pod="openstack/cinder-api-0"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.919311    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6858c8d8f6-k4smz"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.923721    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/553d73e1-f14e-4379-b630-38e440eedb73-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"553d73e1-f14e-4379-b630-38e440eedb73\") " pod="openstack/cinder-api-0"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.923850    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/553d73e1-f14e-4379-b630-38e440eedb73-config-data\") pod \"cinder-api-0\" (UID: \"553d73e1-f14e-4379-b630-38e440eedb73\") " pod="openstack/cinder-api-0"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.924037    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/553d73e1-f14e-4379-b630-38e440eedb73-scripts\") pod \"cinder-api-0\" (UID: \"553d73e1-f14e-4379-b630-38e440eedb73\") " pod="openstack/cinder-api-0"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.924437    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/553d73e1-f14e-4379-b630-38e440eedb73-config-data-custom\") pod \"cinder-api-0\" (UID: \"553d73e1-f14e-4379-b630-38e440eedb73\") " pod="openstack/cinder-api-0"
Mar 20 16:00:45 crc kubenswrapper[4730]: I0320 16:00:45.932779    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptrl6\" (UniqueName: \"kubernetes.io/projected/553d73e1-f14e-4379-b630-38e440eedb73-kube-api-access-ptrl6\") pod \"cinder-api-0\" (UID: \"553d73e1-f14e-4379-b630-38e440eedb73\") " pod="openstack/cinder-api-0"
Mar 20 16:00:46 crc kubenswrapper[4730]: I0320 16:00:46.091059    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76cb94d47c-txmh6"
Mar 20 16:00:46 crc kubenswrapper[4730]: I0320 16:00:46.092069    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0"
Mar 20 16:00:46 crc kubenswrapper[4730]: I0320 16:00:46.168104    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-66f7c676c8-wdfnw"]
Mar 20 16:00:46 crc kubenswrapper[4730]: I0320 16:00:46.204713    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-86bc9f54b4-6szxq"]
Mar 20 16:00:46 crc kubenswrapper[4730]: W0320 16:00:46.227683    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod625a25c3_e585_4848_bbe1_0bdd4be731a9.slice/crio-4051b918d71242b8ca2e41190a654e8dbaba0ff078f4749ad9adbb01f4288ed1 WatchSource:0}: Error finding container 4051b918d71242b8ca2e41190a654e8dbaba0ff078f4749ad9adbb01f4288ed1: Status 404 returned error can't find the container with id 4051b918d71242b8ca2e41190a654e8dbaba0ff078f4749ad9adbb01f4288ed1
Mar 20 16:00:46 crc kubenswrapper[4730]: W0320 16:00:46.276312    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4dfee88_47ff_4e8b_9f46_60cc17fb0080.slice/crio-8090a449167023dbdca53532fb75f2e14e0fdd389666939cc9b4e5a2d48a7c02 WatchSource:0}: Error finding container 8090a449167023dbdca53532fb75f2e14e0fdd389666939cc9b4e5a2d48a7c02: Status 404 returned error can't find the container with id 8090a449167023dbdca53532fb75f2e14e0fdd389666939cc9b4e5a2d48a7c02
Mar 20 16:00:46 crc kubenswrapper[4730]: I0320 16:00:46.410450    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-86bc9f54b4-6szxq" event={"ID":"e4dfee88-47ff-4e8b-9f46-60cc17fb0080","Type":"ContainerStarted","Data":"8090a449167023dbdca53532fb75f2e14e0fdd389666939cc9b4e5a2d48a7c02"}
Mar 20 16:00:46 crc kubenswrapper[4730]: I0320 16:00:46.442179    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-66f7c676c8-wdfnw" event={"ID":"625a25c3-e585-4848-bbe1-0bdd4be731a9","Type":"ContainerStarted","Data":"4051b918d71242b8ca2e41190a654e8dbaba0ff078f4749ad9adbb01f4288ed1"}
Mar 20 16:00:46 crc kubenswrapper[4730]: I0320 16:00:46.458741    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"]
Mar 20 16:00:46 crc kubenswrapper[4730]: I0320 16:00:46.514358    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"223c97f9-0680-47b8-bc2e-1c914296d29e","Type":"ContainerStarted","Data":"af9edb57a02832528fefe2d244021643694b240c8792bf0b8e54f60c03a47e8d"}
Mar 20 16:00:46 crc kubenswrapper[4730]: I0320 16:00:46.514526    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="223c97f9-0680-47b8-bc2e-1c914296d29e" containerName="ceilometer-central-agent" containerID="cri-o://d6846337441238e0631dc47666643f2d85b6c9e548d29144f9972ff195d4dc1e" gracePeriod=30
Mar 20 16:00:46 crc kubenswrapper[4730]: I0320 16:00:46.515033    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0"
Mar 20 16:00:46 crc kubenswrapper[4730]: I0320 16:00:46.515319    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="223c97f9-0680-47b8-bc2e-1c914296d29e" containerName="proxy-httpd" containerID="cri-o://af9edb57a02832528fefe2d244021643694b240c8792bf0b8e54f60c03a47e8d" gracePeriod=30
Mar 20 16:00:46 crc kubenswrapper[4730]: I0320 16:00:46.515372    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="223c97f9-0680-47b8-bc2e-1c914296d29e" containerName="sg-core" containerID="cri-o://71df4b6d20c02608180532f04e26895e3f9ad0248e3128f02a3d2c457f5baa48" gracePeriod=30
Mar 20 16:00:46 crc kubenswrapper[4730]: I0320 16:00:46.515406    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="223c97f9-0680-47b8-bc2e-1c914296d29e" containerName="ceilometer-notification-agent" containerID="cri-o://f29767ca3e9d6cdef0508a609251241ac12e93823a37b6304da4f130070ee420" gracePeriod=30
Mar 20 16:00:46 crc kubenswrapper[4730]: I0320 16:00:46.549691    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"3f6c808e-d523-48bd-8ec2-28b625834317","Type":"ContainerStarted","Data":"22c1fa447a9712d22a3477c2b5b4f81ffbfd58601afde3e8b272d15c3b1ac1ce"}
Mar 20 16:00:46 crc kubenswrapper[4730]: I0320 16:00:46.578481    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.086430644 podStartE2EDuration="47.57826427s" podCreationTimestamp="2026-03-20 15:59:59 +0000 UTC" firstStartedPulling="2026-03-20 16:00:01.452832941 +0000 UTC m=+1260.666204310" lastFinishedPulling="2026-03-20 16:00:44.944666567 +0000 UTC m=+1304.158037936" observedRunningTime="2026-03-20 16:00:46.545743253 +0000 UTC m=+1305.759114642" watchObservedRunningTime="2026-03-20 16:00:46.57826427 +0000 UTC m=+1305.791635639"
Mar 20 16:00:46 crc kubenswrapper[4730]: I0320 16:00:46.637902    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-86947bcbc8-94hl8"]
Mar 20 16:00:46 crc kubenswrapper[4730]: I0320 16:00:46.648220    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dccc4d8b9-qqxcj"]
Mar 20 16:00:46 crc kubenswrapper[4730]: I0320 16:00:46.658480    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-54b9958865-vn9kj"]
Mar 20 16:00:46 crc kubenswrapper[4730]: I0320 16:00:46.718952    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8448dbfc69-4229t"]
Mar 20 16:00:46 crc kubenswrapper[4730]: I0320 16:00:46.765126    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"]
Mar 20 16:00:47 crc kubenswrapper[4730]: I0320 16:00:47.049659    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6858c8d8f6-k4smz"]
Mar 20 16:00:47 crc kubenswrapper[4730]: I0320 16:00:47.197893    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"]
Mar 20 16:00:47 crc kubenswrapper[4730]: I0320 16:00:47.231007    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76cb94d47c-txmh6"]
Mar 20 16:00:47 crc kubenswrapper[4730]: W0320 16:00:47.377642    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod553d73e1_f14e_4379_b630_38e440eedb73.slice/crio-fa390ac4060d1a6d7727f5c45c81da714c4fb46717c6a3810b619cf41b677f6f WatchSource:0}: Error finding container fa390ac4060d1a6d7727f5c45c81da714c4fb46717c6a3810b619cf41b677f6f: Status 404 returned error can't find the container with id fa390ac4060d1a6d7727f5c45c81da714c4fb46717c6a3810b619cf41b677f6f
Mar 20 16:00:47 crc kubenswrapper[4730]: W0320 16:00:47.390452    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff335b2a_909a_4c39_a045_2267c73ac8b2.slice/crio-bb5085b7efdc43fccc1330db4f8eeb5dbaa633cfdd11172926159e432c750243 WatchSource:0}: Error finding container bb5085b7efdc43fccc1330db4f8eeb5dbaa633cfdd11172926159e432c750243: Status 404 returned error can't find the container with id bb5085b7efdc43fccc1330db4f8eeb5dbaa633cfdd11172926159e432c750243
Mar 20 16:00:47 crc kubenswrapper[4730]: I0320 16:00:47.594794    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-66f7c676c8-wdfnw" event={"ID":"625a25c3-e585-4848-bbe1-0bdd4be731a9","Type":"ContainerStarted","Data":"36d68edd252a5f941948a382970d79860f0b64a17a4a5f197bbb8ab68f618711"}
Mar 20 16:00:47 crc kubenswrapper[4730]: I0320 16:00:47.594878    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-66f7c676c8-wdfnw" event={"ID":"625a25c3-e585-4848-bbe1-0bdd4be731a9","Type":"ContainerStarted","Data":"4b1a913b558b33eb77f5f4bce1a49632877936d63ae7264946c7f559a4317add"}
Mar 20 16:00:47 crc kubenswrapper[4730]: I0320 16:00:47.594961    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-66f7c676c8-wdfnw"
Mar 20 16:00:47 crc kubenswrapper[4730]: I0320 16:00:47.597546    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"553d73e1-f14e-4379-b630-38e440eedb73","Type":"ContainerStarted","Data":"fa390ac4060d1a6d7727f5c45c81da714c4fb46717c6a3810b619cf41b677f6f"}
Mar 20 16:00:47 crc kubenswrapper[4730]: I0320 16:00:47.604463    4730 generic.go:334] "Generic (PLEG): container finished" podID="fc765c5c-7def-4230-b500-d6410c2da475" containerID="bb60bf8d45c378ac22831a9107db723f03064f274377daf0138584d49caa72c4" exitCode=0
Mar 20 16:00:47 crc kubenswrapper[4730]: I0320 16:00:47.604605    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dccc4d8b9-qqxcj" event={"ID":"fc765c5c-7def-4230-b500-d6410c2da475","Type":"ContainerDied","Data":"bb60bf8d45c378ac22831a9107db723f03064f274377daf0138584d49caa72c4"}
Mar 20 16:00:47 crc kubenswrapper[4730]: I0320 16:00:47.604645    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dccc4d8b9-qqxcj" event={"ID":"fc765c5c-7def-4230-b500-d6410c2da475","Type":"ContainerStarted","Data":"0e83f3546b83e55e3c56c1cd51e6231078002160e2def174a384d53f10d45966"}
Mar 20 16:00:47 crc kubenswrapper[4730]: I0320 16:00:47.608214    4730 generic.go:334] "Generic (PLEG): container finished" podID="223c97f9-0680-47b8-bc2e-1c914296d29e" containerID="af9edb57a02832528fefe2d244021643694b240c8792bf0b8e54f60c03a47e8d" exitCode=0
Mar 20 16:00:47 crc kubenswrapper[4730]: I0320 16:00:47.608234    4730 generic.go:334] "Generic (PLEG): container finished" podID="223c97f9-0680-47b8-bc2e-1c914296d29e" containerID="71df4b6d20c02608180532f04e26895e3f9ad0248e3128f02a3d2c457f5baa48" exitCode=2
Mar 20 16:00:47 crc kubenswrapper[4730]: I0320 16:00:47.608263    4730 generic.go:334] "Generic (PLEG): container finished" podID="223c97f9-0680-47b8-bc2e-1c914296d29e" containerID="d6846337441238e0631dc47666643f2d85b6c9e548d29144f9972ff195d4dc1e" exitCode=0
Mar 20 16:00:47 crc kubenswrapper[4730]: I0320 16:00:47.608300    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"223c97f9-0680-47b8-bc2e-1c914296d29e","Type":"ContainerDied","Data":"af9edb57a02832528fefe2d244021643694b240c8792bf0b8e54f60c03a47e8d"}
Mar 20 16:00:47 crc kubenswrapper[4730]: I0320 16:00:47.608320    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"223c97f9-0680-47b8-bc2e-1c914296d29e","Type":"ContainerDied","Data":"71df4b6d20c02608180532f04e26895e3f9ad0248e3128f02a3d2c457f5baa48"}
Mar 20 16:00:47 crc kubenswrapper[4730]: I0320 16:00:47.608354    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"223c97f9-0680-47b8-bc2e-1c914296d29e","Type":"ContainerDied","Data":"d6846337441238e0631dc47666643f2d85b6c9e548d29144f9972ff195d4dc1e"}
Mar 20 16:00:47 crc kubenswrapper[4730]: I0320 16:00:47.610146    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-54b9958865-vn9kj" event={"ID":"8f8c40f6-c8d3-4c8c-97eb-643d32774174","Type":"ContainerStarted","Data":"68ccd644408abe10bbe53729b9fa0385e109edeb36678c4555b35581a8c4faae"}
Mar 20 16:00:47 crc kubenswrapper[4730]: I0320 16:00:47.612532    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-86947bcbc8-94hl8" event={"ID":"59460a49-c9fe-46c9-b898-d08234ca7cd3","Type":"ContainerStarted","Data":"f6b19889ce7f10111d0868f7ac900efa99cdeb7647b6d1a5d5561cbe20437ebf"}
Mar 20 16:00:47 crc kubenswrapper[4730]: I0320 16:00:47.612557    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-86947bcbc8-94hl8" event={"ID":"59460a49-c9fe-46c9-b898-d08234ca7cd3","Type":"ContainerStarted","Data":"dda643c0e52a6ae559ac8082cc3cd0f70fc2700317b9248464038bda0b48cc5d"}
Mar 20 16:00:47 crc kubenswrapper[4730]: I0320 16:00:47.614290    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"aabd3bd6-2cee-47b8-9174-ad9ea1415e82","Type":"ContainerStarted","Data":"b9554ee31e1f63a199256bec2bb1d2359bd31822a3145d6062ef5270512d5078"}
Mar 20 16:00:47 crc kubenswrapper[4730]: I0320 16:00:47.614317    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"aabd3bd6-2cee-47b8-9174-ad9ea1415e82","Type":"ContainerStarted","Data":"6d15d2ce82d18fec66394661182780c55d2eb065e0325660570454876023436b"}
Mar 20 16:00:47 crc kubenswrapper[4730]: I0320 16:00:47.624462    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8448dbfc69-4229t" event={"ID":"94a8baf5-acb2-4f6d-811c-335fd3bd7b0d","Type":"ContainerStarted","Data":"52f7b95fb474271e8f80dffb24f266c810b758a2d4678f7825703e2faf3416e4"}
Mar 20 16:00:47 crc kubenswrapper[4730]: I0320 16:00:47.629197    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-66f7c676c8-wdfnw" podStartSLOduration=10.629177152 podStartE2EDuration="10.629177152s" podCreationTimestamp="2026-03-20 16:00:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:00:47.619625278 +0000 UTC m=+1306.832996647" watchObservedRunningTime="2026-03-20 16:00:47.629177152 +0000 UTC m=+1306.842548521"
Mar 20 16:00:47 crc kubenswrapper[4730]: I0320 16:00:47.633818    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6858c8d8f6-k4smz" event={"ID":"4ed9fad7-284f-40b4-9c3b-7a213aff010a","Type":"ContainerStarted","Data":"52473fa5ebf9585a5de42b7ba1a1ed907405640eabed30fdba7a035924a392d0"}
Mar 20 16:00:47 crc kubenswrapper[4730]: I0320 16:00:47.638395    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"211c7770-64fe-4943-becb-bc02113fd867","Type":"ContainerStarted","Data":"4e67bee5c04834a723f337ad60eefc215ef11d3a4fa1f8a1f937528124d8a7cd"}
Mar 20 16:00:47 crc kubenswrapper[4730]: I0320 16:00:47.640378    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76cb94d47c-txmh6" event={"ID":"ff335b2a-909a-4c39-a045-2267c73ac8b2","Type":"ContainerStarted","Data":"bb5085b7efdc43fccc1330db4f8eeb5dbaa633cfdd11172926159e432c750243"}
Mar 20 16:00:47 crc kubenswrapper[4730]: I0320 16:00:47.962509    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"]
Mar 20 16:00:48 crc kubenswrapper[4730]: I0320 16:00:48.154367    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-66f7c676c8-wdfnw"
Mar 20 16:00:48 crc kubenswrapper[4730]: I0320 16:00:48.682070    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-86947bcbc8-94hl8" event={"ID":"59460a49-c9fe-46c9-b898-d08234ca7cd3","Type":"ContainerStarted","Data":"f7ff0c62dabe1626e435ccd5a9eff6848b2da9515e76484d89f62a803fa4c66a"}
Mar 20 16:00:48 crc kubenswrapper[4730]: I0320 16:00:48.683866    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-86947bcbc8-94hl8"
Mar 20 16:00:48 crc kubenswrapper[4730]: I0320 16:00:48.683907    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-86947bcbc8-94hl8"
Mar 20 16:00:48 crc kubenswrapper[4730]: I0320 16:00:48.698303    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"aabd3bd6-2cee-47b8-9174-ad9ea1415e82","Type":"ContainerStarted","Data":"1b54d6706fe3b9e0df5e79e3bf8df496baa00381f4d47713ff0c38d0b131d0ef"}
Mar 20 16:00:48 crc kubenswrapper[4730]: I0320 16:00:48.704237    4730 generic.go:334] "Generic (PLEG): container finished" podID="94a8baf5-acb2-4f6d-811c-335fd3bd7b0d" containerID="350035d4ab56eddb1da4280110af09cc41d7fc179ba6730fdfd0cf3c66bc48f4" exitCode=0
Mar 20 16:00:48 crc kubenswrapper[4730]: I0320 16:00:48.704440    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8448dbfc69-4229t" event={"ID":"94a8baf5-acb2-4f6d-811c-335fd3bd7b0d","Type":"ContainerDied","Data":"350035d4ab56eddb1da4280110af09cc41d7fc179ba6730fdfd0cf3c66bc48f4"}
Mar 20 16:00:48 crc kubenswrapper[4730]: I0320 16:00:48.711520    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0"
Mar 20 16:00:48 crc kubenswrapper[4730]: I0320 16:00:48.720122    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-86947bcbc8-94hl8" podStartSLOduration=8.720088459 podStartE2EDuration="8.720088459s" podCreationTimestamp="2026-03-20 16:00:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:00:48.708422148 +0000 UTC m=+1307.921793517" watchObservedRunningTime="2026-03-20 16:00:48.720088459 +0000 UTC m=+1307.933459828"
Mar 20 16:00:48 crc kubenswrapper[4730]: I0320 16:00:48.722849    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6858c8d8f6-k4smz" event={"ID":"4ed9fad7-284f-40b4-9c3b-7a213aff010a","Type":"ContainerStarted","Data":"2eff1617c29a34da6021d776b5bc5c6695025819add4af253986837526af0f15"}
Mar 20 16:00:48 crc kubenswrapper[4730]: I0320 16:00:48.761532    4730 generic.go:334] "Generic (PLEG): container finished" podID="ff335b2a-909a-4c39-a045-2267c73ac8b2" containerID="e17867e23320294cb0b9160d421d696c692c189cb8d85e17cafc60d15c897466" exitCode=0
Mar 20 16:00:48 crc kubenswrapper[4730]: I0320 16:00:48.763284    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76cb94d47c-txmh6" event={"ID":"ff335b2a-909a-4c39-a045-2267c73ac8b2","Type":"ContainerDied","Data":"e17867e23320294cb0b9160d421d696c692c189cb8d85e17cafc60d15c897466"}
Mar 20 16:00:48 crc kubenswrapper[4730]: I0320 16:00:48.771890    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=12.771868897000001 podStartE2EDuration="12.771868897s" podCreationTimestamp="2026-03-20 16:00:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:00:48.764991773 +0000 UTC m=+1307.978363142" watchObservedRunningTime="2026-03-20 16:00:48.771868897 +0000 UTC m=+1307.985240266"
Mar 20 16:00:49 crc kubenswrapper[4730]: I0320 16:00:49.974264    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0"
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.056076    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0"
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.104211    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dccc4d8b9-qqxcj"
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.109565    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8448dbfc69-4229t"
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.291432    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc765c5c-7def-4230-b500-d6410c2da475-ovsdbserver-nb\") pod \"fc765c5c-7def-4230-b500-d6410c2da475\" (UID: \"fc765c5c-7def-4230-b500-d6410c2da475\") "
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.291529    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc765c5c-7def-4230-b500-d6410c2da475-dns-swift-storage-0\") pod \"fc765c5c-7def-4230-b500-d6410c2da475\" (UID: \"fc765c5c-7def-4230-b500-d6410c2da475\") "
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.291793    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/94a8baf5-acb2-4f6d-811c-335fd3bd7b0d-ovsdbserver-sb\") pod \"94a8baf5-acb2-4f6d-811c-335fd3bd7b0d\" (UID: \"94a8baf5-acb2-4f6d-811c-335fd3bd7b0d\") "
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.291862    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc765c5c-7def-4230-b500-d6410c2da475-ovsdbserver-sb\") pod \"fc765c5c-7def-4230-b500-d6410c2da475\" (UID: \"fc765c5c-7def-4230-b500-d6410c2da475\") "
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.291891    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc765c5c-7def-4230-b500-d6410c2da475-dns-svc\") pod \"fc765c5c-7def-4230-b500-d6410c2da475\" (UID: \"fc765c5c-7def-4230-b500-d6410c2da475\") "
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.291940    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc765c5c-7def-4230-b500-d6410c2da475-config\") pod \"fc765c5c-7def-4230-b500-d6410c2da475\" (UID: \"fc765c5c-7def-4230-b500-d6410c2da475\") "
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.291982    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/94a8baf5-acb2-4f6d-811c-335fd3bd7b0d-ovsdbserver-nb\") pod \"94a8baf5-acb2-4f6d-811c-335fd3bd7b0d\" (UID: \"94a8baf5-acb2-4f6d-811c-335fd3bd7b0d\") "
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.292058    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9vz6\" (UniqueName: \"kubernetes.io/projected/94a8baf5-acb2-4f6d-811c-335fd3bd7b0d-kube-api-access-d9vz6\") pod \"94a8baf5-acb2-4f6d-811c-335fd3bd7b0d\" (UID: \"94a8baf5-acb2-4f6d-811c-335fd3bd7b0d\") "
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.292173    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9k6g\" (UniqueName: \"kubernetes.io/projected/fc765c5c-7def-4230-b500-d6410c2da475-kube-api-access-q9k6g\") pod \"fc765c5c-7def-4230-b500-d6410c2da475\" (UID: \"fc765c5c-7def-4230-b500-d6410c2da475\") "
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.292216    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94a8baf5-acb2-4f6d-811c-335fd3bd7b0d-config\") pod \"94a8baf5-acb2-4f6d-811c-335fd3bd7b0d\" (UID: \"94a8baf5-acb2-4f6d-811c-335fd3bd7b0d\") "
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.292284    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/94a8baf5-acb2-4f6d-811c-335fd3bd7b0d-dns-svc\") pod \"94a8baf5-acb2-4f6d-811c-335fd3bd7b0d\" (UID: \"94a8baf5-acb2-4f6d-811c-335fd3bd7b0d\") "
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.292385    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/94a8baf5-acb2-4f6d-811c-335fd3bd7b0d-dns-swift-storage-0\") pod \"94a8baf5-acb2-4f6d-811c-335fd3bd7b0d\" (UID: \"94a8baf5-acb2-4f6d-811c-335fd3bd7b0d\") "
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.303831    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc765c5c-7def-4230-b500-d6410c2da475-kube-api-access-q9k6g" (OuterVolumeSpecName: "kube-api-access-q9k6g") pod "fc765c5c-7def-4230-b500-d6410c2da475" (UID: "fc765c5c-7def-4230-b500-d6410c2da475"). InnerVolumeSpecName "kube-api-access-q9k6g". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.311882    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9k6g\" (UniqueName: \"kubernetes.io/projected/fc765c5c-7def-4230-b500-d6410c2da475-kube-api-access-q9k6g\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.401819    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc765c5c-7def-4230-b500-d6410c2da475-config" (OuterVolumeSpecName: "config") pod "fc765c5c-7def-4230-b500-d6410c2da475" (UID: "fc765c5c-7def-4230-b500-d6410c2da475"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.415676    4730 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc765c5c-7def-4230-b500-d6410c2da475-config\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.446314    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94a8baf5-acb2-4f6d-811c-335fd3bd7b0d-kube-api-access-d9vz6" (OuterVolumeSpecName: "kube-api-access-d9vz6") pod "94a8baf5-acb2-4f6d-811c-335fd3bd7b0d" (UID: "94a8baf5-acb2-4f6d-811c-335fd3bd7b0d"). InnerVolumeSpecName "kube-api-access-d9vz6". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.456621    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc765c5c-7def-4230-b500-d6410c2da475-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fc765c5c-7def-4230-b500-d6410c2da475" (UID: "fc765c5c-7def-4230-b500-d6410c2da475"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.476217    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94a8baf5-acb2-4f6d-811c-335fd3bd7b0d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "94a8baf5-acb2-4f6d-811c-335fd3bd7b0d" (UID: "94a8baf5-acb2-4f6d-811c-335fd3bd7b0d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.485031    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94a8baf5-acb2-4f6d-811c-335fd3bd7b0d-config" (OuterVolumeSpecName: "config") pod "94a8baf5-acb2-4f6d-811c-335fd3bd7b0d" (UID: "94a8baf5-acb2-4f6d-811c-335fd3bd7b0d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.524815    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94a8baf5-acb2-4f6d-811c-335fd3bd7b0d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "94a8baf5-acb2-4f6d-811c-335fd3bd7b0d" (UID: "94a8baf5-acb2-4f6d-811c-335fd3bd7b0d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.525446    4730 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc765c5c-7def-4230-b500-d6410c2da475-dns-swift-storage-0\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.525629    4730 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/94a8baf5-acb2-4f6d-811c-335fd3bd7b0d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.525706    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9vz6\" (UniqueName: \"kubernetes.io/projected/94a8baf5-acb2-4f6d-811c-335fd3bd7b0d-kube-api-access-d9vz6\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.525800    4730 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94a8baf5-acb2-4f6d-811c-335fd3bd7b0d-config\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.529923    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc765c5c-7def-4230-b500-d6410c2da475-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fc765c5c-7def-4230-b500-d6410c2da475" (UID: "fc765c5c-7def-4230-b500-d6410c2da475"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.534778    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc765c5c-7def-4230-b500-d6410c2da475-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fc765c5c-7def-4230-b500-d6410c2da475" (UID: "fc765c5c-7def-4230-b500-d6410c2da475"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.571213    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94a8baf5-acb2-4f6d-811c-335fd3bd7b0d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "94a8baf5-acb2-4f6d-811c-335fd3bd7b0d" (UID: "94a8baf5-acb2-4f6d-811c-335fd3bd7b0d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.576874    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc765c5c-7def-4230-b500-d6410c2da475-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fc765c5c-7def-4230-b500-d6410c2da475" (UID: "fc765c5c-7def-4230-b500-d6410c2da475"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.577610    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94a8baf5-acb2-4f6d-811c-335fd3bd7b0d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "94a8baf5-acb2-4f6d-811c-335fd3bd7b0d" (UID: "94a8baf5-acb2-4f6d-811c-335fd3bd7b0d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.628049    4730 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/94a8baf5-acb2-4f6d-811c-335fd3bd7b0d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.628081    4730 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc765c5c-7def-4230-b500-d6410c2da475-ovsdbserver-nb\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.628091    4730 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/94a8baf5-acb2-4f6d-811c-335fd3bd7b0d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.628100    4730 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc765c5c-7def-4230-b500-d6410c2da475-ovsdbserver-sb\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.628110    4730 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc765c5c-7def-4230-b500-d6410c2da475-dns-svc\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.628120    4730 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/94a8baf5-acb2-4f6d-811c-335fd3bd7b0d-dns-svc\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.785391    4730 generic.go:334] "Generic (PLEG): container finished" podID="3f6c808e-d523-48bd-8ec2-28b625834317" containerID="22c1fa447a9712d22a3477c2b5b4f81ffbfd58601afde3e8b272d15c3b1ac1ce" exitCode=1
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.785705    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"3f6c808e-d523-48bd-8ec2-28b625834317","Type":"ContainerDied","Data":"22c1fa447a9712d22a3477c2b5b4f81ffbfd58601afde3e8b272d15c3b1ac1ce"}
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.785820    4730 scope.go:117] "RemoveContainer" containerID="834cc7775c739ec615e04b1c22eba8f136c36cff5d344d3613f2797565551c85"
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.786186    4730 scope.go:117] "RemoveContainer" containerID="22c1fa447a9712d22a3477c2b5b4f81ffbfd58601afde3e8b272d15c3b1ac1ce"
Mar 20 16:00:50 crc kubenswrapper[4730]: E0320 16:00:50.786538    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(3f6c808e-d523-48bd-8ec2-28b625834317)\"" pod="openstack/watcher-decision-engine-0" podUID="3f6c808e-d523-48bd-8ec2-28b625834317"
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.789050    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"553d73e1-f14e-4379-b630-38e440eedb73","Type":"ContainerStarted","Data":"65229dd1742ef08926ef8e9608941aa087b79288d7c286c646e5a92bbdde1bc5"}
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.791759    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dccc4d8b9-qqxcj" event={"ID":"fc765c5c-7def-4230-b500-d6410c2da475","Type":"ContainerDied","Data":"0e83f3546b83e55e3c56c1cd51e6231078002160e2def174a384d53f10d45966"}
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.791826    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dccc4d8b9-qqxcj"
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.794010    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8448dbfc69-4229t" event={"ID":"94a8baf5-acb2-4f6d-811c-335fd3bd7b0d","Type":"ContainerDied","Data":"52f7b95fb474271e8f80dffb24f266c810b758a2d4678f7825703e2faf3416e4"}
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.794080    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8448dbfc69-4229t"
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.809692    4730 generic.go:334] "Generic (PLEG): container finished" podID="223c97f9-0680-47b8-bc2e-1c914296d29e" containerID="f29767ca3e9d6cdef0508a609251241ac12e93823a37b6304da4f130070ee420" exitCode=0
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.810622    4730 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness"
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.809995    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"223c97f9-0680-47b8-bc2e-1c914296d29e","Type":"ContainerDied","Data":"f29767ca3e9d6cdef0508a609251241ac12e93823a37b6304da4f130070ee420"}
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.889425    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dccc4d8b9-qqxcj"]
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.915338    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6dccc4d8b9-qqxcj"]
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.940654    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8448dbfc69-4229t"]
Mar 20 16:00:50 crc kubenswrapper[4730]: I0320 16:00:50.948901    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8448dbfc69-4229t"]
Mar 20 16:00:51 crc kubenswrapper[4730]: I0320 16:00:51.157185    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0"
Mar 20 16:00:51 crc kubenswrapper[4730]: I0320 16:00:51.241467    4730 scope.go:117] "RemoveContainer" containerID="bb60bf8d45c378ac22831a9107db723f03064f274377daf0138584d49caa72c4"
Mar 20 16:00:51 crc kubenswrapper[4730]: I0320 16:00:51.254037    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/223c97f9-0680-47b8-bc2e-1c914296d29e-sg-core-conf-yaml\") pod \"223c97f9-0680-47b8-bc2e-1c914296d29e\" (UID: \"223c97f9-0680-47b8-bc2e-1c914296d29e\") "
Mar 20 16:00:51 crc kubenswrapper[4730]: I0320 16:00:51.254091    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/223c97f9-0680-47b8-bc2e-1c914296d29e-log-httpd\") pod \"223c97f9-0680-47b8-bc2e-1c914296d29e\" (UID: \"223c97f9-0680-47b8-bc2e-1c914296d29e\") "
Mar 20 16:00:51 crc kubenswrapper[4730]: I0320 16:00:51.254109    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/223c97f9-0680-47b8-bc2e-1c914296d29e-scripts\") pod \"223c97f9-0680-47b8-bc2e-1c914296d29e\" (UID: \"223c97f9-0680-47b8-bc2e-1c914296d29e\") "
Mar 20 16:00:51 crc kubenswrapper[4730]: I0320 16:00:51.254127    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-np4pc\" (UniqueName: \"kubernetes.io/projected/223c97f9-0680-47b8-bc2e-1c914296d29e-kube-api-access-np4pc\") pod \"223c97f9-0680-47b8-bc2e-1c914296d29e\" (UID: \"223c97f9-0680-47b8-bc2e-1c914296d29e\") "
Mar 20 16:00:51 crc kubenswrapper[4730]: I0320 16:00:51.254147    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/223c97f9-0680-47b8-bc2e-1c914296d29e-combined-ca-bundle\") pod \"223c97f9-0680-47b8-bc2e-1c914296d29e\" (UID: \"223c97f9-0680-47b8-bc2e-1c914296d29e\") "
Mar 20 16:00:51 crc kubenswrapper[4730]: I0320 16:00:51.254174    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/223c97f9-0680-47b8-bc2e-1c914296d29e-run-httpd\") pod \"223c97f9-0680-47b8-bc2e-1c914296d29e\" (UID: \"223c97f9-0680-47b8-bc2e-1c914296d29e\") "
Mar 20 16:00:51 crc kubenswrapper[4730]: I0320 16:00:51.254267    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/223c97f9-0680-47b8-bc2e-1c914296d29e-config-data\") pod \"223c97f9-0680-47b8-bc2e-1c914296d29e\" (UID: \"223c97f9-0680-47b8-bc2e-1c914296d29e\") "
Mar 20 16:00:51 crc kubenswrapper[4730]: I0320 16:00:51.257435    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/223c97f9-0680-47b8-bc2e-1c914296d29e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "223c97f9-0680-47b8-bc2e-1c914296d29e" (UID: "223c97f9-0680-47b8-bc2e-1c914296d29e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:00:51 crc kubenswrapper[4730]: I0320 16:00:51.260612    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/223c97f9-0680-47b8-bc2e-1c914296d29e-scripts" (OuterVolumeSpecName: "scripts") pod "223c97f9-0680-47b8-bc2e-1c914296d29e" (UID: "223c97f9-0680-47b8-bc2e-1c914296d29e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:00:51 crc kubenswrapper[4730]: I0320 16:00:51.260850    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/223c97f9-0680-47b8-bc2e-1c914296d29e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "223c97f9-0680-47b8-bc2e-1c914296d29e" (UID: "223c97f9-0680-47b8-bc2e-1c914296d29e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:00:51 crc kubenswrapper[4730]: I0320 16:00:51.266414    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/223c97f9-0680-47b8-bc2e-1c914296d29e-kube-api-access-np4pc" (OuterVolumeSpecName: "kube-api-access-np4pc") pod "223c97f9-0680-47b8-bc2e-1c914296d29e" (UID: "223c97f9-0680-47b8-bc2e-1c914296d29e"). InnerVolumeSpecName "kube-api-access-np4pc". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:00:51 crc kubenswrapper[4730]: I0320 16:00:51.355964    4730 scope.go:117] "RemoveContainer" containerID="350035d4ab56eddb1da4280110af09cc41d7fc179ba6730fdfd0cf3c66bc48f4"
Mar 20 16:00:51 crc kubenswrapper[4730]: I0320 16:00:51.356897    4730 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/223c97f9-0680-47b8-bc2e-1c914296d29e-run-httpd\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:51 crc kubenswrapper[4730]: I0320 16:00:51.356931    4730 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/223c97f9-0680-47b8-bc2e-1c914296d29e-log-httpd\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:51 crc kubenswrapper[4730]: I0320 16:00:51.356956    4730 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/223c97f9-0680-47b8-bc2e-1c914296d29e-scripts\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:51 crc kubenswrapper[4730]: I0320 16:00:51.356975    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-np4pc\" (UniqueName: \"kubernetes.io/projected/223c97f9-0680-47b8-bc2e-1c914296d29e-kube-api-access-np4pc\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:51 crc kubenswrapper[4730]: I0320 16:00:51.375401    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/223c97f9-0680-47b8-bc2e-1c914296d29e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "223c97f9-0680-47b8-bc2e-1c914296d29e" (UID: "223c97f9-0680-47b8-bc2e-1c914296d29e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:00:51 crc kubenswrapper[4730]: I0320 16:00:51.457916    4730 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/223c97f9-0680-47b8-bc2e-1c914296d29e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:51 crc kubenswrapper[4730]: I0320 16:00:51.527391    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/223c97f9-0680-47b8-bc2e-1c914296d29e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "223c97f9-0680-47b8-bc2e-1c914296d29e" (UID: "223c97f9-0680-47b8-bc2e-1c914296d29e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:00:51 crc kubenswrapper[4730]: I0320 16:00:51.540344    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/223c97f9-0680-47b8-bc2e-1c914296d29e-config-data" (OuterVolumeSpecName: "config-data") pod "223c97f9-0680-47b8-bc2e-1c914296d29e" (UID: "223c97f9-0680-47b8-bc2e-1c914296d29e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:00:51 crc kubenswrapper[4730]: I0320 16:00:51.546723    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94a8baf5-acb2-4f6d-811c-335fd3bd7b0d" path="/var/lib/kubelet/pods/94a8baf5-acb2-4f6d-811c-335fd3bd7b0d/volumes"
Mar 20 16:00:51 crc kubenswrapper[4730]: I0320 16:00:51.547218    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc765c5c-7def-4230-b500-d6410c2da475" path="/var/lib/kubelet/pods/fc765c5c-7def-4230-b500-d6410c2da475/volumes"
Mar 20 16:00:51 crc kubenswrapper[4730]: I0320 16:00:51.561968    4730 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/223c97f9-0680-47b8-bc2e-1c914296d29e-config-data\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:51 crc kubenswrapper[4730]: I0320 16:00:51.562006    4730 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/223c97f9-0680-47b8-bc2e-1c914296d29e-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:51 crc kubenswrapper[4730]: I0320 16:00:51.823049    4730 scope.go:117] "RemoveContainer" containerID="22c1fa447a9712d22a3477c2b5b4f81ffbfd58601afde3e8b272d15c3b1ac1ce"
Mar 20 16:00:51 crc kubenswrapper[4730]: E0320 16:00:51.823573    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(3f6c808e-d523-48bd-8ec2-28b625834317)\"" pod="openstack/watcher-decision-engine-0" podUID="3f6c808e-d523-48bd-8ec2-28b625834317"
Mar 20 16:00:51 crc kubenswrapper[4730]: I0320 16:00:51.829638    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-86bc9f54b4-6szxq" event={"ID":"e4dfee88-47ff-4e8b-9f46-60cc17fb0080","Type":"ContainerStarted","Data":"7adc271839acf868f868394423060886ede5743de715667560a83baa8f102638"}
Mar 20 16:00:51 crc kubenswrapper[4730]: I0320 16:00:51.829694    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-86bc9f54b4-6szxq" event={"ID":"e4dfee88-47ff-4e8b-9f46-60cc17fb0080","Type":"ContainerStarted","Data":"2e378d4becd72b560e4b2faaf1b0e5fd690320acae0a69c92380d07f339ed239"}
Mar 20 16:00:51 crc kubenswrapper[4730]: I0320 16:00:51.846011    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6858c8d8f6-k4smz" event={"ID":"4ed9fad7-284f-40b4-9c3b-7a213aff010a","Type":"ContainerStarted","Data":"3635b09696560454a28e6d666babdb61696ccff059aecec39acea6122546c8aa"}
Mar 20 16:00:51 crc kubenswrapper[4730]: I0320 16:00:51.846947    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6858c8d8f6-k4smz"
Mar 20 16:00:51 crc kubenswrapper[4730]: I0320 16:00:51.858728    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-54b9958865-vn9kj" event={"ID":"8f8c40f6-c8d3-4c8c-97eb-643d32774174","Type":"ContainerStarted","Data":"df46bfa0a3f633249b00195c32cbff13d4d5759f46bd1fc7e9e141bb53782460"}
Mar 20 16:00:51 crc kubenswrapper[4730]: I0320 16:00:51.858764    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-54b9958865-vn9kj" event={"ID":"8f8c40f6-c8d3-4c8c-97eb-643d32774174","Type":"ContainerStarted","Data":"0d6fcfd02e7f08c9519c6dcde8d39e6abb5db9b7085e1f106645291d4ca7c1b0"}
Mar 20 16:00:51 crc kubenswrapper[4730]: I0320 16:00:51.860231    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-86bc9f54b4-6szxq" podStartSLOduration=10.174291041 podStartE2EDuration="14.860217643s" podCreationTimestamp="2026-03-20 16:00:37 +0000 UTC" firstStartedPulling="2026-03-20 16:00:46.347361947 +0000 UTC m=+1305.560733316" lastFinishedPulling="2026-03-20 16:00:51.033288549 +0000 UTC m=+1310.246659918" observedRunningTime="2026-03-20 16:00:51.858011833 +0000 UTC m=+1311.071383192" watchObservedRunningTime="2026-03-20 16:00:51.860217643 +0000 UTC m=+1311.073589012"
Mar 20 16:00:51 crc kubenswrapper[4730]: I0320 16:00:51.862244    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"211c7770-64fe-4943-becb-bc02113fd867","Type":"ContainerStarted","Data":"6d6a4235c5c9e8bc75e4b872bc2c20836032212d513ddc23145b97813f0b4e12"}
Mar 20 16:00:51 crc kubenswrapper[4730]: I0320 16:00:51.873395    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76cb94d47c-txmh6" event={"ID":"ff335b2a-909a-4c39-a045-2267c73ac8b2","Type":"ContainerStarted","Data":"1a5f9090983f451fb355cec1fdd913ee82666d2f07f611bcec31819f098f8126"}
Mar 20 16:00:51 crc kubenswrapper[4730]: I0320 16:00:51.873723    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76cb94d47c-txmh6"
Mar 20 16:00:51 crc kubenswrapper[4730]: I0320 16:00:51.880723    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0"
Mar 20 16:00:51 crc kubenswrapper[4730]: I0320 16:00:51.883231    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"223c97f9-0680-47b8-bc2e-1c914296d29e","Type":"ContainerDied","Data":"7f071b5518e739d74a059048c81e33bc96125faedfd609490b9b1dda80135229"}
Mar 20 16:00:51 crc kubenswrapper[4730]: I0320 16:00:51.883303    4730 scope.go:117] "RemoveContainer" containerID="af9edb57a02832528fefe2d244021643694b240c8792bf0b8e54f60c03a47e8d"
Mar 20 16:00:51 crc kubenswrapper[4730]: I0320 16:00:51.892175    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6858c8d8f6-k4smz" podStartSLOduration=6.8921570469999995 podStartE2EDuration="6.892157047s" podCreationTimestamp="2026-03-20 16:00:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:00:51.877715564 +0000 UTC m=+1311.091086933" watchObservedRunningTime="2026-03-20 16:00:51.892157047 +0000 UTC m=+1311.105528406"
Mar 20 16:00:51 crc kubenswrapper[4730]: I0320 16:00:51.912527    4730 scope.go:117] "RemoveContainer" containerID="71df4b6d20c02608180532f04e26895e3f9ad0248e3128f02a3d2c457f5baa48"
Mar 20 16:00:51 crc kubenswrapper[4730]: I0320 16:00:51.915767    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-54b9958865-vn9kj" podStartSLOduration=10.529947873 podStartE2EDuration="14.915749324s" podCreationTimestamp="2026-03-20 16:00:37 +0000 UTC" firstStartedPulling="2026-03-20 16:00:46.647429957 +0000 UTC m=+1305.860801326" lastFinishedPulling="2026-03-20 16:00:51.033231408 +0000 UTC m=+1310.246602777" observedRunningTime="2026-03-20 16:00:51.907707345 +0000 UTC m=+1311.121078714" watchObservedRunningTime="2026-03-20 16:00:51.915749324 +0000 UTC m=+1311.129120693"
Mar 20 16:00:51 crc kubenswrapper[4730]: I0320 16:00:51.916166    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0"
Mar 20 16:00:51 crc kubenswrapper[4730]: I0320 16:00:51.916327    4730 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness"
Mar 20 16:00:51 crc kubenswrapper[4730]: I0320 16:00:51.947526    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76cb94d47c-txmh6" podStartSLOduration=6.947510605 podStartE2EDuration="6.947510605s" podCreationTimestamp="2026-03-20 16:00:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:00:51.929071982 +0000 UTC m=+1311.142443371" watchObservedRunningTime="2026-03-20 16:00:51.947510605 +0000 UTC m=+1311.160881974"
Mar 20 16:00:51 crc kubenswrapper[4730]: I0320 16:00:51.989521    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"]
Mar 20 16:00:51 crc kubenswrapper[4730]: I0320 16:00:51.992203    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0"
Mar 20 16:00:51 crc kubenswrapper[4730]: I0320 16:00:51.998305    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"]
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.017728    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"]
Mar 20 16:00:52 crc kubenswrapper[4730]: E0320 16:00:52.018627    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="223c97f9-0680-47b8-bc2e-1c914296d29e" containerName="sg-core"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.018644    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="223c97f9-0680-47b8-bc2e-1c914296d29e" containerName="sg-core"
Mar 20 16:00:52 crc kubenswrapper[4730]: E0320 16:00:52.018690    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="223c97f9-0680-47b8-bc2e-1c914296d29e" containerName="proxy-httpd"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.018699    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="223c97f9-0680-47b8-bc2e-1c914296d29e" containerName="proxy-httpd"
Mar 20 16:00:52 crc kubenswrapper[4730]: E0320 16:00:52.018723    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94a8baf5-acb2-4f6d-811c-335fd3bd7b0d" containerName="init"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.018731    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="94a8baf5-acb2-4f6d-811c-335fd3bd7b0d" containerName="init"
Mar 20 16:00:52 crc kubenswrapper[4730]: E0320 16:00:52.018776    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc765c5c-7def-4230-b500-d6410c2da475" containerName="init"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.018785    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc765c5c-7def-4230-b500-d6410c2da475" containerName="init"
Mar 20 16:00:52 crc kubenswrapper[4730]: E0320 16:00:52.018798    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="223c97f9-0680-47b8-bc2e-1c914296d29e" containerName="ceilometer-notification-agent"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.018805    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="223c97f9-0680-47b8-bc2e-1c914296d29e" containerName="ceilometer-notification-agent"
Mar 20 16:00:52 crc kubenswrapper[4730]: E0320 16:00:52.018816    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="223c97f9-0680-47b8-bc2e-1c914296d29e" containerName="ceilometer-central-agent"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.018823    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="223c97f9-0680-47b8-bc2e-1c914296d29e" containerName="ceilometer-central-agent"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.019186    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="223c97f9-0680-47b8-bc2e-1c914296d29e" containerName="proxy-httpd"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.019202    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="94a8baf5-acb2-4f6d-811c-335fd3bd7b0d" containerName="init"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.019217    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="223c97f9-0680-47b8-bc2e-1c914296d29e" containerName="ceilometer-notification-agent"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.019234    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="223c97f9-0680-47b8-bc2e-1c914296d29e" containerName="ceilometer-central-agent"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.019280    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc765c5c-7def-4230-b500-d6410c2da475" containerName="init"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.019298    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="223c97f9-0680-47b8-bc2e-1c914296d29e" containerName="sg-core"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.022382    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.025197    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.025457    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.028791    4730 scope.go:117] "RemoveContainer" containerID="f29767ca3e9d6cdef0508a609251241ac12e93823a37b6304da4f130070ee420"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.039382    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"]
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.064859    4730 scope.go:117] "RemoveContainer" containerID="d6846337441238e0631dc47666643f2d85b6c9e548d29144f9972ff195d4dc1e"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.125565    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5dc7dd859f-wtxnj"]
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.127738    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5dc7dd859f-wtxnj"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.131582    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.131679    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.136889    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5dc7dd859f-wtxnj"]
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.187088    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe109bf0-70d2-41d2-855c-6eb862e568b6-log-httpd\") pod \"ceilometer-0\" (UID: \"fe109bf0-70d2-41d2-855c-6eb862e568b6\") " pod="openstack/ceilometer-0"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.187140    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe109bf0-70d2-41d2-855c-6eb862e568b6-run-httpd\") pod \"ceilometer-0\" (UID: \"fe109bf0-70d2-41d2-855c-6eb862e568b6\") " pod="openstack/ceilometer-0"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.187159    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe109bf0-70d2-41d2-855c-6eb862e568b6-config-data\") pod \"ceilometer-0\" (UID: \"fe109bf0-70d2-41d2-855c-6eb862e568b6\") " pod="openstack/ceilometer-0"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.187182    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62339bcb-2edc-4881-a15e-a9387442db89-internal-tls-certs\") pod \"neutron-5dc7dd859f-wtxnj\" (UID: \"62339bcb-2edc-4881-a15e-a9387442db89\") " pod="openstack/neutron-5dc7dd859f-wtxnj"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.187203    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/62339bcb-2edc-4881-a15e-a9387442db89-config\") pod \"neutron-5dc7dd859f-wtxnj\" (UID: \"62339bcb-2edc-4881-a15e-a9387442db89\") " pod="openstack/neutron-5dc7dd859f-wtxnj"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.187320    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe109bf0-70d2-41d2-855c-6eb862e568b6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fe109bf0-70d2-41d2-855c-6eb862e568b6\") " pod="openstack/ceilometer-0"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.187360    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62339bcb-2edc-4881-a15e-a9387442db89-combined-ca-bundle\") pod \"neutron-5dc7dd859f-wtxnj\" (UID: \"62339bcb-2edc-4881-a15e-a9387442db89\") " pod="openstack/neutron-5dc7dd859f-wtxnj"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.187391    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe109bf0-70d2-41d2-855c-6eb862e568b6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fe109bf0-70d2-41d2-855c-6eb862e568b6\") " pod="openstack/ceilometer-0"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.187411    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62339bcb-2edc-4881-a15e-a9387442db89-public-tls-certs\") pod \"neutron-5dc7dd859f-wtxnj\" (UID: \"62339bcb-2edc-4881-a15e-a9387442db89\") " pod="openstack/neutron-5dc7dd859f-wtxnj"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.187435    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/62339bcb-2edc-4881-a15e-a9387442db89-httpd-config\") pod \"neutron-5dc7dd859f-wtxnj\" (UID: \"62339bcb-2edc-4881-a15e-a9387442db89\") " pod="openstack/neutron-5dc7dd859f-wtxnj"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.187458    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/62339bcb-2edc-4881-a15e-a9387442db89-ovndb-tls-certs\") pod \"neutron-5dc7dd859f-wtxnj\" (UID: \"62339bcb-2edc-4881-a15e-a9387442db89\") " pod="openstack/neutron-5dc7dd859f-wtxnj"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.187477    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z6c7\" (UniqueName: \"kubernetes.io/projected/62339bcb-2edc-4881-a15e-a9387442db89-kube-api-access-6z6c7\") pod \"neutron-5dc7dd859f-wtxnj\" (UID: \"62339bcb-2edc-4881-a15e-a9387442db89\") " pod="openstack/neutron-5dc7dd859f-wtxnj"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.187496    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdtvc\" (UniqueName: \"kubernetes.io/projected/fe109bf0-70d2-41d2-855c-6eb862e568b6-kube-api-access-vdtvc\") pod \"ceilometer-0\" (UID: \"fe109bf0-70d2-41d2-855c-6eb862e568b6\") " pod="openstack/ceilometer-0"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.187515    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe109bf0-70d2-41d2-855c-6eb862e568b6-scripts\") pod \"ceilometer-0\" (UID: \"fe109bf0-70d2-41d2-855c-6eb862e568b6\") " pod="openstack/ceilometer-0"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.288672    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe109bf0-70d2-41d2-855c-6eb862e568b6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fe109bf0-70d2-41d2-855c-6eb862e568b6\") " pod="openstack/ceilometer-0"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.288718    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62339bcb-2edc-4881-a15e-a9387442db89-public-tls-certs\") pod \"neutron-5dc7dd859f-wtxnj\" (UID: \"62339bcb-2edc-4881-a15e-a9387442db89\") " pod="openstack/neutron-5dc7dd859f-wtxnj"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.288749    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/62339bcb-2edc-4881-a15e-a9387442db89-httpd-config\") pod \"neutron-5dc7dd859f-wtxnj\" (UID: \"62339bcb-2edc-4881-a15e-a9387442db89\") " pod="openstack/neutron-5dc7dd859f-wtxnj"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.288778    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/62339bcb-2edc-4881-a15e-a9387442db89-ovndb-tls-certs\") pod \"neutron-5dc7dd859f-wtxnj\" (UID: \"62339bcb-2edc-4881-a15e-a9387442db89\") " pod="openstack/neutron-5dc7dd859f-wtxnj"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.288804    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6z6c7\" (UniqueName: \"kubernetes.io/projected/62339bcb-2edc-4881-a15e-a9387442db89-kube-api-access-6z6c7\") pod \"neutron-5dc7dd859f-wtxnj\" (UID: \"62339bcb-2edc-4881-a15e-a9387442db89\") " pod="openstack/neutron-5dc7dd859f-wtxnj"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.288821    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdtvc\" (UniqueName: \"kubernetes.io/projected/fe109bf0-70d2-41d2-855c-6eb862e568b6-kube-api-access-vdtvc\") pod \"ceilometer-0\" (UID: \"fe109bf0-70d2-41d2-855c-6eb862e568b6\") " pod="openstack/ceilometer-0"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.288840    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe109bf0-70d2-41d2-855c-6eb862e568b6-scripts\") pod \"ceilometer-0\" (UID: \"fe109bf0-70d2-41d2-855c-6eb862e568b6\") " pod="openstack/ceilometer-0"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.288870    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe109bf0-70d2-41d2-855c-6eb862e568b6-log-httpd\") pod \"ceilometer-0\" (UID: \"fe109bf0-70d2-41d2-855c-6eb862e568b6\") " pod="openstack/ceilometer-0"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.288892    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe109bf0-70d2-41d2-855c-6eb862e568b6-run-httpd\") pod \"ceilometer-0\" (UID: \"fe109bf0-70d2-41d2-855c-6eb862e568b6\") " pod="openstack/ceilometer-0"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.288907    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe109bf0-70d2-41d2-855c-6eb862e568b6-config-data\") pod \"ceilometer-0\" (UID: \"fe109bf0-70d2-41d2-855c-6eb862e568b6\") " pod="openstack/ceilometer-0"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.288932    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62339bcb-2edc-4881-a15e-a9387442db89-internal-tls-certs\") pod \"neutron-5dc7dd859f-wtxnj\" (UID: \"62339bcb-2edc-4881-a15e-a9387442db89\") " pod="openstack/neutron-5dc7dd859f-wtxnj"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.288950    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/62339bcb-2edc-4881-a15e-a9387442db89-config\") pod \"neutron-5dc7dd859f-wtxnj\" (UID: \"62339bcb-2edc-4881-a15e-a9387442db89\") " pod="openstack/neutron-5dc7dd859f-wtxnj"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.288997    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe109bf0-70d2-41d2-855c-6eb862e568b6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fe109bf0-70d2-41d2-855c-6eb862e568b6\") " pod="openstack/ceilometer-0"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.289037    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62339bcb-2edc-4881-a15e-a9387442db89-combined-ca-bundle\") pod \"neutron-5dc7dd859f-wtxnj\" (UID: \"62339bcb-2edc-4881-a15e-a9387442db89\") " pod="openstack/neutron-5dc7dd859f-wtxnj"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.292575    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe109bf0-70d2-41d2-855c-6eb862e568b6-log-httpd\") pod \"ceilometer-0\" (UID: \"fe109bf0-70d2-41d2-855c-6eb862e568b6\") " pod="openstack/ceilometer-0"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.292984    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe109bf0-70d2-41d2-855c-6eb862e568b6-run-httpd\") pod \"ceilometer-0\" (UID: \"fe109bf0-70d2-41d2-855c-6eb862e568b6\") " pod="openstack/ceilometer-0"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.298073    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/62339bcb-2edc-4881-a15e-a9387442db89-ovndb-tls-certs\") pod \"neutron-5dc7dd859f-wtxnj\" (UID: \"62339bcb-2edc-4881-a15e-a9387442db89\") " pod="openstack/neutron-5dc7dd859f-wtxnj"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.301260    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62339bcb-2edc-4881-a15e-a9387442db89-public-tls-certs\") pod \"neutron-5dc7dd859f-wtxnj\" (UID: \"62339bcb-2edc-4881-a15e-a9387442db89\") " pod="openstack/neutron-5dc7dd859f-wtxnj"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.302105    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe109bf0-70d2-41d2-855c-6eb862e568b6-scripts\") pod \"ceilometer-0\" (UID: \"fe109bf0-70d2-41d2-855c-6eb862e568b6\") " pod="openstack/ceilometer-0"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.302894    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62339bcb-2edc-4881-a15e-a9387442db89-combined-ca-bundle\") pod \"neutron-5dc7dd859f-wtxnj\" (UID: \"62339bcb-2edc-4881-a15e-a9387442db89\") " pod="openstack/neutron-5dc7dd859f-wtxnj"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.303046    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/62339bcb-2edc-4881-a15e-a9387442db89-config\") pod \"neutron-5dc7dd859f-wtxnj\" (UID: \"62339bcb-2edc-4881-a15e-a9387442db89\") " pod="openstack/neutron-5dc7dd859f-wtxnj"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.303276    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe109bf0-70d2-41d2-855c-6eb862e568b6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fe109bf0-70d2-41d2-855c-6eb862e568b6\") " pod="openstack/ceilometer-0"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.303934    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe109bf0-70d2-41d2-855c-6eb862e568b6-config-data\") pod \"ceilometer-0\" (UID: \"fe109bf0-70d2-41d2-855c-6eb862e568b6\") " pod="openstack/ceilometer-0"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.305459    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe109bf0-70d2-41d2-855c-6eb862e568b6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fe109bf0-70d2-41d2-855c-6eb862e568b6\") " pod="openstack/ceilometer-0"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.314536    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdtvc\" (UniqueName: \"kubernetes.io/projected/fe109bf0-70d2-41d2-855c-6eb862e568b6-kube-api-access-vdtvc\") pod \"ceilometer-0\" (UID: \"fe109bf0-70d2-41d2-855c-6eb862e568b6\") " pod="openstack/ceilometer-0"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.316387    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62339bcb-2edc-4881-a15e-a9387442db89-internal-tls-certs\") pod \"neutron-5dc7dd859f-wtxnj\" (UID: \"62339bcb-2edc-4881-a15e-a9387442db89\") " pod="openstack/neutron-5dc7dd859f-wtxnj"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.316759    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/62339bcb-2edc-4881-a15e-a9387442db89-httpd-config\") pod \"neutron-5dc7dd859f-wtxnj\" (UID: \"62339bcb-2edc-4881-a15e-a9387442db89\") " pod="openstack/neutron-5dc7dd859f-wtxnj"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.341046    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z6c7\" (UniqueName: \"kubernetes.io/projected/62339bcb-2edc-4881-a15e-a9387442db89-kube-api-access-6z6c7\") pod \"neutron-5dc7dd859f-wtxnj\" (UID: \"62339bcb-2edc-4881-a15e-a9387442db89\") " pod="openstack/neutron-5dc7dd859f-wtxnj"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.345228    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.454750    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5dc7dd859f-wtxnj"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.872379    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"]
Mar 20 16:00:52 crc kubenswrapper[4730]: W0320 16:00:52.893373    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe109bf0_70d2_41d2_855c_6eb862e568b6.slice/crio-4d621236dd54101584c726bca47e76d324d3fb96b4a4404920b5e18a6a4fbb39 WatchSource:0}: Error finding container 4d621236dd54101584c726bca47e76d324d3fb96b4a4404920b5e18a6a4fbb39: Status 404 returned error can't find the container with id 4d621236dd54101584c726bca47e76d324d3fb96b4a4404920b5e18a6a4fbb39
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.930988    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"553d73e1-f14e-4379-b630-38e440eedb73","Type":"ContainerStarted","Data":"3644a90df0b76ee2298de84878e1467667a7cf0df46bec7faf290aad4ad0f2a7"}
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.931180    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="553d73e1-f14e-4379-b630-38e440eedb73" containerName="cinder-api-log" containerID="cri-o://65229dd1742ef08926ef8e9608941aa087b79288d7c286c646e5a92bbdde1bc5" gracePeriod=30
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.931370    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0"
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.931489    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="553d73e1-f14e-4379-b630-38e440eedb73" containerName="cinder-api" containerID="cri-o://3644a90df0b76ee2298de84878e1467667a7cf0df46bec7faf290aad4ad0f2a7" gracePeriod=30
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.959444    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"211c7770-64fe-4943-becb-bc02113fd867","Type":"ContainerStarted","Data":"4f88e22c1abf31db26b2de1e0622d6921372e0cee954fd4aea4d817c4e543d05"}
Mar 20 16:00:52 crc kubenswrapper[4730]: I0320 16:00:52.969120    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=7.969099561 podStartE2EDuration="7.969099561s" podCreationTimestamp="2026-03-20 16:00:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:00:52.957000741 +0000 UTC m=+1312.170372110" watchObservedRunningTime="2026-03-20 16:00:52.969099561 +0000 UTC m=+1312.182470930"
Mar 20 16:00:53 crc kubenswrapper[4730]: I0320 16:00:53.169473    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=7.583057069 podStartE2EDuration="8.169447022s" podCreationTimestamp="2026-03-20 16:00:45 +0000 UTC" firstStartedPulling="2026-03-20 16:00:46.907715118 +0000 UTC m=+1306.121086487" lastFinishedPulling="2026-03-20 16:00:47.494105071 +0000 UTC m=+1306.707476440" observedRunningTime="2026-03-20 16:00:52.989807704 +0000 UTC m=+1312.203179073" watchObservedRunningTime="2026-03-20 16:00:53.169447022 +0000 UTC m=+1312.382818401"
Mar 20 16:00:53 crc kubenswrapper[4730]: I0320 16:00:53.176635    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5dc7dd859f-wtxnj"]
Mar 20 16:00:53 crc kubenswrapper[4730]: I0320 16:00:53.597465    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="223c97f9-0680-47b8-bc2e-1c914296d29e" path="/var/lib/kubelet/pods/223c97f9-0680-47b8-bc2e-1c914296d29e/volumes"
Mar 20 16:00:53 crc kubenswrapper[4730]: I0320 16:00:53.833125    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0"
Mar 20 16:00:53 crc kubenswrapper[4730]: I0320 16:00:53.961823    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/553d73e1-f14e-4379-b630-38e440eedb73-combined-ca-bundle\") pod \"553d73e1-f14e-4379-b630-38e440eedb73\" (UID: \"553d73e1-f14e-4379-b630-38e440eedb73\") "
Mar 20 16:00:53 crc kubenswrapper[4730]: I0320 16:00:53.962089    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/553d73e1-f14e-4379-b630-38e440eedb73-config-data\") pod \"553d73e1-f14e-4379-b630-38e440eedb73\" (UID: \"553d73e1-f14e-4379-b630-38e440eedb73\") "
Mar 20 16:00:53 crc kubenswrapper[4730]: I0320 16:00:53.962133    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptrl6\" (UniqueName: \"kubernetes.io/projected/553d73e1-f14e-4379-b630-38e440eedb73-kube-api-access-ptrl6\") pod \"553d73e1-f14e-4379-b630-38e440eedb73\" (UID: \"553d73e1-f14e-4379-b630-38e440eedb73\") "
Mar 20 16:00:53 crc kubenswrapper[4730]: I0320 16:00:53.962464    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/553d73e1-f14e-4379-b630-38e440eedb73-config-data-custom\") pod \"553d73e1-f14e-4379-b630-38e440eedb73\" (UID: \"553d73e1-f14e-4379-b630-38e440eedb73\") "
Mar 20 16:00:53 crc kubenswrapper[4730]: I0320 16:00:53.962486    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/553d73e1-f14e-4379-b630-38e440eedb73-logs\") pod \"553d73e1-f14e-4379-b630-38e440eedb73\" (UID: \"553d73e1-f14e-4379-b630-38e440eedb73\") "
Mar 20 16:00:53 crc kubenswrapper[4730]: I0320 16:00:53.962509    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/553d73e1-f14e-4379-b630-38e440eedb73-etc-machine-id\") pod \"553d73e1-f14e-4379-b630-38e440eedb73\" (UID: \"553d73e1-f14e-4379-b630-38e440eedb73\") "
Mar 20 16:00:53 crc kubenswrapper[4730]: I0320 16:00:53.962535    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/553d73e1-f14e-4379-b630-38e440eedb73-scripts\") pod \"553d73e1-f14e-4379-b630-38e440eedb73\" (UID: \"553d73e1-f14e-4379-b630-38e440eedb73\") "
Mar 20 16:00:53 crc kubenswrapper[4730]: I0320 16:00:53.965044    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/553d73e1-f14e-4379-b630-38e440eedb73-logs" (OuterVolumeSpecName: "logs") pod "553d73e1-f14e-4379-b630-38e440eedb73" (UID: "553d73e1-f14e-4379-b630-38e440eedb73"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:00:53 crc kubenswrapper[4730]: I0320 16:00:53.965595    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/553d73e1-f14e-4379-b630-38e440eedb73-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "553d73e1-f14e-4379-b630-38e440eedb73" (UID: "553d73e1-f14e-4379-b630-38e440eedb73"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue ""
Mar 20 16:00:53 crc kubenswrapper[4730]: I0320 16:00:53.968404    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/553d73e1-f14e-4379-b630-38e440eedb73-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "553d73e1-f14e-4379-b630-38e440eedb73" (UID: "553d73e1-f14e-4379-b630-38e440eedb73"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:00:53 crc kubenswrapper[4730]: I0320 16:00:53.976625    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/553d73e1-f14e-4379-b630-38e440eedb73-kube-api-access-ptrl6" (OuterVolumeSpecName: "kube-api-access-ptrl6") pod "553d73e1-f14e-4379-b630-38e440eedb73" (UID: "553d73e1-f14e-4379-b630-38e440eedb73"). InnerVolumeSpecName "kube-api-access-ptrl6". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:00:53 crc kubenswrapper[4730]: I0320 16:00:53.977451    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/553d73e1-f14e-4379-b630-38e440eedb73-scripts" (OuterVolumeSpecName: "scripts") pod "553d73e1-f14e-4379-b630-38e440eedb73" (UID: "553d73e1-f14e-4379-b630-38e440eedb73"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:00:53 crc kubenswrapper[4730]: I0320 16:00:53.992572    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe109bf0-70d2-41d2-855c-6eb862e568b6","Type":"ContainerStarted","Data":"8853aa1f17e6388ce020212c8d73958c09bbf6fcc38c4d043313ee458cbde4ad"}
Mar 20 16:00:53 crc kubenswrapper[4730]: I0320 16:00:53.992632    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe109bf0-70d2-41d2-855c-6eb862e568b6","Type":"ContainerStarted","Data":"c73d9cedf11e6a3a273cb136651dcbf0aa00b21555ddca3a8c2b2551a6375a21"}
Mar 20 16:00:53 crc kubenswrapper[4730]: I0320 16:00:53.992646    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe109bf0-70d2-41d2-855c-6eb862e568b6","Type":"ContainerStarted","Data":"4d621236dd54101584c726bca47e76d324d3fb96b4a4404920b5e18a6a4fbb39"}
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.005135    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/553d73e1-f14e-4379-b630-38e440eedb73-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "553d73e1-f14e-4379-b630-38e440eedb73" (UID: "553d73e1-f14e-4379-b630-38e440eedb73"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.042688    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/553d73e1-f14e-4379-b630-38e440eedb73-config-data" (OuterVolumeSpecName: "config-data") pod "553d73e1-f14e-4379-b630-38e440eedb73" (UID: "553d73e1-f14e-4379-b630-38e440eedb73"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.045378    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5dc7dd859f-wtxnj" event={"ID":"62339bcb-2edc-4881-a15e-a9387442db89","Type":"ContainerStarted","Data":"e9bcbd3723d2f293fb36e8e3445df05602c0d62eb7e6982ff997cb3f24bf3fdc"}
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.045837    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5dc7dd859f-wtxnj" event={"ID":"62339bcb-2edc-4881-a15e-a9387442db89","Type":"ContainerStarted","Data":"b7fcbce1d55c9ecbe53035d1ab156cb679df475f1a8bcf8ecf5b13bc843b652e"}
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.045855    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5dc7dd859f-wtxnj" event={"ID":"62339bcb-2edc-4881-a15e-a9387442db89","Type":"ContainerStarted","Data":"18fb0a41e0b45d2c14e6c163aa9e9e0b6fce377022fcbab6c5e0d14237a1f7c7"}
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.045894    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5dc7dd859f-wtxnj"
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.058917    4730 generic.go:334] "Generic (PLEG): container finished" podID="553d73e1-f14e-4379-b630-38e440eedb73" containerID="3644a90df0b76ee2298de84878e1467667a7cf0df46bec7faf290aad4ad0f2a7" exitCode=0
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.058950    4730 generic.go:334] "Generic (PLEG): container finished" podID="553d73e1-f14e-4379-b630-38e440eedb73" containerID="65229dd1742ef08926ef8e9608941aa087b79288d7c286c646e5a92bbdde1bc5" exitCode=143
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.059814    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0"
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.063924    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"553d73e1-f14e-4379-b630-38e440eedb73","Type":"ContainerDied","Data":"3644a90df0b76ee2298de84878e1467667a7cf0df46bec7faf290aad4ad0f2a7"}
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.063974    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"553d73e1-f14e-4379-b630-38e440eedb73","Type":"ContainerDied","Data":"65229dd1742ef08926ef8e9608941aa087b79288d7c286c646e5a92bbdde1bc5"}
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.063987    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"553d73e1-f14e-4379-b630-38e440eedb73","Type":"ContainerDied","Data":"fa390ac4060d1a6d7727f5c45c81da714c4fb46717c6a3810b619cf41b677f6f"}
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.064005    4730 scope.go:117] "RemoveContainer" containerID="3644a90df0b76ee2298de84878e1467667a7cf0df46bec7faf290aad4ad0f2a7"
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.066594    4730 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/553d73e1-f14e-4379-b630-38e440eedb73-config-data-custom\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.066624    4730 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/553d73e1-f14e-4379-b630-38e440eedb73-logs\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.067112    4730 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/553d73e1-f14e-4379-b630-38e440eedb73-etc-machine-id\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.067130    4730 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/553d73e1-f14e-4379-b630-38e440eedb73-scripts\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.067140    4730 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/553d73e1-f14e-4379-b630-38e440eedb73-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.067151    4730 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/553d73e1-f14e-4379-b630-38e440eedb73-config-data\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.067162    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptrl6\" (UniqueName: \"kubernetes.io/projected/553d73e1-f14e-4379-b630-38e440eedb73-kube-api-access-ptrl6\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.072530    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5dc7dd859f-wtxnj" podStartSLOduration=2.072511238 podStartE2EDuration="2.072511238s" podCreationTimestamp="2026-03-20 16:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:00:54.066696218 +0000 UTC m=+1313.280067587" watchObservedRunningTime="2026-03-20 16:00:54.072511238 +0000 UTC m=+1313.285882597"
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.143384    4730 scope.go:117] "RemoveContainer" containerID="65229dd1742ef08926ef8e9608941aa087b79288d7c286c646e5a92bbdde1bc5"
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.243731    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"]
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.252599    4730 scope.go:117] "RemoveContainer" containerID="3644a90df0b76ee2298de84878e1467667a7cf0df46bec7faf290aad4ad0f2a7"
Mar 20 16:00:54 crc kubenswrapper[4730]: E0320 16:00:54.253897    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3644a90df0b76ee2298de84878e1467667a7cf0df46bec7faf290aad4ad0f2a7\": container with ID starting with 3644a90df0b76ee2298de84878e1467667a7cf0df46bec7faf290aad4ad0f2a7 not found: ID does not exist" containerID="3644a90df0b76ee2298de84878e1467667a7cf0df46bec7faf290aad4ad0f2a7"
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.253932    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3644a90df0b76ee2298de84878e1467667a7cf0df46bec7faf290aad4ad0f2a7"} err="failed to get container status \"3644a90df0b76ee2298de84878e1467667a7cf0df46bec7faf290aad4ad0f2a7\": rpc error: code = NotFound desc = could not find container \"3644a90df0b76ee2298de84878e1467667a7cf0df46bec7faf290aad4ad0f2a7\": container with ID starting with 3644a90df0b76ee2298de84878e1467667a7cf0df46bec7faf290aad4ad0f2a7 not found: ID does not exist"
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.253952    4730 scope.go:117] "RemoveContainer" containerID="65229dd1742ef08926ef8e9608941aa087b79288d7c286c646e5a92bbdde1bc5"
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.255355    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"]
Mar 20 16:00:54 crc kubenswrapper[4730]: E0320 16:00:54.255697    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65229dd1742ef08926ef8e9608941aa087b79288d7c286c646e5a92bbdde1bc5\": container with ID starting with 65229dd1742ef08926ef8e9608941aa087b79288d7c286c646e5a92bbdde1bc5 not found: ID does not exist" containerID="65229dd1742ef08926ef8e9608941aa087b79288d7c286c646e5a92bbdde1bc5"
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.255735    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65229dd1742ef08926ef8e9608941aa087b79288d7c286c646e5a92bbdde1bc5"} err="failed to get container status \"65229dd1742ef08926ef8e9608941aa087b79288d7c286c646e5a92bbdde1bc5\": rpc error: code = NotFound desc = could not find container \"65229dd1742ef08926ef8e9608941aa087b79288d7c286c646e5a92bbdde1bc5\": container with ID starting with 65229dd1742ef08926ef8e9608941aa087b79288d7c286c646e5a92bbdde1bc5 not found: ID does not exist"
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.255751    4730 scope.go:117] "RemoveContainer" containerID="3644a90df0b76ee2298de84878e1467667a7cf0df46bec7faf290aad4ad0f2a7"
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.259613    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3644a90df0b76ee2298de84878e1467667a7cf0df46bec7faf290aad4ad0f2a7"} err="failed to get container status \"3644a90df0b76ee2298de84878e1467667a7cf0df46bec7faf290aad4ad0f2a7\": rpc error: code = NotFound desc = could not find container \"3644a90df0b76ee2298de84878e1467667a7cf0df46bec7faf290aad4ad0f2a7\": container with ID starting with 3644a90df0b76ee2298de84878e1467667a7cf0df46bec7faf290aad4ad0f2a7 not found: ID does not exist"
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.259636    4730 scope.go:117] "RemoveContainer" containerID="65229dd1742ef08926ef8e9608941aa087b79288d7c286c646e5a92bbdde1bc5"
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.262363    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65229dd1742ef08926ef8e9608941aa087b79288d7c286c646e5a92bbdde1bc5"} err="failed to get container status \"65229dd1742ef08926ef8e9608941aa087b79288d7c286c646e5a92bbdde1bc5\": rpc error: code = NotFound desc = could not find container \"65229dd1742ef08926ef8e9608941aa087b79288d7c286c646e5a92bbdde1bc5\": container with ID starting with 65229dd1742ef08926ef8e9608941aa087b79288d7c286c646e5a92bbdde1bc5 not found: ID does not exist"
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.267348    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"]
Mar 20 16:00:54 crc kubenswrapper[4730]: E0320 16:00:54.267796    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="553d73e1-f14e-4379-b630-38e440eedb73" containerName="cinder-api-log"
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.267820    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="553d73e1-f14e-4379-b630-38e440eedb73" containerName="cinder-api-log"
Mar 20 16:00:54 crc kubenswrapper[4730]: E0320 16:00:54.267851    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="553d73e1-f14e-4379-b630-38e440eedb73" containerName="cinder-api"
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.267858    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="553d73e1-f14e-4379-b630-38e440eedb73" containerName="cinder-api"
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.268042    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="553d73e1-f14e-4379-b630-38e440eedb73" containerName="cinder-api-log"
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.268058    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="553d73e1-f14e-4379-b630-38e440eedb73" containerName="cinder-api"
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.269099    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0"
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.273959    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"]
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.274181    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc"
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.274517    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data"
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.275972    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc"
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.377683    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa\") " pod="openstack/cinder-api-0"
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.377745    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa-scripts\") pod \"cinder-api-0\" (UID: \"4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa\") " pod="openstack/cinder-api-0"
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.377780    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa-logs\") pod \"cinder-api-0\" (UID: \"4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa\") " pod="openstack/cinder-api-0"
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.377865    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa-public-tls-certs\") pod \"cinder-api-0\" (UID: \"4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa\") " pod="openstack/cinder-api-0"
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.377931    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbzzr\" (UniqueName: \"kubernetes.io/projected/4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa-kube-api-access-bbzzr\") pod \"cinder-api-0\" (UID: \"4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa\") " pod="openstack/cinder-api-0"
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.378028    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa-config-data-custom\") pod \"cinder-api-0\" (UID: \"4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa\") " pod="openstack/cinder-api-0"
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.378069    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa\") " pod="openstack/cinder-api-0"
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.378092    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa-config-data\") pod \"cinder-api-0\" (UID: \"4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa\") " pod="openstack/cinder-api-0"
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.378147    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa\") " pod="openstack/cinder-api-0"
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.494918    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbzzr\" (UniqueName: \"kubernetes.io/projected/4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa-kube-api-access-bbzzr\") pod \"cinder-api-0\" (UID: \"4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa\") " pod="openstack/cinder-api-0"
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.495031    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa-config-data-custom\") pod \"cinder-api-0\" (UID: \"4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa\") " pod="openstack/cinder-api-0"
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.495072    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa\") " pod="openstack/cinder-api-0"
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.495102    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa-config-data\") pod \"cinder-api-0\" (UID: \"4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa\") " pod="openstack/cinder-api-0"
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.495141    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa\") " pod="openstack/cinder-api-0"
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.495240    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa-scripts\") pod \"cinder-api-0\" (UID: \"4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa\") " pod="openstack/cinder-api-0"
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.495286    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa\") " pod="openstack/cinder-api-0"
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.495322    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa-logs\") pod \"cinder-api-0\" (UID: \"4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa\") " pod="openstack/cinder-api-0"
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.495391    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa-public-tls-certs\") pod \"cinder-api-0\" (UID: \"4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa\") " pod="openstack/cinder-api-0"
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.502359    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa\") " pod="openstack/cinder-api-0"
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.502652    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa-scripts\") pod \"cinder-api-0\" (UID: \"4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa\") " pod="openstack/cinder-api-0"
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.502871    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa-logs\") pod \"cinder-api-0\" (UID: \"4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa\") " pod="openstack/cinder-api-0"
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.509532    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa\") " pod="openstack/cinder-api-0"
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.512144    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa-config-data\") pod \"cinder-api-0\" (UID: \"4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa\") " pod="openstack/cinder-api-0"
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.512706    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa\") " pod="openstack/cinder-api-0"
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.513217    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa-public-tls-certs\") pod \"cinder-api-0\" (UID: \"4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa\") " pod="openstack/cinder-api-0"
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.514912    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa-config-data-custom\") pod \"cinder-api-0\" (UID: \"4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa\") " pod="openstack/cinder-api-0"
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.531860    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbzzr\" (UniqueName: \"kubernetes.io/projected/4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa-kube-api-access-bbzzr\") pod \"cinder-api-0\" (UID: \"4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa\") " pod="openstack/cinder-api-0"
Mar 20 16:00:54 crc kubenswrapper[4730]: I0320 16:00:54.589167    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0"
Mar 20 16:00:55 crc kubenswrapper[4730]: I0320 16:00:55.100556    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe109bf0-70d2-41d2-855c-6eb862e568b6","Type":"ContainerStarted","Data":"786b95c139b35cbde05b7814738f76287b10b59ef0dc76fe0f5bcee037ab03c4"}
Mar 20 16:00:55 crc kubenswrapper[4730]: I0320 16:00:55.294338    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"]
Mar 20 16:00:55 crc kubenswrapper[4730]: I0320 16:00:55.551796    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="553d73e1-f14e-4379-b630-38e440eedb73" path="/var/lib/kubelet/pods/553d73e1-f14e-4379-b630-38e440eedb73/volumes"
Mar 20 16:00:55 crc kubenswrapper[4730]: I0320 16:00:55.709392    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0"
Mar 20 16:00:55 crc kubenswrapper[4730]: I0320 16:00:55.968802    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0"
Mar 20 16:00:56 crc kubenswrapper[4730]: I0320 16:00:56.078283    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-66f7c676c8-wdfnw"
Mar 20 16:00:56 crc kubenswrapper[4730]: I0320 16:00:56.094380    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76cb94d47c-txmh6"
Mar 20 16:00:56 crc kubenswrapper[4730]: I0320 16:00:56.192229    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-66f7c676c8-wdfnw"
Mar 20 16:00:56 crc kubenswrapper[4730]: I0320 16:00:56.200170    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77878fc4cf-8s7hs"]
Mar 20 16:00:56 crc kubenswrapper[4730]: I0320 16:00:56.200483    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77878fc4cf-8s7hs" podUID="82ffcdbb-cebb-443a-a8af-3c3543bea13d" containerName="dnsmasq-dns" containerID="cri-o://a42cd9a54cab5432c7c6a61bb8a81e8c16b97e8e15d954234ce03c8ac58b65f1" gracePeriod=10
Mar 20 16:00:56 crc kubenswrapper[4730]: I0320 16:00:56.230819    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa","Type":"ContainerStarted","Data":"1edd5d33a486fdcefab106ad673dae51ea84827be13cc96e58ca8a06e7819a03"}
Mar 20 16:00:56 crc kubenswrapper[4730]: I0320 16:00:56.230862    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa","Type":"ContainerStarted","Data":"625fc2c6bb4bee7ecb35eb34bb8b6fbc56f99ceec30755b63f3bc3ef203e90f6"}
Mar 20 16:00:56 crc kubenswrapper[4730]: I0320 16:00:56.354700    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"]
Mar 20 16:00:56 crc kubenswrapper[4730]: I0320 16:00:56.809295    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77878fc4cf-8s7hs"
Mar 20 16:00:56 crc kubenswrapper[4730]: I0320 16:00:56.917090    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0"
Mar 20 16:00:56 crc kubenswrapper[4730]: I0320 16:00:56.928836    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0"
Mar 20 16:00:56 crc kubenswrapper[4730]: I0320 16:00:56.972972    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82ffcdbb-cebb-443a-a8af-3c3543bea13d-ovsdbserver-nb\") pod \"82ffcdbb-cebb-443a-a8af-3c3543bea13d\" (UID: \"82ffcdbb-cebb-443a-a8af-3c3543bea13d\") "
Mar 20 16:00:56 crc kubenswrapper[4730]: I0320 16:00:56.973052    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82ffcdbb-cebb-443a-a8af-3c3543bea13d-ovsdbserver-sb\") pod \"82ffcdbb-cebb-443a-a8af-3c3543bea13d\" (UID: \"82ffcdbb-cebb-443a-a8af-3c3543bea13d\") "
Mar 20 16:00:56 crc kubenswrapper[4730]: I0320 16:00:56.973078    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82ffcdbb-cebb-443a-a8af-3c3543bea13d-dns-svc\") pod \"82ffcdbb-cebb-443a-a8af-3c3543bea13d\" (UID: \"82ffcdbb-cebb-443a-a8af-3c3543bea13d\") "
Mar 20 16:00:56 crc kubenswrapper[4730]: I0320 16:00:56.973196    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82ffcdbb-cebb-443a-a8af-3c3543bea13d-config\") pod \"82ffcdbb-cebb-443a-a8af-3c3543bea13d\" (UID: \"82ffcdbb-cebb-443a-a8af-3c3543bea13d\") "
Mar 20 16:00:56 crc kubenswrapper[4730]: I0320 16:00:56.973290    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/82ffcdbb-cebb-443a-a8af-3c3543bea13d-dns-swift-storage-0\") pod \"82ffcdbb-cebb-443a-a8af-3c3543bea13d\" (UID: \"82ffcdbb-cebb-443a-a8af-3c3543bea13d\") "
Mar 20 16:00:56 crc kubenswrapper[4730]: I0320 16:00:56.973307    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5dq8\" (UniqueName: \"kubernetes.io/projected/82ffcdbb-cebb-443a-a8af-3c3543bea13d-kube-api-access-m5dq8\") pod \"82ffcdbb-cebb-443a-a8af-3c3543bea13d\" (UID: \"82ffcdbb-cebb-443a-a8af-3c3543bea13d\") "
Mar 20 16:00:56 crc kubenswrapper[4730]: I0320 16:00:56.981434    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82ffcdbb-cebb-443a-a8af-3c3543bea13d-kube-api-access-m5dq8" (OuterVolumeSpecName: "kube-api-access-m5dq8") pod "82ffcdbb-cebb-443a-a8af-3c3543bea13d" (UID: "82ffcdbb-cebb-443a-a8af-3c3543bea13d"). InnerVolumeSpecName "kube-api-access-m5dq8". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:00:57 crc kubenswrapper[4730]: I0320 16:00:57.046895    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82ffcdbb-cebb-443a-a8af-3c3543bea13d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "82ffcdbb-cebb-443a-a8af-3c3543bea13d" (UID: "82ffcdbb-cebb-443a-a8af-3c3543bea13d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:00:57 crc kubenswrapper[4730]: I0320 16:00:57.076113    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82ffcdbb-cebb-443a-a8af-3c3543bea13d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "82ffcdbb-cebb-443a-a8af-3c3543bea13d" (UID: "82ffcdbb-cebb-443a-a8af-3c3543bea13d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:00:57 crc kubenswrapper[4730]: I0320 16:00:57.076472    4730 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/82ffcdbb-cebb-443a-a8af-3c3543bea13d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:57 crc kubenswrapper[4730]: I0320 16:00:57.076497    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5dq8\" (UniqueName: \"kubernetes.io/projected/82ffcdbb-cebb-443a-a8af-3c3543bea13d-kube-api-access-m5dq8\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:57 crc kubenswrapper[4730]: I0320 16:00:57.076507    4730 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82ffcdbb-cebb-443a-a8af-3c3543bea13d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:57 crc kubenswrapper[4730]: I0320 16:00:57.080199    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82ffcdbb-cebb-443a-a8af-3c3543bea13d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "82ffcdbb-cebb-443a-a8af-3c3543bea13d" (UID: "82ffcdbb-cebb-443a-a8af-3c3543bea13d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:00:57 crc kubenswrapper[4730]: I0320 16:00:57.082365    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82ffcdbb-cebb-443a-a8af-3c3543bea13d-config" (OuterVolumeSpecName: "config") pod "82ffcdbb-cebb-443a-a8af-3c3543bea13d" (UID: "82ffcdbb-cebb-443a-a8af-3c3543bea13d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:00:57 crc kubenswrapper[4730]: I0320 16:00:57.105297    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82ffcdbb-cebb-443a-a8af-3c3543bea13d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "82ffcdbb-cebb-443a-a8af-3c3543bea13d" (UID: "82ffcdbb-cebb-443a-a8af-3c3543bea13d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:00:57 crc kubenswrapper[4730]: I0320 16:00:57.178110    4730 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82ffcdbb-cebb-443a-a8af-3c3543bea13d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:57 crc kubenswrapper[4730]: I0320 16:00:57.178150    4730 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82ffcdbb-cebb-443a-a8af-3c3543bea13d-dns-svc\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:57 crc kubenswrapper[4730]: I0320 16:00:57.178161    4730 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82ffcdbb-cebb-443a-a8af-3c3543bea13d-config\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:57 crc kubenswrapper[4730]: I0320 16:00:57.258708    4730 generic.go:334] "Generic (PLEG): container finished" podID="82ffcdbb-cebb-443a-a8af-3c3543bea13d" containerID="a42cd9a54cab5432c7c6a61bb8a81e8c16b97e8e15d954234ce03c8ac58b65f1" exitCode=0
Mar 20 16:00:57 crc kubenswrapper[4730]: I0320 16:00:57.258749    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77878fc4cf-8s7hs" event={"ID":"82ffcdbb-cebb-443a-a8af-3c3543bea13d","Type":"ContainerDied","Data":"a42cd9a54cab5432c7c6a61bb8a81e8c16b97e8e15d954234ce03c8ac58b65f1"}
Mar 20 16:00:57 crc kubenswrapper[4730]: I0320 16:00:57.259042    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77878fc4cf-8s7hs" event={"ID":"82ffcdbb-cebb-443a-a8af-3c3543bea13d","Type":"ContainerDied","Data":"7a2cb82ca156020b21392ad78c85cb4ddbdb0874dad3d5fcc0112cae0cde0511"}
Mar 20 16:00:57 crc kubenswrapper[4730]: I0320 16:00:57.258814    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77878fc4cf-8s7hs"
Mar 20 16:00:57 crc kubenswrapper[4730]: I0320 16:00:57.259076    4730 scope.go:117] "RemoveContainer" containerID="a42cd9a54cab5432c7c6a61bb8a81e8c16b97e8e15d954234ce03c8ac58b65f1"
Mar 20 16:00:57 crc kubenswrapper[4730]: I0320 16:00:57.259364    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="211c7770-64fe-4943-becb-bc02113fd867" containerName="cinder-scheduler" containerID="cri-o://6d6a4235c5c9e8bc75e4b872bc2c20836032212d513ddc23145b97813f0b4e12" gracePeriod=30
Mar 20 16:00:57 crc kubenswrapper[4730]: I0320 16:00:57.259532    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="211c7770-64fe-4943-becb-bc02113fd867" containerName="probe" containerID="cri-o://4f88e22c1abf31db26b2de1e0622d6921372e0cee954fd4aea4d817c4e543d05" gracePeriod=30
Mar 20 16:00:57 crc kubenswrapper[4730]: I0320 16:00:57.266519    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0"
Mar 20 16:00:57 crc kubenswrapper[4730]: I0320 16:00:57.298664    4730 scope.go:117] "RemoveContainer" containerID="8048a1935689b83c55f9f97ca86b535cb03200a9107add7dbb79c33ad2385a52"
Mar 20 16:00:57 crc kubenswrapper[4730]: I0320 16:00:57.525408    4730 scope.go:117] "RemoveContainer" containerID="a42cd9a54cab5432c7c6a61bb8a81e8c16b97e8e15d954234ce03c8ac58b65f1"
Mar 20 16:00:57 crc kubenswrapper[4730]: E0320 16:00:57.525933    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a42cd9a54cab5432c7c6a61bb8a81e8c16b97e8e15d954234ce03c8ac58b65f1\": container with ID starting with a42cd9a54cab5432c7c6a61bb8a81e8c16b97e8e15d954234ce03c8ac58b65f1 not found: ID does not exist" containerID="a42cd9a54cab5432c7c6a61bb8a81e8c16b97e8e15d954234ce03c8ac58b65f1"
Mar 20 16:00:57 crc kubenswrapper[4730]: I0320 16:00:57.525985    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a42cd9a54cab5432c7c6a61bb8a81e8c16b97e8e15d954234ce03c8ac58b65f1"} err="failed to get container status \"a42cd9a54cab5432c7c6a61bb8a81e8c16b97e8e15d954234ce03c8ac58b65f1\": rpc error: code = NotFound desc = could not find container \"a42cd9a54cab5432c7c6a61bb8a81e8c16b97e8e15d954234ce03c8ac58b65f1\": container with ID starting with a42cd9a54cab5432c7c6a61bb8a81e8c16b97e8e15d954234ce03c8ac58b65f1 not found: ID does not exist"
Mar 20 16:00:57 crc kubenswrapper[4730]: I0320 16:00:57.526017    4730 scope.go:117] "RemoveContainer" containerID="8048a1935689b83c55f9f97ca86b535cb03200a9107add7dbb79c33ad2385a52"
Mar 20 16:00:57 crc kubenswrapper[4730]: E0320 16:00:57.528232    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8048a1935689b83c55f9f97ca86b535cb03200a9107add7dbb79c33ad2385a52\": container with ID starting with 8048a1935689b83c55f9f97ca86b535cb03200a9107add7dbb79c33ad2385a52 not found: ID does not exist" containerID="8048a1935689b83c55f9f97ca86b535cb03200a9107add7dbb79c33ad2385a52"
Mar 20 16:00:57 crc kubenswrapper[4730]: I0320 16:00:57.528283    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8048a1935689b83c55f9f97ca86b535cb03200a9107add7dbb79c33ad2385a52"} err="failed to get container status \"8048a1935689b83c55f9f97ca86b535cb03200a9107add7dbb79c33ad2385a52\": rpc error: code = NotFound desc = could not find container \"8048a1935689b83c55f9f97ca86b535cb03200a9107add7dbb79c33ad2385a52\": container with ID starting with 8048a1935689b83c55f9f97ca86b535cb03200a9107add7dbb79c33ad2385a52 not found: ID does not exist"
Mar 20 16:00:57 crc kubenswrapper[4730]: I0320 16:00:57.563141    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77878fc4cf-8s7hs"]
Mar 20 16:00:57 crc kubenswrapper[4730]: I0320 16:00:57.585190    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77878fc4cf-8s7hs"]
Mar 20 16:00:58 crc kubenswrapper[4730]: I0320 16:00:58.100762    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-86947bcbc8-94hl8"
Mar 20 16:00:58 crc kubenswrapper[4730]: I0320 16:00:58.268240    4730 generic.go:334] "Generic (PLEG): container finished" podID="211c7770-64fe-4943-becb-bc02113fd867" containerID="4f88e22c1abf31db26b2de1e0622d6921372e0cee954fd4aea4d817c4e543d05" exitCode=0
Mar 20 16:00:58 crc kubenswrapper[4730]: I0320 16:00:58.268293    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"211c7770-64fe-4943-becb-bc02113fd867","Type":"ContainerDied","Data":"4f88e22c1abf31db26b2de1e0622d6921372e0cee954fd4aea4d817c4e543d05"}
Mar 20 16:00:58 crc kubenswrapper[4730]: I0320 16:00:58.271793    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe109bf0-70d2-41d2-855c-6eb862e568b6","Type":"ContainerStarted","Data":"73bfac78764299ab2b8b9302430bca90ecedaa10df4b93eb8c976fe0582a6af1"}
Mar 20 16:00:58 crc kubenswrapper[4730]: I0320 16:00:58.272983    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0"
Mar 20 16:00:58 crc kubenswrapper[4730]: I0320 16:00:58.274842    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa","Type":"ContainerStarted","Data":"dddf7ad95d1d0c0afeeb8c41ccf7fe37f8c38c3a24f4285f2ff9b97dfed77c66"}
Mar 20 16:00:58 crc kubenswrapper[4730]: I0320 16:00:58.275316    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0"
Mar 20 16:00:58 crc kubenswrapper[4730]: I0320 16:00:58.298565    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.936650938 podStartE2EDuration="7.298531405s" podCreationTimestamp="2026-03-20 16:00:51 +0000 UTC" firstStartedPulling="2026-03-20 16:00:52.909621661 +0000 UTC m=+1312.122993030" lastFinishedPulling="2026-03-20 16:00:57.271502138 +0000 UTC m=+1316.484873497" observedRunningTime="2026-03-20 16:00:58.291296353 +0000 UTC m=+1317.504667712" watchObservedRunningTime="2026-03-20 16:00:58.298531405 +0000 UTC m=+1317.511902774"
Mar 20 16:00:58 crc kubenswrapper[4730]: I0320 16:00:58.318745    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.318729167 podStartE2EDuration="4.318729167s" podCreationTimestamp="2026-03-20 16:00:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:00:58.310202876 +0000 UTC m=+1317.523574245" watchObservedRunningTime="2026-03-20 16:00:58.318729167 +0000 UTC m=+1317.532100536"
Mar 20 16:00:58 crc kubenswrapper[4730]: I0320 16:00:58.358808    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-86947bcbc8-94hl8"
Mar 20 16:00:58 crc kubenswrapper[4730]: I0320 16:00:58.489475    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-66f7c676c8-wdfnw"]
Mar 20 16:00:58 crc kubenswrapper[4730]: I0320 16:00:58.489965    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-66f7c676c8-wdfnw" podUID="625a25c3-e585-4848-bbe1-0bdd4be731a9" containerName="barbican-api-log" containerID="cri-o://4b1a913b558b33eb77f5f4bce1a49632877936d63ae7264946c7f559a4317add" gracePeriod=30
Mar 20 16:00:58 crc kubenswrapper[4730]: I0320 16:00:58.490216    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-66f7c676c8-wdfnw" podUID="625a25c3-e585-4848-bbe1-0bdd4be731a9" containerName="barbican-api" containerID="cri-o://36d68edd252a5f941948a382970d79860f0b64a17a4a5f197bbb8ab68f618711" gracePeriod=30
Mar 20 16:00:58 crc kubenswrapper[4730]: I0320 16:00:58.496484    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-66f7c676c8-wdfnw" podUID="625a25c3-e585-4848-bbe1-0bdd4be731a9" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.176:9311/healthcheck\": EOF"
Mar 20 16:00:58 crc kubenswrapper[4730]: I0320 16:00:58.496633    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-66f7c676c8-wdfnw" podUID="625a25c3-e585-4848-bbe1-0bdd4be731a9" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.176:9311/healthcheck\": EOF"
Mar 20 16:00:58 crc kubenswrapper[4730]: I0320 16:00:58.496726    4730 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-66f7c676c8-wdfnw" podUID="625a25c3-e585-4848-bbe1-0bdd4be731a9" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.176:9311/healthcheck\": EOF"
Mar 20 16:00:58 crc kubenswrapper[4730]: I0320 16:00:58.496819    4730 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-66f7c676c8-wdfnw" podUID="625a25c3-e585-4848-bbe1-0bdd4be731a9" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.176:9311/healthcheck\": EOF"
Mar 20 16:00:59 crc kubenswrapper[4730]: I0320 16:00:59.313614    4730 generic.go:334] "Generic (PLEG): container finished" podID="211c7770-64fe-4943-becb-bc02113fd867" containerID="6d6a4235c5c9e8bc75e4b872bc2c20836032212d513ddc23145b97813f0b4e12" exitCode=0
Mar 20 16:00:59 crc kubenswrapper[4730]: I0320 16:00:59.313900    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"211c7770-64fe-4943-becb-bc02113fd867","Type":"ContainerDied","Data":"6d6a4235c5c9e8bc75e4b872bc2c20836032212d513ddc23145b97813f0b4e12"}
Mar 20 16:00:59 crc kubenswrapper[4730]: I0320 16:00:59.318914    4730 generic.go:334] "Generic (PLEG): container finished" podID="625a25c3-e585-4848-bbe1-0bdd4be731a9" containerID="4b1a913b558b33eb77f5f4bce1a49632877936d63ae7264946c7f559a4317add" exitCode=143
Mar 20 16:00:59 crc kubenswrapper[4730]: I0320 16:00:59.319888    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-66f7c676c8-wdfnw" event={"ID":"625a25c3-e585-4848-bbe1-0bdd4be731a9","Type":"ContainerDied","Data":"4b1a913b558b33eb77f5f4bce1a49632877936d63ae7264946c7f559a4317add"}
Mar 20 16:00:59 crc kubenswrapper[4730]: I0320 16:00:59.554680    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82ffcdbb-cebb-443a-a8af-3c3543bea13d" path="/var/lib/kubelet/pods/82ffcdbb-cebb-443a-a8af-3c3543bea13d/volumes"
Mar 20 16:00:59 crc kubenswrapper[4730]: I0320 16:00:59.577911    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0"
Mar 20 16:00:59 crc kubenswrapper[4730]: I0320 16:00:59.745234    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srhzx\" (UniqueName: \"kubernetes.io/projected/211c7770-64fe-4943-becb-bc02113fd867-kube-api-access-srhzx\") pod \"211c7770-64fe-4943-becb-bc02113fd867\" (UID: \"211c7770-64fe-4943-becb-bc02113fd867\") "
Mar 20 16:00:59 crc kubenswrapper[4730]: I0320 16:00:59.745636    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/211c7770-64fe-4943-becb-bc02113fd867-etc-machine-id\") pod \"211c7770-64fe-4943-becb-bc02113fd867\" (UID: \"211c7770-64fe-4943-becb-bc02113fd867\") "
Mar 20 16:00:59 crc kubenswrapper[4730]: I0320 16:00:59.745715    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/211c7770-64fe-4943-becb-bc02113fd867-config-data-custom\") pod \"211c7770-64fe-4943-becb-bc02113fd867\" (UID: \"211c7770-64fe-4943-becb-bc02113fd867\") "
Mar 20 16:00:59 crc kubenswrapper[4730]: I0320 16:00:59.745788    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/211c7770-64fe-4943-becb-bc02113fd867-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "211c7770-64fe-4943-becb-bc02113fd867" (UID: "211c7770-64fe-4943-becb-bc02113fd867"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue ""
Mar 20 16:00:59 crc kubenswrapper[4730]: I0320 16:00:59.745872    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/211c7770-64fe-4943-becb-bc02113fd867-config-data\") pod \"211c7770-64fe-4943-becb-bc02113fd867\" (UID: \"211c7770-64fe-4943-becb-bc02113fd867\") "
Mar 20 16:00:59 crc kubenswrapper[4730]: I0320 16:00:59.745947    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/211c7770-64fe-4943-becb-bc02113fd867-combined-ca-bundle\") pod \"211c7770-64fe-4943-becb-bc02113fd867\" (UID: \"211c7770-64fe-4943-becb-bc02113fd867\") "
Mar 20 16:00:59 crc kubenswrapper[4730]: I0320 16:00:59.745968    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/211c7770-64fe-4943-becb-bc02113fd867-scripts\") pod \"211c7770-64fe-4943-becb-bc02113fd867\" (UID: \"211c7770-64fe-4943-becb-bc02113fd867\") "
Mar 20 16:00:59 crc kubenswrapper[4730]: I0320 16:00:59.746340    4730 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/211c7770-64fe-4943-becb-bc02113fd867-etc-machine-id\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:59 crc kubenswrapper[4730]: I0320 16:00:59.768521    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/211c7770-64fe-4943-becb-bc02113fd867-kube-api-access-srhzx" (OuterVolumeSpecName: "kube-api-access-srhzx") pod "211c7770-64fe-4943-becb-bc02113fd867" (UID: "211c7770-64fe-4943-becb-bc02113fd867"). InnerVolumeSpecName "kube-api-access-srhzx". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:00:59 crc kubenswrapper[4730]: I0320 16:00:59.769448    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/211c7770-64fe-4943-becb-bc02113fd867-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "211c7770-64fe-4943-becb-bc02113fd867" (UID: "211c7770-64fe-4943-becb-bc02113fd867"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:00:59 crc kubenswrapper[4730]: I0320 16:00:59.774422    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/211c7770-64fe-4943-becb-bc02113fd867-scripts" (OuterVolumeSpecName: "scripts") pod "211c7770-64fe-4943-becb-bc02113fd867" (UID: "211c7770-64fe-4943-becb-bc02113fd867"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:00:59 crc kubenswrapper[4730]: I0320 16:00:59.805390    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/211c7770-64fe-4943-becb-bc02113fd867-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "211c7770-64fe-4943-becb-bc02113fd867" (UID: "211c7770-64fe-4943-becb-bc02113fd867"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:00:59 crc kubenswrapper[4730]: I0320 16:00:59.847721    4730 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/211c7770-64fe-4943-becb-bc02113fd867-config-data-custom\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:59 crc kubenswrapper[4730]: I0320 16:00:59.847751    4730 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/211c7770-64fe-4943-becb-bc02113fd867-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:59 crc kubenswrapper[4730]: I0320 16:00:59.847763    4730 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/211c7770-64fe-4943-becb-bc02113fd867-scripts\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:59 crc kubenswrapper[4730]: I0320 16:00:59.847771    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srhzx\" (UniqueName: \"kubernetes.io/projected/211c7770-64fe-4943-becb-bc02113fd867-kube-api-access-srhzx\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:59 crc kubenswrapper[4730]: I0320 16:00:59.876110    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/211c7770-64fe-4943-becb-bc02113fd867-config-data" (OuterVolumeSpecName: "config-data") pod "211c7770-64fe-4943-becb-bc02113fd867" (UID: "211c7770-64fe-4943-becb-bc02113fd867"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:00:59 crc kubenswrapper[4730]: I0320 16:00:59.949726    4730 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/211c7770-64fe-4943-becb-bc02113fd867-config-data\") on node \"crc\" DevicePath \"\""
Mar 20 16:00:59 crc kubenswrapper[4730]: I0320 16:00:59.974071    4730 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0"
Mar 20 16:00:59 crc kubenswrapper[4730]: I0320 16:00:59.974110    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0"
Mar 20 16:00:59 crc kubenswrapper[4730]: I0320 16:00:59.974121    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0"
Mar 20 16:00:59 crc kubenswrapper[4730]: I0320 16:00:59.974836    4730 scope.go:117] "RemoveContainer" containerID="22c1fa447a9712d22a3477c2b5b4f81ffbfd58601afde3e8b272d15c3b1ac1ce"
Mar 20 16:00:59 crc kubenswrapper[4730]: E0320 16:00:59.975049    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(3f6c808e-d523-48bd-8ec2-28b625834317)\"" pod="openstack/watcher-decision-engine-0" podUID="3f6c808e-d523-48bd-8ec2-28b625834317"
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.137948    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29567041-zmx9n"]
Mar 20 16:01:00 crc kubenswrapper[4730]: E0320 16:01:00.138354    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82ffcdbb-cebb-443a-a8af-3c3543bea13d" containerName="init"
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.138371    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="82ffcdbb-cebb-443a-a8af-3c3543bea13d" containerName="init"
Mar 20 16:01:00 crc kubenswrapper[4730]: E0320 16:01:00.138395    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82ffcdbb-cebb-443a-a8af-3c3543bea13d" containerName="dnsmasq-dns"
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.138403    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="82ffcdbb-cebb-443a-a8af-3c3543bea13d" containerName="dnsmasq-dns"
Mar 20 16:01:00 crc kubenswrapper[4730]: E0320 16:01:00.138421    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="211c7770-64fe-4943-becb-bc02113fd867" containerName="probe"
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.138428    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="211c7770-64fe-4943-becb-bc02113fd867" containerName="probe"
Mar 20 16:01:00 crc kubenswrapper[4730]: E0320 16:01:00.138449    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="211c7770-64fe-4943-becb-bc02113fd867" containerName="cinder-scheduler"
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.138454    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="211c7770-64fe-4943-becb-bc02113fd867" containerName="cinder-scheduler"
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.138643    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="211c7770-64fe-4943-becb-bc02113fd867" containerName="cinder-scheduler"
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.138655    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="82ffcdbb-cebb-443a-a8af-3c3543bea13d" containerName="dnsmasq-dns"
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.138671    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="211c7770-64fe-4943-becb-bc02113fd867" containerName="probe"
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.139336    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29567041-zmx9n"
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.149701    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29567041-zmx9n"]
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.265803    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkrlg\" (UniqueName: \"kubernetes.io/projected/d3747d18-1b1e-4c43-ac1a-efeeb453b1ae-kube-api-access-rkrlg\") pod \"keystone-cron-29567041-zmx9n\" (UID: \"d3747d18-1b1e-4c43-ac1a-efeeb453b1ae\") " pod="openstack/keystone-cron-29567041-zmx9n"
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.265866    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3747d18-1b1e-4c43-ac1a-efeeb453b1ae-config-data\") pod \"keystone-cron-29567041-zmx9n\" (UID: \"d3747d18-1b1e-4c43-ac1a-efeeb453b1ae\") " pod="openstack/keystone-cron-29567041-zmx9n"
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.265891    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3747d18-1b1e-4c43-ac1a-efeeb453b1ae-combined-ca-bundle\") pod \"keystone-cron-29567041-zmx9n\" (UID: \"d3747d18-1b1e-4c43-ac1a-efeeb453b1ae\") " pod="openstack/keystone-cron-29567041-zmx9n"
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.265980    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d3747d18-1b1e-4c43-ac1a-efeeb453b1ae-fernet-keys\") pod \"keystone-cron-29567041-zmx9n\" (UID: \"d3747d18-1b1e-4c43-ac1a-efeeb453b1ae\") " pod="openstack/keystone-cron-29567041-zmx9n"
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.330095    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0"
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.330697    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"211c7770-64fe-4943-becb-bc02113fd867","Type":"ContainerDied","Data":"4e67bee5c04834a723f337ad60eefc215ef11d3a4fa1f8a1f937528124d8a7cd"}
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.330734    4730 scope.go:117] "RemoveContainer" containerID="4f88e22c1abf31db26b2de1e0622d6921372e0cee954fd4aea4d817c4e543d05"
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.354231    4730 scope.go:117] "RemoveContainer" containerID="6d6a4235c5c9e8bc75e4b872bc2c20836032212d513ddc23145b97813f0b4e12"
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.370740    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkrlg\" (UniqueName: \"kubernetes.io/projected/d3747d18-1b1e-4c43-ac1a-efeeb453b1ae-kube-api-access-rkrlg\") pod \"keystone-cron-29567041-zmx9n\" (UID: \"d3747d18-1b1e-4c43-ac1a-efeeb453b1ae\") " pod="openstack/keystone-cron-29567041-zmx9n"
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.370814    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3747d18-1b1e-4c43-ac1a-efeeb453b1ae-config-data\") pod \"keystone-cron-29567041-zmx9n\" (UID: \"d3747d18-1b1e-4c43-ac1a-efeeb453b1ae\") " pod="openstack/keystone-cron-29567041-zmx9n"
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.370845    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3747d18-1b1e-4c43-ac1a-efeeb453b1ae-combined-ca-bundle\") pod \"keystone-cron-29567041-zmx9n\" (UID: \"d3747d18-1b1e-4c43-ac1a-efeeb453b1ae\") " pod="openstack/keystone-cron-29567041-zmx9n"
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.370943    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d3747d18-1b1e-4c43-ac1a-efeeb453b1ae-fernet-keys\") pod \"keystone-cron-29567041-zmx9n\" (UID: \"d3747d18-1b1e-4c43-ac1a-efeeb453b1ae\") " pod="openstack/keystone-cron-29567041-zmx9n"
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.375029    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d3747d18-1b1e-4c43-ac1a-efeeb453b1ae-fernet-keys\") pod \"keystone-cron-29567041-zmx9n\" (UID: \"d3747d18-1b1e-4c43-ac1a-efeeb453b1ae\") " pod="openstack/keystone-cron-29567041-zmx9n"
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.378144    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3747d18-1b1e-4c43-ac1a-efeeb453b1ae-combined-ca-bundle\") pod \"keystone-cron-29567041-zmx9n\" (UID: \"d3747d18-1b1e-4c43-ac1a-efeeb453b1ae\") " pod="openstack/keystone-cron-29567041-zmx9n"
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.388331    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3747d18-1b1e-4c43-ac1a-efeeb453b1ae-config-data\") pod \"keystone-cron-29567041-zmx9n\" (UID: \"d3747d18-1b1e-4c43-ac1a-efeeb453b1ae\") " pod="openstack/keystone-cron-29567041-zmx9n"
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.388558    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"]
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.401007    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkrlg\" (UniqueName: \"kubernetes.io/projected/d3747d18-1b1e-4c43-ac1a-efeeb453b1ae-kube-api-access-rkrlg\") pod \"keystone-cron-29567041-zmx9n\" (UID: \"d3747d18-1b1e-4c43-ac1a-efeeb453b1ae\") " pod="openstack/keystone-cron-29567041-zmx9n"
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.405936    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"]
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.416317    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"]
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.418088    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0"
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.420687    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data"
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.426597    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"]
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.493164    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29567041-zmx9n"
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.584290    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ff07e31-53ad-49da-941d-607115f965e0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8ff07e31-53ad-49da-941d-607115f965e0\") " pod="openstack/cinder-scheduler-0"
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.584449    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ff07e31-53ad-49da-941d-607115f965e0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8ff07e31-53ad-49da-941d-607115f965e0\") " pod="openstack/cinder-scheduler-0"
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.584489    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ff07e31-53ad-49da-941d-607115f965e0-config-data\") pod \"cinder-scheduler-0\" (UID: \"8ff07e31-53ad-49da-941d-607115f965e0\") " pod="openstack/cinder-scheduler-0"
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.584510    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5rql\" (UniqueName: \"kubernetes.io/projected/8ff07e31-53ad-49da-941d-607115f965e0-kube-api-access-v5rql\") pod \"cinder-scheduler-0\" (UID: \"8ff07e31-53ad-49da-941d-607115f965e0\") " pod="openstack/cinder-scheduler-0"
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.584590    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8ff07e31-53ad-49da-941d-607115f965e0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8ff07e31-53ad-49da-941d-607115f965e0\") " pod="openstack/cinder-scheduler-0"
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.584623    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ff07e31-53ad-49da-941d-607115f965e0-scripts\") pod \"cinder-scheduler-0\" (UID: \"8ff07e31-53ad-49da-941d-607115f965e0\") " pod="openstack/cinder-scheduler-0"
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.601310    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"]
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.601609    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="aabd3bd6-2cee-47b8-9174-ad9ea1415e82" containerName="watcher-api-log" containerID="cri-o://b9554ee31e1f63a199256bec2bb1d2359bd31822a3145d6062ef5270512d5078" gracePeriod=30
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.601840    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="aabd3bd6-2cee-47b8-9174-ad9ea1415e82" containerName="watcher-api" containerID="cri-o://1b54d6706fe3b9e0df5e79e3bf8df496baa00381f4d47713ff0c38d0b131d0ef" gracePeriod=30
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.687583    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ff07e31-53ad-49da-941d-607115f965e0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8ff07e31-53ad-49da-941d-607115f965e0\") " pod="openstack/cinder-scheduler-0"
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.687629    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ff07e31-53ad-49da-941d-607115f965e0-config-data\") pod \"cinder-scheduler-0\" (UID: \"8ff07e31-53ad-49da-941d-607115f965e0\") " pod="openstack/cinder-scheduler-0"
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.687649    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5rql\" (UniqueName: \"kubernetes.io/projected/8ff07e31-53ad-49da-941d-607115f965e0-kube-api-access-v5rql\") pod \"cinder-scheduler-0\" (UID: \"8ff07e31-53ad-49da-941d-607115f965e0\") " pod="openstack/cinder-scheduler-0"
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.687718    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8ff07e31-53ad-49da-941d-607115f965e0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8ff07e31-53ad-49da-941d-607115f965e0\") " pod="openstack/cinder-scheduler-0"
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.687739    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ff07e31-53ad-49da-941d-607115f965e0-scripts\") pod \"cinder-scheduler-0\" (UID: \"8ff07e31-53ad-49da-941d-607115f965e0\") " pod="openstack/cinder-scheduler-0"
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.687802    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ff07e31-53ad-49da-941d-607115f965e0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8ff07e31-53ad-49da-941d-607115f965e0\") " pod="openstack/cinder-scheduler-0"
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.688087    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8ff07e31-53ad-49da-941d-607115f965e0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8ff07e31-53ad-49da-941d-607115f965e0\") " pod="openstack/cinder-scheduler-0"
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.692808    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ff07e31-53ad-49da-941d-607115f965e0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8ff07e31-53ad-49da-941d-607115f965e0\") " pod="openstack/cinder-scheduler-0"
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.694144    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ff07e31-53ad-49da-941d-607115f965e0-config-data\") pod \"cinder-scheduler-0\" (UID: \"8ff07e31-53ad-49da-941d-607115f965e0\") " pod="openstack/cinder-scheduler-0"
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.694333    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ff07e31-53ad-49da-941d-607115f965e0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8ff07e31-53ad-49da-941d-607115f965e0\") " pod="openstack/cinder-scheduler-0"
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.704756    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ff07e31-53ad-49da-941d-607115f965e0-scripts\") pod \"cinder-scheduler-0\" (UID: \"8ff07e31-53ad-49da-941d-607115f965e0\") " pod="openstack/cinder-scheduler-0"
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.709177    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5rql\" (UniqueName: \"kubernetes.io/projected/8ff07e31-53ad-49da-941d-607115f965e0-kube-api-access-v5rql\") pod \"cinder-scheduler-0\" (UID: \"8ff07e31-53ad-49da-941d-607115f965e0\") " pod="openstack/cinder-scheduler-0"
Mar 20 16:01:00 crc kubenswrapper[4730]: I0320 16:01:00.805709    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0"
Mar 20 16:01:01 crc kubenswrapper[4730]: I0320 16:01:01.100921    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29567041-zmx9n"]
Mar 20 16:01:01 crc kubenswrapper[4730]: I0320 16:01:01.359633    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29567041-zmx9n" event={"ID":"d3747d18-1b1e-4c43-ac1a-efeeb453b1ae","Type":"ContainerStarted","Data":"a19bcb5b8c66a4a747a916cf2427a3e88a6ef33424161955f95912089a76ca70"}
Mar 20 16:01:01 crc kubenswrapper[4730]: I0320 16:01:01.392512    4730 generic.go:334] "Generic (PLEG): container finished" podID="aabd3bd6-2cee-47b8-9174-ad9ea1415e82" containerID="b9554ee31e1f63a199256bec2bb1d2359bd31822a3145d6062ef5270512d5078" exitCode=143
Mar 20 16:01:01 crc kubenswrapper[4730]: I0320 16:01:01.392666    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"aabd3bd6-2cee-47b8-9174-ad9ea1415e82","Type":"ContainerDied","Data":"b9554ee31e1f63a199256bec2bb1d2359bd31822a3145d6062ef5270512d5078"}
Mar 20 16:01:01 crc kubenswrapper[4730]: I0320 16:01:01.415654    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"]
Mar 20 16:01:01 crc kubenswrapper[4730]: I0320 16:01:01.568101    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="211c7770-64fe-4943-becb-bc02113fd867" path="/var/lib/kubelet/pods/211c7770-64fe-4943-becb-bc02113fd867/volumes"
Mar 20 16:01:02 crc kubenswrapper[4730]: I0320 16:01:02.244725    4730 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="aabd3bd6-2cee-47b8-9174-ad9ea1415e82" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.172:9322/\": read tcp 10.217.0.2:40200->10.217.0.172:9322: read: connection reset by peer"
Mar 20 16:01:02 crc kubenswrapper[4730]: I0320 16:01:02.244808    4730 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="aabd3bd6-2cee-47b8-9174-ad9ea1415e82" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.172:9322/\": read tcp 10.217.0.2:40198->10.217.0.172:9322: read: connection reset by peer"
Mar 20 16:01:02 crc kubenswrapper[4730]: I0320 16:01:02.425753    4730 generic.go:334] "Generic (PLEG): container finished" podID="aabd3bd6-2cee-47b8-9174-ad9ea1415e82" containerID="1b54d6706fe3b9e0df5e79e3bf8df496baa00381f4d47713ff0c38d0b131d0ef" exitCode=0
Mar 20 16:01:02 crc kubenswrapper[4730]: I0320 16:01:02.426123    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"aabd3bd6-2cee-47b8-9174-ad9ea1415e82","Type":"ContainerDied","Data":"1b54d6706fe3b9e0df5e79e3bf8df496baa00381f4d47713ff0c38d0b131d0ef"}
Mar 20 16:01:02 crc kubenswrapper[4730]: I0320 16:01:02.435559    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8ff07e31-53ad-49da-941d-607115f965e0","Type":"ContainerStarted","Data":"54e760b727beebcf73820b5976bbc47da91de66eac2b9c3e4c9962bc51dcc205"}
Mar 20 16:01:02 crc kubenswrapper[4730]: I0320 16:01:02.435606    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8ff07e31-53ad-49da-941d-607115f965e0","Type":"ContainerStarted","Data":"eae88a4e3b6fca9302c6691c9973e47a8b0ce9904684def6fd84b8f298a43bec"}
Mar 20 16:01:02 crc kubenswrapper[4730]: I0320 16:01:02.446848    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29567041-zmx9n" event={"ID":"d3747d18-1b1e-4c43-ac1a-efeeb453b1ae","Type":"ContainerStarted","Data":"f176bdad8c7c8b3ae0a8f361e9d7c67c47a36840f5e1630151c5c006f2c36ca0"}
Mar 20 16:01:02 crc kubenswrapper[4730]: I0320 16:01:02.469839    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29567041-zmx9n" podStartSLOduration=2.46982058 podStartE2EDuration="2.46982058s" podCreationTimestamp="2026-03-20 16:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:01:02.460976362 +0000 UTC m=+1321.674347741" watchObservedRunningTime="2026-03-20 16:01:02.46982058 +0000 UTC m=+1321.683191949"
Mar 20 16:01:02 crc kubenswrapper[4730]: I0320 16:01:02.617726    4730 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-66f7c676c8-wdfnw" podUID="625a25c3-e585-4848-bbe1-0bdd4be731a9" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.176:9311/healthcheck\": read tcp 10.217.0.2:48578->10.217.0.176:9311: read: connection reset by peer"
Mar 20 16:01:02 crc kubenswrapper[4730]: I0320 16:01:02.767413    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0"
Mar 20 16:01:02 crc kubenswrapper[4730]: I0320 16:01:02.957066    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aabd3bd6-2cee-47b8-9174-ad9ea1415e82-config-data\") pod \"aabd3bd6-2cee-47b8-9174-ad9ea1415e82\" (UID: \"aabd3bd6-2cee-47b8-9174-ad9ea1415e82\") "
Mar 20 16:01:02 crc kubenswrapper[4730]: I0320 16:01:02.957487    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aabd3bd6-2cee-47b8-9174-ad9ea1415e82-combined-ca-bundle\") pod \"aabd3bd6-2cee-47b8-9174-ad9ea1415e82\" (UID: \"aabd3bd6-2cee-47b8-9174-ad9ea1415e82\") "
Mar 20 16:01:02 crc kubenswrapper[4730]: I0320 16:01:02.957566    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aabd3bd6-2cee-47b8-9174-ad9ea1415e82-logs\") pod \"aabd3bd6-2cee-47b8-9174-ad9ea1415e82\" (UID: \"aabd3bd6-2cee-47b8-9174-ad9ea1415e82\") "
Mar 20 16:01:02 crc kubenswrapper[4730]: I0320 16:01:02.957586    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2lq6\" (UniqueName: \"kubernetes.io/projected/aabd3bd6-2cee-47b8-9174-ad9ea1415e82-kube-api-access-m2lq6\") pod \"aabd3bd6-2cee-47b8-9174-ad9ea1415e82\" (UID: \"aabd3bd6-2cee-47b8-9174-ad9ea1415e82\") "
Mar 20 16:01:02 crc kubenswrapper[4730]: I0320 16:01:02.957625    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/aabd3bd6-2cee-47b8-9174-ad9ea1415e82-custom-prometheus-ca\") pod \"aabd3bd6-2cee-47b8-9174-ad9ea1415e82\" (UID: \"aabd3bd6-2cee-47b8-9174-ad9ea1415e82\") "
Mar 20 16:01:02 crc kubenswrapper[4730]: I0320 16:01:02.957872    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aabd3bd6-2cee-47b8-9174-ad9ea1415e82-logs" (OuterVolumeSpecName: "logs") pod "aabd3bd6-2cee-47b8-9174-ad9ea1415e82" (UID: "aabd3bd6-2cee-47b8-9174-ad9ea1415e82"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:01:02 crc kubenswrapper[4730]: I0320 16:01:02.958517    4730 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aabd3bd6-2cee-47b8-9174-ad9ea1415e82-logs\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:02 crc kubenswrapper[4730]: I0320 16:01:02.964790    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aabd3bd6-2cee-47b8-9174-ad9ea1415e82-kube-api-access-m2lq6" (OuterVolumeSpecName: "kube-api-access-m2lq6") pod "aabd3bd6-2cee-47b8-9174-ad9ea1415e82" (UID: "aabd3bd6-2cee-47b8-9174-ad9ea1415e82"). InnerVolumeSpecName "kube-api-access-m2lq6". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.007906    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aabd3bd6-2cee-47b8-9174-ad9ea1415e82-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aabd3bd6-2cee-47b8-9174-ad9ea1415e82" (UID: "aabd3bd6-2cee-47b8-9174-ad9ea1415e82"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.013767    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aabd3bd6-2cee-47b8-9174-ad9ea1415e82-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "aabd3bd6-2cee-47b8-9174-ad9ea1415e82" (UID: "aabd3bd6-2cee-47b8-9174-ad9ea1415e82"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.055759    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aabd3bd6-2cee-47b8-9174-ad9ea1415e82-config-data" (OuterVolumeSpecName: "config-data") pod "aabd3bd6-2cee-47b8-9174-ad9ea1415e82" (UID: "aabd3bd6-2cee-47b8-9174-ad9ea1415e82"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.060062    4730 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aabd3bd6-2cee-47b8-9174-ad9ea1415e82-config-data\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.060090    4730 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aabd3bd6-2cee-47b8-9174-ad9ea1415e82-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.060101    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2lq6\" (UniqueName: \"kubernetes.io/projected/aabd3bd6-2cee-47b8-9174-ad9ea1415e82-kube-api-access-m2lq6\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.060111    4730 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/aabd3bd6-2cee-47b8-9174-ad9ea1415e82-custom-prometheus-ca\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.326066    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-66f7c676c8-wdfnw"
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.458006    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0"
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.458033    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"aabd3bd6-2cee-47b8-9174-ad9ea1415e82","Type":"ContainerDied","Data":"6d15d2ce82d18fec66394661182780c55d2eb065e0325660570454876023436b"}
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.458095    4730 scope.go:117] "RemoveContainer" containerID="1b54d6706fe3b9e0df5e79e3bf8df496baa00381f4d47713ff0c38d0b131d0ef"
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.467756    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8ff07e31-53ad-49da-941d-607115f965e0","Type":"ContainerStarted","Data":"26b8db254782a9a742d6cabf7c1b04ee4c13e5ad7e13bfc014ba09e458ab0f58"}
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.473912    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/625a25c3-e585-4848-bbe1-0bdd4be731a9-config-data-custom\") pod \"625a25c3-e585-4848-bbe1-0bdd4be731a9\" (UID: \"625a25c3-e585-4848-bbe1-0bdd4be731a9\") "
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.474207    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/625a25c3-e585-4848-bbe1-0bdd4be731a9-logs\") pod \"625a25c3-e585-4848-bbe1-0bdd4be731a9\" (UID: \"625a25c3-e585-4848-bbe1-0bdd4be731a9\") "
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.474339    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/625a25c3-e585-4848-bbe1-0bdd4be731a9-combined-ca-bundle\") pod \"625a25c3-e585-4848-bbe1-0bdd4be731a9\" (UID: \"625a25c3-e585-4848-bbe1-0bdd4be731a9\") "
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.474379    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/625a25c3-e585-4848-bbe1-0bdd4be731a9-config-data\") pod \"625a25c3-e585-4848-bbe1-0bdd4be731a9\" (UID: \"625a25c3-e585-4848-bbe1-0bdd4be731a9\") "
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.474648    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfcrj\" (UniqueName: \"kubernetes.io/projected/625a25c3-e585-4848-bbe1-0bdd4be731a9-kube-api-access-nfcrj\") pod \"625a25c3-e585-4848-bbe1-0bdd4be731a9\" (UID: \"625a25c3-e585-4848-bbe1-0bdd4be731a9\") "
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.474951    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/625a25c3-e585-4848-bbe1-0bdd4be731a9-logs" (OuterVolumeSpecName: "logs") pod "625a25c3-e585-4848-bbe1-0bdd4be731a9" (UID: "625a25c3-e585-4848-bbe1-0bdd4be731a9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.488586    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/625a25c3-e585-4848-bbe1-0bdd4be731a9-kube-api-access-nfcrj" (OuterVolumeSpecName: "kube-api-access-nfcrj") pod "625a25c3-e585-4848-bbe1-0bdd4be731a9" (UID: "625a25c3-e585-4848-bbe1-0bdd4be731a9"). InnerVolumeSpecName "kube-api-access-nfcrj". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.488962    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/625a25c3-e585-4848-bbe1-0bdd4be731a9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "625a25c3-e585-4848-bbe1-0bdd4be731a9" (UID: "625a25c3-e585-4848-bbe1-0bdd4be731a9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.495135    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfcrj\" (UniqueName: \"kubernetes.io/projected/625a25c3-e585-4848-bbe1-0bdd4be731a9-kube-api-access-nfcrj\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.495172    4730 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/625a25c3-e585-4848-bbe1-0bdd4be731a9-config-data-custom\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.495182    4730 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/625a25c3-e585-4848-bbe1-0bdd4be731a9-logs\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.506941    4730 scope.go:117] "RemoveContainer" containerID="b9554ee31e1f63a199256bec2bb1d2359bd31822a3145d6062ef5270512d5078"
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.507038    4730 generic.go:334] "Generic (PLEG): container finished" podID="625a25c3-e585-4848-bbe1-0bdd4be731a9" containerID="36d68edd252a5f941948a382970d79860f0b64a17a4a5f197bbb8ab68f618711" exitCode=0
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.507191    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-66f7c676c8-wdfnw"
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.507192    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-66f7c676c8-wdfnw" event={"ID":"625a25c3-e585-4848-bbe1-0bdd4be731a9","Type":"ContainerDied","Data":"36d68edd252a5f941948a382970d79860f0b64a17a4a5f197bbb8ab68f618711"}
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.507263    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-66f7c676c8-wdfnw" event={"ID":"625a25c3-e585-4848-bbe1-0bdd4be731a9","Type":"ContainerDied","Data":"4051b918d71242b8ca2e41190a654e8dbaba0ff078f4749ad9adbb01f4288ed1"}
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.507317    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.507303632 podStartE2EDuration="3.507303632s" podCreationTimestamp="2026-03-20 16:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:01:03.489119255 +0000 UTC m=+1322.702490634" watchObservedRunningTime="2026-03-20 16:01:03.507303632 +0000 UTC m=+1322.720675001"
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.515688    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/625a25c3-e585-4848-bbe1-0bdd4be731a9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "625a25c3-e585-4848-bbe1-0bdd4be731a9" (UID: "625a25c3-e585-4848-bbe1-0bdd4be731a9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.598100    4730 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/625a25c3-e585-4848-bbe1-0bdd4be731a9-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.605168    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/625a25c3-e585-4848-bbe1-0bdd4be731a9-config-data" (OuterVolumeSpecName: "config-data") pod "625a25c3-e585-4848-bbe1-0bdd4be731a9" (UID: "625a25c3-e585-4848-bbe1-0bdd4be731a9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.611450    4730 scope.go:117] "RemoveContainer" containerID="36d68edd252a5f941948a382970d79860f0b64a17a4a5f197bbb8ab68f618711"
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.615219    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"]
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.628352    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"]
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.635900    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"]
Mar 20 16:01:03 crc kubenswrapper[4730]: E0320 16:01:03.636293    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="625a25c3-e585-4848-bbe1-0bdd4be731a9" containerName="barbican-api-log"
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.636311    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="625a25c3-e585-4848-bbe1-0bdd4be731a9" containerName="barbican-api-log"
Mar 20 16:01:03 crc kubenswrapper[4730]: E0320 16:01:03.636329    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aabd3bd6-2cee-47b8-9174-ad9ea1415e82" containerName="watcher-api"
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.636335    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="aabd3bd6-2cee-47b8-9174-ad9ea1415e82" containerName="watcher-api"
Mar 20 16:01:03 crc kubenswrapper[4730]: E0320 16:01:03.636349    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aabd3bd6-2cee-47b8-9174-ad9ea1415e82" containerName="watcher-api-log"
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.636355    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="aabd3bd6-2cee-47b8-9174-ad9ea1415e82" containerName="watcher-api-log"
Mar 20 16:01:03 crc kubenswrapper[4730]: E0320 16:01:03.636375    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="625a25c3-e585-4848-bbe1-0bdd4be731a9" containerName="barbican-api"
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.636383    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="625a25c3-e585-4848-bbe1-0bdd4be731a9" containerName="barbican-api"
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.636589    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="625a25c3-e585-4848-bbe1-0bdd4be731a9" containerName="barbican-api-log"
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.636615    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="aabd3bd6-2cee-47b8-9174-ad9ea1415e82" containerName="watcher-api"
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.636626    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="625a25c3-e585-4848-bbe1-0bdd4be731a9" containerName="barbican-api"
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.636643    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="aabd3bd6-2cee-47b8-9174-ad9ea1415e82" containerName="watcher-api-log"
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.637611    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0"
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.641102    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-public-svc"
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.641269    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data"
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.644377    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-internal-svc"
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.659350    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"]
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.660604    4730 scope.go:117] "RemoveContainer" containerID="4b1a913b558b33eb77f5f4bce1a49632877936d63ae7264946c7f559a4317add"
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.699761    4730 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/625a25c3-e585-4848-bbe1-0bdd4be731a9-config-data\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.700767    4730 scope.go:117] "RemoveContainer" containerID="36d68edd252a5f941948a382970d79860f0b64a17a4a5f197bbb8ab68f618711"
Mar 20 16:01:03 crc kubenswrapper[4730]: E0320 16:01:03.704659    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36d68edd252a5f941948a382970d79860f0b64a17a4a5f197bbb8ab68f618711\": container with ID starting with 36d68edd252a5f941948a382970d79860f0b64a17a4a5f197bbb8ab68f618711 not found: ID does not exist" containerID="36d68edd252a5f941948a382970d79860f0b64a17a4a5f197bbb8ab68f618711"
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.704710    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36d68edd252a5f941948a382970d79860f0b64a17a4a5f197bbb8ab68f618711"} err="failed to get container status \"36d68edd252a5f941948a382970d79860f0b64a17a4a5f197bbb8ab68f618711\": rpc error: code = NotFound desc = could not find container \"36d68edd252a5f941948a382970d79860f0b64a17a4a5f197bbb8ab68f618711\": container with ID starting with 36d68edd252a5f941948a382970d79860f0b64a17a4a5f197bbb8ab68f618711 not found: ID does not exist"
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.704739    4730 scope.go:117] "RemoveContainer" containerID="4b1a913b558b33eb77f5f4bce1a49632877936d63ae7264946c7f559a4317add"
Mar 20 16:01:03 crc kubenswrapper[4730]: E0320 16:01:03.711451    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b1a913b558b33eb77f5f4bce1a49632877936d63ae7264946c7f559a4317add\": container with ID starting with 4b1a913b558b33eb77f5f4bce1a49632877936d63ae7264946c7f559a4317add not found: ID does not exist" containerID="4b1a913b558b33eb77f5f4bce1a49632877936d63ae7264946c7f559a4317add"
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.711506    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b1a913b558b33eb77f5f4bce1a49632877936d63ae7264946c7f559a4317add"} err="failed to get container status \"4b1a913b558b33eb77f5f4bce1a49632877936d63ae7264946c7f559a4317add\": rpc error: code = NotFound desc = could not find container \"4b1a913b558b33eb77f5f4bce1a49632877936d63ae7264946c7f559a4317add\": container with ID starting with 4b1a913b558b33eb77f5f4bce1a49632877936d63ae7264946c7f559a4317add not found: ID does not exist"
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.800761    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a-config-data\") pod \"watcher-api-0\" (UID: \"3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a\") " pod="openstack/watcher-api-0"
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.800793    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a-logs\") pod \"watcher-api-0\" (UID: \"3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a\") " pod="openstack/watcher-api-0"
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.800818    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a-public-tls-certs\") pod \"watcher-api-0\" (UID: \"3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a\") " pod="openstack/watcher-api-0"
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.801597    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a\") " pod="openstack/watcher-api-0"
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.801654    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fk5wd\" (UniqueName: \"kubernetes.io/projected/3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a-kube-api-access-fk5wd\") pod \"watcher-api-0\" (UID: \"3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a\") " pod="openstack/watcher-api-0"
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.801821    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a\") " pod="openstack/watcher-api-0"
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.801915    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a\") " pod="openstack/watcher-api-0"
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.854180    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-66f7c676c8-wdfnw"]
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.863554    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-66f7c676c8-wdfnw"]
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.903639    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a-public-tls-certs\") pod \"watcher-api-0\" (UID: \"3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a\") " pod="openstack/watcher-api-0"
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.903711    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a\") " pod="openstack/watcher-api-0"
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.903746    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fk5wd\" (UniqueName: \"kubernetes.io/projected/3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a-kube-api-access-fk5wd\") pod \"watcher-api-0\" (UID: \"3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a\") " pod="openstack/watcher-api-0"
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.903825    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a\") " pod="openstack/watcher-api-0"
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.903880    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a\") " pod="openstack/watcher-api-0"
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.903962    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a-config-data\") pod \"watcher-api-0\" (UID: \"3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a\") " pod="openstack/watcher-api-0"
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.904202    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a-logs\") pod \"watcher-api-0\" (UID: \"3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a\") " pod="openstack/watcher-api-0"
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.904421    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a-logs\") pod \"watcher-api-0\" (UID: \"3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a\") " pod="openstack/watcher-api-0"
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.908801    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a-config-data\") pod \"watcher-api-0\" (UID: \"3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a\") " pod="openstack/watcher-api-0"
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.908959    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a\") " pod="openstack/watcher-api-0"
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.912580    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a-public-tls-certs\") pod \"watcher-api-0\" (UID: \"3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a\") " pod="openstack/watcher-api-0"
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.912652    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a\") " pod="openstack/watcher-api-0"
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.922920    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a\") " pod="openstack/watcher-api-0"
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.926155    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fk5wd\" (UniqueName: \"kubernetes.io/projected/3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a-kube-api-access-fk5wd\") pod \"watcher-api-0\" (UID: \"3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a\") " pod="openstack/watcher-api-0"
Mar 20 16:01:03 crc kubenswrapper[4730]: I0320 16:01:03.981717    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0"
Mar 20 16:01:04 crc kubenswrapper[4730]: I0320 16:01:04.555754    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"]
Mar 20 16:01:05 crc kubenswrapper[4730]: I0320 16:01:05.528517    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a","Type":"ContainerStarted","Data":"cdc802e9a5716d718ebb13c2b13971164edec56068203d16169206897ba03b6a"}
Mar 20 16:01:05 crc kubenswrapper[4730]: I0320 16:01:05.529126    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0"
Mar 20 16:01:05 crc kubenswrapper[4730]: I0320 16:01:05.529143    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a","Type":"ContainerStarted","Data":"c743230ba78e3e74fccce83315455e43e5c471c2e21f3d4ac91af747ed0b8301"}
Mar 20 16:01:05 crc kubenswrapper[4730]: I0320 16:01:05.529165    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a","Type":"ContainerStarted","Data":"4c2c6a630c73d61868e537787c0ffc30dd54b3a6ece7e73f02c494b3afd3c924"}
Mar 20 16:01:05 crc kubenswrapper[4730]: I0320 16:01:05.559333    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=2.5593102009999997 podStartE2EDuration="2.559310201s" podCreationTimestamp="2026-03-20 16:01:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:01:05.558074234 +0000 UTC m=+1324.771445613" watchObservedRunningTime="2026-03-20 16:01:05.559310201 +0000 UTC m=+1324.772681570"
Mar 20 16:01:05 crc kubenswrapper[4730]: I0320 16:01:05.559826    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="625a25c3-e585-4848-bbe1-0bdd4be731a9" path="/var/lib/kubelet/pods/625a25c3-e585-4848-bbe1-0bdd4be731a9/volumes"
Mar 20 16:01:05 crc kubenswrapper[4730]: I0320 16:01:05.560613    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aabd3bd6-2cee-47b8-9174-ad9ea1415e82" path="/var/lib/kubelet/pods/aabd3bd6-2cee-47b8-9174-ad9ea1415e82/volumes"
Mar 20 16:01:05 crc kubenswrapper[4730]: I0320 16:01:05.806735    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0"
Mar 20 16:01:05 crc kubenswrapper[4730]: I0320 16:01:05.986771    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-78b446cdb6-zs6nw"
Mar 20 16:01:06 crc kubenswrapper[4730]: I0320 16:01:06.097069    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-78b446cdb6-zs6nw"
Mar 20 16:01:06 crc kubenswrapper[4730]: I0320 16:01:06.373851    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-6fb7949f77-2l9t7"
Mar 20 16:01:06 crc kubenswrapper[4730]: I0320 16:01:06.540289    4730 generic.go:334] "Generic (PLEG): container finished" podID="d3747d18-1b1e-4c43-ac1a-efeeb453b1ae" containerID="f176bdad8c7c8b3ae0a8f361e9d7c67c47a36840f5e1630151c5c006f2c36ca0" exitCode=0
Mar 20 16:01:06 crc kubenswrapper[4730]: I0320 16:01:06.540376    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29567041-zmx9n" event={"ID":"d3747d18-1b1e-4c43-ac1a-efeeb453b1ae","Type":"ContainerDied","Data":"f176bdad8c7c8b3ae0a8f361e9d7c67c47a36840f5e1630151c5c006f2c36ca0"}
Mar 20 16:01:07 crc kubenswrapper[4730]: I0320 16:01:07.377659    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0"
Mar 20 16:01:07 crc kubenswrapper[4730]: I0320 16:01:07.956034    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29567041-zmx9n"
Mar 20 16:01:08 crc kubenswrapper[4730]: I0320 16:01:08.088535    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3747d18-1b1e-4c43-ac1a-efeeb453b1ae-config-data\") pod \"d3747d18-1b1e-4c43-ac1a-efeeb453b1ae\" (UID: \"d3747d18-1b1e-4c43-ac1a-efeeb453b1ae\") "
Mar 20 16:01:08 crc kubenswrapper[4730]: I0320 16:01:08.088593    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3747d18-1b1e-4c43-ac1a-efeeb453b1ae-combined-ca-bundle\") pod \"d3747d18-1b1e-4c43-ac1a-efeeb453b1ae\" (UID: \"d3747d18-1b1e-4c43-ac1a-efeeb453b1ae\") "
Mar 20 16:01:08 crc kubenswrapper[4730]: I0320 16:01:08.088728    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkrlg\" (UniqueName: \"kubernetes.io/projected/d3747d18-1b1e-4c43-ac1a-efeeb453b1ae-kube-api-access-rkrlg\") pod \"d3747d18-1b1e-4c43-ac1a-efeeb453b1ae\" (UID: \"d3747d18-1b1e-4c43-ac1a-efeeb453b1ae\") "
Mar 20 16:01:08 crc kubenswrapper[4730]: I0320 16:01:08.088811    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d3747d18-1b1e-4c43-ac1a-efeeb453b1ae-fernet-keys\") pod \"d3747d18-1b1e-4c43-ac1a-efeeb453b1ae\" (UID: \"d3747d18-1b1e-4c43-ac1a-efeeb453b1ae\") "
Mar 20 16:01:08 crc kubenswrapper[4730]: I0320 16:01:08.106935    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3747d18-1b1e-4c43-ac1a-efeeb453b1ae-kube-api-access-rkrlg" (OuterVolumeSpecName: "kube-api-access-rkrlg") pod "d3747d18-1b1e-4c43-ac1a-efeeb453b1ae" (UID: "d3747d18-1b1e-4c43-ac1a-efeeb453b1ae"). InnerVolumeSpecName "kube-api-access-rkrlg". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:01:08 crc kubenswrapper[4730]: I0320 16:01:08.112177    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3747d18-1b1e-4c43-ac1a-efeeb453b1ae-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d3747d18-1b1e-4c43-ac1a-efeeb453b1ae" (UID: "d3747d18-1b1e-4c43-ac1a-efeeb453b1ae"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:01:08 crc kubenswrapper[4730]: I0320 16:01:08.133489    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3747d18-1b1e-4c43-ac1a-efeeb453b1ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d3747d18-1b1e-4c43-ac1a-efeeb453b1ae" (UID: "d3747d18-1b1e-4c43-ac1a-efeeb453b1ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:01:08 crc kubenswrapper[4730]: I0320 16:01:08.162354    4730 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-66f7c676c8-wdfnw" podUID="625a25c3-e585-4848-bbe1-0bdd4be731a9" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.176:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)"
Mar 20 16:01:08 crc kubenswrapper[4730]: I0320 16:01:08.162808    4730 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-66f7c676c8-wdfnw" podUID="625a25c3-e585-4848-bbe1-0bdd4be731a9" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.176:9311/healthcheck\": dial tcp 10.217.0.176:9311: i/o timeout (Client.Timeout exceeded while awaiting headers)"
Mar 20 16:01:08 crc kubenswrapper[4730]: I0320 16:01:08.187400    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3747d18-1b1e-4c43-ac1a-efeeb453b1ae-config-data" (OuterVolumeSpecName: "config-data") pod "d3747d18-1b1e-4c43-ac1a-efeeb453b1ae" (UID: "d3747d18-1b1e-4c43-ac1a-efeeb453b1ae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:01:08 crc kubenswrapper[4730]: I0320 16:01:08.190515    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkrlg\" (UniqueName: \"kubernetes.io/projected/d3747d18-1b1e-4c43-ac1a-efeeb453b1ae-kube-api-access-rkrlg\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:08 crc kubenswrapper[4730]: I0320 16:01:08.190547    4730 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d3747d18-1b1e-4c43-ac1a-efeeb453b1ae-fernet-keys\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:08 crc kubenswrapper[4730]: I0320 16:01:08.190558    4730 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3747d18-1b1e-4c43-ac1a-efeeb453b1ae-config-data\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:08 crc kubenswrapper[4730]: I0320 16:01:08.190572    4730 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3747d18-1b1e-4c43-ac1a-efeeb453b1ae-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:08 crc kubenswrapper[4730]: I0320 16:01:08.399789    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0"
Mar 20 16:01:08 crc kubenswrapper[4730]: I0320 16:01:08.563033    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29567041-zmx9n" event={"ID":"d3747d18-1b1e-4c43-ac1a-efeeb453b1ae","Type":"ContainerDied","Data":"a19bcb5b8c66a4a747a916cf2427a3e88a6ef33424161955f95912089a76ca70"}
Mar 20 16:01:08 crc kubenswrapper[4730]: I0320 16:01:08.563097    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a19bcb5b8c66a4a747a916cf2427a3e88a6ef33424161955f95912089a76ca70"
Mar 20 16:01:08 crc kubenswrapper[4730]: I0320 16:01:08.563104    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29567041-zmx9n"
Mar 20 16:01:08 crc kubenswrapper[4730]: I0320 16:01:08.982557    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0"
Mar 20 16:01:09 crc kubenswrapper[4730]: I0320 16:01:09.400095    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"]
Mar 20 16:01:09 crc kubenswrapper[4730]: E0320 16:01:09.400782    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3747d18-1b1e-4c43-ac1a-efeeb453b1ae" containerName="keystone-cron"
Mar 20 16:01:09 crc kubenswrapper[4730]: I0320 16:01:09.400794    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3747d18-1b1e-4c43-ac1a-efeeb453b1ae" containerName="keystone-cron"
Mar 20 16:01:09 crc kubenswrapper[4730]: I0320 16:01:09.400975    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3747d18-1b1e-4c43-ac1a-efeeb453b1ae" containerName="keystone-cron"
Mar 20 16:01:09 crc kubenswrapper[4730]: I0320 16:01:09.401666    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient"
Mar 20 16:01:09 crc kubenswrapper[4730]: I0320 16:01:09.403562    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-6tzwx"
Mar 20 16:01:09 crc kubenswrapper[4730]: I0320 16:01:09.404627    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config"
Mar 20 16:01:09 crc kubenswrapper[4730]: I0320 16:01:09.405650    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret"
Mar 20 16:01:09 crc kubenswrapper[4730]: I0320 16:01:09.418404    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"]
Mar 20 16:01:09 crc kubenswrapper[4730]: I0320 16:01:09.510612    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a893eba7-9715-4599-93c2-0365a45134e9-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a893eba7-9715-4599-93c2-0365a45134e9\") " pod="openstack/openstackclient"
Mar 20 16:01:09 crc kubenswrapper[4730]: I0320 16:01:09.510704    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a893eba7-9715-4599-93c2-0365a45134e9-openstack-config\") pod \"openstackclient\" (UID: \"a893eba7-9715-4599-93c2-0365a45134e9\") " pod="openstack/openstackclient"
Mar 20 16:01:09 crc kubenswrapper[4730]: I0320 16:01:09.510805    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a893eba7-9715-4599-93c2-0365a45134e9-openstack-config-secret\") pod \"openstackclient\" (UID: \"a893eba7-9715-4599-93c2-0365a45134e9\") " pod="openstack/openstackclient"
Mar 20 16:01:09 crc kubenswrapper[4730]: I0320 16:01:09.510838    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khvd7\" (UniqueName: \"kubernetes.io/projected/a893eba7-9715-4599-93c2-0365a45134e9-kube-api-access-khvd7\") pod \"openstackclient\" (UID: \"a893eba7-9715-4599-93c2-0365a45134e9\") " pod="openstack/openstackclient"
Mar 20 16:01:09 crc kubenswrapper[4730]: I0320 16:01:09.612326    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a893eba7-9715-4599-93c2-0365a45134e9-openstack-config-secret\") pod \"openstackclient\" (UID: \"a893eba7-9715-4599-93c2-0365a45134e9\") " pod="openstack/openstackclient"
Mar 20 16:01:09 crc kubenswrapper[4730]: I0320 16:01:09.612641    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khvd7\" (UniqueName: \"kubernetes.io/projected/a893eba7-9715-4599-93c2-0365a45134e9-kube-api-access-khvd7\") pod \"openstackclient\" (UID: \"a893eba7-9715-4599-93c2-0365a45134e9\") " pod="openstack/openstackclient"
Mar 20 16:01:09 crc kubenswrapper[4730]: I0320 16:01:09.612809    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a893eba7-9715-4599-93c2-0365a45134e9-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a893eba7-9715-4599-93c2-0365a45134e9\") " pod="openstack/openstackclient"
Mar 20 16:01:09 crc kubenswrapper[4730]: I0320 16:01:09.612972    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a893eba7-9715-4599-93c2-0365a45134e9-openstack-config\") pod \"openstackclient\" (UID: \"a893eba7-9715-4599-93c2-0365a45134e9\") " pod="openstack/openstackclient"
Mar 20 16:01:09 crc kubenswrapper[4730]: I0320 16:01:09.614041    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a893eba7-9715-4599-93c2-0365a45134e9-openstack-config\") pod \"openstackclient\" (UID: \"a893eba7-9715-4599-93c2-0365a45134e9\") " pod="openstack/openstackclient"
Mar 20 16:01:09 crc kubenswrapper[4730]: I0320 16:01:09.622664    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a893eba7-9715-4599-93c2-0365a45134e9-openstack-config-secret\") pod \"openstackclient\" (UID: \"a893eba7-9715-4599-93c2-0365a45134e9\") " pod="openstack/openstackclient"
Mar 20 16:01:09 crc kubenswrapper[4730]: I0320 16:01:09.624062    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a893eba7-9715-4599-93c2-0365a45134e9-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a893eba7-9715-4599-93c2-0365a45134e9\") " pod="openstack/openstackclient"
Mar 20 16:01:09 crc kubenswrapper[4730]: I0320 16:01:09.634652    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khvd7\" (UniqueName: \"kubernetes.io/projected/a893eba7-9715-4599-93c2-0365a45134e9-kube-api-access-khvd7\") pod \"openstackclient\" (UID: \"a893eba7-9715-4599-93c2-0365a45134e9\") " pod="openstack/openstackclient"
Mar 20 16:01:09 crc kubenswrapper[4730]: I0320 16:01:09.720204    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient"
Mar 20 16:01:10 crc kubenswrapper[4730]: I0320 16:01:10.206336    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"]
Mar 20 16:01:10 crc kubenswrapper[4730]: W0320 16:01:10.210440    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda893eba7_9715_4599_93c2_0365a45134e9.slice/crio-b3e08788439002bcf967b00fb1086c9348f760c4292faa4ae4e4280cc7aca24d WatchSource:0}: Error finding container b3e08788439002bcf967b00fb1086c9348f760c4292faa4ae4e4280cc7aca24d: Status 404 returned error can't find the container with id b3e08788439002bcf967b00fb1086c9348f760c4292faa4ae4e4280cc7aca24d
Mar 20 16:01:10 crc kubenswrapper[4730]: I0320 16:01:10.586174    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a893eba7-9715-4599-93c2-0365a45134e9","Type":"ContainerStarted","Data":"b3e08788439002bcf967b00fb1086c9348f760c4292faa4ae4e4280cc7aca24d"}
Mar 20 16:01:11 crc kubenswrapper[4730]: I0320 16:01:11.002447    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0"
Mar 20 16:01:12 crc kubenswrapper[4730]: I0320 16:01:12.533503    4730 scope.go:117] "RemoveContainer" containerID="22c1fa447a9712d22a3477c2b5b4f81ffbfd58601afde3e8b272d15c3b1ac1ce"
Mar 20 16:01:12 crc kubenswrapper[4730]: I0320 16:01:12.880599    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 16:01:12 crc kubenswrapper[4730]: I0320 16:01:12.880674    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 16:01:13 crc kubenswrapper[4730]: I0320 16:01:13.629391    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"3f6c808e-d523-48bd-8ec2-28b625834317","Type":"ContainerStarted","Data":"ca3ac5b513d25322badcc2bf19b245d687c9ccf8bff6c35cf5794c95ec2ab964"}
Mar 20 16:01:13 crc kubenswrapper[4730]: I0320 16:01:13.982585    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0"
Mar 20 16:01:13 crc kubenswrapper[4730]: I0320 16:01:13.996184    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"]
Mar 20 16:01:13 crc kubenswrapper[4730]: I0320 16:01:13.996528    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="14cdd4b7-7a81-469f-ae2f-104b054cc583" containerName="glance-log" containerID="cri-o://1eef3a4c57a302c49282046a25ce9b6b686742d1068d8539a0c4f898222c31dc" gracePeriod=30
Mar 20 16:01:13 crc kubenswrapper[4730]: I0320 16:01:13.996556    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="14cdd4b7-7a81-469f-ae2f-104b054cc583" containerName="glance-httpd" containerID="cri-o://8f1291a8157b5fac6e0adc6431d157cb584ae26777485369c5a717b5d22da62c" gracePeriod=30
Mar 20 16:01:14 crc kubenswrapper[4730]: I0320 16:01:14.030025    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0"
Mar 20 16:01:14 crc kubenswrapper[4730]: I0320 16:01:14.642312    4730 generic.go:334] "Generic (PLEG): container finished" podID="14cdd4b7-7a81-469f-ae2f-104b054cc583" containerID="1eef3a4c57a302c49282046a25ce9b6b686742d1068d8539a0c4f898222c31dc" exitCode=143
Mar 20 16:01:14 crc kubenswrapper[4730]: I0320 16:01:14.642360    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"14cdd4b7-7a81-469f-ae2f-104b054cc583","Type":"ContainerDied","Data":"1eef3a4c57a302c49282046a25ce9b6b686742d1068d8539a0c4f898222c31dc"}
Mar 20 16:01:14 crc kubenswrapper[4730]: I0320 16:01:14.654492    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0"
Mar 20 16:01:14 crc kubenswrapper[4730]: I0320 16:01:14.896683    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"]
Mar 20 16:01:14 crc kubenswrapper[4730]: I0320 16:01:14.896974    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fe109bf0-70d2-41d2-855c-6eb862e568b6" containerName="ceilometer-central-agent" containerID="cri-o://c73d9cedf11e6a3a273cb136651dcbf0aa00b21555ddca3a8c2b2551a6375a21" gracePeriod=30
Mar 20 16:01:14 crc kubenswrapper[4730]: I0320 16:01:14.897351    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fe109bf0-70d2-41d2-855c-6eb862e568b6" containerName="proxy-httpd" containerID="cri-o://73bfac78764299ab2b8b9302430bca90ecedaa10df4b93eb8c976fe0582a6af1" gracePeriod=30
Mar 20 16:01:14 crc kubenswrapper[4730]: I0320 16:01:14.897459    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fe109bf0-70d2-41d2-855c-6eb862e568b6" containerName="sg-core" containerID="cri-o://786b95c139b35cbde05b7814738f76287b10b59ef0dc76fe0f5bcee037ab03c4" gracePeriod=30
Mar 20 16:01:14 crc kubenswrapper[4730]: I0320 16:01:14.897516    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fe109bf0-70d2-41d2-855c-6eb862e568b6" containerName="ceilometer-notification-agent" containerID="cri-o://8853aa1f17e6388ce020212c8d73958c09bbf6fcc38c4d043313ee458cbde4ad" gracePeriod=30
Mar 20 16:01:14 crc kubenswrapper[4730]: I0320 16:01:14.907231    4730 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="fe109bf0-70d2-41d2-855c-6eb862e568b6" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.183:3000/\": EOF"
Mar 20 16:01:15 crc kubenswrapper[4730]: I0320 16:01:15.654922    4730 generic.go:334] "Generic (PLEG): container finished" podID="fe109bf0-70d2-41d2-855c-6eb862e568b6" containerID="73bfac78764299ab2b8b9302430bca90ecedaa10df4b93eb8c976fe0582a6af1" exitCode=0
Mar 20 16:01:15 crc kubenswrapper[4730]: I0320 16:01:15.655292    4730 generic.go:334] "Generic (PLEG): container finished" podID="fe109bf0-70d2-41d2-855c-6eb862e568b6" containerID="786b95c139b35cbde05b7814738f76287b10b59ef0dc76fe0f5bcee037ab03c4" exitCode=2
Mar 20 16:01:15 crc kubenswrapper[4730]: I0320 16:01:15.654998    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe109bf0-70d2-41d2-855c-6eb862e568b6","Type":"ContainerDied","Data":"73bfac78764299ab2b8b9302430bca90ecedaa10df4b93eb8c976fe0582a6af1"}
Mar 20 16:01:15 crc kubenswrapper[4730]: I0320 16:01:15.655337    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe109bf0-70d2-41d2-855c-6eb862e568b6","Type":"ContainerDied","Data":"786b95c139b35cbde05b7814738f76287b10b59ef0dc76fe0f5bcee037ab03c4"}
Mar 20 16:01:15 crc kubenswrapper[4730]: I0320 16:01:15.655352    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe109bf0-70d2-41d2-855c-6eb862e568b6","Type":"ContainerDied","Data":"c73d9cedf11e6a3a273cb136651dcbf0aa00b21555ddca3a8c2b2551a6375a21"}
Mar 20 16:01:15 crc kubenswrapper[4730]: I0320 16:01:15.655304    4730 generic.go:334] "Generic (PLEG): container finished" podID="fe109bf0-70d2-41d2-855c-6eb862e568b6" containerID="c73d9cedf11e6a3a273cb136651dcbf0aa00b21555ddca3a8c2b2551a6375a21" exitCode=0
Mar 20 16:01:15 crc kubenswrapper[4730]: I0320 16:01:15.657885    4730 generic.go:334] "Generic (PLEG): container finished" podID="14cdd4b7-7a81-469f-ae2f-104b054cc583" containerID="8f1291a8157b5fac6e0adc6431d157cb584ae26777485369c5a717b5d22da62c" exitCode=0
Mar 20 16:01:15 crc kubenswrapper[4730]: I0320 16:01:15.657910    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"14cdd4b7-7a81-469f-ae2f-104b054cc583","Type":"ContainerDied","Data":"8f1291a8157b5fac6e0adc6431d157cb584ae26777485369c5a717b5d22da62c"}
Mar 20 16:01:15 crc kubenswrapper[4730]: I0320 16:01:15.934540    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6858c8d8f6-k4smz"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.246520    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-tv4tn"]
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.247841    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-tv4tn"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.259027    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-tv4tn"]
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.267699    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b68m2\" (UniqueName: \"kubernetes.io/projected/dac41622-7c80-4fce-a5ac-8a04d301669d-kube-api-access-b68m2\") pod \"nova-api-db-create-tv4tn\" (UID: \"dac41622-7c80-4fce-a5ac-8a04d301669d\") " pod="openstack/nova-api-db-create-tv4tn"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.267772    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dac41622-7c80-4fce-a5ac-8a04d301669d-operator-scripts\") pod \"nova-api-db-create-tv4tn\" (UID: \"dac41622-7c80-4fce-a5ac-8a04d301669d\") " pod="openstack/nova-api-db-create-tv4tn"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.332039    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-rlp9c"]
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.333211    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-rlp9c"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.355429    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-rlp9c"]
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.369466    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dac41622-7c80-4fce-a5ac-8a04d301669d-operator-scripts\") pod \"nova-api-db-create-tv4tn\" (UID: \"dac41622-7c80-4fce-a5ac-8a04d301669d\") " pod="openstack/nova-api-db-create-tv4tn"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.369790    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rq8d\" (UniqueName: \"kubernetes.io/projected/475a52ba-bc8d-4c7b-ae99-330d6ec2b358-kube-api-access-7rq8d\") pod \"nova-cell0-db-create-rlp9c\" (UID: \"475a52ba-bc8d-4c7b-ae99-330d6ec2b358\") " pod="openstack/nova-cell0-db-create-rlp9c"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.369943    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/475a52ba-bc8d-4c7b-ae99-330d6ec2b358-operator-scripts\") pod \"nova-cell0-db-create-rlp9c\" (UID: \"475a52ba-bc8d-4c7b-ae99-330d6ec2b358\") " pod="openstack/nova-cell0-db-create-rlp9c"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.369985    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b68m2\" (UniqueName: \"kubernetes.io/projected/dac41622-7c80-4fce-a5ac-8a04d301669d-kube-api-access-b68m2\") pod \"nova-api-db-create-tv4tn\" (UID: \"dac41622-7c80-4fce-a5ac-8a04d301669d\") " pod="openstack/nova-api-db-create-tv4tn"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.370880    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dac41622-7c80-4fce-a5ac-8a04d301669d-operator-scripts\") pod \"nova-api-db-create-tv4tn\" (UID: \"dac41622-7c80-4fce-a5ac-8a04d301669d\") " pod="openstack/nova-api-db-create-tv4tn"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.400973    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b68m2\" (UniqueName: \"kubernetes.io/projected/dac41622-7c80-4fce-a5ac-8a04d301669d-kube-api-access-b68m2\") pod \"nova-api-db-create-tv4tn\" (UID: \"dac41622-7c80-4fce-a5ac-8a04d301669d\") " pod="openstack/nova-api-db-create-tv4tn"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.434557    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-qt4mz"]
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.435866    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-qt4mz"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.460541    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-qt4mz"]
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.473410    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b9cde2b-3d05-44ba-898b-6fe8cfab4fe1-operator-scripts\") pod \"nova-cell1-db-create-qt4mz\" (UID: \"6b9cde2b-3d05-44ba-898b-6fe8cfab4fe1\") " pod="openstack/nova-cell1-db-create-qt4mz"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.473492    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rq8d\" (UniqueName: \"kubernetes.io/projected/475a52ba-bc8d-4c7b-ae99-330d6ec2b358-kube-api-access-7rq8d\") pod \"nova-cell0-db-create-rlp9c\" (UID: \"475a52ba-bc8d-4c7b-ae99-330d6ec2b358\") " pod="openstack/nova-cell0-db-create-rlp9c"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.473555    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/475a52ba-bc8d-4c7b-ae99-330d6ec2b358-operator-scripts\") pod \"nova-cell0-db-create-rlp9c\" (UID: \"475a52ba-bc8d-4c7b-ae99-330d6ec2b358\") " pod="openstack/nova-cell0-db-create-rlp9c"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.473604    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkb5p\" (UniqueName: \"kubernetes.io/projected/6b9cde2b-3d05-44ba-898b-6fe8cfab4fe1-kube-api-access-fkb5p\") pod \"nova-cell1-db-create-qt4mz\" (UID: \"6b9cde2b-3d05-44ba-898b-6fe8cfab4fe1\") " pod="openstack/nova-cell1-db-create-qt4mz"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.474467    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/475a52ba-bc8d-4c7b-ae99-330d6ec2b358-operator-scripts\") pod \"nova-cell0-db-create-rlp9c\" (UID: \"475a52ba-bc8d-4c7b-ae99-330d6ec2b358\") " pod="openstack/nova-cell0-db-create-rlp9c"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.475998    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-d0d2-account-create-update-z6v46"]
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.477225    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d0d2-account-create-update-z6v46"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.479012    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.493439    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rq8d\" (UniqueName: \"kubernetes.io/projected/475a52ba-bc8d-4c7b-ae99-330d6ec2b358-kube-api-access-7rq8d\") pod \"nova-cell0-db-create-rlp9c\" (UID: \"475a52ba-bc8d-4c7b-ae99-330d6ec2b358\") " pod="openstack/nova-cell0-db-create-rlp9c"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.507017    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-d0d2-account-create-update-z6v46"]
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.575468    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/383cf79a-0636-4175-bcf8-7e369f101901-operator-scripts\") pod \"nova-api-d0d2-account-create-update-z6v46\" (UID: \"383cf79a-0636-4175-bcf8-7e369f101901\") " pod="openstack/nova-api-d0d2-account-create-update-z6v46"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.575577    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkb5p\" (UniqueName: \"kubernetes.io/projected/6b9cde2b-3d05-44ba-898b-6fe8cfab4fe1-kube-api-access-fkb5p\") pod \"nova-cell1-db-create-qt4mz\" (UID: \"6b9cde2b-3d05-44ba-898b-6fe8cfab4fe1\") " pod="openstack/nova-cell1-db-create-qt4mz"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.575700    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b9cde2b-3d05-44ba-898b-6fe8cfab4fe1-operator-scripts\") pod \"nova-cell1-db-create-qt4mz\" (UID: \"6b9cde2b-3d05-44ba-898b-6fe8cfab4fe1\") " pod="openstack/nova-cell1-db-create-qt4mz"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.575792    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkr7g\" (UniqueName: \"kubernetes.io/projected/383cf79a-0636-4175-bcf8-7e369f101901-kube-api-access-fkr7g\") pod \"nova-api-d0d2-account-create-update-z6v46\" (UID: \"383cf79a-0636-4175-bcf8-7e369f101901\") " pod="openstack/nova-api-d0d2-account-create-update-z6v46"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.577029    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b9cde2b-3d05-44ba-898b-6fe8cfab4fe1-operator-scripts\") pod \"nova-cell1-db-create-qt4mz\" (UID: \"6b9cde2b-3d05-44ba-898b-6fe8cfab4fe1\") " pod="openstack/nova-cell1-db-create-qt4mz"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.589053    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-tv4tn"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.605773    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkb5p\" (UniqueName: \"kubernetes.io/projected/6b9cde2b-3d05-44ba-898b-6fe8cfab4fe1-kube-api-access-fkb5p\") pod \"nova-cell1-db-create-qt4mz\" (UID: \"6b9cde2b-3d05-44ba-898b-6fe8cfab4fe1\") " pod="openstack/nova-cell1-db-create-qt4mz"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.644637    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-2850-account-create-update-4lrrq"]
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.645897    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2850-account-create-update-4lrrq"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.651526    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.656068    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-2850-account-create-update-4lrrq"]
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.657682    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-rlp9c"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.682922    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f625a9e-a940-476b-85b2-ff54c5e87785-operator-scripts\") pod \"nova-cell0-2850-account-create-update-4lrrq\" (UID: \"3f625a9e-a940-476b-85b2-ff54c5e87785\") " pod="openstack/nova-cell0-2850-account-create-update-4lrrq"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.683071    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkr7g\" (UniqueName: \"kubernetes.io/projected/383cf79a-0636-4175-bcf8-7e369f101901-kube-api-access-fkr7g\") pod \"nova-api-d0d2-account-create-update-z6v46\" (UID: \"383cf79a-0636-4175-bcf8-7e369f101901\") " pod="openstack/nova-api-d0d2-account-create-update-z6v46"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.683163    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/383cf79a-0636-4175-bcf8-7e369f101901-operator-scripts\") pod \"nova-api-d0d2-account-create-update-z6v46\" (UID: \"383cf79a-0636-4175-bcf8-7e369f101901\") " pod="openstack/nova-api-d0d2-account-create-update-z6v46"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.683263    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25qhp\" (UniqueName: \"kubernetes.io/projected/3f625a9e-a940-476b-85b2-ff54c5e87785-kube-api-access-25qhp\") pod \"nova-cell0-2850-account-create-update-4lrrq\" (UID: \"3f625a9e-a940-476b-85b2-ff54c5e87785\") " pod="openstack/nova-cell0-2850-account-create-update-4lrrq"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.684121    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/383cf79a-0636-4175-bcf8-7e369f101901-operator-scripts\") pod \"nova-api-d0d2-account-create-update-z6v46\" (UID: \"383cf79a-0636-4175-bcf8-7e369f101901\") " pod="openstack/nova-api-d0d2-account-create-update-z6v46"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.688680    4730 generic.go:334] "Generic (PLEG): container finished" podID="3f6c808e-d523-48bd-8ec2-28b625834317" containerID="ca3ac5b513d25322badcc2bf19b245d687c9ccf8bff6c35cf5794c95ec2ab964" exitCode=1
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.688721    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"3f6c808e-d523-48bd-8ec2-28b625834317","Type":"ContainerDied","Data":"ca3ac5b513d25322badcc2bf19b245d687c9ccf8bff6c35cf5794c95ec2ab964"}
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.688753    4730 scope.go:117] "RemoveContainer" containerID="22c1fa447a9712d22a3477c2b5b4f81ffbfd58601afde3e8b272d15c3b1ac1ce"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.689797    4730 scope.go:117] "RemoveContainer" containerID="ca3ac5b513d25322badcc2bf19b245d687c9ccf8bff6c35cf5794c95ec2ab964"
Mar 20 16:01:16 crc kubenswrapper[4730]: E0320 16:01:16.690015    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(3f6c808e-d523-48bd-8ec2-28b625834317)\"" pod="openstack/watcher-decision-engine-0" podUID="3f6c808e-d523-48bd-8ec2-28b625834317"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.706432    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkr7g\" (UniqueName: \"kubernetes.io/projected/383cf79a-0636-4175-bcf8-7e369f101901-kube-api-access-fkr7g\") pod \"nova-api-d0d2-account-create-update-z6v46\" (UID: \"383cf79a-0636-4175-bcf8-7e369f101901\") " pod="openstack/nova-api-d0d2-account-create-update-z6v46"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.782964    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-qt4mz"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.784350    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25qhp\" (UniqueName: \"kubernetes.io/projected/3f625a9e-a940-476b-85b2-ff54c5e87785-kube-api-access-25qhp\") pod \"nova-cell0-2850-account-create-update-4lrrq\" (UID: \"3f625a9e-a940-476b-85b2-ff54c5e87785\") " pod="openstack/nova-cell0-2850-account-create-update-4lrrq"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.784441    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f625a9e-a940-476b-85b2-ff54c5e87785-operator-scripts\") pod \"nova-cell0-2850-account-create-update-4lrrq\" (UID: \"3f625a9e-a940-476b-85b2-ff54c5e87785\") " pod="openstack/nova-cell0-2850-account-create-update-4lrrq"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.786055    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f625a9e-a940-476b-85b2-ff54c5e87785-operator-scripts\") pod \"nova-cell0-2850-account-create-update-4lrrq\" (UID: \"3f625a9e-a940-476b-85b2-ff54c5e87785\") " pod="openstack/nova-cell0-2850-account-create-update-4lrrq"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.814880    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25qhp\" (UniqueName: \"kubernetes.io/projected/3f625a9e-a940-476b-85b2-ff54c5e87785-kube-api-access-25qhp\") pod \"nova-cell0-2850-account-create-update-4lrrq\" (UID: \"3f625a9e-a940-476b-85b2-ff54c5e87785\") " pod="openstack/nova-cell0-2850-account-create-update-4lrrq"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.832393    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-4a43-account-create-update-cj4kg"]
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.833595    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4a43-account-create-update-cj4kg"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.836237    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d0d2-account-create-update-z6v46"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.836855    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.842193    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-4a43-account-create-update-cj4kg"]
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.886087    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdzxp\" (UniqueName: \"kubernetes.io/projected/7de61c5d-53ba-4d26-9a79-b82c2bc3b779-kube-api-access-rdzxp\") pod \"nova-cell1-4a43-account-create-update-cj4kg\" (UID: \"7de61c5d-53ba-4d26-9a79-b82c2bc3b779\") " pod="openstack/nova-cell1-4a43-account-create-update-cj4kg"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.886125    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7de61c5d-53ba-4d26-9a79-b82c2bc3b779-operator-scripts\") pod \"nova-cell1-4a43-account-create-update-cj4kg\" (UID: \"7de61c5d-53ba-4d26-9a79-b82c2bc3b779\") " pod="openstack/nova-cell1-4a43-account-create-update-cj4kg"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.949234    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-7c5c8ffdd9-xpfhf"]
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.951053    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7c5c8ffdd9-xpfhf"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.953992    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.954549    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.955122    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.967974    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7c5c8ffdd9-xpfhf"]
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.989332    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdzxp\" (UniqueName: \"kubernetes.io/projected/7de61c5d-53ba-4d26-9a79-b82c2bc3b779-kube-api-access-rdzxp\") pod \"nova-cell1-4a43-account-create-update-cj4kg\" (UID: \"7de61c5d-53ba-4d26-9a79-b82c2bc3b779\") " pod="openstack/nova-cell1-4a43-account-create-update-cj4kg"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.989387    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7de61c5d-53ba-4d26-9a79-b82c2bc3b779-operator-scripts\") pod \"nova-cell1-4a43-account-create-update-cj4kg\" (UID: \"7de61c5d-53ba-4d26-9a79-b82c2bc3b779\") " pod="openstack/nova-cell1-4a43-account-create-update-cj4kg"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.990119    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7de61c5d-53ba-4d26-9a79-b82c2bc3b779-operator-scripts\") pod \"nova-cell1-4a43-account-create-update-cj4kg\" (UID: \"7de61c5d-53ba-4d26-9a79-b82c2bc3b779\") " pod="openstack/nova-cell1-4a43-account-create-update-cj4kg"
Mar 20 16:01:16 crc kubenswrapper[4730]: I0320 16:01:16.992363    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2850-account-create-update-4lrrq"
Mar 20 16:01:17 crc kubenswrapper[4730]: I0320 16:01:17.007062    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdzxp\" (UniqueName: \"kubernetes.io/projected/7de61c5d-53ba-4d26-9a79-b82c2bc3b779-kube-api-access-rdzxp\") pod \"nova-cell1-4a43-account-create-update-cj4kg\" (UID: \"7de61c5d-53ba-4d26-9a79-b82c2bc3b779\") " pod="openstack/nova-cell1-4a43-account-create-update-cj4kg"
Mar 20 16:01:17 crc kubenswrapper[4730]: I0320 16:01:17.091355    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9780622-27f3-4339-8107-321feed5e25b-internal-tls-certs\") pod \"swift-proxy-7c5c8ffdd9-xpfhf\" (UID: \"b9780622-27f3-4339-8107-321feed5e25b\") " pod="openstack/swift-proxy-7c5c8ffdd9-xpfhf"
Mar 20 16:01:17 crc kubenswrapper[4730]: I0320 16:01:17.091422    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9780622-27f3-4339-8107-321feed5e25b-config-data\") pod \"swift-proxy-7c5c8ffdd9-xpfhf\" (UID: \"b9780622-27f3-4339-8107-321feed5e25b\") " pod="openstack/swift-proxy-7c5c8ffdd9-xpfhf"
Mar 20 16:01:17 crc kubenswrapper[4730]: I0320 16:01:17.091441    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sfs6\" (UniqueName: \"kubernetes.io/projected/b9780622-27f3-4339-8107-321feed5e25b-kube-api-access-8sfs6\") pod \"swift-proxy-7c5c8ffdd9-xpfhf\" (UID: \"b9780622-27f3-4339-8107-321feed5e25b\") " pod="openstack/swift-proxy-7c5c8ffdd9-xpfhf"
Mar 20 16:01:17 crc kubenswrapper[4730]: I0320 16:01:17.091460    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9780622-27f3-4339-8107-321feed5e25b-public-tls-certs\") pod \"swift-proxy-7c5c8ffdd9-xpfhf\" (UID: \"b9780622-27f3-4339-8107-321feed5e25b\") " pod="openstack/swift-proxy-7c5c8ffdd9-xpfhf"
Mar 20 16:01:17 crc kubenswrapper[4730]: I0320 16:01:17.092420    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9780622-27f3-4339-8107-321feed5e25b-combined-ca-bundle\") pod \"swift-proxy-7c5c8ffdd9-xpfhf\" (UID: \"b9780622-27f3-4339-8107-321feed5e25b\") " pod="openstack/swift-proxy-7c5c8ffdd9-xpfhf"
Mar 20 16:01:17 crc kubenswrapper[4730]: I0320 16:01:17.092665    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b9780622-27f3-4339-8107-321feed5e25b-log-httpd\") pod \"swift-proxy-7c5c8ffdd9-xpfhf\" (UID: \"b9780622-27f3-4339-8107-321feed5e25b\") " pod="openstack/swift-proxy-7c5c8ffdd9-xpfhf"
Mar 20 16:01:17 crc kubenswrapper[4730]: I0320 16:01:17.092732    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b9780622-27f3-4339-8107-321feed5e25b-etc-swift\") pod \"swift-proxy-7c5c8ffdd9-xpfhf\" (UID: \"b9780622-27f3-4339-8107-321feed5e25b\") " pod="openstack/swift-proxy-7c5c8ffdd9-xpfhf"
Mar 20 16:01:17 crc kubenswrapper[4730]: I0320 16:01:17.092771    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b9780622-27f3-4339-8107-321feed5e25b-run-httpd\") pod \"swift-proxy-7c5c8ffdd9-xpfhf\" (UID: \"b9780622-27f3-4339-8107-321feed5e25b\") " pod="openstack/swift-proxy-7c5c8ffdd9-xpfhf"
Mar 20 16:01:17 crc kubenswrapper[4730]: I0320 16:01:17.164953    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"]
Mar 20 16:01:17 crc kubenswrapper[4730]: I0320 16:01:17.165239    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="82401c9f-f5f9-4bc6-a085-c89d3632493e" containerName="glance-log" containerID="cri-o://d342f4f373c3a46fb291e5160cd15525a5be1018e68f010945cfba9de11fd3fe" gracePeriod=30
Mar 20 16:01:17 crc kubenswrapper[4730]: I0320 16:01:17.165347    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="82401c9f-f5f9-4bc6-a085-c89d3632493e" containerName="glance-httpd" containerID="cri-o://68b0b2752749a64e7ce292cfa6aabcc6400dcc4552e165b090d093ce63fe5a35" gracePeriod=30
Mar 20 16:01:17 crc kubenswrapper[4730]: I0320 16:01:17.193496    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4a43-account-create-update-cj4kg"
Mar 20 16:01:17 crc kubenswrapper[4730]: I0320 16:01:17.194008    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9780622-27f3-4339-8107-321feed5e25b-combined-ca-bundle\") pod \"swift-proxy-7c5c8ffdd9-xpfhf\" (UID: \"b9780622-27f3-4339-8107-321feed5e25b\") " pod="openstack/swift-proxy-7c5c8ffdd9-xpfhf"
Mar 20 16:01:17 crc kubenswrapper[4730]: I0320 16:01:17.194081    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b9780622-27f3-4339-8107-321feed5e25b-log-httpd\") pod \"swift-proxy-7c5c8ffdd9-xpfhf\" (UID: \"b9780622-27f3-4339-8107-321feed5e25b\") " pod="openstack/swift-proxy-7c5c8ffdd9-xpfhf"
Mar 20 16:01:17 crc kubenswrapper[4730]: I0320 16:01:17.194104    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b9780622-27f3-4339-8107-321feed5e25b-etc-swift\") pod \"swift-proxy-7c5c8ffdd9-xpfhf\" (UID: \"b9780622-27f3-4339-8107-321feed5e25b\") " pod="openstack/swift-proxy-7c5c8ffdd9-xpfhf"
Mar 20 16:01:17 crc kubenswrapper[4730]: I0320 16:01:17.194126    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b9780622-27f3-4339-8107-321feed5e25b-run-httpd\") pod \"swift-proxy-7c5c8ffdd9-xpfhf\" (UID: \"b9780622-27f3-4339-8107-321feed5e25b\") " pod="openstack/swift-proxy-7c5c8ffdd9-xpfhf"
Mar 20 16:01:17 crc kubenswrapper[4730]: I0320 16:01:17.194179    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9780622-27f3-4339-8107-321feed5e25b-internal-tls-certs\") pod \"swift-proxy-7c5c8ffdd9-xpfhf\" (UID: \"b9780622-27f3-4339-8107-321feed5e25b\") " pod="openstack/swift-proxy-7c5c8ffdd9-xpfhf"
Mar 20 16:01:17 crc kubenswrapper[4730]: I0320 16:01:17.194216    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9780622-27f3-4339-8107-321feed5e25b-config-data\") pod \"swift-proxy-7c5c8ffdd9-xpfhf\" (UID: \"b9780622-27f3-4339-8107-321feed5e25b\") " pod="openstack/swift-proxy-7c5c8ffdd9-xpfhf"
Mar 20 16:01:17 crc kubenswrapper[4730]: I0320 16:01:17.194233    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sfs6\" (UniqueName: \"kubernetes.io/projected/b9780622-27f3-4339-8107-321feed5e25b-kube-api-access-8sfs6\") pod \"swift-proxy-7c5c8ffdd9-xpfhf\" (UID: \"b9780622-27f3-4339-8107-321feed5e25b\") " pod="openstack/swift-proxy-7c5c8ffdd9-xpfhf"
Mar 20 16:01:17 crc kubenswrapper[4730]: I0320 16:01:17.194453    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9780622-27f3-4339-8107-321feed5e25b-public-tls-certs\") pod \"swift-proxy-7c5c8ffdd9-xpfhf\" (UID: \"b9780622-27f3-4339-8107-321feed5e25b\") " pod="openstack/swift-proxy-7c5c8ffdd9-xpfhf"
Mar 20 16:01:17 crc kubenswrapper[4730]: I0320 16:01:17.194631    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b9780622-27f3-4339-8107-321feed5e25b-log-httpd\") pod \"swift-proxy-7c5c8ffdd9-xpfhf\" (UID: \"b9780622-27f3-4339-8107-321feed5e25b\") " pod="openstack/swift-proxy-7c5c8ffdd9-xpfhf"
Mar 20 16:01:17 crc kubenswrapper[4730]: I0320 16:01:17.194737    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b9780622-27f3-4339-8107-321feed5e25b-run-httpd\") pod \"swift-proxy-7c5c8ffdd9-xpfhf\" (UID: \"b9780622-27f3-4339-8107-321feed5e25b\") " pod="openstack/swift-proxy-7c5c8ffdd9-xpfhf"
Mar 20 16:01:17 crc kubenswrapper[4730]: I0320 16:01:17.197865    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9780622-27f3-4339-8107-321feed5e25b-combined-ca-bundle\") pod \"swift-proxy-7c5c8ffdd9-xpfhf\" (UID: \"b9780622-27f3-4339-8107-321feed5e25b\") " pod="openstack/swift-proxy-7c5c8ffdd9-xpfhf"
Mar 20 16:01:17 crc kubenswrapper[4730]: I0320 16:01:17.197957    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b9780622-27f3-4339-8107-321feed5e25b-etc-swift\") pod \"swift-proxy-7c5c8ffdd9-xpfhf\" (UID: \"b9780622-27f3-4339-8107-321feed5e25b\") " pod="openstack/swift-proxy-7c5c8ffdd9-xpfhf"
Mar 20 16:01:17 crc kubenswrapper[4730]: I0320 16:01:17.199430    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9780622-27f3-4339-8107-321feed5e25b-public-tls-certs\") pod \"swift-proxy-7c5c8ffdd9-xpfhf\" (UID: \"b9780622-27f3-4339-8107-321feed5e25b\") " pod="openstack/swift-proxy-7c5c8ffdd9-xpfhf"
Mar 20 16:01:17 crc kubenswrapper[4730]: I0320 16:01:17.200231    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9780622-27f3-4339-8107-321feed5e25b-config-data\") pod \"swift-proxy-7c5c8ffdd9-xpfhf\" (UID: \"b9780622-27f3-4339-8107-321feed5e25b\") " pod="openstack/swift-proxy-7c5c8ffdd9-xpfhf"
Mar 20 16:01:17 crc kubenswrapper[4730]: I0320 16:01:17.200733    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9780622-27f3-4339-8107-321feed5e25b-internal-tls-certs\") pod \"swift-proxy-7c5c8ffdd9-xpfhf\" (UID: \"b9780622-27f3-4339-8107-321feed5e25b\") " pod="openstack/swift-proxy-7c5c8ffdd9-xpfhf"
Mar 20 16:01:17 crc kubenswrapper[4730]: I0320 16:01:17.232136    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sfs6\" (UniqueName: \"kubernetes.io/projected/b9780622-27f3-4339-8107-321feed5e25b-kube-api-access-8sfs6\") pod \"swift-proxy-7c5c8ffdd9-xpfhf\" (UID: \"b9780622-27f3-4339-8107-321feed5e25b\") " pod="openstack/swift-proxy-7c5c8ffdd9-xpfhf"
Mar 20 16:01:17 crc kubenswrapper[4730]: I0320 16:01:17.266180    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7c5c8ffdd9-xpfhf"
Mar 20 16:01:17 crc kubenswrapper[4730]: I0320 16:01:17.711008    4730 generic.go:334] "Generic (PLEG): container finished" podID="82401c9f-f5f9-4bc6-a085-c89d3632493e" containerID="d342f4f373c3a46fb291e5160cd15525a5be1018e68f010945cfba9de11fd3fe" exitCode=143
Mar 20 16:01:17 crc kubenswrapper[4730]: I0320 16:01:17.711094    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"82401c9f-f5f9-4bc6-a085-c89d3632493e","Type":"ContainerDied","Data":"d342f4f373c3a46fb291e5160cd15525a5be1018e68f010945cfba9de11fd3fe"}
Mar 20 16:01:17 crc kubenswrapper[4730]: I0320 16:01:17.722522    4730 generic.go:334] "Generic (PLEG): container finished" podID="fe109bf0-70d2-41d2-855c-6eb862e568b6" containerID="8853aa1f17e6388ce020212c8d73958c09bbf6fcc38c4d043313ee458cbde4ad" exitCode=0
Mar 20 16:01:17 crc kubenswrapper[4730]: I0320 16:01:17.722567    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe109bf0-70d2-41d2-855c-6eb862e568b6","Type":"ContainerDied","Data":"8853aa1f17e6388ce020212c8d73958c09bbf6fcc38c4d043313ee458cbde4ad"}
Mar 20 16:01:18 crc kubenswrapper[4730]: I0320 16:01:18.744991    4730 generic.go:334] "Generic (PLEG): container finished" podID="82401c9f-f5f9-4bc6-a085-c89d3632493e" containerID="68b0b2752749a64e7ce292cfa6aabcc6400dcc4552e165b090d093ce63fe5a35" exitCode=0
Mar 20 16:01:18 crc kubenswrapper[4730]: I0320 16:01:18.745165    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"82401c9f-f5f9-4bc6-a085-c89d3632493e","Type":"ContainerDied","Data":"68b0b2752749a64e7ce292cfa6aabcc6400dcc4552e165b090d093ce63fe5a35"}
Mar 20 16:01:19 crc kubenswrapper[4730]: I0320 16:01:19.974325    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0"
Mar 20 16:01:19 crc kubenswrapper[4730]: I0320 16:01:19.974382    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0"
Mar 20 16:01:19 crc kubenswrapper[4730]: I0320 16:01:19.975151    4730 scope.go:117] "RemoveContainer" containerID="ca3ac5b513d25322badcc2bf19b245d687c9ccf8bff6c35cf5794c95ec2ab964"
Mar 20 16:01:19 crc kubenswrapper[4730]: E0320 16:01:19.975518    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(3f6c808e-d523-48bd-8ec2-28b625834317)\"" pod="openstack/watcher-decision-engine-0" podUID="3f6c808e-d523-48bd-8ec2-28b625834317"
Mar 20 16:01:21 crc kubenswrapper[4730]: I0320 16:01:21.962059    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0"
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.005629    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14cdd4b7-7a81-469f-ae2f-104b054cc583-combined-ca-bundle\") pod \"14cdd4b7-7a81-469f-ae2f-104b054cc583\" (UID: \"14cdd4b7-7a81-469f-ae2f-104b054cc583\") "
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.005912    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/14cdd4b7-7a81-469f-ae2f-104b054cc583-httpd-run\") pod \"14cdd4b7-7a81-469f-ae2f-104b054cc583\" (UID: \"14cdd4b7-7a81-469f-ae2f-104b054cc583\") "
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.005963    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14cdd4b7-7a81-469f-ae2f-104b054cc583-logs\") pod \"14cdd4b7-7a81-469f-ae2f-104b054cc583\" (UID: \"14cdd4b7-7a81-469f-ae2f-104b054cc583\") "
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.005987    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/14cdd4b7-7a81-469f-ae2f-104b054cc583-internal-tls-certs\") pod \"14cdd4b7-7a81-469f-ae2f-104b054cc583\" (UID: \"14cdd4b7-7a81-469f-ae2f-104b054cc583\") "
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.006013    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14cdd4b7-7a81-469f-ae2f-104b054cc583-scripts\") pod \"14cdd4b7-7a81-469f-ae2f-104b054cc583\" (UID: \"14cdd4b7-7a81-469f-ae2f-104b054cc583\") "
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.006035    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-566vl\" (UniqueName: \"kubernetes.io/projected/14cdd4b7-7a81-469f-ae2f-104b054cc583-kube-api-access-566vl\") pod \"14cdd4b7-7a81-469f-ae2f-104b054cc583\" (UID: \"14cdd4b7-7a81-469f-ae2f-104b054cc583\") "
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.006081    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"14cdd4b7-7a81-469f-ae2f-104b054cc583\" (UID: \"14cdd4b7-7a81-469f-ae2f-104b054cc583\") "
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.006113    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14cdd4b7-7a81-469f-ae2f-104b054cc583-config-data\") pod \"14cdd4b7-7a81-469f-ae2f-104b054cc583\" (UID: \"14cdd4b7-7a81-469f-ae2f-104b054cc583\") "
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.008485    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14cdd4b7-7a81-469f-ae2f-104b054cc583-logs" (OuterVolumeSpecName: "logs") pod "14cdd4b7-7a81-469f-ae2f-104b054cc583" (UID: "14cdd4b7-7a81-469f-ae2f-104b054cc583"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.009356    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14cdd4b7-7a81-469f-ae2f-104b054cc583-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "14cdd4b7-7a81-469f-ae2f-104b054cc583" (UID: "14cdd4b7-7a81-469f-ae2f-104b054cc583"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.021111    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14cdd4b7-7a81-469f-ae2f-104b054cc583-scripts" (OuterVolumeSpecName: "scripts") pod "14cdd4b7-7a81-469f-ae2f-104b054cc583" (UID: "14cdd4b7-7a81-469f-ae2f-104b054cc583"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.029526    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14cdd4b7-7a81-469f-ae2f-104b054cc583-kube-api-access-566vl" (OuterVolumeSpecName: "kube-api-access-566vl") pod "14cdd4b7-7a81-469f-ae2f-104b054cc583" (UID: "14cdd4b7-7a81-469f-ae2f-104b054cc583"). InnerVolumeSpecName "kube-api-access-566vl". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.030846    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "14cdd4b7-7a81-469f-ae2f-104b054cc583" (UID: "14cdd4b7-7a81-469f-ae2f-104b054cc583"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue ""
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.046232    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14cdd4b7-7a81-469f-ae2f-104b054cc583-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "14cdd4b7-7a81-469f-ae2f-104b054cc583" (UID: "14cdd4b7-7a81-469f-ae2f-104b054cc583"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.066673    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14cdd4b7-7a81-469f-ae2f-104b054cc583-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "14cdd4b7-7a81-469f-ae2f-104b054cc583" (UID: "14cdd4b7-7a81-469f-ae2f-104b054cc583"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.103369    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14cdd4b7-7a81-469f-ae2f-104b054cc583-config-data" (OuterVolumeSpecName: "config-data") pod "14cdd4b7-7a81-469f-ae2f-104b054cc583" (UID: "14cdd4b7-7a81-469f-ae2f-104b054cc583"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.108883    4730 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14cdd4b7-7a81-469f-ae2f-104b054cc583-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.108922    4730 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/14cdd4b7-7a81-469f-ae2f-104b054cc583-httpd-run\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.108933    4730 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14cdd4b7-7a81-469f-ae2f-104b054cc583-logs\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.108943    4730 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/14cdd4b7-7a81-469f-ae2f-104b054cc583-internal-tls-certs\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.108952    4730 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14cdd4b7-7a81-469f-ae2f-104b054cc583-scripts\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.108964    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-566vl\" (UniqueName: \"kubernetes.io/projected/14cdd4b7-7a81-469f-ae2f-104b054cc583-kube-api-access-566vl\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.108997    4730 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" "
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.109007    4730 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14cdd4b7-7a81-469f-ae2f-104b054cc583-config-data\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.135962    4730 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc"
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.211402    4730 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.448640    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0"
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.465023    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0"
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.482508    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-4a43-account-create-update-cj4kg"]
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.511658    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5dc7dd859f-wtxnj"
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.618365    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe109bf0-70d2-41d2-855c-6eb862e568b6-config-data\") pod \"fe109bf0-70d2-41d2-855c-6eb862e568b6\" (UID: \"fe109bf0-70d2-41d2-855c-6eb862e568b6\") "
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.618430    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe109bf0-70d2-41d2-855c-6eb862e568b6-combined-ca-bundle\") pod \"fe109bf0-70d2-41d2-855c-6eb862e568b6\" (UID: \"fe109bf0-70d2-41d2-855c-6eb862e568b6\") "
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.618481    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82401c9f-f5f9-4bc6-a085-c89d3632493e-combined-ca-bundle\") pod \"82401c9f-f5f9-4bc6-a085-c89d3632493e\" (UID: \"82401c9f-f5f9-4bc6-a085-c89d3632493e\") "
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.618512    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/82401c9f-f5f9-4bc6-a085-c89d3632493e-httpd-run\") pod \"82401c9f-f5f9-4bc6-a085-c89d3632493e\" (UID: \"82401c9f-f5f9-4bc6-a085-c89d3632493e\") "
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.618539    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdtvc\" (UniqueName: \"kubernetes.io/projected/fe109bf0-70d2-41d2-855c-6eb862e568b6-kube-api-access-vdtvc\") pod \"fe109bf0-70d2-41d2-855c-6eb862e568b6\" (UID: \"fe109bf0-70d2-41d2-855c-6eb862e568b6\") "
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.618559    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"82401c9f-f5f9-4bc6-a085-c89d3632493e\" (UID: \"82401c9f-f5f9-4bc6-a085-c89d3632493e\") "
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.618578    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/82401c9f-f5f9-4bc6-a085-c89d3632493e-public-tls-certs\") pod \"82401c9f-f5f9-4bc6-a085-c89d3632493e\" (UID: \"82401c9f-f5f9-4bc6-a085-c89d3632493e\") "
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.618599    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82401c9f-f5f9-4bc6-a085-c89d3632493e-logs\") pod \"82401c9f-f5f9-4bc6-a085-c89d3632493e\" (UID: \"82401c9f-f5f9-4bc6-a085-c89d3632493e\") "
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.618630    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe109bf0-70d2-41d2-855c-6eb862e568b6-scripts\") pod \"fe109bf0-70d2-41d2-855c-6eb862e568b6\" (UID: \"fe109bf0-70d2-41d2-855c-6eb862e568b6\") "
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.618651    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe109bf0-70d2-41d2-855c-6eb862e568b6-sg-core-conf-yaml\") pod \"fe109bf0-70d2-41d2-855c-6eb862e568b6\" (UID: \"fe109bf0-70d2-41d2-855c-6eb862e568b6\") "
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.618686    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe109bf0-70d2-41d2-855c-6eb862e568b6-run-httpd\") pod \"fe109bf0-70d2-41d2-855c-6eb862e568b6\" (UID: \"fe109bf0-70d2-41d2-855c-6eb862e568b6\") "
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.618707    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe109bf0-70d2-41d2-855c-6eb862e568b6-log-httpd\") pod \"fe109bf0-70d2-41d2-855c-6eb862e568b6\" (UID: \"fe109bf0-70d2-41d2-855c-6eb862e568b6\") "
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.618803    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82401c9f-f5f9-4bc6-a085-c89d3632493e-scripts\") pod \"82401c9f-f5f9-4bc6-a085-c89d3632493e\" (UID: \"82401c9f-f5f9-4bc6-a085-c89d3632493e\") "
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.618822    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pw75x\" (UniqueName: \"kubernetes.io/projected/82401c9f-f5f9-4bc6-a085-c89d3632493e-kube-api-access-pw75x\") pod \"82401c9f-f5f9-4bc6-a085-c89d3632493e\" (UID: \"82401c9f-f5f9-4bc6-a085-c89d3632493e\") "
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.618854    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82401c9f-f5f9-4bc6-a085-c89d3632493e-config-data\") pod \"82401c9f-f5f9-4bc6-a085-c89d3632493e\" (UID: \"82401c9f-f5f9-4bc6-a085-c89d3632493e\") "
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.626575    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82401c9f-f5f9-4bc6-a085-c89d3632493e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "82401c9f-f5f9-4bc6-a085-c89d3632493e" (UID: "82401c9f-f5f9-4bc6-a085-c89d3632493e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.632307    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6858c8d8f6-k4smz"]
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.632772    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6858c8d8f6-k4smz" podUID="4ed9fad7-284f-40b4-9c3b-7a213aff010a" containerName="neutron-api" containerID="cri-o://2eff1617c29a34da6021d776b5bc5c6695025819add4af253986837526af0f15" gracePeriod=30
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.633880    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6858c8d8f6-k4smz" podUID="4ed9fad7-284f-40b4-9c3b-7a213aff010a" containerName="neutron-httpd" containerID="cri-o://3635b09696560454a28e6d666babdb61696ccff059aecec39acea6122546c8aa" gracePeriod=30
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.635042    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe109bf0-70d2-41d2-855c-6eb862e568b6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fe109bf0-70d2-41d2-855c-6eb862e568b6" (UID: "fe109bf0-70d2-41d2-855c-6eb862e568b6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.635601    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82401c9f-f5f9-4bc6-a085-c89d3632493e-logs" (OuterVolumeSpecName: "logs") pod "82401c9f-f5f9-4bc6-a085-c89d3632493e" (UID: "82401c9f-f5f9-4bc6-a085-c89d3632493e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.641793    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe109bf0-70d2-41d2-855c-6eb862e568b6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fe109bf0-70d2-41d2-855c-6eb862e568b6" (UID: "fe109bf0-70d2-41d2-855c-6eb862e568b6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.663880    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe109bf0-70d2-41d2-855c-6eb862e568b6-kube-api-access-vdtvc" (OuterVolumeSpecName: "kube-api-access-vdtvc") pod "fe109bf0-70d2-41d2-855c-6eb862e568b6" (UID: "fe109bf0-70d2-41d2-855c-6eb862e568b6"). InnerVolumeSpecName "kube-api-access-vdtvc". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.664197    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe109bf0-70d2-41d2-855c-6eb862e568b6-scripts" (OuterVolumeSpecName: "scripts") pod "fe109bf0-70d2-41d2-855c-6eb862e568b6" (UID: "fe109bf0-70d2-41d2-855c-6eb862e568b6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.664456    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82401c9f-f5f9-4bc6-a085-c89d3632493e-kube-api-access-pw75x" (OuterVolumeSpecName: "kube-api-access-pw75x") pod "82401c9f-f5f9-4bc6-a085-c89d3632493e" (UID: "82401c9f-f5f9-4bc6-a085-c89d3632493e"). InnerVolumeSpecName "kube-api-access-pw75x". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.664883    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82401c9f-f5f9-4bc6-a085-c89d3632493e-scripts" (OuterVolumeSpecName: "scripts") pod "82401c9f-f5f9-4bc6-a085-c89d3632493e" (UID: "82401c9f-f5f9-4bc6-a085-c89d3632493e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.683428    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "82401c9f-f5f9-4bc6-a085-c89d3632493e" (UID: "82401c9f-f5f9-4bc6-a085-c89d3632493e"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue ""
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.721809    4730 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/82401c9f-f5f9-4bc6-a085-c89d3632493e-httpd-run\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.721842    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdtvc\" (UniqueName: \"kubernetes.io/projected/fe109bf0-70d2-41d2-855c-6eb862e568b6-kube-api-access-vdtvc\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.721867    4730 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" "
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.721876    4730 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82401c9f-f5f9-4bc6-a085-c89d3632493e-logs\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.721886    4730 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe109bf0-70d2-41d2-855c-6eb862e568b6-scripts\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.722055    4730 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe109bf0-70d2-41d2-855c-6eb862e568b6-run-httpd\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.723490    4730 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe109bf0-70d2-41d2-855c-6eb862e568b6-log-httpd\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.723521    4730 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82401c9f-f5f9-4bc6-a085-c89d3632493e-scripts\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.723534    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pw75x\" (UniqueName: \"kubernetes.io/projected/82401c9f-f5f9-4bc6-a085-c89d3632493e-kube-api-access-pw75x\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.726647    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82401c9f-f5f9-4bc6-a085-c89d3632493e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "82401c9f-f5f9-4bc6-a085-c89d3632493e" (UID: "82401c9f-f5f9-4bc6-a085-c89d3632493e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.752111    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe109bf0-70d2-41d2-855c-6eb862e568b6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fe109bf0-70d2-41d2-855c-6eb862e568b6" (UID: "fe109bf0-70d2-41d2-855c-6eb862e568b6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.809899    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-tv4tn"]
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.823659    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-qt4mz"]
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.826223    4730 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82401c9f-f5f9-4bc6-a085-c89d3632493e-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.826293    4730 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe109bf0-70d2-41d2-855c-6eb862e568b6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.839452    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-rlp9c"]
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.850802    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-2850-account-create-update-4lrrq"]
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.887001    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-d0d2-account-create-update-z6v46"]
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.894222    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a893eba7-9715-4599-93c2-0365a45134e9","Type":"ContainerStarted","Data":"a9986e5fc496f5b7dd403b81f494104bc1b23a77020a7cc79e3b92ba315ed5a9"}
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.899778    4730 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc"
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.912045    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"14cdd4b7-7a81-469f-ae2f-104b054cc583","Type":"ContainerDied","Data":"656641958e552f93316749e193bcc1be09eafd97508856daa5fede226eb204fa"}
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.912122    4730 scope.go:117] "RemoveContainer" containerID="8f1291a8157b5fac6e0adc6431d157cb584ae26777485369c5a717b5d22da62c"
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.912338    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0"
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.915455    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4a43-account-create-update-cj4kg" event={"ID":"7de61c5d-53ba-4d26-9a79-b82c2bc3b779","Type":"ContainerStarted","Data":"37c4808f93e82a0a4d50cb5905a0dc855da5adcf2e92087cb8a7b2e04fe7f5b5"}
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.928265    4730 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.931357    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7c5c8ffdd9-xpfhf"]
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.940241    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.650580293 podStartE2EDuration="13.940218388s" podCreationTimestamp="2026-03-20 16:01:09 +0000 UTC" firstStartedPulling="2026-03-20 16:01:10.215893499 +0000 UTC m=+1329.429264868" lastFinishedPulling="2026-03-20 16:01:21.505531594 +0000 UTC m=+1340.718902963" observedRunningTime="2026-03-20 16:01:22.911296991 +0000 UTC m=+1342.124668360" watchObservedRunningTime="2026-03-20 16:01:22.940218388 +0000 UTC m=+1342.153589757"
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.942374    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"82401c9f-f5f9-4bc6-a085-c89d3632493e","Type":"ContainerDied","Data":"81ccba513b601e62e75f217051576c1c723201231f3534205e6403a320c44aa9"}
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.942460    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0"
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.955465    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe109bf0-70d2-41d2-855c-6eb862e568b6","Type":"ContainerDied","Data":"4d621236dd54101584c726bca47e76d324d3fb96b4a4404920b5e18a6a4fbb39"}
Mar 20 16:01:22 crc kubenswrapper[4730]: I0320 16:01:22.955780    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.032052    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"]
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.051124    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"]
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.091146    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82401c9f-f5f9-4bc6-a085-c89d3632493e-config-data" (OuterVolumeSpecName: "config-data") pod "82401c9f-f5f9-4bc6-a085-c89d3632493e" (UID: "82401c9f-f5f9-4bc6-a085-c89d3632493e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:01:23 crc kubenswrapper[4730]: W0320 16:01:23.099526    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod383cf79a_0636_4175_bcf8_7e369f101901.slice/crio-e6bf2254b89d522962de84d44ec16f569df85b053c9d15be0ca828bf3c222139 WatchSource:0}: Error finding container e6bf2254b89d522962de84d44ec16f569df85b053c9d15be0ca828bf3c222139: Status 404 returned error can't find the container with id e6bf2254b89d522962de84d44ec16f569df85b053c9d15be0ca828bf3c222139
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.099556    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82401c9f-f5f9-4bc6-a085-c89d3632493e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "82401c9f-f5f9-4bc6-a085-c89d3632493e" (UID: "82401c9f-f5f9-4bc6-a085-c89d3632493e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.116058    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"]
Mar 20 16:01:23 crc kubenswrapper[4730]: E0320 16:01:23.116698    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14cdd4b7-7a81-469f-ae2f-104b054cc583" containerName="glance-httpd"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.116771    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="14cdd4b7-7a81-469f-ae2f-104b054cc583" containerName="glance-httpd"
Mar 20 16:01:23 crc kubenswrapper[4730]: E0320 16:01:23.116783    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82401c9f-f5f9-4bc6-a085-c89d3632493e" containerName="glance-httpd"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.116827    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="82401c9f-f5f9-4bc6-a085-c89d3632493e" containerName="glance-httpd"
Mar 20 16:01:23 crc kubenswrapper[4730]: E0320 16:01:23.116856    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe109bf0-70d2-41d2-855c-6eb862e568b6" containerName="ceilometer-central-agent"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.116864    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe109bf0-70d2-41d2-855c-6eb862e568b6" containerName="ceilometer-central-agent"
Mar 20 16:01:23 crc kubenswrapper[4730]: E0320 16:01:23.116918    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe109bf0-70d2-41d2-855c-6eb862e568b6" containerName="proxy-httpd"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.116926    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe109bf0-70d2-41d2-855c-6eb862e568b6" containerName="proxy-httpd"
Mar 20 16:01:23 crc kubenswrapper[4730]: E0320 16:01:23.116953    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14cdd4b7-7a81-469f-ae2f-104b054cc583" containerName="glance-log"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.116960    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="14cdd4b7-7a81-469f-ae2f-104b054cc583" containerName="glance-log"
Mar 20 16:01:23 crc kubenswrapper[4730]: E0320 16:01:23.116968    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe109bf0-70d2-41d2-855c-6eb862e568b6" containerName="sg-core"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.116974    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe109bf0-70d2-41d2-855c-6eb862e568b6" containerName="sg-core"
Mar 20 16:01:23 crc kubenswrapper[4730]: E0320 16:01:23.116988    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82401c9f-f5f9-4bc6-a085-c89d3632493e" containerName="glance-log"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.116994    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="82401c9f-f5f9-4bc6-a085-c89d3632493e" containerName="glance-log"
Mar 20 16:01:23 crc kubenswrapper[4730]: E0320 16:01:23.117004    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe109bf0-70d2-41d2-855c-6eb862e568b6" containerName="ceilometer-notification-agent"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.117010    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe109bf0-70d2-41d2-855c-6eb862e568b6" containerName="ceilometer-notification-agent"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.117175    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="82401c9f-f5f9-4bc6-a085-c89d3632493e" containerName="glance-log"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.117187    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="14cdd4b7-7a81-469f-ae2f-104b054cc583" containerName="glance-httpd"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.117196    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe109bf0-70d2-41d2-855c-6eb862e568b6" containerName="ceilometer-central-agent"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.117209    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="14cdd4b7-7a81-469f-ae2f-104b054cc583" containerName="glance-log"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.117223    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe109bf0-70d2-41d2-855c-6eb862e568b6" containerName="sg-core"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.117237    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="82401c9f-f5f9-4bc6-a085-c89d3632493e" containerName="glance-httpd"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.117258    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe109bf0-70d2-41d2-855c-6eb862e568b6" containerName="proxy-httpd"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.117269    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe109bf0-70d2-41d2-855c-6eb862e568b6" containerName="ceilometer-notification-agent"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.118396    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.120993    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.121195    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.125391    4730 scope.go:117] "RemoveContainer" containerID="1eef3a4c57a302c49282046a25ce9b6b686742d1068d8539a0c4f898222c31dc"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.133403    4730 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82401c9f-f5f9-4bc6-a085-c89d3632493e-config-data\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.133425    4730 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/82401c9f-f5f9-4bc6-a085-c89d3632493e-public-tls-certs\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.156013    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"]
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.156909    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe109bf0-70d2-41d2-855c-6eb862e568b6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe109bf0-70d2-41d2-855c-6eb862e568b6" (UID: "fe109bf0-70d2-41d2-855c-6eb862e568b6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.236198    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/84366eea-e5f9-43da-ac65-8e79cb659c0a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"84366eea-e5f9-43da-ac65-8e79cb659c0a\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.237061    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84366eea-e5f9-43da-ac65-8e79cb659c0a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"84366eea-e5f9-43da-ac65-8e79cb659c0a\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.237091    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84366eea-e5f9-43da-ac65-8e79cb659c0a-logs\") pod \"glance-default-internal-api-0\" (UID: \"84366eea-e5f9-43da-ac65-8e79cb659c0a\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.237137    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84366eea-e5f9-43da-ac65-8e79cb659c0a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"84366eea-e5f9-43da-ac65-8e79cb659c0a\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.237292    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/84366eea-e5f9-43da-ac65-8e79cb659c0a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"84366eea-e5f9-43da-ac65-8e79cb659c0a\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.237370    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j82ft\" (UniqueName: \"kubernetes.io/projected/84366eea-e5f9-43da-ac65-8e79cb659c0a-kube-api-access-j82ft\") pod \"glance-default-internal-api-0\" (UID: \"84366eea-e5f9-43da-ac65-8e79cb659c0a\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.237450    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"84366eea-e5f9-43da-ac65-8e79cb659c0a\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.237497    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84366eea-e5f9-43da-ac65-8e79cb659c0a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"84366eea-e5f9-43da-ac65-8e79cb659c0a\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.237645    4730 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe109bf0-70d2-41d2-855c-6eb862e568b6-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.318337    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe109bf0-70d2-41d2-855c-6eb862e568b6-config-data" (OuterVolumeSpecName: "config-data") pod "fe109bf0-70d2-41d2-855c-6eb862e568b6" (UID: "fe109bf0-70d2-41d2-855c-6eb862e568b6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.341207    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/84366eea-e5f9-43da-ac65-8e79cb659c0a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"84366eea-e5f9-43da-ac65-8e79cb659c0a\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.341293    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84366eea-e5f9-43da-ac65-8e79cb659c0a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"84366eea-e5f9-43da-ac65-8e79cb659c0a\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.341320    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84366eea-e5f9-43da-ac65-8e79cb659c0a-logs\") pod \"glance-default-internal-api-0\" (UID: \"84366eea-e5f9-43da-ac65-8e79cb659c0a\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.341364    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84366eea-e5f9-43da-ac65-8e79cb659c0a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"84366eea-e5f9-43da-ac65-8e79cb659c0a\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.341479    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/84366eea-e5f9-43da-ac65-8e79cb659c0a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"84366eea-e5f9-43da-ac65-8e79cb659c0a\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.341544    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j82ft\" (UniqueName: \"kubernetes.io/projected/84366eea-e5f9-43da-ac65-8e79cb659c0a-kube-api-access-j82ft\") pod \"glance-default-internal-api-0\" (UID: \"84366eea-e5f9-43da-ac65-8e79cb659c0a\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.341585    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"84366eea-e5f9-43da-ac65-8e79cb659c0a\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.341626    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84366eea-e5f9-43da-ac65-8e79cb659c0a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"84366eea-e5f9-43da-ac65-8e79cb659c0a\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.341704    4730 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe109bf0-70d2-41d2-855c-6eb862e568b6-config-data\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.345034    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/84366eea-e5f9-43da-ac65-8e79cb659c0a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"84366eea-e5f9-43da-ac65-8e79cb659c0a\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.345463    4730 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"84366eea-e5f9-43da-ac65-8e79cb659c0a\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.348052    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84366eea-e5f9-43da-ac65-8e79cb659c0a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"84366eea-e5f9-43da-ac65-8e79cb659c0a\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.348311    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84366eea-e5f9-43da-ac65-8e79cb659c0a-logs\") pod \"glance-default-internal-api-0\" (UID: \"84366eea-e5f9-43da-ac65-8e79cb659c0a\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.352000    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/84366eea-e5f9-43da-ac65-8e79cb659c0a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"84366eea-e5f9-43da-ac65-8e79cb659c0a\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.352236    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84366eea-e5f9-43da-ac65-8e79cb659c0a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"84366eea-e5f9-43da-ac65-8e79cb659c0a\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.363915    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84366eea-e5f9-43da-ac65-8e79cb659c0a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"84366eea-e5f9-43da-ac65-8e79cb659c0a\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.369726    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j82ft\" (UniqueName: \"kubernetes.io/projected/84366eea-e5f9-43da-ac65-8e79cb659c0a-kube-api-access-j82ft\") pod \"glance-default-internal-api-0\" (UID: \"84366eea-e5f9-43da-ac65-8e79cb659c0a\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.397302    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"84366eea-e5f9-43da-ac65-8e79cb659c0a\") " pod="openstack/glance-default-internal-api-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.515137    4730 scope.go:117] "RemoveContainer" containerID="68b0b2752749a64e7ce292cfa6aabcc6400dcc4552e165b090d093ce63fe5a35"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.516858    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.556406    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14cdd4b7-7a81-469f-ae2f-104b054cc583" path="/var/lib/kubelet/pods/14cdd4b7-7a81-469f-ae2f-104b054cc583/volumes"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.567740    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"]
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.591184    4730 scope.go:117] "RemoveContainer" containerID="d342f4f373c3a46fb291e5160cd15525a5be1018e68f010945cfba9de11fd3fe"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.597703    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"]
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.622801    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"]
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.675176    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"]
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.692352    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.698679    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.698907    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.773581    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pq6z\" (UniqueName: \"kubernetes.io/projected/47ed5bd7-7aa8-4f16-98de-f09e21218ae6-kube-api-access-4pq6z\") pod \"glance-default-external-api-0\" (UID: \"47ed5bd7-7aa8-4f16-98de-f09e21218ae6\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.773659    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47ed5bd7-7aa8-4f16-98de-f09e21218ae6-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"47ed5bd7-7aa8-4f16-98de-f09e21218ae6\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.773680    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47ed5bd7-7aa8-4f16-98de-f09e21218ae6-config-data\") pod \"glance-default-external-api-0\" (UID: \"47ed5bd7-7aa8-4f16-98de-f09e21218ae6\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.773725    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47ed5bd7-7aa8-4f16-98de-f09e21218ae6-scripts\") pod \"glance-default-external-api-0\" (UID: \"47ed5bd7-7aa8-4f16-98de-f09e21218ae6\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.773784    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/47ed5bd7-7aa8-4f16-98de-f09e21218ae6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"47ed5bd7-7aa8-4f16-98de-f09e21218ae6\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.773812    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47ed5bd7-7aa8-4f16-98de-f09e21218ae6-logs\") pod \"glance-default-external-api-0\" (UID: \"47ed5bd7-7aa8-4f16-98de-f09e21218ae6\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.773855    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"47ed5bd7-7aa8-4f16-98de-f09e21218ae6\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.773881    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47ed5bd7-7aa8-4f16-98de-f09e21218ae6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"47ed5bd7-7aa8-4f16-98de-f09e21218ae6\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.775429    4730 scope.go:117] "RemoveContainer" containerID="73bfac78764299ab2b8b9302430bca90ecedaa10df4b93eb8c976fe0582a6af1"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.878387    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/47ed5bd7-7aa8-4f16-98de-f09e21218ae6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"47ed5bd7-7aa8-4f16-98de-f09e21218ae6\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.878450    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47ed5bd7-7aa8-4f16-98de-f09e21218ae6-logs\") pod \"glance-default-external-api-0\" (UID: \"47ed5bd7-7aa8-4f16-98de-f09e21218ae6\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.878519    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"47ed5bd7-7aa8-4f16-98de-f09e21218ae6\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.878556    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47ed5bd7-7aa8-4f16-98de-f09e21218ae6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"47ed5bd7-7aa8-4f16-98de-f09e21218ae6\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.878584    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pq6z\" (UniqueName: \"kubernetes.io/projected/47ed5bd7-7aa8-4f16-98de-f09e21218ae6-kube-api-access-4pq6z\") pod \"glance-default-external-api-0\" (UID: \"47ed5bd7-7aa8-4f16-98de-f09e21218ae6\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.878645    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47ed5bd7-7aa8-4f16-98de-f09e21218ae6-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"47ed5bd7-7aa8-4f16-98de-f09e21218ae6\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.878669    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47ed5bd7-7aa8-4f16-98de-f09e21218ae6-config-data\") pod \"glance-default-external-api-0\" (UID: \"47ed5bd7-7aa8-4f16-98de-f09e21218ae6\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.878727    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47ed5bd7-7aa8-4f16-98de-f09e21218ae6-scripts\") pod \"glance-default-external-api-0\" (UID: \"47ed5bd7-7aa8-4f16-98de-f09e21218ae6\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.879824    4730 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"47ed5bd7-7aa8-4f16-98de-f09e21218ae6\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.881032    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47ed5bd7-7aa8-4f16-98de-f09e21218ae6-logs\") pod \"glance-default-external-api-0\" (UID: \"47ed5bd7-7aa8-4f16-98de-f09e21218ae6\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.881278    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/47ed5bd7-7aa8-4f16-98de-f09e21218ae6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"47ed5bd7-7aa8-4f16-98de-f09e21218ae6\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.905947    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47ed5bd7-7aa8-4f16-98de-f09e21218ae6-config-data\") pod \"glance-default-external-api-0\" (UID: \"47ed5bd7-7aa8-4f16-98de-f09e21218ae6\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.907882    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pq6z\" (UniqueName: \"kubernetes.io/projected/47ed5bd7-7aa8-4f16-98de-f09e21218ae6-kube-api-access-4pq6z\") pod \"glance-default-external-api-0\" (UID: \"47ed5bd7-7aa8-4f16-98de-f09e21218ae6\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.908329    4730 scope.go:117] "RemoveContainer" containerID="786b95c139b35cbde05b7814738f76287b10b59ef0dc76fe0f5bcee037ab03c4"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.915978    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47ed5bd7-7aa8-4f16-98de-f09e21218ae6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"47ed5bd7-7aa8-4f16-98de-f09e21218ae6\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.917160    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47ed5bd7-7aa8-4f16-98de-f09e21218ae6-scripts\") pod \"glance-default-external-api-0\" (UID: \"47ed5bd7-7aa8-4f16-98de-f09e21218ae6\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.918367    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47ed5bd7-7aa8-4f16-98de-f09e21218ae6-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"47ed5bd7-7aa8-4f16-98de-f09e21218ae6\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.959383    4730 scope.go:117] "RemoveContainer" containerID="8853aa1f17e6388ce020212c8d73958c09bbf6fcc38c4d043313ee458cbde4ad"
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.984838    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7c5c8ffdd9-xpfhf" event={"ID":"b9780622-27f3-4339-8107-321feed5e25b","Type":"ContainerStarted","Data":"6737220c431a730c25e8b5fa82bece6085b864fef5ef8bc86899d62afa13f2b7"}
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.995426    4730 generic.go:334] "Generic (PLEG): container finished" podID="4ed9fad7-284f-40b4-9c3b-7a213aff010a" containerID="3635b09696560454a28e6d666babdb61696ccff059aecec39acea6122546c8aa" exitCode=0
Mar 20 16:01:23 crc kubenswrapper[4730]: I0320 16:01:23.995727    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6858c8d8f6-k4smz" event={"ID":"4ed9fad7-284f-40b4-9c3b-7a213aff010a","Type":"ContainerDied","Data":"3635b09696560454a28e6d666babdb61696ccff059aecec39acea6122546c8aa"}
Mar 20 16:01:24 crc kubenswrapper[4730]: I0320 16:01:24.001297    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4a43-account-create-update-cj4kg" event={"ID":"7de61c5d-53ba-4d26-9a79-b82c2bc3b779","Type":"ContainerStarted","Data":"b8f3077acd6da12cfe1f43474ad395781d9175a1c666f3763b8d16340af465ed"}
Mar 20 16:01:24 crc kubenswrapper[4730]: I0320 16:01:24.006756    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-d0d2-account-create-update-z6v46" event={"ID":"383cf79a-0636-4175-bcf8-7e369f101901","Type":"ContainerStarted","Data":"e6bf2254b89d522962de84d44ec16f569df85b053c9d15be0ca828bf3c222139"}
Mar 20 16:01:24 crc kubenswrapper[4730]: I0320 16:01:24.015771    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-tv4tn" event={"ID":"dac41622-7c80-4fce-a5ac-8a04d301669d","Type":"ContainerStarted","Data":"0bc2c745fdc03e2ebaee7279abc3bfbcbbdcf758f441e201c8581df04a4b242e"}
Mar 20 16:01:24 crc kubenswrapper[4730]: I0320 16:01:24.033731    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-qt4mz" event={"ID":"6b9cde2b-3d05-44ba-898b-6fe8cfab4fe1","Type":"ContainerStarted","Data":"355a75fe7ba277cb711131a013e942d383afbd409506d81ea0e5c0f0045fafaa"}
Mar 20 16:01:24 crc kubenswrapper[4730]: I0320 16:01:24.040459    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-rlp9c" event={"ID":"475a52ba-bc8d-4c7b-ae99-330d6ec2b358","Type":"ContainerStarted","Data":"ce528fba4f5580657123f969c759921b847c901bd39a64e8b6b06d44553bfc4a"}
Mar 20 16:01:24 crc kubenswrapper[4730]: I0320 16:01:24.048266    4730 scope.go:117] "RemoveContainer" containerID="c73d9cedf11e6a3a273cb136651dcbf0aa00b21555ddca3a8c2b2551a6375a21"
Mar 20 16:01:24 crc kubenswrapper[4730]: I0320 16:01:24.054658    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2850-account-create-update-4lrrq" event={"ID":"3f625a9e-a940-476b-85b2-ff54c5e87785","Type":"ContainerStarted","Data":"3216fe865866f7179d8f43203a7e55e15d5d0647158c4d882b6846c9c28b49d1"}
Mar 20 16:01:24 crc kubenswrapper[4730]: I0320 16:01:24.344492    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"47ed5bd7-7aa8-4f16-98de-f09e21218ae6\") " pod="openstack/glance-default-external-api-0"
Mar 20 16:01:24 crc kubenswrapper[4730]: I0320 16:01:24.429011    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"]
Mar 20 16:01:24 crc kubenswrapper[4730]: W0320 16:01:24.450501    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84366eea_e5f9_43da_ac65_8e79cb659c0a.slice/crio-f66b063c9a11d8bc88c8ceb8816f91a77a1c4b1fc4469fc7bcec47c61420ab84 WatchSource:0}: Error finding container f66b063c9a11d8bc88c8ceb8816f91a77a1c4b1fc4469fc7bcec47c61420ab84: Status 404 returned error can't find the container with id f66b063c9a11d8bc88c8ceb8816f91a77a1c4b1fc4469fc7bcec47c61420ab84
Mar 20 16:01:24 crc kubenswrapper[4730]: I0320 16:01:24.509353    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0"
Mar 20 16:01:25 crc kubenswrapper[4730]: I0320 16:01:25.069355    4730 generic.go:334] "Generic (PLEG): container finished" podID="dac41622-7c80-4fce-a5ac-8a04d301669d" containerID="fa7fa1c1a12965d7d645639e89c04d923a3d343c7729667128d205eaaba9942e" exitCode=0
Mar 20 16:01:25 crc kubenswrapper[4730]: I0320 16:01:25.069646    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-tv4tn" event={"ID":"dac41622-7c80-4fce-a5ac-8a04d301669d","Type":"ContainerDied","Data":"fa7fa1c1a12965d7d645639e89c04d923a3d343c7729667128d205eaaba9942e"}
Mar 20 16:01:25 crc kubenswrapper[4730]: I0320 16:01:25.072546    4730 generic.go:334] "Generic (PLEG): container finished" podID="475a52ba-bc8d-4c7b-ae99-330d6ec2b358" containerID="be1153307c9e28a344ac73169445af77d8ee3c7d9c2256c03916bd83fc0e8437" exitCode=0
Mar 20 16:01:25 crc kubenswrapper[4730]: I0320 16:01:25.072591    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-rlp9c" event={"ID":"475a52ba-bc8d-4c7b-ae99-330d6ec2b358","Type":"ContainerDied","Data":"be1153307c9e28a344ac73169445af77d8ee3c7d9c2256c03916bd83fc0e8437"}
Mar 20 16:01:25 crc kubenswrapper[4730]: I0320 16:01:25.074922    4730 generic.go:334] "Generic (PLEG): container finished" podID="383cf79a-0636-4175-bcf8-7e369f101901" containerID="f2c396b5999dcbacb34f0cb38c776c483344cb3d6a6925954ac69d2fbac35de7" exitCode=0
Mar 20 16:01:25 crc kubenswrapper[4730]: I0320 16:01:25.075032    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-d0d2-account-create-update-z6v46" event={"ID":"383cf79a-0636-4175-bcf8-7e369f101901","Type":"ContainerDied","Data":"f2c396b5999dcbacb34f0cb38c776c483344cb3d6a6925954ac69d2fbac35de7"}
Mar 20 16:01:25 crc kubenswrapper[4730]: I0320 16:01:25.077068    4730 generic.go:334] "Generic (PLEG): container finished" podID="6b9cde2b-3d05-44ba-898b-6fe8cfab4fe1" containerID="0d50b068c846deeebd08139bc4d49513ed414820309b36a4108d6ffa43871b84" exitCode=0
Mar 20 16:01:25 crc kubenswrapper[4730]: I0320 16:01:25.077143    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-qt4mz" event={"ID":"6b9cde2b-3d05-44ba-898b-6fe8cfab4fe1","Type":"ContainerDied","Data":"0d50b068c846deeebd08139bc4d49513ed414820309b36a4108d6ffa43871b84"}
Mar 20 16:01:25 crc kubenswrapper[4730]: I0320 16:01:25.080227    4730 generic.go:334] "Generic (PLEG): container finished" podID="3f625a9e-a940-476b-85b2-ff54c5e87785" containerID="3a26f3f5793abd65e69907fa90ac71e2abaa4ea13397a929de033f2bbf59a251" exitCode=0
Mar 20 16:01:25 crc kubenswrapper[4730]: I0320 16:01:25.080286    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2850-account-create-update-4lrrq" event={"ID":"3f625a9e-a940-476b-85b2-ff54c5e87785","Type":"ContainerDied","Data":"3a26f3f5793abd65e69907fa90ac71e2abaa4ea13397a929de033f2bbf59a251"}
Mar 20 16:01:25 crc kubenswrapper[4730]: I0320 16:01:25.098340    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7c5c8ffdd9-xpfhf" event={"ID":"b9780622-27f3-4339-8107-321feed5e25b","Type":"ContainerStarted","Data":"4fcf792521814c19293591e5848d25f6975cd1c658448c02e673fd07255574a6"}
Mar 20 16:01:25 crc kubenswrapper[4730]: I0320 16:01:25.098381    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7c5c8ffdd9-xpfhf" event={"ID":"b9780622-27f3-4339-8107-321feed5e25b","Type":"ContainerStarted","Data":"d59a1350a5c641d443411375443c080b305a5acd7fdd08f8f8e72ad5b37fd568"}
Mar 20 16:01:25 crc kubenswrapper[4730]: I0320 16:01:25.098791    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7c5c8ffdd9-xpfhf"
Mar 20 16:01:25 crc kubenswrapper[4730]: I0320 16:01:25.098944    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7c5c8ffdd9-xpfhf"
Mar 20 16:01:25 crc kubenswrapper[4730]: I0320 16:01:25.100316    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"84366eea-e5f9-43da-ac65-8e79cb659c0a","Type":"ContainerStarted","Data":"f66b063c9a11d8bc88c8ceb8816f91a77a1c4b1fc4469fc7bcec47c61420ab84"}
Mar 20 16:01:25 crc kubenswrapper[4730]: I0320 16:01:25.106725    4730 generic.go:334] "Generic (PLEG): container finished" podID="7de61c5d-53ba-4d26-9a79-b82c2bc3b779" containerID="b8f3077acd6da12cfe1f43474ad395781d9175a1c666f3763b8d16340af465ed" exitCode=0
Mar 20 16:01:25 crc kubenswrapper[4730]: I0320 16:01:25.106779    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4a43-account-create-update-cj4kg" event={"ID":"7de61c5d-53ba-4d26-9a79-b82c2bc3b779","Type":"ContainerDied","Data":"b8f3077acd6da12cfe1f43474ad395781d9175a1c666f3763b8d16340af465ed"}
Mar 20 16:01:25 crc kubenswrapper[4730]: I0320 16:01:25.238583    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-7c5c8ffdd9-xpfhf" podStartSLOduration=9.238560098 podStartE2EDuration="9.238560098s" podCreationTimestamp="2026-03-20 16:01:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:01:25.219757047 +0000 UTC m=+1344.433128416" watchObservedRunningTime="2026-03-20 16:01:25.238560098 +0000 UTC m=+1344.451931467"
Mar 20 16:01:25 crc kubenswrapper[4730]: I0320 16:01:25.283692    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"]
Mar 20 16:01:25 crc kubenswrapper[4730]: I0320 16:01:25.561474    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82401c9f-f5f9-4bc6-a085-c89d3632493e" path="/var/lib/kubelet/pods/82401c9f-f5f9-4bc6-a085-c89d3632493e/volumes"
Mar 20 16:01:25 crc kubenswrapper[4730]: I0320 16:01:25.677833    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4a43-account-create-update-cj4kg"
Mar 20 16:01:25 crc kubenswrapper[4730]: I0320 16:01:25.825147    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdzxp\" (UniqueName: \"kubernetes.io/projected/7de61c5d-53ba-4d26-9a79-b82c2bc3b779-kube-api-access-rdzxp\") pod \"7de61c5d-53ba-4d26-9a79-b82c2bc3b779\" (UID: \"7de61c5d-53ba-4d26-9a79-b82c2bc3b779\") "
Mar 20 16:01:25 crc kubenswrapper[4730]: I0320 16:01:25.825318    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7de61c5d-53ba-4d26-9a79-b82c2bc3b779-operator-scripts\") pod \"7de61c5d-53ba-4d26-9a79-b82c2bc3b779\" (UID: \"7de61c5d-53ba-4d26-9a79-b82c2bc3b779\") "
Mar 20 16:01:25 crc kubenswrapper[4730]: I0320 16:01:25.826513    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7de61c5d-53ba-4d26-9a79-b82c2bc3b779-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7de61c5d-53ba-4d26-9a79-b82c2bc3b779" (UID: "7de61c5d-53ba-4d26-9a79-b82c2bc3b779"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:01:25 crc kubenswrapper[4730]: I0320 16:01:25.833061    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7de61c5d-53ba-4d26-9a79-b82c2bc3b779-kube-api-access-rdzxp" (OuterVolumeSpecName: "kube-api-access-rdzxp") pod "7de61c5d-53ba-4d26-9a79-b82c2bc3b779" (UID: "7de61c5d-53ba-4d26-9a79-b82c2bc3b779"). InnerVolumeSpecName "kube-api-access-rdzxp". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:01:25 crc kubenswrapper[4730]: I0320 16:01:25.929484    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdzxp\" (UniqueName: \"kubernetes.io/projected/7de61c5d-53ba-4d26-9a79-b82c2bc3b779-kube-api-access-rdzxp\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:25 crc kubenswrapper[4730]: I0320 16:01:25.929513    4730 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7de61c5d-53ba-4d26-9a79-b82c2bc3b779-operator-scripts\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:26 crc kubenswrapper[4730]: I0320 16:01:26.118289    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"84366eea-e5f9-43da-ac65-8e79cb659c0a","Type":"ContainerStarted","Data":"779e7477c07bb8fb52d96b4fbf4853ffaaf30fb37f63820812b07120ae5f29d8"}
Mar 20 16:01:26 crc kubenswrapper[4730]: I0320 16:01:26.119853    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4a43-account-create-update-cj4kg" event={"ID":"7de61c5d-53ba-4d26-9a79-b82c2bc3b779","Type":"ContainerDied","Data":"37c4808f93e82a0a4d50cb5905a0dc855da5adcf2e92087cb8a7b2e04fe7f5b5"}
Mar 20 16:01:26 crc kubenswrapper[4730]: I0320 16:01:26.120180    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37c4808f93e82a0a4d50cb5905a0dc855da5adcf2e92087cb8a7b2e04fe7f5b5"
Mar 20 16:01:26 crc kubenswrapper[4730]: I0320 16:01:26.120233    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4a43-account-create-update-cj4kg"
Mar 20 16:01:26 crc kubenswrapper[4730]: I0320 16:01:26.121943    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"47ed5bd7-7aa8-4f16-98de-f09e21218ae6","Type":"ContainerStarted","Data":"fd58701a403de7bf0ddedef18266cb425b3027c016fae3e50be9328963693994"}
Mar 20 16:01:26 crc kubenswrapper[4730]: I0320 16:01:26.671796    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d0d2-account-create-update-z6v46"
Mar 20 16:01:26 crc kubenswrapper[4730]: I0320 16:01:26.865865    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/383cf79a-0636-4175-bcf8-7e369f101901-operator-scripts\") pod \"383cf79a-0636-4175-bcf8-7e369f101901\" (UID: \"383cf79a-0636-4175-bcf8-7e369f101901\") "
Mar 20 16:01:26 crc kubenswrapper[4730]: I0320 16:01:26.866010    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkr7g\" (UniqueName: \"kubernetes.io/projected/383cf79a-0636-4175-bcf8-7e369f101901-kube-api-access-fkr7g\") pod \"383cf79a-0636-4175-bcf8-7e369f101901\" (UID: \"383cf79a-0636-4175-bcf8-7e369f101901\") "
Mar 20 16:01:26 crc kubenswrapper[4730]: I0320 16:01:26.866567    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/383cf79a-0636-4175-bcf8-7e369f101901-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "383cf79a-0636-4175-bcf8-7e369f101901" (UID: "383cf79a-0636-4175-bcf8-7e369f101901"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:01:26 crc kubenswrapper[4730]: I0320 16:01:26.866869    4730 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/383cf79a-0636-4175-bcf8-7e369f101901-operator-scripts\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:26 crc kubenswrapper[4730]: I0320 16:01:26.891090    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/383cf79a-0636-4175-bcf8-7e369f101901-kube-api-access-fkr7g" (OuterVolumeSpecName: "kube-api-access-fkr7g") pod "383cf79a-0636-4175-bcf8-7e369f101901" (UID: "383cf79a-0636-4175-bcf8-7e369f101901"). InnerVolumeSpecName "kube-api-access-fkr7g". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:01:26 crc kubenswrapper[4730]: I0320 16:01:26.968484    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkr7g\" (UniqueName: \"kubernetes.io/projected/383cf79a-0636-4175-bcf8-7e369f101901-kube-api-access-fkr7g\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:27 crc kubenswrapper[4730]: I0320 16:01:27.040743    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-tv4tn"
Mar 20 16:01:27 crc kubenswrapper[4730]: I0320 16:01:27.082063    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b68m2\" (UniqueName: \"kubernetes.io/projected/dac41622-7c80-4fce-a5ac-8a04d301669d-kube-api-access-b68m2\") pod \"dac41622-7c80-4fce-a5ac-8a04d301669d\" (UID: \"dac41622-7c80-4fce-a5ac-8a04d301669d\") "
Mar 20 16:01:27 crc kubenswrapper[4730]: I0320 16:01:27.082643    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dac41622-7c80-4fce-a5ac-8a04d301669d-operator-scripts\") pod \"dac41622-7c80-4fce-a5ac-8a04d301669d\" (UID: \"dac41622-7c80-4fce-a5ac-8a04d301669d\") "
Mar 20 16:01:27 crc kubenswrapper[4730]: I0320 16:01:27.083582    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dac41622-7c80-4fce-a5ac-8a04d301669d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dac41622-7c80-4fce-a5ac-8a04d301669d" (UID: "dac41622-7c80-4fce-a5ac-8a04d301669d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:01:27 crc kubenswrapper[4730]: I0320 16:01:27.098428    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dac41622-7c80-4fce-a5ac-8a04d301669d-kube-api-access-b68m2" (OuterVolumeSpecName: "kube-api-access-b68m2") pod "dac41622-7c80-4fce-a5ac-8a04d301669d" (UID: "dac41622-7c80-4fce-a5ac-8a04d301669d"). InnerVolumeSpecName "kube-api-access-b68m2". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:01:27 crc kubenswrapper[4730]: I0320 16:01:27.183869    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b68m2\" (UniqueName: \"kubernetes.io/projected/dac41622-7c80-4fce-a5ac-8a04d301669d-kube-api-access-b68m2\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:27 crc kubenswrapper[4730]: I0320 16:01:27.183905    4730 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dac41622-7c80-4fce-a5ac-8a04d301669d-operator-scripts\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:27 crc kubenswrapper[4730]: I0320 16:01:27.193564    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-rlp9c"
Mar 20 16:01:27 crc kubenswrapper[4730]: I0320 16:01:27.205927    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-tv4tn" event={"ID":"dac41622-7c80-4fce-a5ac-8a04d301669d","Type":"ContainerDied","Data":"0bc2c745fdc03e2ebaee7279abc3bfbcbbdcf758f441e201c8581df04a4b242e"}
Mar 20 16:01:27 crc kubenswrapper[4730]: I0320 16:01:27.205965    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0bc2c745fdc03e2ebaee7279abc3bfbcbbdcf758f441e201c8581df04a4b242e"
Mar 20 16:01:27 crc kubenswrapper[4730]: I0320 16:01:27.206025    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-tv4tn"
Mar 20 16:01:27 crc kubenswrapper[4730]: I0320 16:01:27.212574    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-rlp9c" event={"ID":"475a52ba-bc8d-4c7b-ae99-330d6ec2b358","Type":"ContainerDied","Data":"ce528fba4f5580657123f969c759921b847c901bd39a64e8b6b06d44553bfc4a"}
Mar 20 16:01:27 crc kubenswrapper[4730]: I0320 16:01:27.212631    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce528fba4f5580657123f969c759921b847c901bd39a64e8b6b06d44553bfc4a"
Mar 20 16:01:27 crc kubenswrapper[4730]: I0320 16:01:27.212708    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-rlp9c"
Mar 20 16:01:27 crc kubenswrapper[4730]: I0320 16:01:27.220234    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"84366eea-e5f9-43da-ac65-8e79cb659c0a","Type":"ContainerStarted","Data":"996f0b7f4e3a654b9192e645695b74025c495451221958d0da443f5f7189c82a"}
Mar 20 16:01:27 crc kubenswrapper[4730]: I0320 16:01:27.262522    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"47ed5bd7-7aa8-4f16-98de-f09e21218ae6","Type":"ContainerStarted","Data":"ce34938fef5613dae9cf7087ec34e9ba6875fd8831873c932828c58d94c6b2e6"}
Mar 20 16:01:27 crc kubenswrapper[4730]: I0320 16:01:27.265773    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-d0d2-account-create-update-z6v46" event={"ID":"383cf79a-0636-4175-bcf8-7e369f101901","Type":"ContainerDied","Data":"e6bf2254b89d522962de84d44ec16f569df85b053c9d15be0ca828bf3c222139"}
Mar 20 16:01:27 crc kubenswrapper[4730]: I0320 16:01:27.265807    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6bf2254b89d522962de84d44ec16f569df85b053c9d15be0ca828bf3c222139"
Mar 20 16:01:27 crc kubenswrapper[4730]: I0320 16:01:27.265864    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d0d2-account-create-update-z6v46"
Mar 20 16:01:27 crc kubenswrapper[4730]: I0320 16:01:27.278691    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.278673941 podStartE2EDuration="5.278673941s" podCreationTimestamp="2026-03-20 16:01:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:01:27.271942771 +0000 UTC m=+1346.485314130" watchObservedRunningTime="2026-03-20 16:01:27.278673941 +0000 UTC m=+1346.492045310"
Mar 20 16:01:27 crc kubenswrapper[4730]: I0320 16:01:27.282006    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2850-account-create-update-4lrrq" event={"ID":"3f625a9e-a940-476b-85b2-ff54c5e87785","Type":"ContainerDied","Data":"3216fe865866f7179d8f43203a7e55e15d5d0647158c4d882b6846c9c28b49d1"}
Mar 20 16:01:27 crc kubenswrapper[4730]: I0320 16:01:27.282046    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3216fe865866f7179d8f43203a7e55e15d5d0647158c4d882b6846c9c28b49d1"
Mar 20 16:01:27 crc kubenswrapper[4730]: I0320 16:01:27.296840    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-qt4mz" event={"ID":"6b9cde2b-3d05-44ba-898b-6fe8cfab4fe1","Type":"ContainerDied","Data":"355a75fe7ba277cb711131a013e942d383afbd409506d81ea0e5c0f0045fafaa"}
Mar 20 16:01:27 crc kubenswrapper[4730]: I0320 16:01:27.296883    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="355a75fe7ba277cb711131a013e942d383afbd409506d81ea0e5c0f0045fafaa"
Mar 20 16:01:27 crc kubenswrapper[4730]: I0320 16:01:27.299799    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2850-account-create-update-4lrrq"
Mar 20 16:01:27 crc kubenswrapper[4730]: I0320 16:01:27.314221    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-qt4mz"
Mar 20 16:01:27 crc kubenswrapper[4730]: I0320 16:01:27.389922    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/475a52ba-bc8d-4c7b-ae99-330d6ec2b358-operator-scripts\") pod \"475a52ba-bc8d-4c7b-ae99-330d6ec2b358\" (UID: \"475a52ba-bc8d-4c7b-ae99-330d6ec2b358\") "
Mar 20 16:01:27 crc kubenswrapper[4730]: I0320 16:01:27.390021    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rq8d\" (UniqueName: \"kubernetes.io/projected/475a52ba-bc8d-4c7b-ae99-330d6ec2b358-kube-api-access-7rq8d\") pod \"475a52ba-bc8d-4c7b-ae99-330d6ec2b358\" (UID: \"475a52ba-bc8d-4c7b-ae99-330d6ec2b358\") "
Mar 20 16:01:27 crc kubenswrapper[4730]: I0320 16:01:27.390804    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/475a52ba-bc8d-4c7b-ae99-330d6ec2b358-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "475a52ba-bc8d-4c7b-ae99-330d6ec2b358" (UID: "475a52ba-bc8d-4c7b-ae99-330d6ec2b358"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:01:27 crc kubenswrapper[4730]: I0320 16:01:27.406843    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/475a52ba-bc8d-4c7b-ae99-330d6ec2b358-kube-api-access-7rq8d" (OuterVolumeSpecName: "kube-api-access-7rq8d") pod "475a52ba-bc8d-4c7b-ae99-330d6ec2b358" (UID: "475a52ba-bc8d-4c7b-ae99-330d6ec2b358"). InnerVolumeSpecName "kube-api-access-7rq8d". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:01:27 crc kubenswrapper[4730]: I0320 16:01:27.491362    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f625a9e-a940-476b-85b2-ff54c5e87785-operator-scripts\") pod \"3f625a9e-a940-476b-85b2-ff54c5e87785\" (UID: \"3f625a9e-a940-476b-85b2-ff54c5e87785\") "
Mar 20 16:01:27 crc kubenswrapper[4730]: I0320 16:01:27.491466    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b9cde2b-3d05-44ba-898b-6fe8cfab4fe1-operator-scripts\") pod \"6b9cde2b-3d05-44ba-898b-6fe8cfab4fe1\" (UID: \"6b9cde2b-3d05-44ba-898b-6fe8cfab4fe1\") "
Mar 20 16:01:27 crc kubenswrapper[4730]: I0320 16:01:27.491614    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25qhp\" (UniqueName: \"kubernetes.io/projected/3f625a9e-a940-476b-85b2-ff54c5e87785-kube-api-access-25qhp\") pod \"3f625a9e-a940-476b-85b2-ff54c5e87785\" (UID: \"3f625a9e-a940-476b-85b2-ff54c5e87785\") "
Mar 20 16:01:27 crc kubenswrapper[4730]: I0320 16:01:27.491641    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkb5p\" (UniqueName: \"kubernetes.io/projected/6b9cde2b-3d05-44ba-898b-6fe8cfab4fe1-kube-api-access-fkb5p\") pod \"6b9cde2b-3d05-44ba-898b-6fe8cfab4fe1\" (UID: \"6b9cde2b-3d05-44ba-898b-6fe8cfab4fe1\") "
Mar 20 16:01:27 crc kubenswrapper[4730]: I0320 16:01:27.492043    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f625a9e-a940-476b-85b2-ff54c5e87785-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3f625a9e-a940-476b-85b2-ff54c5e87785" (UID: "3f625a9e-a940-476b-85b2-ff54c5e87785"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:01:27 crc kubenswrapper[4730]: I0320 16:01:27.492043    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b9cde2b-3d05-44ba-898b-6fe8cfab4fe1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6b9cde2b-3d05-44ba-898b-6fe8cfab4fe1" (UID: "6b9cde2b-3d05-44ba-898b-6fe8cfab4fe1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:01:27 crc kubenswrapper[4730]: I0320 16:01:27.492137    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rq8d\" (UniqueName: \"kubernetes.io/projected/475a52ba-bc8d-4c7b-ae99-330d6ec2b358-kube-api-access-7rq8d\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:27 crc kubenswrapper[4730]: I0320 16:01:27.492155    4730 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/475a52ba-bc8d-4c7b-ae99-330d6ec2b358-operator-scripts\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:27 crc kubenswrapper[4730]: I0320 16:01:27.502538    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b9cde2b-3d05-44ba-898b-6fe8cfab4fe1-kube-api-access-fkb5p" (OuterVolumeSpecName: "kube-api-access-fkb5p") pod "6b9cde2b-3d05-44ba-898b-6fe8cfab4fe1" (UID: "6b9cde2b-3d05-44ba-898b-6fe8cfab4fe1"). InnerVolumeSpecName "kube-api-access-fkb5p". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:01:27 crc kubenswrapper[4730]: I0320 16:01:27.502635    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f625a9e-a940-476b-85b2-ff54c5e87785-kube-api-access-25qhp" (OuterVolumeSpecName: "kube-api-access-25qhp") pod "3f625a9e-a940-476b-85b2-ff54c5e87785" (UID: "3f625a9e-a940-476b-85b2-ff54c5e87785"). InnerVolumeSpecName "kube-api-access-25qhp". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:01:27 crc kubenswrapper[4730]: I0320 16:01:27.593398    4730 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f625a9e-a940-476b-85b2-ff54c5e87785-operator-scripts\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:27 crc kubenswrapper[4730]: I0320 16:01:27.593433    4730 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b9cde2b-3d05-44ba-898b-6fe8cfab4fe1-operator-scripts\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:27 crc kubenswrapper[4730]: I0320 16:01:27.593444    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25qhp\" (UniqueName: \"kubernetes.io/projected/3f625a9e-a940-476b-85b2-ff54c5e87785-kube-api-access-25qhp\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:27 crc kubenswrapper[4730]: I0320 16:01:27.593454    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkb5p\" (UniqueName: \"kubernetes.io/projected/6b9cde2b-3d05-44ba-898b-6fe8cfab4fe1-kube-api-access-fkb5p\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:27 crc kubenswrapper[4730]: I0320 16:01:27.988432    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6858c8d8f6-k4smz"
Mar 20 16:01:28 crc kubenswrapper[4730]: I0320 16:01:28.102239    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4ed9fad7-284f-40b4-9c3b-7a213aff010a-config\") pod \"4ed9fad7-284f-40b4-9c3b-7a213aff010a\" (UID: \"4ed9fad7-284f-40b4-9c3b-7a213aff010a\") "
Mar 20 16:01:28 crc kubenswrapper[4730]: I0320 16:01:28.102323    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ed9fad7-284f-40b4-9c3b-7a213aff010a-ovndb-tls-certs\") pod \"4ed9fad7-284f-40b4-9c3b-7a213aff010a\" (UID: \"4ed9fad7-284f-40b4-9c3b-7a213aff010a\") "
Mar 20 16:01:28 crc kubenswrapper[4730]: I0320 16:01:28.102388    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97ld6\" (UniqueName: \"kubernetes.io/projected/4ed9fad7-284f-40b4-9c3b-7a213aff010a-kube-api-access-97ld6\") pod \"4ed9fad7-284f-40b4-9c3b-7a213aff010a\" (UID: \"4ed9fad7-284f-40b4-9c3b-7a213aff010a\") "
Mar 20 16:01:28 crc kubenswrapper[4730]: I0320 16:01:28.102513    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4ed9fad7-284f-40b4-9c3b-7a213aff010a-httpd-config\") pod \"4ed9fad7-284f-40b4-9c3b-7a213aff010a\" (UID: \"4ed9fad7-284f-40b4-9c3b-7a213aff010a\") "
Mar 20 16:01:28 crc kubenswrapper[4730]: I0320 16:01:28.102599    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ed9fad7-284f-40b4-9c3b-7a213aff010a-combined-ca-bundle\") pod \"4ed9fad7-284f-40b4-9c3b-7a213aff010a\" (UID: \"4ed9fad7-284f-40b4-9c3b-7a213aff010a\") "
Mar 20 16:01:28 crc kubenswrapper[4730]: I0320 16:01:28.107570    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ed9fad7-284f-40b4-9c3b-7a213aff010a-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "4ed9fad7-284f-40b4-9c3b-7a213aff010a" (UID: "4ed9fad7-284f-40b4-9c3b-7a213aff010a"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:01:28 crc kubenswrapper[4730]: I0320 16:01:28.109987    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ed9fad7-284f-40b4-9c3b-7a213aff010a-kube-api-access-97ld6" (OuterVolumeSpecName: "kube-api-access-97ld6") pod "4ed9fad7-284f-40b4-9c3b-7a213aff010a" (UID: "4ed9fad7-284f-40b4-9c3b-7a213aff010a"). InnerVolumeSpecName "kube-api-access-97ld6". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:01:28 crc kubenswrapper[4730]: I0320 16:01:28.152557    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ed9fad7-284f-40b4-9c3b-7a213aff010a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4ed9fad7-284f-40b4-9c3b-7a213aff010a" (UID: "4ed9fad7-284f-40b4-9c3b-7a213aff010a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:01:28 crc kubenswrapper[4730]: I0320 16:01:28.162524    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ed9fad7-284f-40b4-9c3b-7a213aff010a-config" (OuterVolumeSpecName: "config") pod "4ed9fad7-284f-40b4-9c3b-7a213aff010a" (UID: "4ed9fad7-284f-40b4-9c3b-7a213aff010a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:01:28 crc kubenswrapper[4730]: I0320 16:01:28.183239    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ed9fad7-284f-40b4-9c3b-7a213aff010a-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "4ed9fad7-284f-40b4-9c3b-7a213aff010a" (UID: "4ed9fad7-284f-40b4-9c3b-7a213aff010a"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:01:28 crc kubenswrapper[4730]: I0320 16:01:28.206419    4730 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4ed9fad7-284f-40b4-9c3b-7a213aff010a-httpd-config\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:28 crc kubenswrapper[4730]: I0320 16:01:28.206457    4730 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ed9fad7-284f-40b4-9c3b-7a213aff010a-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:28 crc kubenswrapper[4730]: I0320 16:01:28.206469    4730 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4ed9fad7-284f-40b4-9c3b-7a213aff010a-config\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:28 crc kubenswrapper[4730]: I0320 16:01:28.206477    4730 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ed9fad7-284f-40b4-9c3b-7a213aff010a-ovndb-tls-certs\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:28 crc kubenswrapper[4730]: I0320 16:01:28.206486    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97ld6\" (UniqueName: \"kubernetes.io/projected/4ed9fad7-284f-40b4-9c3b-7a213aff010a-kube-api-access-97ld6\") on node \"crc\" DevicePath \"\""
Mar 20 16:01:28 crc kubenswrapper[4730]: I0320 16:01:28.310530    4730 generic.go:334] "Generic (PLEG): container finished" podID="4ed9fad7-284f-40b4-9c3b-7a213aff010a" containerID="2eff1617c29a34da6021d776b5bc5c6695025819add4af253986837526af0f15" exitCode=0
Mar 20 16:01:28 crc kubenswrapper[4730]: I0320 16:01:28.310675    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6858c8d8f6-k4smz"
Mar 20 16:01:28 crc kubenswrapper[4730]: I0320 16:01:28.310875    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6858c8d8f6-k4smz" event={"ID":"4ed9fad7-284f-40b4-9c3b-7a213aff010a","Type":"ContainerDied","Data":"2eff1617c29a34da6021d776b5bc5c6695025819add4af253986837526af0f15"}
Mar 20 16:01:28 crc kubenswrapper[4730]: I0320 16:01:28.310921    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6858c8d8f6-k4smz" event={"ID":"4ed9fad7-284f-40b4-9c3b-7a213aff010a","Type":"ContainerDied","Data":"52473fa5ebf9585a5de42b7ba1a1ed907405640eabed30fdba7a035924a392d0"}
Mar 20 16:01:28 crc kubenswrapper[4730]: I0320 16:01:28.310941    4730 scope.go:117] "RemoveContainer" containerID="3635b09696560454a28e6d666babdb61696ccff059aecec39acea6122546c8aa"
Mar 20 16:01:28 crc kubenswrapper[4730]: I0320 16:01:28.313040    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-qt4mz"
Mar 20 16:01:28 crc kubenswrapper[4730]: I0320 16:01:28.314449    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"47ed5bd7-7aa8-4f16-98de-f09e21218ae6","Type":"ContainerStarted","Data":"ff63c9353e2d509add01dd682087a5bf19343746c25a249896e86f0c4fee5817"}
Mar 20 16:01:28 crc kubenswrapper[4730]: I0320 16:01:28.315045    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2850-account-create-update-4lrrq"
Mar 20 16:01:28 crc kubenswrapper[4730]: I0320 16:01:28.340712    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.340690172 podStartE2EDuration="5.340690172s" podCreationTimestamp="2026-03-20 16:01:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:01:28.335338582 +0000 UTC m=+1347.548709951" watchObservedRunningTime="2026-03-20 16:01:28.340690172 +0000 UTC m=+1347.554061541"
Mar 20 16:01:28 crc kubenswrapper[4730]: I0320 16:01:28.350221    4730 scope.go:117] "RemoveContainer" containerID="2eff1617c29a34da6021d776b5bc5c6695025819add4af253986837526af0f15"
Mar 20 16:01:28 crc kubenswrapper[4730]: I0320 16:01:28.372895    4730 scope.go:117] "RemoveContainer" containerID="3635b09696560454a28e6d666babdb61696ccff059aecec39acea6122546c8aa"
Mar 20 16:01:28 crc kubenswrapper[4730]: E0320 16:01:28.373347    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3635b09696560454a28e6d666babdb61696ccff059aecec39acea6122546c8aa\": container with ID starting with 3635b09696560454a28e6d666babdb61696ccff059aecec39acea6122546c8aa not found: ID does not exist" containerID="3635b09696560454a28e6d666babdb61696ccff059aecec39acea6122546c8aa"
Mar 20 16:01:28 crc kubenswrapper[4730]: I0320 16:01:28.373393    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3635b09696560454a28e6d666babdb61696ccff059aecec39acea6122546c8aa"} err="failed to get container status \"3635b09696560454a28e6d666babdb61696ccff059aecec39acea6122546c8aa\": rpc error: code = NotFound desc = could not find container \"3635b09696560454a28e6d666babdb61696ccff059aecec39acea6122546c8aa\": container with ID starting with 3635b09696560454a28e6d666babdb61696ccff059aecec39acea6122546c8aa not found: ID does not exist"
Mar 20 16:01:28 crc kubenswrapper[4730]: I0320 16:01:28.373422    4730 scope.go:117] "RemoveContainer" containerID="2eff1617c29a34da6021d776b5bc5c6695025819add4af253986837526af0f15"
Mar 20 16:01:28 crc kubenswrapper[4730]: E0320 16:01:28.373693    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2eff1617c29a34da6021d776b5bc5c6695025819add4af253986837526af0f15\": container with ID starting with 2eff1617c29a34da6021d776b5bc5c6695025819add4af253986837526af0f15 not found: ID does not exist" containerID="2eff1617c29a34da6021d776b5bc5c6695025819add4af253986837526af0f15"
Mar 20 16:01:28 crc kubenswrapper[4730]: I0320 16:01:28.373729    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2eff1617c29a34da6021d776b5bc5c6695025819add4af253986837526af0f15"} err="failed to get container status \"2eff1617c29a34da6021d776b5bc5c6695025819add4af253986837526af0f15\": rpc error: code = NotFound desc = could not find container \"2eff1617c29a34da6021d776b5bc5c6695025819add4af253986837526af0f15\": container with ID starting with 2eff1617c29a34da6021d776b5bc5c6695025819add4af253986837526af0f15 not found: ID does not exist"
Mar 20 16:01:28 crc kubenswrapper[4730]: I0320 16:01:28.376912    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6858c8d8f6-k4smz"]
Mar 20 16:01:28 crc kubenswrapper[4730]: I0320 16:01:28.396339    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6858c8d8f6-k4smz"]
Mar 20 16:01:29 crc kubenswrapper[4730]: I0320 16:01:29.544154    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ed9fad7-284f-40b4-9c3b-7a213aff010a" path="/var/lib/kubelet/pods/4ed9fad7-284f-40b4-9c3b-7a213aff010a/volumes"
Mar 20 16:01:29 crc kubenswrapper[4730]: I0320 16:01:29.974442    4730 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0"
Mar 20 16:01:29 crc kubenswrapper[4730]: I0320 16:01:29.974517    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0"
Mar 20 16:01:29 crc kubenswrapper[4730]: I0320 16:01:29.975324    4730 scope.go:117] "RemoveContainer" containerID="ca3ac5b513d25322badcc2bf19b245d687c9ccf8bff6c35cf5794c95ec2ab964"
Mar 20 16:01:29 crc kubenswrapper[4730]: E0320 16:01:29.975711    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(3f6c808e-d523-48bd-8ec2-28b625834317)\"" pod="openstack/watcher-decision-engine-0" podUID="3f6c808e-d523-48bd-8ec2-28b625834317"
Mar 20 16:01:31 crc kubenswrapper[4730]: I0320 16:01:31.985575    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qsrbd"]
Mar 20 16:01:31 crc kubenswrapper[4730]: E0320 16:01:31.987485    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7de61c5d-53ba-4d26-9a79-b82c2bc3b779" containerName="mariadb-account-create-update"
Mar 20 16:01:31 crc kubenswrapper[4730]: I0320 16:01:31.987599    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="7de61c5d-53ba-4d26-9a79-b82c2bc3b779" containerName="mariadb-account-create-update"
Mar 20 16:01:31 crc kubenswrapper[4730]: E0320 16:01:31.987676    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="383cf79a-0636-4175-bcf8-7e369f101901" containerName="mariadb-account-create-update"
Mar 20 16:01:31 crc kubenswrapper[4730]: I0320 16:01:31.987742    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="383cf79a-0636-4175-bcf8-7e369f101901" containerName="mariadb-account-create-update"
Mar 20 16:01:31 crc kubenswrapper[4730]: E0320 16:01:31.987804    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b9cde2b-3d05-44ba-898b-6fe8cfab4fe1" containerName="mariadb-database-create"
Mar 20 16:01:31 crc kubenswrapper[4730]: I0320 16:01:31.987863    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b9cde2b-3d05-44ba-898b-6fe8cfab4fe1" containerName="mariadb-database-create"
Mar 20 16:01:31 crc kubenswrapper[4730]: E0320 16:01:31.987931    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f625a9e-a940-476b-85b2-ff54c5e87785" containerName="mariadb-account-create-update"
Mar 20 16:01:31 crc kubenswrapper[4730]: I0320 16:01:31.988002    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f625a9e-a940-476b-85b2-ff54c5e87785" containerName="mariadb-account-create-update"
Mar 20 16:01:31 crc kubenswrapper[4730]: E0320 16:01:31.988092    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ed9fad7-284f-40b4-9c3b-7a213aff010a" containerName="neutron-api"
Mar 20 16:01:31 crc kubenswrapper[4730]: I0320 16:01:31.988162    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ed9fad7-284f-40b4-9c3b-7a213aff010a" containerName="neutron-api"
Mar 20 16:01:31 crc kubenswrapper[4730]: E0320 16:01:31.988275    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dac41622-7c80-4fce-a5ac-8a04d301669d" containerName="mariadb-database-create"
Mar 20 16:01:31 crc kubenswrapper[4730]: I0320 16:01:31.988367    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="dac41622-7c80-4fce-a5ac-8a04d301669d" containerName="mariadb-database-create"
Mar 20 16:01:31 crc kubenswrapper[4730]: E0320 16:01:31.988457    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ed9fad7-284f-40b4-9c3b-7a213aff010a" containerName="neutron-httpd"
Mar 20 16:01:31 crc kubenswrapper[4730]: I0320 16:01:31.988540    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ed9fad7-284f-40b4-9c3b-7a213aff010a" containerName="neutron-httpd"
Mar 20 16:01:31 crc kubenswrapper[4730]: E0320 16:01:31.988728    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="475a52ba-bc8d-4c7b-ae99-330d6ec2b358" containerName="mariadb-database-create"
Mar 20 16:01:31 crc kubenswrapper[4730]: I0320 16:01:31.988810    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="475a52ba-bc8d-4c7b-ae99-330d6ec2b358" containerName="mariadb-database-create"
Mar 20 16:01:31 crc kubenswrapper[4730]: I0320 16:01:31.989085    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b9cde2b-3d05-44ba-898b-6fe8cfab4fe1" containerName="mariadb-database-create"
Mar 20 16:01:31 crc kubenswrapper[4730]: I0320 16:01:31.989179    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="7de61c5d-53ba-4d26-9a79-b82c2bc3b779" containerName="mariadb-account-create-update"
Mar 20 16:01:31 crc kubenswrapper[4730]: I0320 16:01:31.989279    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="383cf79a-0636-4175-bcf8-7e369f101901" containerName="mariadb-account-create-update"
Mar 20 16:01:31 crc kubenswrapper[4730]: I0320 16:01:31.989358    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ed9fad7-284f-40b4-9c3b-7a213aff010a" containerName="neutron-httpd"
Mar 20 16:01:31 crc kubenswrapper[4730]: I0320 16:01:31.989448    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ed9fad7-284f-40b4-9c3b-7a213aff010a" containerName="neutron-api"
Mar 20 16:01:31 crc kubenswrapper[4730]: I0320 16:01:31.989516    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="475a52ba-bc8d-4c7b-ae99-330d6ec2b358" containerName="mariadb-database-create"
Mar 20 16:01:31 crc kubenswrapper[4730]: I0320 16:01:31.989587    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="dac41622-7c80-4fce-a5ac-8a04d301669d" containerName="mariadb-database-create"
Mar 20 16:01:31 crc kubenswrapper[4730]: I0320 16:01:31.989658    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f625a9e-a940-476b-85b2-ff54c5e87785" containerName="mariadb-account-create-update"
Mar 20 16:01:31 crc kubenswrapper[4730]: I0320 16:01:31.990574    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qsrbd"
Mar 20 16:01:31 crc kubenswrapper[4730]: I0320 16:01:31.993631    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts"
Mar 20 16:01:31 crc kubenswrapper[4730]: I0320 16:01:31.993952    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-fmsk9"
Mar 20 16:01:31 crc kubenswrapper[4730]: I0320 16:01:31.994280    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data"
Mar 20 16:01:32 crc kubenswrapper[4730]: I0320 16:01:32.004124    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qsrbd"]
Mar 20 16:01:32 crc kubenswrapper[4730]: I0320 16:01:32.072807    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/279d2368-abe1-465a-9007-68542e5dbfc4-scripts\") pod \"nova-cell0-conductor-db-sync-qsrbd\" (UID: \"279d2368-abe1-465a-9007-68542e5dbfc4\") " pod="openstack/nova-cell0-conductor-db-sync-qsrbd"
Mar 20 16:01:32 crc kubenswrapper[4730]: I0320 16:01:32.072959    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/279d2368-abe1-465a-9007-68542e5dbfc4-config-data\") pod \"nova-cell0-conductor-db-sync-qsrbd\" (UID: \"279d2368-abe1-465a-9007-68542e5dbfc4\") " pod="openstack/nova-cell0-conductor-db-sync-qsrbd"
Mar 20 16:01:32 crc kubenswrapper[4730]: I0320 16:01:32.073000    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8swbj\" (UniqueName: \"kubernetes.io/projected/279d2368-abe1-465a-9007-68542e5dbfc4-kube-api-access-8swbj\") pod \"nova-cell0-conductor-db-sync-qsrbd\" (UID: \"279d2368-abe1-465a-9007-68542e5dbfc4\") " pod="openstack/nova-cell0-conductor-db-sync-qsrbd"
Mar 20 16:01:32 crc kubenswrapper[4730]: I0320 16:01:32.073045    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/279d2368-abe1-465a-9007-68542e5dbfc4-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qsrbd\" (UID: \"279d2368-abe1-465a-9007-68542e5dbfc4\") " pod="openstack/nova-cell0-conductor-db-sync-qsrbd"
Mar 20 16:01:32 crc kubenswrapper[4730]: I0320 16:01:32.174628    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/279d2368-abe1-465a-9007-68542e5dbfc4-scripts\") pod \"nova-cell0-conductor-db-sync-qsrbd\" (UID: \"279d2368-abe1-465a-9007-68542e5dbfc4\") " pod="openstack/nova-cell0-conductor-db-sync-qsrbd"
Mar 20 16:01:32 crc kubenswrapper[4730]: I0320 16:01:32.174882    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/279d2368-abe1-465a-9007-68542e5dbfc4-config-data\") pod \"nova-cell0-conductor-db-sync-qsrbd\" (UID: \"279d2368-abe1-465a-9007-68542e5dbfc4\") " pod="openstack/nova-cell0-conductor-db-sync-qsrbd"
Mar 20 16:01:32 crc kubenswrapper[4730]: I0320 16:01:32.174971    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8swbj\" (UniqueName: \"kubernetes.io/projected/279d2368-abe1-465a-9007-68542e5dbfc4-kube-api-access-8swbj\") pod \"nova-cell0-conductor-db-sync-qsrbd\" (UID: \"279d2368-abe1-465a-9007-68542e5dbfc4\") " pod="openstack/nova-cell0-conductor-db-sync-qsrbd"
Mar 20 16:01:32 crc kubenswrapper[4730]: I0320 16:01:32.175036    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/279d2368-abe1-465a-9007-68542e5dbfc4-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qsrbd\" (UID: \"279d2368-abe1-465a-9007-68542e5dbfc4\") " pod="openstack/nova-cell0-conductor-db-sync-qsrbd"
Mar 20 16:01:32 crc kubenswrapper[4730]: I0320 16:01:32.181406    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/279d2368-abe1-465a-9007-68542e5dbfc4-scripts\") pod \"nova-cell0-conductor-db-sync-qsrbd\" (UID: \"279d2368-abe1-465a-9007-68542e5dbfc4\") " pod="openstack/nova-cell0-conductor-db-sync-qsrbd"
Mar 20 16:01:32 crc kubenswrapper[4730]: I0320 16:01:32.182604    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/279d2368-abe1-465a-9007-68542e5dbfc4-config-data\") pod \"nova-cell0-conductor-db-sync-qsrbd\" (UID: \"279d2368-abe1-465a-9007-68542e5dbfc4\") " pod="openstack/nova-cell0-conductor-db-sync-qsrbd"
Mar 20 16:01:32 crc kubenswrapper[4730]: I0320 16:01:32.185753    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/279d2368-abe1-465a-9007-68542e5dbfc4-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qsrbd\" (UID: \"279d2368-abe1-465a-9007-68542e5dbfc4\") " pod="openstack/nova-cell0-conductor-db-sync-qsrbd"
Mar 20 16:01:32 crc kubenswrapper[4730]: I0320 16:01:32.194620    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8swbj\" (UniqueName: \"kubernetes.io/projected/279d2368-abe1-465a-9007-68542e5dbfc4-kube-api-access-8swbj\") pod \"nova-cell0-conductor-db-sync-qsrbd\" (UID: \"279d2368-abe1-465a-9007-68542e5dbfc4\") " pod="openstack/nova-cell0-conductor-db-sync-qsrbd"
Mar 20 16:01:32 crc kubenswrapper[4730]: I0320 16:01:32.276776    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7c5c8ffdd9-xpfhf"
Mar 20 16:01:32 crc kubenswrapper[4730]: I0320 16:01:32.277849    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7c5c8ffdd9-xpfhf"
Mar 20 16:01:32 crc kubenswrapper[4730]: I0320 16:01:32.307328    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qsrbd"
Mar 20 16:01:32 crc kubenswrapper[4730]: I0320 16:01:32.780290    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qsrbd"]
Mar 20 16:01:33 crc kubenswrapper[4730]: I0320 16:01:33.377085    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qsrbd" event={"ID":"279d2368-abe1-465a-9007-68542e5dbfc4","Type":"ContainerStarted","Data":"da2c214fcdb33fae608237a0fb1d01559481f0b67a4afa1e6c930298a64b75a2"}
Mar 20 16:01:33 crc kubenswrapper[4730]: I0320 16:01:33.517750    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0"
Mar 20 16:01:33 crc kubenswrapper[4730]: I0320 16:01:33.518147    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0"
Mar 20 16:01:33 crc kubenswrapper[4730]: I0320 16:01:33.557560    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0"
Mar 20 16:01:33 crc kubenswrapper[4730]: I0320 16:01:33.588656    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0"
Mar 20 16:01:34 crc kubenswrapper[4730]: I0320 16:01:34.390025    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0"
Mar 20 16:01:34 crc kubenswrapper[4730]: I0320 16:01:34.390069    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0"
Mar 20 16:01:34 crc kubenswrapper[4730]: I0320 16:01:34.510558    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0"
Mar 20 16:01:34 crc kubenswrapper[4730]: I0320 16:01:34.510623    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0"
Mar 20 16:01:34 crc kubenswrapper[4730]: I0320 16:01:34.545883    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0"
Mar 20 16:01:34 crc kubenswrapper[4730]: I0320 16:01:34.567567    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0"
Mar 20 16:01:35 crc kubenswrapper[4730]: I0320 16:01:35.399072    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0"
Mar 20 16:01:35 crc kubenswrapper[4730]: I0320 16:01:35.399108    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0"
Mar 20 16:01:36 crc kubenswrapper[4730]: I0320 16:01:36.544574    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0"
Mar 20 16:01:36 crc kubenswrapper[4730]: I0320 16:01:36.544675    4730 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness"
Mar 20 16:01:36 crc kubenswrapper[4730]: I0320 16:01:36.904282    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0"
Mar 20 16:01:37 crc kubenswrapper[4730]: I0320 16:01:37.825386    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0"
Mar 20 16:01:37 crc kubenswrapper[4730]: I0320 16:01:37.825719    4730 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness"
Mar 20 16:01:38 crc kubenswrapper[4730]: I0320 16:01:38.078909    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0"
Mar 20 16:01:42 crc kubenswrapper[4730]: I0320 16:01:42.879746    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 16:01:42 crc kubenswrapper[4730]: I0320 16:01:42.880378    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 16:01:42 crc kubenswrapper[4730]: I0320 16:01:42.880440    4730 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf"
Mar 20 16:01:42 crc kubenswrapper[4730]: I0320 16:01:42.881467    4730 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2aab75ddb2e10e731a7d582f69fae06a40e7e5a6270ff47496bdac5fb9c6ebfd"} pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted"
Mar 20 16:01:42 crc kubenswrapper[4730]: I0320 16:01:42.881648    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" containerID="cri-o://2aab75ddb2e10e731a7d582f69fae06a40e7e5a6270ff47496bdac5fb9c6ebfd" gracePeriod=600
Mar 20 16:01:43 crc kubenswrapper[4730]: I0320 16:01:43.495208    4730 generic.go:334] "Generic (PLEG): container finished" podID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerID="2aab75ddb2e10e731a7d582f69fae06a40e7e5a6270ff47496bdac5fb9c6ebfd" exitCode=0
Mar 20 16:01:43 crc kubenswrapper[4730]: I0320 16:01:43.495288    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" event={"ID":"7fcd3db3-55f1-4c23-8fa9-78844495cea3","Type":"ContainerDied","Data":"2aab75ddb2e10e731a7d582f69fae06a40e7e5a6270ff47496bdac5fb9c6ebfd"}
Mar 20 16:01:43 crc kubenswrapper[4730]: I0320 16:01:43.495705    4730 scope.go:117] "RemoveContainer" containerID="5a28eadd1ac2eb334876364a020c16296d471cf45645c126a154825ac93c80d5"
Mar 20 16:01:44 crc kubenswrapper[4730]: I0320 16:01:44.507565    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qsrbd" event={"ID":"279d2368-abe1-465a-9007-68542e5dbfc4","Type":"ContainerStarted","Data":"79f3caf37f32c7308d415a20940a3f1cbb774116c3657a269e41ea28bde4ad32"}
Mar 20 16:01:44 crc kubenswrapper[4730]: I0320 16:01:44.510362    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" event={"ID":"7fcd3db3-55f1-4c23-8fa9-78844495cea3","Type":"ContainerStarted","Data":"fb7cef3383bd559653e29a00e754a8c3366946a9ed7a655b7b70a7214aec8143"}
Mar 20 16:01:44 crc kubenswrapper[4730]: I0320 16:01:44.527065    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-qsrbd" podStartSLOduration=2.809549272 podStartE2EDuration="13.527042622s" podCreationTimestamp="2026-03-20 16:01:31 +0000 UTC" firstStartedPulling="2026-03-20 16:01:32.787695838 +0000 UTC m=+1352.001067207" lastFinishedPulling="2026-03-20 16:01:43.505189188 +0000 UTC m=+1362.718560557" observedRunningTime="2026-03-20 16:01:44.526099635 +0000 UTC m=+1363.739471004" watchObservedRunningTime="2026-03-20 16:01:44.527042622 +0000 UTC m=+1363.740414031"
Mar 20 16:01:44 crc kubenswrapper[4730]: I0320 16:01:44.533310    4730 scope.go:117] "RemoveContainer" containerID="ca3ac5b513d25322badcc2bf19b245d687c9ccf8bff6c35cf5794c95ec2ab964"
Mar 20 16:01:44 crc kubenswrapper[4730]: E0320 16:01:44.533886    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(3f6c808e-d523-48bd-8ec2-28b625834317)\"" pod="openstack/watcher-decision-engine-0" podUID="3f6c808e-d523-48bd-8ec2-28b625834317"
Mar 20 16:01:52 crc kubenswrapper[4730]: I0320 16:01:52.346962    4730 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="fe109bf0-70d2-41d2-855c-6eb862e568b6" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.183:3000/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)"
Mar 20 16:01:53 crc kubenswrapper[4730]: I0320 16:01:53.629003    4730 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podfe109bf0-70d2-41d2-855c-6eb862e568b6"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podfe109bf0-70d2-41d2-855c-6eb862e568b6] : Timed out while waiting for systemd to remove kubepods-besteffort-podfe109bf0_70d2_41d2_855c_6eb862e568b6.slice"
Mar 20 16:01:53 crc kubenswrapper[4730]: E0320 16:01:53.629562    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort podfe109bf0-70d2-41d2-855c-6eb862e568b6] : unable to destroy cgroup paths for cgroup [kubepods besteffort podfe109bf0-70d2-41d2-855c-6eb862e568b6] : Timed out while waiting for systemd to remove kubepods-besteffort-podfe109bf0_70d2_41d2_855c_6eb862e568b6.slice" pod="openstack/ceilometer-0" podUID="fe109bf0-70d2-41d2-855c-6eb862e568b6"
Mar 20 16:01:54 crc kubenswrapper[4730]: I0320 16:01:54.627738    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0"
Mar 20 16:01:54 crc kubenswrapper[4730]: I0320 16:01:54.690697    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"]
Mar 20 16:01:54 crc kubenswrapper[4730]: I0320 16:01:54.703393    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"]
Mar 20 16:01:54 crc kubenswrapper[4730]: I0320 16:01:54.709473    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"]
Mar 20 16:01:54 crc kubenswrapper[4730]: I0320 16:01:54.712188    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0"
Mar 20 16:01:54 crc kubenswrapper[4730]: I0320 16:01:54.720484    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts"
Mar 20 16:01:54 crc kubenswrapper[4730]: I0320 16:01:54.720743    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data"
Mar 20 16:01:54 crc kubenswrapper[4730]: I0320 16:01:54.724852    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"]
Mar 20 16:01:54 crc kubenswrapper[4730]: I0320 16:01:54.789883    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f03f19a6-9d2e-421e-9388-68ae49ae68ef-log-httpd\") pod \"ceilometer-0\" (UID: \"f03f19a6-9d2e-421e-9388-68ae49ae68ef\") " pod="openstack/ceilometer-0"
Mar 20 16:01:54 crc kubenswrapper[4730]: I0320 16:01:54.790025    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f03f19a6-9d2e-421e-9388-68ae49ae68ef-config-data\") pod \"ceilometer-0\" (UID: \"f03f19a6-9d2e-421e-9388-68ae49ae68ef\") " pod="openstack/ceilometer-0"
Mar 20 16:01:54 crc kubenswrapper[4730]: I0320 16:01:54.790063    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f03f19a6-9d2e-421e-9388-68ae49ae68ef-scripts\") pod \"ceilometer-0\" (UID: \"f03f19a6-9d2e-421e-9388-68ae49ae68ef\") " pod="openstack/ceilometer-0"
Mar 20 16:01:54 crc kubenswrapper[4730]: I0320 16:01:54.790165    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2lkw\" (UniqueName: \"kubernetes.io/projected/f03f19a6-9d2e-421e-9388-68ae49ae68ef-kube-api-access-w2lkw\") pod \"ceilometer-0\" (UID: \"f03f19a6-9d2e-421e-9388-68ae49ae68ef\") " pod="openstack/ceilometer-0"
Mar 20 16:01:54 crc kubenswrapper[4730]: I0320 16:01:54.790323    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f03f19a6-9d2e-421e-9388-68ae49ae68ef-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f03f19a6-9d2e-421e-9388-68ae49ae68ef\") " pod="openstack/ceilometer-0"
Mar 20 16:01:54 crc kubenswrapper[4730]: I0320 16:01:54.790373    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f03f19a6-9d2e-421e-9388-68ae49ae68ef-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f03f19a6-9d2e-421e-9388-68ae49ae68ef\") " pod="openstack/ceilometer-0"
Mar 20 16:01:54 crc kubenswrapper[4730]: I0320 16:01:54.790426    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f03f19a6-9d2e-421e-9388-68ae49ae68ef-run-httpd\") pod \"ceilometer-0\" (UID: \"f03f19a6-9d2e-421e-9388-68ae49ae68ef\") " pod="openstack/ceilometer-0"
Mar 20 16:01:54 crc kubenswrapper[4730]: I0320 16:01:54.899949    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f03f19a6-9d2e-421e-9388-68ae49ae68ef-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f03f19a6-9d2e-421e-9388-68ae49ae68ef\") " pod="openstack/ceilometer-0"
Mar 20 16:01:54 crc kubenswrapper[4730]: I0320 16:01:54.900024    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f03f19a6-9d2e-421e-9388-68ae49ae68ef-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f03f19a6-9d2e-421e-9388-68ae49ae68ef\") " pod="openstack/ceilometer-0"
Mar 20 16:01:54 crc kubenswrapper[4730]: I0320 16:01:54.900080    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f03f19a6-9d2e-421e-9388-68ae49ae68ef-run-httpd\") pod \"ceilometer-0\" (UID: \"f03f19a6-9d2e-421e-9388-68ae49ae68ef\") " pod="openstack/ceilometer-0"
Mar 20 16:01:54 crc kubenswrapper[4730]: I0320 16:01:54.900214    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f03f19a6-9d2e-421e-9388-68ae49ae68ef-log-httpd\") pod \"ceilometer-0\" (UID: \"f03f19a6-9d2e-421e-9388-68ae49ae68ef\") " pod="openstack/ceilometer-0"
Mar 20 16:01:54 crc kubenswrapper[4730]: I0320 16:01:54.900330    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f03f19a6-9d2e-421e-9388-68ae49ae68ef-config-data\") pod \"ceilometer-0\" (UID: \"f03f19a6-9d2e-421e-9388-68ae49ae68ef\") " pod="openstack/ceilometer-0"
Mar 20 16:01:54 crc kubenswrapper[4730]: I0320 16:01:54.900366    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f03f19a6-9d2e-421e-9388-68ae49ae68ef-scripts\") pod \"ceilometer-0\" (UID: \"f03f19a6-9d2e-421e-9388-68ae49ae68ef\") " pod="openstack/ceilometer-0"
Mar 20 16:01:54 crc kubenswrapper[4730]: I0320 16:01:54.900452    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2lkw\" (UniqueName: \"kubernetes.io/projected/f03f19a6-9d2e-421e-9388-68ae49ae68ef-kube-api-access-w2lkw\") pod \"ceilometer-0\" (UID: \"f03f19a6-9d2e-421e-9388-68ae49ae68ef\") " pod="openstack/ceilometer-0"
Mar 20 16:01:54 crc kubenswrapper[4730]: I0320 16:01:54.901473    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f03f19a6-9d2e-421e-9388-68ae49ae68ef-run-httpd\") pod \"ceilometer-0\" (UID: \"f03f19a6-9d2e-421e-9388-68ae49ae68ef\") " pod="openstack/ceilometer-0"
Mar 20 16:01:54 crc kubenswrapper[4730]: I0320 16:01:54.901535    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f03f19a6-9d2e-421e-9388-68ae49ae68ef-log-httpd\") pod \"ceilometer-0\" (UID: \"f03f19a6-9d2e-421e-9388-68ae49ae68ef\") " pod="openstack/ceilometer-0"
Mar 20 16:01:54 crc kubenswrapper[4730]: I0320 16:01:54.909545    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f03f19a6-9d2e-421e-9388-68ae49ae68ef-config-data\") pod \"ceilometer-0\" (UID: \"f03f19a6-9d2e-421e-9388-68ae49ae68ef\") " pod="openstack/ceilometer-0"
Mar 20 16:01:54 crc kubenswrapper[4730]: I0320 16:01:54.913134    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f03f19a6-9d2e-421e-9388-68ae49ae68ef-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f03f19a6-9d2e-421e-9388-68ae49ae68ef\") " pod="openstack/ceilometer-0"
Mar 20 16:01:54 crc kubenswrapper[4730]: I0320 16:01:54.914729    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f03f19a6-9d2e-421e-9388-68ae49ae68ef-scripts\") pod \"ceilometer-0\" (UID: \"f03f19a6-9d2e-421e-9388-68ae49ae68ef\") " pod="openstack/ceilometer-0"
Mar 20 16:01:54 crc kubenswrapper[4730]: I0320 16:01:54.916336    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f03f19a6-9d2e-421e-9388-68ae49ae68ef-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f03f19a6-9d2e-421e-9388-68ae49ae68ef\") " pod="openstack/ceilometer-0"
Mar 20 16:01:54 crc kubenswrapper[4730]: I0320 16:01:54.929020    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2lkw\" (UniqueName: \"kubernetes.io/projected/f03f19a6-9d2e-421e-9388-68ae49ae68ef-kube-api-access-w2lkw\") pod \"ceilometer-0\" (UID: \"f03f19a6-9d2e-421e-9388-68ae49ae68ef\") " pod="openstack/ceilometer-0"
Mar 20 16:01:55 crc kubenswrapper[4730]: I0320 16:01:55.051683    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0"
Mar 20 16:01:55 crc kubenswrapper[4730]: I0320 16:01:55.533306    4730 scope.go:117] "RemoveContainer" containerID="ca3ac5b513d25322badcc2bf19b245d687c9ccf8bff6c35cf5794c95ec2ab964"
Mar 20 16:01:55 crc kubenswrapper[4730]: E0320 16:01:55.533997    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(3f6c808e-d523-48bd-8ec2-28b625834317)\"" pod="openstack/watcher-decision-engine-0" podUID="3f6c808e-d523-48bd-8ec2-28b625834317"
Mar 20 16:01:55 crc kubenswrapper[4730]: I0320 16:01:55.548536    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe109bf0-70d2-41d2-855c-6eb862e568b6" path="/var/lib/kubelet/pods/fe109bf0-70d2-41d2-855c-6eb862e568b6/volumes"
Mar 20 16:01:55 crc kubenswrapper[4730]: W0320 16:01:55.561339    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf03f19a6_9d2e_421e_9388_68ae49ae68ef.slice/crio-d3a4ea5bd7c7ce048ff6487734956147a2553ddcde8a7a07f3bf8e3324ca2fc8 WatchSource:0}: Error finding container d3a4ea5bd7c7ce048ff6487734956147a2553ddcde8a7a07f3bf8e3324ca2fc8: Status 404 returned error can't find the container with id d3a4ea5bd7c7ce048ff6487734956147a2553ddcde8a7a07f3bf8e3324ca2fc8
Mar 20 16:01:55 crc kubenswrapper[4730]: I0320 16:01:55.561350    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"]
Mar 20 16:01:55 crc kubenswrapper[4730]: I0320 16:01:55.642924    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f03f19a6-9d2e-421e-9388-68ae49ae68ef","Type":"ContainerStarted","Data":"d3a4ea5bd7c7ce048ff6487734956147a2553ddcde8a7a07f3bf8e3324ca2fc8"}
Mar 20 16:01:56 crc kubenswrapper[4730]: I0320 16:01:56.652452    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f03f19a6-9d2e-421e-9388-68ae49ae68ef","Type":"ContainerStarted","Data":"f0057f5dab402a46d249f303cdcb727d64def9624f515e382c4084ea494f079a"}
Mar 20 16:01:56 crc kubenswrapper[4730]: I0320 16:01:56.652851    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f03f19a6-9d2e-421e-9388-68ae49ae68ef","Type":"ContainerStarted","Data":"9dd9d5de632bf89de261876ec488aa9f35efb63310645d30071a551b02d1b18c"}
Mar 20 16:01:58 crc kubenswrapper[4730]: I0320 16:01:58.680060    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f03f19a6-9d2e-421e-9388-68ae49ae68ef","Type":"ContainerStarted","Data":"fcfdc390791242aab10df223e6f66346cede319e34a43195f06b469deaa0cf18"}
Mar 20 16:01:59 crc kubenswrapper[4730]: I0320 16:01:59.690129    4730 generic.go:334] "Generic (PLEG): container finished" podID="279d2368-abe1-465a-9007-68542e5dbfc4" containerID="79f3caf37f32c7308d415a20940a3f1cbb774116c3657a269e41ea28bde4ad32" exitCode=0
Mar 20 16:01:59 crc kubenswrapper[4730]: I0320 16:01:59.690205    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qsrbd" event={"ID":"279d2368-abe1-465a-9007-68542e5dbfc4","Type":"ContainerDied","Data":"79f3caf37f32c7308d415a20940a3f1cbb774116c3657a269e41ea28bde4ad32"}
Mar 20 16:02:00 crc kubenswrapper[4730]: I0320 16:02:00.140268    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567042-rngpc"]
Mar 20 16:02:00 crc kubenswrapper[4730]: I0320 16:02:00.141992    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567042-rngpc"
Mar 20 16:02:00 crc kubenswrapper[4730]: I0320 16:02:00.144291    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt"
Mar 20 16:02:00 crc kubenswrapper[4730]: I0320 16:02:00.144511    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt"
Mar 20 16:02:00 crc kubenswrapper[4730]: I0320 16:02:00.144682    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vlqbl"
Mar 20 16:02:00 crc kubenswrapper[4730]: I0320 16:02:00.156563    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567042-rngpc"]
Mar 20 16:02:00 crc kubenswrapper[4730]: I0320 16:02:00.304161    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2grlf\" (UniqueName: \"kubernetes.io/projected/8f1785c3-01b9-48cd-bfc9-c0fdb1c18455-kube-api-access-2grlf\") pod \"auto-csr-approver-29567042-rngpc\" (UID: \"8f1785c3-01b9-48cd-bfc9-c0fdb1c18455\") " pod="openshift-infra/auto-csr-approver-29567042-rngpc"
Mar 20 16:02:00 crc kubenswrapper[4730]: I0320 16:02:00.405708    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2grlf\" (UniqueName: \"kubernetes.io/projected/8f1785c3-01b9-48cd-bfc9-c0fdb1c18455-kube-api-access-2grlf\") pod \"auto-csr-approver-29567042-rngpc\" (UID: \"8f1785c3-01b9-48cd-bfc9-c0fdb1c18455\") " pod="openshift-infra/auto-csr-approver-29567042-rngpc"
Mar 20 16:02:00 crc kubenswrapper[4730]: I0320 16:02:00.441414    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2grlf\" (UniqueName: \"kubernetes.io/projected/8f1785c3-01b9-48cd-bfc9-c0fdb1c18455-kube-api-access-2grlf\") pod \"auto-csr-approver-29567042-rngpc\" (UID: \"8f1785c3-01b9-48cd-bfc9-c0fdb1c18455\") " pod="openshift-infra/auto-csr-approver-29567042-rngpc"
Mar 20 16:02:00 crc kubenswrapper[4730]: I0320 16:02:00.461022    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567042-rngpc"
Mar 20 16:02:00 crc kubenswrapper[4730]: I0320 16:02:00.708351    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f03f19a6-9d2e-421e-9388-68ae49ae68ef","Type":"ContainerStarted","Data":"3cdc313706dd8cba3d3a54bac2bfd0a19fc778679c99983f7df6459d6ad53f25"}
Mar 20 16:02:00 crc kubenswrapper[4730]: I0320 16:02:00.708683    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0"
Mar 20 16:02:00 crc kubenswrapper[4730]: I0320 16:02:00.740306    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.387666184 podStartE2EDuration="6.74027849s" podCreationTimestamp="2026-03-20 16:01:54 +0000 UTC" firstStartedPulling="2026-03-20 16:01:55.564375649 +0000 UTC m=+1374.777747018" lastFinishedPulling="2026-03-20 16:01:59.916987955 +0000 UTC m=+1379.130359324" observedRunningTime="2026-03-20 16:02:00.734078302 +0000 UTC m=+1379.947449671" watchObservedRunningTime="2026-03-20 16:02:00.74027849 +0000 UTC m=+1379.953649859"
Mar 20 16:02:00 crc kubenswrapper[4730]: I0320 16:02:00.981185    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567042-rngpc"]
Mar 20 16:02:00 crc kubenswrapper[4730]: W0320 16:02:00.985163    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f1785c3_01b9_48cd_bfc9_c0fdb1c18455.slice/crio-0195877c2e8f64939a13b8bd295cc860249fcf82aef822acc7e477581c695355 WatchSource:0}: Error finding container 0195877c2e8f64939a13b8bd295cc860249fcf82aef822acc7e477581c695355: Status 404 returned error can't find the container with id 0195877c2e8f64939a13b8bd295cc860249fcf82aef822acc7e477581c695355
Mar 20 16:02:01 crc kubenswrapper[4730]: I0320 16:02:01.074995    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qsrbd"
Mar 20 16:02:01 crc kubenswrapper[4730]: I0320 16:02:01.223784    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8swbj\" (UniqueName: \"kubernetes.io/projected/279d2368-abe1-465a-9007-68542e5dbfc4-kube-api-access-8swbj\") pod \"279d2368-abe1-465a-9007-68542e5dbfc4\" (UID: \"279d2368-abe1-465a-9007-68542e5dbfc4\") "
Mar 20 16:02:01 crc kubenswrapper[4730]: I0320 16:02:01.223864    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/279d2368-abe1-465a-9007-68542e5dbfc4-scripts\") pod \"279d2368-abe1-465a-9007-68542e5dbfc4\" (UID: \"279d2368-abe1-465a-9007-68542e5dbfc4\") "
Mar 20 16:02:01 crc kubenswrapper[4730]: I0320 16:02:01.223921    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/279d2368-abe1-465a-9007-68542e5dbfc4-combined-ca-bundle\") pod \"279d2368-abe1-465a-9007-68542e5dbfc4\" (UID: \"279d2368-abe1-465a-9007-68542e5dbfc4\") "
Mar 20 16:02:01 crc kubenswrapper[4730]: I0320 16:02:01.223962    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/279d2368-abe1-465a-9007-68542e5dbfc4-config-data\") pod \"279d2368-abe1-465a-9007-68542e5dbfc4\" (UID: \"279d2368-abe1-465a-9007-68542e5dbfc4\") "
Mar 20 16:02:01 crc kubenswrapper[4730]: I0320 16:02:01.243590    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/279d2368-abe1-465a-9007-68542e5dbfc4-kube-api-access-8swbj" (OuterVolumeSpecName: "kube-api-access-8swbj") pod "279d2368-abe1-465a-9007-68542e5dbfc4" (UID: "279d2368-abe1-465a-9007-68542e5dbfc4"). InnerVolumeSpecName "kube-api-access-8swbj". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:02:01 crc kubenswrapper[4730]: I0320 16:02:01.259563    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/279d2368-abe1-465a-9007-68542e5dbfc4-scripts" (OuterVolumeSpecName: "scripts") pod "279d2368-abe1-465a-9007-68542e5dbfc4" (UID: "279d2368-abe1-465a-9007-68542e5dbfc4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:02:01 crc kubenswrapper[4730]: I0320 16:02:01.259681    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/279d2368-abe1-465a-9007-68542e5dbfc4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "279d2368-abe1-465a-9007-68542e5dbfc4" (UID: "279d2368-abe1-465a-9007-68542e5dbfc4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:02:01 crc kubenswrapper[4730]: I0320 16:02:01.263639    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/279d2368-abe1-465a-9007-68542e5dbfc4-config-data" (OuterVolumeSpecName: "config-data") pod "279d2368-abe1-465a-9007-68542e5dbfc4" (UID: "279d2368-abe1-465a-9007-68542e5dbfc4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:02:01 crc kubenswrapper[4730]: I0320 16:02:01.326544    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8swbj\" (UniqueName: \"kubernetes.io/projected/279d2368-abe1-465a-9007-68542e5dbfc4-kube-api-access-8swbj\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:01 crc kubenswrapper[4730]: I0320 16:02:01.326588    4730 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/279d2368-abe1-465a-9007-68542e5dbfc4-scripts\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:01 crc kubenswrapper[4730]: I0320 16:02:01.326602    4730 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/279d2368-abe1-465a-9007-68542e5dbfc4-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:01 crc kubenswrapper[4730]: I0320 16:02:01.326613    4730 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/279d2368-abe1-465a-9007-68542e5dbfc4-config-data\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:01 crc kubenswrapper[4730]: I0320 16:02:01.716029    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567042-rngpc" event={"ID":"8f1785c3-01b9-48cd-bfc9-c0fdb1c18455","Type":"ContainerStarted","Data":"0195877c2e8f64939a13b8bd295cc860249fcf82aef822acc7e477581c695355"}
Mar 20 16:02:01 crc kubenswrapper[4730]: I0320 16:02:01.718048    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qsrbd" event={"ID":"279d2368-abe1-465a-9007-68542e5dbfc4","Type":"ContainerDied","Data":"da2c214fcdb33fae608237a0fb1d01559481f0b67a4afa1e6c930298a64b75a2"}
Mar 20 16:02:01 crc kubenswrapper[4730]: I0320 16:02:01.718076    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da2c214fcdb33fae608237a0fb1d01559481f0b67a4afa1e6c930298a64b75a2"
Mar 20 16:02:01 crc kubenswrapper[4730]: I0320 16:02:01.718094    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qsrbd"
Mar 20 16:02:01 crc kubenswrapper[4730]: I0320 16:02:01.799361    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"]
Mar 20 16:02:01 crc kubenswrapper[4730]: E0320 16:02:01.799750    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="279d2368-abe1-465a-9007-68542e5dbfc4" containerName="nova-cell0-conductor-db-sync"
Mar 20 16:02:01 crc kubenswrapper[4730]: I0320 16:02:01.799769    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="279d2368-abe1-465a-9007-68542e5dbfc4" containerName="nova-cell0-conductor-db-sync"
Mar 20 16:02:01 crc kubenswrapper[4730]: I0320 16:02:01.799992    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="279d2368-abe1-465a-9007-68542e5dbfc4" containerName="nova-cell0-conductor-db-sync"
Mar 20 16:02:01 crc kubenswrapper[4730]: I0320 16:02:01.800804    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0"
Mar 20 16:02:01 crc kubenswrapper[4730]: I0320 16:02:01.805018    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-fmsk9"
Mar 20 16:02:01 crc kubenswrapper[4730]: I0320 16:02:01.805189    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data"
Mar 20 16:02:01 crc kubenswrapper[4730]: I0320 16:02:01.809201    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"]
Mar 20 16:02:01 crc kubenswrapper[4730]: I0320 16:02:01.939403    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95rw8\" (UniqueName: \"kubernetes.io/projected/0940bcf4-b3ca-4f1d-92df-5fa9f477c800-kube-api-access-95rw8\") pod \"nova-cell0-conductor-0\" (UID: \"0940bcf4-b3ca-4f1d-92df-5fa9f477c800\") " pod="openstack/nova-cell0-conductor-0"
Mar 20 16:02:01 crc kubenswrapper[4730]: I0320 16:02:01.939743    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0940bcf4-b3ca-4f1d-92df-5fa9f477c800-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"0940bcf4-b3ca-4f1d-92df-5fa9f477c800\") " pod="openstack/nova-cell0-conductor-0"
Mar 20 16:02:01 crc kubenswrapper[4730]: I0320 16:02:01.939904    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0940bcf4-b3ca-4f1d-92df-5fa9f477c800-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"0940bcf4-b3ca-4f1d-92df-5fa9f477c800\") " pod="openstack/nova-cell0-conductor-0"
Mar 20 16:02:02 crc kubenswrapper[4730]: I0320 16:02:02.041767    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95rw8\" (UniqueName: \"kubernetes.io/projected/0940bcf4-b3ca-4f1d-92df-5fa9f477c800-kube-api-access-95rw8\") pod \"nova-cell0-conductor-0\" (UID: \"0940bcf4-b3ca-4f1d-92df-5fa9f477c800\") " pod="openstack/nova-cell0-conductor-0"
Mar 20 16:02:02 crc kubenswrapper[4730]: I0320 16:02:02.041899    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0940bcf4-b3ca-4f1d-92df-5fa9f477c800-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"0940bcf4-b3ca-4f1d-92df-5fa9f477c800\") " pod="openstack/nova-cell0-conductor-0"
Mar 20 16:02:02 crc kubenswrapper[4730]: I0320 16:02:02.041935    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0940bcf4-b3ca-4f1d-92df-5fa9f477c800-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"0940bcf4-b3ca-4f1d-92df-5fa9f477c800\") " pod="openstack/nova-cell0-conductor-0"
Mar 20 16:02:02 crc kubenswrapper[4730]: I0320 16:02:02.046212    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0940bcf4-b3ca-4f1d-92df-5fa9f477c800-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"0940bcf4-b3ca-4f1d-92df-5fa9f477c800\") " pod="openstack/nova-cell0-conductor-0"
Mar 20 16:02:02 crc kubenswrapper[4730]: I0320 16:02:02.046855    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0940bcf4-b3ca-4f1d-92df-5fa9f477c800-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"0940bcf4-b3ca-4f1d-92df-5fa9f477c800\") " pod="openstack/nova-cell0-conductor-0"
Mar 20 16:02:02 crc kubenswrapper[4730]: I0320 16:02:02.066223    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95rw8\" (UniqueName: \"kubernetes.io/projected/0940bcf4-b3ca-4f1d-92df-5fa9f477c800-kube-api-access-95rw8\") pod \"nova-cell0-conductor-0\" (UID: \"0940bcf4-b3ca-4f1d-92df-5fa9f477c800\") " pod="openstack/nova-cell0-conductor-0"
Mar 20 16:02:02 crc kubenswrapper[4730]: I0320 16:02:02.123907    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0"
Mar 20 16:02:02 crc kubenswrapper[4730]: I0320 16:02:02.585767    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"]
Mar 20 16:02:02 crc kubenswrapper[4730]: I0320 16:02:02.729685    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"0940bcf4-b3ca-4f1d-92df-5fa9f477c800","Type":"ContainerStarted","Data":"b01ec4d0612406942ef5a166c52fecb4325a42f7709f92a49bd937099ba5ff8c"}
Mar 20 16:02:02 crc kubenswrapper[4730]: I0320 16:02:02.732885    4730 generic.go:334] "Generic (PLEG): container finished" podID="8f1785c3-01b9-48cd-bfc9-c0fdb1c18455" containerID="807d658a9e1c791073ad6dce59cf86eec477c7d4420c8a363f99c8986963ad00" exitCode=0
Mar 20 16:02:02 crc kubenswrapper[4730]: I0320 16:02:02.733648    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567042-rngpc" event={"ID":"8f1785c3-01b9-48cd-bfc9-c0fdb1c18455","Type":"ContainerDied","Data":"807d658a9e1c791073ad6dce59cf86eec477c7d4420c8a363f99c8986963ad00"}
Mar 20 16:02:03 crc kubenswrapper[4730]: I0320 16:02:03.656326    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"]
Mar 20 16:02:03 crc kubenswrapper[4730]: I0320 16:02:03.742970    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"0940bcf4-b3ca-4f1d-92df-5fa9f477c800","Type":"ContainerStarted","Data":"339ea55d2abfdc249e3164e1df0588dc76b8df8a4a83a8021b6600b6a5471598"}
Mar 20 16:02:03 crc kubenswrapper[4730]: I0320 16:02:03.743171    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f03f19a6-9d2e-421e-9388-68ae49ae68ef" containerName="ceilometer-central-agent" containerID="cri-o://9dd9d5de632bf89de261876ec488aa9f35efb63310645d30071a551b02d1b18c" gracePeriod=30
Mar 20 16:02:03 crc kubenswrapper[4730]: I0320 16:02:03.743261    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f03f19a6-9d2e-421e-9388-68ae49ae68ef" containerName="sg-core" containerID="cri-o://fcfdc390791242aab10df223e6f66346cede319e34a43195f06b469deaa0cf18" gracePeriod=30
Mar 20 16:02:03 crc kubenswrapper[4730]: I0320 16:02:03.743222    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f03f19a6-9d2e-421e-9388-68ae49ae68ef" containerName="proxy-httpd" containerID="cri-o://3cdc313706dd8cba3d3a54bac2bfd0a19fc778679c99983f7df6459d6ad53f25" gracePeriod=30
Mar 20 16:02:03 crc kubenswrapper[4730]: I0320 16:02:03.743301    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f03f19a6-9d2e-421e-9388-68ae49ae68ef" containerName="ceilometer-notification-agent" containerID="cri-o://f0057f5dab402a46d249f303cdcb727d64def9624f515e382c4084ea494f079a" gracePeriod=30
Mar 20 16:02:03 crc kubenswrapper[4730]: I0320 16:02:03.769002    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.76898248 podStartE2EDuration="2.76898248s" podCreationTimestamp="2026-03-20 16:02:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:02:03.764580184 +0000 UTC m=+1382.977951563" watchObservedRunningTime="2026-03-20 16:02:03.76898248 +0000 UTC m=+1382.982353849"
Mar 20 16:02:04 crc kubenswrapper[4730]: I0320 16:02:04.754678    4730 generic.go:334] "Generic (PLEG): container finished" podID="f03f19a6-9d2e-421e-9388-68ae49ae68ef" containerID="3cdc313706dd8cba3d3a54bac2bfd0a19fc778679c99983f7df6459d6ad53f25" exitCode=0
Mar 20 16:02:04 crc kubenswrapper[4730]: I0320 16:02:04.755150    4730 generic.go:334] "Generic (PLEG): container finished" podID="f03f19a6-9d2e-421e-9388-68ae49ae68ef" containerID="fcfdc390791242aab10df223e6f66346cede319e34a43195f06b469deaa0cf18" exitCode=2
Mar 20 16:02:04 crc kubenswrapper[4730]: I0320 16:02:04.754741    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f03f19a6-9d2e-421e-9388-68ae49ae68ef","Type":"ContainerDied","Data":"3cdc313706dd8cba3d3a54bac2bfd0a19fc778679c99983f7df6459d6ad53f25"}
Mar 20 16:02:04 crc kubenswrapper[4730]: I0320 16:02:04.755232    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f03f19a6-9d2e-421e-9388-68ae49ae68ef","Type":"ContainerDied","Data":"fcfdc390791242aab10df223e6f66346cede319e34a43195f06b469deaa0cf18"}
Mar 20 16:02:04 crc kubenswrapper[4730]: I0320 16:02:04.755334    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0"
Mar 20 16:02:05 crc kubenswrapper[4730]: I0320 16:02:05.084637    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567042-rngpc"
Mar 20 16:02:05 crc kubenswrapper[4730]: I0320 16:02:05.213678    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2grlf\" (UniqueName: \"kubernetes.io/projected/8f1785c3-01b9-48cd-bfc9-c0fdb1c18455-kube-api-access-2grlf\") pod \"8f1785c3-01b9-48cd-bfc9-c0fdb1c18455\" (UID: \"8f1785c3-01b9-48cd-bfc9-c0fdb1c18455\") "
Mar 20 16:02:05 crc kubenswrapper[4730]: I0320 16:02:05.220099    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f1785c3-01b9-48cd-bfc9-c0fdb1c18455-kube-api-access-2grlf" (OuterVolumeSpecName: "kube-api-access-2grlf") pod "8f1785c3-01b9-48cd-bfc9-c0fdb1c18455" (UID: "8f1785c3-01b9-48cd-bfc9-c0fdb1c18455"). InnerVolumeSpecName "kube-api-access-2grlf". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:02:05 crc kubenswrapper[4730]: I0320 16:02:05.316277    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2grlf\" (UniqueName: \"kubernetes.io/projected/8f1785c3-01b9-48cd-bfc9-c0fdb1c18455-kube-api-access-2grlf\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:05 crc kubenswrapper[4730]: I0320 16:02:05.764510    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567042-rngpc"
Mar 20 16:02:05 crc kubenswrapper[4730]: I0320 16:02:05.766074    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567042-rngpc" event={"ID":"8f1785c3-01b9-48cd-bfc9-c0fdb1c18455","Type":"ContainerDied","Data":"0195877c2e8f64939a13b8bd295cc860249fcf82aef822acc7e477581c695355"}
Mar 20 16:02:05 crc kubenswrapper[4730]: I0320 16:02:05.767309    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0195877c2e8f64939a13b8bd295cc860249fcf82aef822acc7e477581c695355"
Mar 20 16:02:05 crc kubenswrapper[4730]: I0320 16:02:05.771735    4730 generic.go:334] "Generic (PLEG): container finished" podID="f03f19a6-9d2e-421e-9388-68ae49ae68ef" containerID="f0057f5dab402a46d249f303cdcb727d64def9624f515e382c4084ea494f079a" exitCode=0
Mar 20 16:02:05 crc kubenswrapper[4730]: I0320 16:02:05.772382    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f03f19a6-9d2e-421e-9388-68ae49ae68ef","Type":"ContainerDied","Data":"f0057f5dab402a46d249f303cdcb727d64def9624f515e382c4084ea494f079a"}
Mar 20 16:02:06 crc kubenswrapper[4730]: I0320 16:02:06.146051    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567036-mnkd8"]
Mar 20 16:02:06 crc kubenswrapper[4730]: I0320 16:02:06.157098    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567036-mnkd8"]
Mar 20 16:02:07 crc kubenswrapper[4730]: I0320 16:02:07.154997    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0"
Mar 20 16:02:07 crc kubenswrapper[4730]: I0320 16:02:07.545849    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="889da12d-843a-4c71-8d48-cbb0360b024a" path="/var/lib/kubelet/pods/889da12d-843a-4c71-8d48-cbb0360b024a/volumes"
Mar 20 16:02:07 crc kubenswrapper[4730]: I0320 16:02:07.860292    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-f7rjc"]
Mar 20 16:02:07 crc kubenswrapper[4730]: E0320 16:02:07.860987    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f1785c3-01b9-48cd-bfc9-c0fdb1c18455" containerName="oc"
Mar 20 16:02:07 crc kubenswrapper[4730]: I0320 16:02:07.861014    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f1785c3-01b9-48cd-bfc9-c0fdb1c18455" containerName="oc"
Mar 20 16:02:07 crc kubenswrapper[4730]: I0320 16:02:07.861317    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f1785c3-01b9-48cd-bfc9-c0fdb1c18455" containerName="oc"
Mar 20 16:02:07 crc kubenswrapper[4730]: I0320 16:02:07.862486    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-f7rjc"
Mar 20 16:02:07 crc kubenswrapper[4730]: I0320 16:02:07.864901    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts"
Mar 20 16:02:07 crc kubenswrapper[4730]: I0320 16:02:07.875608    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-f7rjc"]
Mar 20 16:02:07 crc kubenswrapper[4730]: I0320 16:02:07.906724    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data"
Mar 20 16:02:07 crc kubenswrapper[4730]: I0320 16:02:07.972571    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f144e50-8d18-49a5-a3ef-84b72e6e119f-scripts\") pod \"nova-cell0-cell-mapping-f7rjc\" (UID: \"8f144e50-8d18-49a5-a3ef-84b72e6e119f\") " pod="openstack/nova-cell0-cell-mapping-f7rjc"
Mar 20 16:02:07 crc kubenswrapper[4730]: I0320 16:02:07.972683    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f144e50-8d18-49a5-a3ef-84b72e6e119f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-f7rjc\" (UID: \"8f144e50-8d18-49a5-a3ef-84b72e6e119f\") " pod="openstack/nova-cell0-cell-mapping-f7rjc"
Mar 20 16:02:07 crc kubenswrapper[4730]: I0320 16:02:07.972768    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f144e50-8d18-49a5-a3ef-84b72e6e119f-config-data\") pod \"nova-cell0-cell-mapping-f7rjc\" (UID: \"8f144e50-8d18-49a5-a3ef-84b72e6e119f\") " pod="openstack/nova-cell0-cell-mapping-f7rjc"
Mar 20 16:02:07 crc kubenswrapper[4730]: I0320 16:02:07.972803    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dn84\" (UniqueName: \"kubernetes.io/projected/8f144e50-8d18-49a5-a3ef-84b72e6e119f-kube-api-access-4dn84\") pod \"nova-cell0-cell-mapping-f7rjc\" (UID: \"8f144e50-8d18-49a5-a3ef-84b72e6e119f\") " pod="openstack/nova-cell0-cell-mapping-f7rjc"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.075391    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f144e50-8d18-49a5-a3ef-84b72e6e119f-scripts\") pod \"nova-cell0-cell-mapping-f7rjc\" (UID: \"8f144e50-8d18-49a5-a3ef-84b72e6e119f\") " pod="openstack/nova-cell0-cell-mapping-f7rjc"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.075491    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f144e50-8d18-49a5-a3ef-84b72e6e119f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-f7rjc\" (UID: \"8f144e50-8d18-49a5-a3ef-84b72e6e119f\") " pod="openstack/nova-cell0-cell-mapping-f7rjc"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.075552    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f144e50-8d18-49a5-a3ef-84b72e6e119f-config-data\") pod \"nova-cell0-cell-mapping-f7rjc\" (UID: \"8f144e50-8d18-49a5-a3ef-84b72e6e119f\") " pod="openstack/nova-cell0-cell-mapping-f7rjc"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.075600    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dn84\" (UniqueName: \"kubernetes.io/projected/8f144e50-8d18-49a5-a3ef-84b72e6e119f-kube-api-access-4dn84\") pod \"nova-cell0-cell-mapping-f7rjc\" (UID: \"8f144e50-8d18-49a5-a3ef-84b72e6e119f\") " pod="openstack/nova-cell0-cell-mapping-f7rjc"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.082450    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f144e50-8d18-49a5-a3ef-84b72e6e119f-scripts\") pod \"nova-cell0-cell-mapping-f7rjc\" (UID: \"8f144e50-8d18-49a5-a3ef-84b72e6e119f\") " pod="openstack/nova-cell0-cell-mapping-f7rjc"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.083742    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f144e50-8d18-49a5-a3ef-84b72e6e119f-config-data\") pod \"nova-cell0-cell-mapping-f7rjc\" (UID: \"8f144e50-8d18-49a5-a3ef-84b72e6e119f\") " pod="openstack/nova-cell0-cell-mapping-f7rjc"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.084882    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f144e50-8d18-49a5-a3ef-84b72e6e119f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-f7rjc\" (UID: \"8f144e50-8d18-49a5-a3ef-84b72e6e119f\") " pod="openstack/nova-cell0-cell-mapping-f7rjc"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.102320    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"]
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.104136    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.108943    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dn84\" (UniqueName: \"kubernetes.io/projected/8f144e50-8d18-49a5-a3ef-84b72e6e119f-kube-api-access-4dn84\") pod \"nova-cell0-cell-mapping-f7rjc\" (UID: \"8f144e50-8d18-49a5-a3ef-84b72e6e119f\") " pod="openstack/nova-cell0-cell-mapping-f7rjc"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.109193    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.120878    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"]
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.122305    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.124772    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.135350    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"]
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.151404    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"]
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.228205    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-f7rjc"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.280232    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"]
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.283148    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.285525    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.289746    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlxxq\" (UniqueName: \"kubernetes.io/projected/01c93007-4d31-47a2-810c-819caf917e43-kube-api-access-nlxxq\") pod \"nova-scheduler-0\" (UID: \"01c93007-4d31-47a2-810c-819caf917e43\") " pod="openstack/nova-scheduler-0"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.289797    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/585eb246-c6cd-4641-a7fb-86d2ef87e31e-logs\") pod \"nova-api-0\" (UID: \"585eb246-c6cd-4641-a7fb-86d2ef87e31e\") " pod="openstack/nova-api-0"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.289879    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01c93007-4d31-47a2-810c-819caf917e43-config-data\") pod \"nova-scheduler-0\" (UID: \"01c93007-4d31-47a2-810c-819caf917e43\") " pod="openstack/nova-scheduler-0"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.289942    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/585eb246-c6cd-4641-a7fb-86d2ef87e31e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"585eb246-c6cd-4641-a7fb-86d2ef87e31e\") " pod="openstack/nova-api-0"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.290032    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/585eb246-c6cd-4641-a7fb-86d2ef87e31e-config-data\") pod \"nova-api-0\" (UID: \"585eb246-c6cd-4641-a7fb-86d2ef87e31e\") " pod="openstack/nova-api-0"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.290124    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01c93007-4d31-47a2-810c-819caf917e43-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"01c93007-4d31-47a2-810c-819caf917e43\") " pod="openstack/nova-scheduler-0"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.294716    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9r8b\" (UniqueName: \"kubernetes.io/projected/585eb246-c6cd-4641-a7fb-86d2ef87e31e-kube-api-access-l9r8b\") pod \"nova-api-0\" (UID: \"585eb246-c6cd-4641-a7fb-86d2ef87e31e\") " pod="openstack/nova-api-0"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.347343    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"]
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.397406    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlxxq\" (UniqueName: \"kubernetes.io/projected/01c93007-4d31-47a2-810c-819caf917e43-kube-api-access-nlxxq\") pod \"nova-scheduler-0\" (UID: \"01c93007-4d31-47a2-810c-819caf917e43\") " pod="openstack/nova-scheduler-0"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.397647    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/585eb246-c6cd-4641-a7fb-86d2ef87e31e-logs\") pod \"nova-api-0\" (UID: \"585eb246-c6cd-4641-a7fb-86d2ef87e31e\") " pod="openstack/nova-api-0"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.397686    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f1bcc8c-7598-4c25-aaa7-0a9636c0729c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1f1bcc8c-7598-4c25-aaa7-0a9636c0729c\") " pod="openstack/nova-cell1-novncproxy-0"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.397712    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f1bcc8c-7598-4c25-aaa7-0a9636c0729c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1f1bcc8c-7598-4c25-aaa7-0a9636c0729c\") " pod="openstack/nova-cell1-novncproxy-0"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.397740    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01c93007-4d31-47a2-810c-819caf917e43-config-data\") pod \"nova-scheduler-0\" (UID: \"01c93007-4d31-47a2-810c-819caf917e43\") " pod="openstack/nova-scheduler-0"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.397774    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/585eb246-c6cd-4641-a7fb-86d2ef87e31e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"585eb246-c6cd-4641-a7fb-86d2ef87e31e\") " pod="openstack/nova-api-0"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.397810    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/585eb246-c6cd-4641-a7fb-86d2ef87e31e-config-data\") pod \"nova-api-0\" (UID: \"585eb246-c6cd-4641-a7fb-86d2ef87e31e\") " pod="openstack/nova-api-0"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.397850    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01c93007-4d31-47a2-810c-819caf917e43-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"01c93007-4d31-47a2-810c-819caf917e43\") " pod="openstack/nova-scheduler-0"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.397874    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmvzz\" (UniqueName: \"kubernetes.io/projected/1f1bcc8c-7598-4c25-aaa7-0a9636c0729c-kube-api-access-zmvzz\") pod \"nova-cell1-novncproxy-0\" (UID: \"1f1bcc8c-7598-4c25-aaa7-0a9636c0729c\") " pod="openstack/nova-cell1-novncproxy-0"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.397915    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9r8b\" (UniqueName: \"kubernetes.io/projected/585eb246-c6cd-4641-a7fb-86d2ef87e31e-kube-api-access-l9r8b\") pod \"nova-api-0\" (UID: \"585eb246-c6cd-4641-a7fb-86d2ef87e31e\") " pod="openstack/nova-api-0"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.403966    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/585eb246-c6cd-4641-a7fb-86d2ef87e31e-logs\") pod \"nova-api-0\" (UID: \"585eb246-c6cd-4641-a7fb-86d2ef87e31e\") " pod="openstack/nova-api-0"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.415432    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01c93007-4d31-47a2-810c-819caf917e43-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"01c93007-4d31-47a2-810c-819caf917e43\") " pod="openstack/nova-scheduler-0"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.416883    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/585eb246-c6cd-4641-a7fb-86d2ef87e31e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"585eb246-c6cd-4641-a7fb-86d2ef87e31e\") " pod="openstack/nova-api-0"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.424927    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/585eb246-c6cd-4641-a7fb-86d2ef87e31e-config-data\") pod \"nova-api-0\" (UID: \"585eb246-c6cd-4641-a7fb-86d2ef87e31e\") " pod="openstack/nova-api-0"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.426192    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01c93007-4d31-47a2-810c-819caf917e43-config-data\") pod \"nova-scheduler-0\" (UID: \"01c93007-4d31-47a2-810c-819caf917e43\") " pod="openstack/nova-scheduler-0"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.436965    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9r8b\" (UniqueName: \"kubernetes.io/projected/585eb246-c6cd-4641-a7fb-86d2ef87e31e-kube-api-access-l9r8b\") pod \"nova-api-0\" (UID: \"585eb246-c6cd-4641-a7fb-86d2ef87e31e\") " pod="openstack/nova-api-0"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.445302    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlxxq\" (UniqueName: \"kubernetes.io/projected/01c93007-4d31-47a2-810c-819caf917e43-kube-api-access-nlxxq\") pod \"nova-scheduler-0\" (UID: \"01c93007-4d31-47a2-810c-819caf917e43\") " pod="openstack/nova-scheduler-0"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.445312    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"]
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.447375    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.451197    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.462797    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"]
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.505615    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-654455944c-qph9q"]
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.508627    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-654455944c-qph9q"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.511123    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f1bcc8c-7598-4c25-aaa7-0a9636c0729c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1f1bcc8c-7598-4c25-aaa7-0a9636c0729c\") " pod="openstack/nova-cell1-novncproxy-0"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.511175    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f1bcc8c-7598-4c25-aaa7-0a9636c0729c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1f1bcc8c-7598-4c25-aaa7-0a9636c0729c\") " pod="openstack/nova-cell1-novncproxy-0"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.511369    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmvzz\" (UniqueName: \"kubernetes.io/projected/1f1bcc8c-7598-4c25-aaa7-0a9636c0729c-kube-api-access-zmvzz\") pod \"nova-cell1-novncproxy-0\" (UID: \"1f1bcc8c-7598-4c25-aaa7-0a9636c0729c\") " pod="openstack/nova-cell1-novncproxy-0"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.520323    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f1bcc8c-7598-4c25-aaa7-0a9636c0729c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1f1bcc8c-7598-4c25-aaa7-0a9636c0729c\") " pod="openstack/nova-cell1-novncproxy-0"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.534162    4730 scope.go:117] "RemoveContainer" containerID="ca3ac5b513d25322badcc2bf19b245d687c9ccf8bff6c35cf5794c95ec2ab964"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.534422    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmvzz\" (UniqueName: \"kubernetes.io/projected/1f1bcc8c-7598-4c25-aaa7-0a9636c0729c-kube-api-access-zmvzz\") pod \"nova-cell1-novncproxy-0\" (UID: \"1f1bcc8c-7598-4c25-aaa7-0a9636c0729c\") " pod="openstack/nova-cell1-novncproxy-0"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.535895    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f1bcc8c-7598-4c25-aaa7-0a9636c0729c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1f1bcc8c-7598-4c25-aaa7-0a9636c0729c\") " pod="openstack/nova-cell1-novncproxy-0"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.540488    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.559367    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-654455944c-qph9q"]
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.563423    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.614287    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84937d37-8276-4014-b1ae-bb84547384af-ovsdbserver-sb\") pod \"dnsmasq-dns-654455944c-qph9q\" (UID: \"84937d37-8276-4014-b1ae-bb84547384af\") " pod="openstack/dnsmasq-dns-654455944c-qph9q"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.614462    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/84937d37-8276-4014-b1ae-bb84547384af-dns-swift-storage-0\") pod \"dnsmasq-dns-654455944c-qph9q\" (UID: \"84937d37-8276-4014-b1ae-bb84547384af\") " pod="openstack/dnsmasq-dns-654455944c-qph9q"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.614499    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwnnn\" (UniqueName: \"kubernetes.io/projected/84937d37-8276-4014-b1ae-bb84547384af-kube-api-access-fwnnn\") pod \"dnsmasq-dns-654455944c-qph9q\" (UID: \"84937d37-8276-4014-b1ae-bb84547384af\") " pod="openstack/dnsmasq-dns-654455944c-qph9q"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.614566    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32ac7762-8088-4313-a883-898b569b9154-logs\") pod \"nova-metadata-0\" (UID: \"32ac7762-8088-4313-a883-898b569b9154\") " pod="openstack/nova-metadata-0"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.614792    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84937d37-8276-4014-b1ae-bb84547384af-config\") pod \"dnsmasq-dns-654455944c-qph9q\" (UID: \"84937d37-8276-4014-b1ae-bb84547384af\") " pod="openstack/dnsmasq-dns-654455944c-qph9q"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.616331    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32ac7762-8088-4313-a883-898b569b9154-config-data\") pod \"nova-metadata-0\" (UID: \"32ac7762-8088-4313-a883-898b569b9154\") " pod="openstack/nova-metadata-0"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.616401    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84937d37-8276-4014-b1ae-bb84547384af-dns-svc\") pod \"dnsmasq-dns-654455944c-qph9q\" (UID: \"84937d37-8276-4014-b1ae-bb84547384af\") " pod="openstack/dnsmasq-dns-654455944c-qph9q"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.616525    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32ac7762-8088-4313-a883-898b569b9154-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"32ac7762-8088-4313-a883-898b569b9154\") " pod="openstack/nova-metadata-0"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.616554    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88r6q\" (UniqueName: \"kubernetes.io/projected/32ac7762-8088-4313-a883-898b569b9154-kube-api-access-88r6q\") pod \"nova-metadata-0\" (UID: \"32ac7762-8088-4313-a883-898b569b9154\") " pod="openstack/nova-metadata-0"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.616753    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84937d37-8276-4014-b1ae-bb84547384af-ovsdbserver-nb\") pod \"dnsmasq-dns-654455944c-qph9q\" (UID: \"84937d37-8276-4014-b1ae-bb84547384af\") " pod="openstack/dnsmasq-dns-654455944c-qph9q"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.724202    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/84937d37-8276-4014-b1ae-bb84547384af-dns-swift-storage-0\") pod \"dnsmasq-dns-654455944c-qph9q\" (UID: \"84937d37-8276-4014-b1ae-bb84547384af\") " pod="openstack/dnsmasq-dns-654455944c-qph9q"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.724309    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwnnn\" (UniqueName: \"kubernetes.io/projected/84937d37-8276-4014-b1ae-bb84547384af-kube-api-access-fwnnn\") pod \"dnsmasq-dns-654455944c-qph9q\" (UID: \"84937d37-8276-4014-b1ae-bb84547384af\") " pod="openstack/dnsmasq-dns-654455944c-qph9q"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.724343    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32ac7762-8088-4313-a883-898b569b9154-logs\") pod \"nova-metadata-0\" (UID: \"32ac7762-8088-4313-a883-898b569b9154\") " pod="openstack/nova-metadata-0"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.724439    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84937d37-8276-4014-b1ae-bb84547384af-config\") pod \"dnsmasq-dns-654455944c-qph9q\" (UID: \"84937d37-8276-4014-b1ae-bb84547384af\") " pod="openstack/dnsmasq-dns-654455944c-qph9q"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.724474    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32ac7762-8088-4313-a883-898b569b9154-config-data\") pod \"nova-metadata-0\" (UID: \"32ac7762-8088-4313-a883-898b569b9154\") " pod="openstack/nova-metadata-0"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.724499    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84937d37-8276-4014-b1ae-bb84547384af-dns-svc\") pod \"dnsmasq-dns-654455944c-qph9q\" (UID: \"84937d37-8276-4014-b1ae-bb84547384af\") " pod="openstack/dnsmasq-dns-654455944c-qph9q"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.724572    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32ac7762-8088-4313-a883-898b569b9154-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"32ac7762-8088-4313-a883-898b569b9154\") " pod="openstack/nova-metadata-0"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.724593    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88r6q\" (UniqueName: \"kubernetes.io/projected/32ac7762-8088-4313-a883-898b569b9154-kube-api-access-88r6q\") pod \"nova-metadata-0\" (UID: \"32ac7762-8088-4313-a883-898b569b9154\") " pod="openstack/nova-metadata-0"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.727426    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84937d37-8276-4014-b1ae-bb84547384af-ovsdbserver-nb\") pod \"dnsmasq-dns-654455944c-qph9q\" (UID: \"84937d37-8276-4014-b1ae-bb84547384af\") " pod="openstack/dnsmasq-dns-654455944c-qph9q"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.727525    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84937d37-8276-4014-b1ae-bb84547384af-ovsdbserver-sb\") pod \"dnsmasq-dns-654455944c-qph9q\" (UID: \"84937d37-8276-4014-b1ae-bb84547384af\") " pod="openstack/dnsmasq-dns-654455944c-qph9q"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.737855    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/84937d37-8276-4014-b1ae-bb84547384af-dns-swift-storage-0\") pod \"dnsmasq-dns-654455944c-qph9q\" (UID: \"84937d37-8276-4014-b1ae-bb84547384af\") " pod="openstack/dnsmasq-dns-654455944c-qph9q"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.738013    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32ac7762-8088-4313-a883-898b569b9154-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"32ac7762-8088-4313-a883-898b569b9154\") " pod="openstack/nova-metadata-0"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.738771    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84937d37-8276-4014-b1ae-bb84547384af-dns-svc\") pod \"dnsmasq-dns-654455944c-qph9q\" (UID: \"84937d37-8276-4014-b1ae-bb84547384af\") " pod="openstack/dnsmasq-dns-654455944c-qph9q"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.739944    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84937d37-8276-4014-b1ae-bb84547384af-config\") pod \"dnsmasq-dns-654455944c-qph9q\" (UID: \"84937d37-8276-4014-b1ae-bb84547384af\") " pod="openstack/dnsmasq-dns-654455944c-qph9q"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.740047    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32ac7762-8088-4313-a883-898b569b9154-logs\") pod \"nova-metadata-0\" (UID: \"32ac7762-8088-4313-a883-898b569b9154\") " pod="openstack/nova-metadata-0"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.740792    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84937d37-8276-4014-b1ae-bb84547384af-ovsdbserver-nb\") pod \"dnsmasq-dns-654455944c-qph9q\" (UID: \"84937d37-8276-4014-b1ae-bb84547384af\") " pod="openstack/dnsmasq-dns-654455944c-qph9q"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.741989    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84937d37-8276-4014-b1ae-bb84547384af-ovsdbserver-sb\") pod \"dnsmasq-dns-654455944c-qph9q\" (UID: \"84937d37-8276-4014-b1ae-bb84547384af\") " pod="openstack/dnsmasq-dns-654455944c-qph9q"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.745166    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32ac7762-8088-4313-a883-898b569b9154-config-data\") pod \"nova-metadata-0\" (UID: \"32ac7762-8088-4313-a883-898b569b9154\") " pod="openstack/nova-metadata-0"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.746421    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88r6q\" (UniqueName: \"kubernetes.io/projected/32ac7762-8088-4313-a883-898b569b9154-kube-api-access-88r6q\") pod \"nova-metadata-0\" (UID: \"32ac7762-8088-4313-a883-898b569b9154\") " pod="openstack/nova-metadata-0"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.768882    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.772641    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwnnn\" (UniqueName: \"kubernetes.io/projected/84937d37-8276-4014-b1ae-bb84547384af-kube-api-access-fwnnn\") pod \"dnsmasq-dns-654455944c-qph9q\" (UID: \"84937d37-8276-4014-b1ae-bb84547384af\") " pod="openstack/dnsmasq-dns-654455944c-qph9q"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.801851    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0"
Mar 20 16:02:08 crc kubenswrapper[4730]: I0320 16:02:08.842573    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-654455944c-qph9q"
Mar 20 16:02:09 crc kubenswrapper[4730]: I0320 16:02:09.029507    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-f7rjc"]
Mar 20 16:02:09 crc kubenswrapper[4730]: I0320 16:02:09.179984    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"]
Mar 20 16:02:09 crc kubenswrapper[4730]: I0320 16:02:09.345056    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"]
Mar 20 16:02:09 crc kubenswrapper[4730]: I0320 16:02:09.443016    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-cdhdz"]
Mar 20 16:02:09 crc kubenswrapper[4730]: I0320 16:02:09.445005    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-cdhdz"
Mar 20 16:02:09 crc kubenswrapper[4730]: I0320 16:02:09.447208    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data"
Mar 20 16:02:09 crc kubenswrapper[4730]: I0320 16:02:09.447791    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts"
Mar 20 16:02:09 crc kubenswrapper[4730]: I0320 16:02:09.454717    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-cdhdz"]
Mar 20 16:02:09 crc kubenswrapper[4730]: I0320 16:02:09.512273    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-cdhdz\" (UID: \"ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647\") " pod="openstack/nova-cell1-conductor-db-sync-cdhdz"
Mar 20 16:02:09 crc kubenswrapper[4730]: I0320 16:02:09.512313    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnxln\" (UniqueName: \"kubernetes.io/projected/ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647-kube-api-access-jnxln\") pod \"nova-cell1-conductor-db-sync-cdhdz\" (UID: \"ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647\") " pod="openstack/nova-cell1-conductor-db-sync-cdhdz"
Mar 20 16:02:09 crc kubenswrapper[4730]: I0320 16:02:09.512343    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647-scripts\") pod \"nova-cell1-conductor-db-sync-cdhdz\" (UID: \"ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647\") " pod="openstack/nova-cell1-conductor-db-sync-cdhdz"
Mar 20 16:02:09 crc kubenswrapper[4730]: I0320 16:02:09.512452    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647-config-data\") pod \"nova-cell1-conductor-db-sync-cdhdz\" (UID: \"ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647\") " pod="openstack/nova-cell1-conductor-db-sync-cdhdz"
Mar 20 16:02:09 crc kubenswrapper[4730]: I0320 16:02:09.613961    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-cdhdz\" (UID: \"ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647\") " pod="openstack/nova-cell1-conductor-db-sync-cdhdz"
Mar 20 16:02:09 crc kubenswrapper[4730]: I0320 16:02:09.614230    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnxln\" (UniqueName: \"kubernetes.io/projected/ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647-kube-api-access-jnxln\") pod \"nova-cell1-conductor-db-sync-cdhdz\" (UID: \"ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647\") " pod="openstack/nova-cell1-conductor-db-sync-cdhdz"
Mar 20 16:02:09 crc kubenswrapper[4730]: I0320 16:02:09.614270    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647-scripts\") pod \"nova-cell1-conductor-db-sync-cdhdz\" (UID: \"ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647\") " pod="openstack/nova-cell1-conductor-db-sync-cdhdz"
Mar 20 16:02:09 crc kubenswrapper[4730]: I0320 16:02:09.614390    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647-config-data\") pod \"nova-cell1-conductor-db-sync-cdhdz\" (UID: \"ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647\") " pod="openstack/nova-cell1-conductor-db-sync-cdhdz"
Mar 20 16:02:09 crc kubenswrapper[4730]: I0320 16:02:09.620053    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"]
Mar 20 16:02:09 crc kubenswrapper[4730]: I0320 16:02:09.644215    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647-scripts\") pod \"nova-cell1-conductor-db-sync-cdhdz\" (UID: \"ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647\") " pod="openstack/nova-cell1-conductor-db-sync-cdhdz"
Mar 20 16:02:09 crc kubenswrapper[4730]: I0320 16:02:09.644751    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-cdhdz\" (UID: \"ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647\") " pod="openstack/nova-cell1-conductor-db-sync-cdhdz"
Mar 20 16:02:09 crc kubenswrapper[4730]: I0320 16:02:09.651721    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnxln\" (UniqueName: \"kubernetes.io/projected/ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647-kube-api-access-jnxln\") pod \"nova-cell1-conductor-db-sync-cdhdz\" (UID: \"ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647\") " pod="openstack/nova-cell1-conductor-db-sync-cdhdz"
Mar 20 16:02:09 crc kubenswrapper[4730]: I0320 16:02:09.651868    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647-config-data\") pod \"nova-cell1-conductor-db-sync-cdhdz\" (UID: \"ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647\") " pod="openstack/nova-cell1-conductor-db-sync-cdhdz"
Mar 20 16:02:09 crc kubenswrapper[4730]: I0320 16:02:09.692965    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"]
Mar 20 16:02:09 crc kubenswrapper[4730]: I0320 16:02:09.788786    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-cdhdz"
Mar 20 16:02:09 crc kubenswrapper[4730]: I0320 16:02:09.845704    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-654455944c-qph9q"]
Mar 20 16:02:09 crc kubenswrapper[4730]: W0320 16:02:09.856504    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84937d37_8276_4014_b1ae_bb84547384af.slice/crio-0a2eb8594326a606d45ae0159cba2be47a452aea3e7bbc81624368815408532b WatchSource:0}: Error finding container 0a2eb8594326a606d45ae0159cba2be47a452aea3e7bbc81624368815408532b: Status 404 returned error can't find the container with id 0a2eb8594326a606d45ae0159cba2be47a452aea3e7bbc81624368815408532b
Mar 20 16:02:09 crc kubenswrapper[4730]: I0320 16:02:09.856724    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1f1bcc8c-7598-4c25-aaa7-0a9636c0729c","Type":"ContainerStarted","Data":"bb467f0f9108cd2d4075fdbdf95b9f10f944692ab74865e527713ebadb5491f5"}
Mar 20 16:02:09 crc kubenswrapper[4730]: I0320 16:02:09.863370    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-f7rjc" event={"ID":"8f144e50-8d18-49a5-a3ef-84b72e6e119f","Type":"ContainerStarted","Data":"712419554ee7980049c60af4c6c43298daddafb72a74d5efc70ba46df50bba0e"}
Mar 20 16:02:09 crc kubenswrapper[4730]: I0320 16:02:09.863440    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-f7rjc" event={"ID":"8f144e50-8d18-49a5-a3ef-84b72e6e119f","Type":"ContainerStarted","Data":"e3b08dc7c4abd2dcaea34ec7c791ef1e1b1a8fa905aae37ea25e912f633e57a6"}
Mar 20 16:02:09 crc kubenswrapper[4730]: I0320 16:02:09.879954    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"3f6c808e-d523-48bd-8ec2-28b625834317","Type":"ContainerStarted","Data":"ed4f7daeff4a5766b3f240bd0c76702ab4bff0db1848809b07121ff30e3a5bfc"}
Mar 20 16:02:09 crc kubenswrapper[4730]: I0320 16:02:09.902303    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"01c93007-4d31-47a2-810c-819caf917e43","Type":"ContainerStarted","Data":"3250edd54dccb12be75e701aadcaf4d148acb76b38a455ec3944d6f9f7d0f0e8"}
Mar 20 16:02:09 crc kubenswrapper[4730]: I0320 16:02:09.904794    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"585eb246-c6cd-4641-a7fb-86d2ef87e31e","Type":"ContainerStarted","Data":"b2e0e39bb6caaa7438ca18c26b9795ce5981a3e0bae88697a3af45f446cad74c"}
Mar 20 16:02:09 crc kubenswrapper[4730]: I0320 16:02:09.905554    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"32ac7762-8088-4313-a883-898b569b9154","Type":"ContainerStarted","Data":"e7a5d0df9d685e0f92bcdac057ecc8056efed293df571627fdc9a7cd3a4ee7ff"}
Mar 20 16:02:09 crc kubenswrapper[4730]: I0320 16:02:09.906908    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-f7rjc" podStartSLOduration=2.906880573 podStartE2EDuration="2.906880573s" podCreationTimestamp="2026-03-20 16:02:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:02:09.893650362 +0000 UTC m=+1389.107021741" watchObservedRunningTime="2026-03-20 16:02:09.906880573 +0000 UTC m=+1389.120251942"
Mar 20 16:02:09 crc kubenswrapper[4730]: I0320 16:02:09.985322    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0"
Mar 20 16:02:10 crc kubenswrapper[4730]: I0320 16:02:10.166497    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0"
Mar 20 16:02:10 crc kubenswrapper[4730]: I0320 16:02:10.568274    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-cdhdz"]
Mar 20 16:02:10 crc kubenswrapper[4730]: E0320 16:02:10.857599    4730 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod279d2368_abe1_465a_9007_68542e5dbfc4.slice/crio-da2c214fcdb33fae608237a0fb1d01559481f0b67a4afa1e6c930298a64b75a2\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod279d2368_abe1_465a_9007_68542e5dbfc4.slice\": RecentStats: unable to find data in memory cache]"
Mar 20 16:02:10 crc kubenswrapper[4730]: I0320 16:02:10.941542    4730 generic.go:334] "Generic (PLEG): container finished" podID="84937d37-8276-4014-b1ae-bb84547384af" containerID="a42a5cae6611c2fe3d170915f157931f092b64200d58d7b15c079992b8c3bd9a" exitCode=0
Mar 20 16:02:10 crc kubenswrapper[4730]: I0320 16:02:10.942131    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-654455944c-qph9q" event={"ID":"84937d37-8276-4014-b1ae-bb84547384af","Type":"ContainerDied","Data":"a42a5cae6611c2fe3d170915f157931f092b64200d58d7b15c079992b8c3bd9a"}
Mar 20 16:02:10 crc kubenswrapper[4730]: I0320 16:02:10.942195    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-654455944c-qph9q" event={"ID":"84937d37-8276-4014-b1ae-bb84547384af","Type":"ContainerStarted","Data":"0a2eb8594326a606d45ae0159cba2be47a452aea3e7bbc81624368815408532b"}
Mar 20 16:02:10 crc kubenswrapper[4730]: I0320 16:02:10.942523    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0"
Mar 20 16:02:10 crc kubenswrapper[4730]: I0320 16:02:10.986201    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0"
Mar 20 16:02:11 crc kubenswrapper[4730]: I0320 16:02:11.023068    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"]
Mar 20 16:02:11 crc kubenswrapper[4730]: I0320 16:02:11.957843    4730 generic.go:334] "Generic (PLEG): container finished" podID="f03f19a6-9d2e-421e-9388-68ae49ae68ef" containerID="9dd9d5de632bf89de261876ec488aa9f35efb63310645d30071a551b02d1b18c" exitCode=0
Mar 20 16:02:11 crc kubenswrapper[4730]: I0320 16:02:11.957917    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f03f19a6-9d2e-421e-9388-68ae49ae68ef","Type":"ContainerDied","Data":"9dd9d5de632bf89de261876ec488aa9f35efb63310645d30071a551b02d1b18c"}
Mar 20 16:02:12 crc kubenswrapper[4730]: I0320 16:02:12.966573    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-decision-engine-0" podUID="3f6c808e-d523-48bd-8ec2-28b625834317" containerName="watcher-decision-engine" containerID="cri-o://ed4f7daeff4a5766b3f240bd0c76702ab4bff0db1848809b07121ff30e3a5bfc" gracePeriod=30
Mar 20 16:02:13 crc kubenswrapper[4730]: W0320 16:02:13.226287    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podddb7b5ed_16a0_4fb8_96d3_df3c9c2cd647.slice/crio-d9f3d982f9ee881581e688def58986a8c7c62dacb4325c11bdceadbe4b6ffa1a WatchSource:0}: Error finding container d9f3d982f9ee881581e688def58986a8c7c62dacb4325c11bdceadbe4b6ffa1a: Status 404 returned error can't find the container with id d9f3d982f9ee881581e688def58986a8c7c62dacb4325c11bdceadbe4b6ffa1a
Mar 20 16:02:13 crc kubenswrapper[4730]: I0320 16:02:13.416131    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"]
Mar 20 16:02:13 crc kubenswrapper[4730]: I0320 16:02:13.432409    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"]
Mar 20 16:02:13 crc kubenswrapper[4730]: I0320 16:02:13.728086    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0"
Mar 20 16:02:13 crc kubenswrapper[4730]: I0320 16:02:13.849639    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f03f19a6-9d2e-421e-9388-68ae49ae68ef-sg-core-conf-yaml\") pod \"f03f19a6-9d2e-421e-9388-68ae49ae68ef\" (UID: \"f03f19a6-9d2e-421e-9388-68ae49ae68ef\") "
Mar 20 16:02:13 crc kubenswrapper[4730]: I0320 16:02:13.849681    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f03f19a6-9d2e-421e-9388-68ae49ae68ef-run-httpd\") pod \"f03f19a6-9d2e-421e-9388-68ae49ae68ef\" (UID: \"f03f19a6-9d2e-421e-9388-68ae49ae68ef\") "
Mar 20 16:02:13 crc kubenswrapper[4730]: I0320 16:02:13.849728    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f03f19a6-9d2e-421e-9388-68ae49ae68ef-scripts\") pod \"f03f19a6-9d2e-421e-9388-68ae49ae68ef\" (UID: \"f03f19a6-9d2e-421e-9388-68ae49ae68ef\") "
Mar 20 16:02:13 crc kubenswrapper[4730]: I0320 16:02:13.849757    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f03f19a6-9d2e-421e-9388-68ae49ae68ef-log-httpd\") pod \"f03f19a6-9d2e-421e-9388-68ae49ae68ef\" (UID: \"f03f19a6-9d2e-421e-9388-68ae49ae68ef\") "
Mar 20 16:02:13 crc kubenswrapper[4730]: I0320 16:02:13.849779    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f03f19a6-9d2e-421e-9388-68ae49ae68ef-config-data\") pod \"f03f19a6-9d2e-421e-9388-68ae49ae68ef\" (UID: \"f03f19a6-9d2e-421e-9388-68ae49ae68ef\") "
Mar 20 16:02:13 crc kubenswrapper[4730]: I0320 16:02:13.849919    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f03f19a6-9d2e-421e-9388-68ae49ae68ef-combined-ca-bundle\") pod \"f03f19a6-9d2e-421e-9388-68ae49ae68ef\" (UID: \"f03f19a6-9d2e-421e-9388-68ae49ae68ef\") "
Mar 20 16:02:13 crc kubenswrapper[4730]: I0320 16:02:13.849950    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2lkw\" (UniqueName: \"kubernetes.io/projected/f03f19a6-9d2e-421e-9388-68ae49ae68ef-kube-api-access-w2lkw\") pod \"f03f19a6-9d2e-421e-9388-68ae49ae68ef\" (UID: \"f03f19a6-9d2e-421e-9388-68ae49ae68ef\") "
Mar 20 16:02:13 crc kubenswrapper[4730]: I0320 16:02:13.851186    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f03f19a6-9d2e-421e-9388-68ae49ae68ef-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f03f19a6-9d2e-421e-9388-68ae49ae68ef" (UID: "f03f19a6-9d2e-421e-9388-68ae49ae68ef"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:02:13 crc kubenswrapper[4730]: I0320 16:02:13.851437    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f03f19a6-9d2e-421e-9388-68ae49ae68ef-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f03f19a6-9d2e-421e-9388-68ae49ae68ef" (UID: "f03f19a6-9d2e-421e-9388-68ae49ae68ef"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:02:13 crc kubenswrapper[4730]: I0320 16:02:13.861976    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f03f19a6-9d2e-421e-9388-68ae49ae68ef-scripts" (OuterVolumeSpecName: "scripts") pod "f03f19a6-9d2e-421e-9388-68ae49ae68ef" (UID: "f03f19a6-9d2e-421e-9388-68ae49ae68ef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:02:13 crc kubenswrapper[4730]: I0320 16:02:13.875570    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f03f19a6-9d2e-421e-9388-68ae49ae68ef-kube-api-access-w2lkw" (OuterVolumeSpecName: "kube-api-access-w2lkw") pod "f03f19a6-9d2e-421e-9388-68ae49ae68ef" (UID: "f03f19a6-9d2e-421e-9388-68ae49ae68ef"). InnerVolumeSpecName "kube-api-access-w2lkw". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:02:13 crc kubenswrapper[4730]: I0320 16:02:13.953869    4730 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f03f19a6-9d2e-421e-9388-68ae49ae68ef-run-httpd\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:13 crc kubenswrapper[4730]: I0320 16:02:13.953899    4730 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f03f19a6-9d2e-421e-9388-68ae49ae68ef-scripts\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:13 crc kubenswrapper[4730]: I0320 16:02:13.953909    4730 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f03f19a6-9d2e-421e-9388-68ae49ae68ef-log-httpd\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:13 crc kubenswrapper[4730]: I0320 16:02:13.953917    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2lkw\" (UniqueName: \"kubernetes.io/projected/f03f19a6-9d2e-421e-9388-68ae49ae68ef-kube-api-access-w2lkw\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:13 crc kubenswrapper[4730]: I0320 16:02:13.984163    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f03f19a6-9d2e-421e-9388-68ae49ae68ef-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f03f19a6-9d2e-421e-9388-68ae49ae68ef" (UID: "f03f19a6-9d2e-421e-9388-68ae49ae68ef"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.000337    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-654455944c-qph9q" event={"ID":"84937d37-8276-4014-b1ae-bb84547384af","Type":"ContainerStarted","Data":"61e03f987a4e7a22fe8153ac2d6c60c19cb72977de582edcb281b2f18aeef951"}
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.000482    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-654455944c-qph9q"
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.004024    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"01c93007-4d31-47a2-810c-819caf917e43","Type":"ContainerStarted","Data":"68063e072de62260e0731b15d4d132f06b3322eede18b910d69b064464dffc8d"}
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.008017    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-cdhdz" event={"ID":"ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647","Type":"ContainerStarted","Data":"78840c5174380df1ec37853cf5867820b6c6e093e27294e16eb3795a80e9c2a8"}
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.008065    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-cdhdz" event={"ID":"ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647","Type":"ContainerStarted","Data":"d9f3d982f9ee881581e688def58986a8c7c62dacb4325c11bdceadbe4b6ffa1a"}
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.010193    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"585eb246-c6cd-4641-a7fb-86d2ef87e31e","Type":"ContainerStarted","Data":"605dab74894b941f8dc47934657056d0f993430ffbd34d8c7cb723a73fb546d4"}
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.029434    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-654455944c-qph9q" podStartSLOduration=6.029321542 podStartE2EDuration="6.029321542s" podCreationTimestamp="2026-03-20 16:02:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:02:14.028455877 +0000 UTC m=+1393.241827246" watchObservedRunningTime="2026-03-20 16:02:14.029321542 +0000 UTC m=+1393.242692911"
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.042678    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0"
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.043341    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f03f19a6-9d2e-421e-9388-68ae49ae68ef","Type":"ContainerDied","Data":"d3a4ea5bd7c7ce048ff6487734956147a2553ddcde8a7a07f3bf8e3324ca2fc8"}
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.043385    4730 scope.go:117] "RemoveContainer" containerID="3cdc313706dd8cba3d3a54bac2bfd0a19fc778679c99983f7df6459d6ad53f25"
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.055734    4730 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f03f19a6-9d2e-421e-9388-68ae49ae68ef-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.058026    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.953331898 podStartE2EDuration="6.058002876s" podCreationTimestamp="2026-03-20 16:02:08 +0000 UTC" firstStartedPulling="2026-03-20 16:02:09.344915449 +0000 UTC m=+1388.558286818" lastFinishedPulling="2026-03-20 16:02:13.449586427 +0000 UTC m=+1392.662957796" observedRunningTime="2026-03-20 16:02:14.052720454 +0000 UTC m=+1393.266091823" watchObservedRunningTime="2026-03-20 16:02:14.058002876 +0000 UTC m=+1393.271374245"
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.061914    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"32ac7762-8088-4313-a883-898b569b9154","Type":"ContainerStarted","Data":"9979114478a3a955b42e7b3859791b120a8c87e81eba7cafe092f17caecff80d"}
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.083797    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-cdhdz" podStartSLOduration=5.083778077 podStartE2EDuration="5.083778077s" podCreationTimestamp="2026-03-20 16:02:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:02:14.071738161 +0000 UTC m=+1393.285109530" watchObservedRunningTime="2026-03-20 16:02:14.083778077 +0000 UTC m=+1393.297149446"
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.087034    4730 scope.go:117] "RemoveContainer" containerID="fcfdc390791242aab10df223e6f66346cede319e34a43195f06b469deaa0cf18"
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.105263    4730 scope.go:117] "RemoveContainer" containerID="f0057f5dab402a46d249f303cdcb727d64def9624f515e382c4084ea494f079a"
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.163969    4730 scope.go:117] "RemoveContainer" containerID="9dd9d5de632bf89de261876ec488aa9f35efb63310645d30071a551b02d1b18c"
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.200661    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f03f19a6-9d2e-421e-9388-68ae49ae68ef-config-data" (OuterVolumeSpecName: "config-data") pod "f03f19a6-9d2e-421e-9388-68ae49ae68ef" (UID: "f03f19a6-9d2e-421e-9388-68ae49ae68ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.259785    4730 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f03f19a6-9d2e-421e-9388-68ae49ae68ef-config-data\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.285343    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f03f19a6-9d2e-421e-9388-68ae49ae68ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f03f19a6-9d2e-421e-9388-68ae49ae68ef" (UID: "f03f19a6-9d2e-421e-9388-68ae49ae68ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.361130    4730 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f03f19a6-9d2e-421e-9388-68ae49ae68ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.428011    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"]
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.439983    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"]
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.470777    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"]
Mar 20 16:02:14 crc kubenswrapper[4730]: E0320 16:02:14.471363    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f03f19a6-9d2e-421e-9388-68ae49ae68ef" containerName="proxy-httpd"
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.471389    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="f03f19a6-9d2e-421e-9388-68ae49ae68ef" containerName="proxy-httpd"
Mar 20 16:02:14 crc kubenswrapper[4730]: E0320 16:02:14.471409    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f03f19a6-9d2e-421e-9388-68ae49ae68ef" containerName="sg-core"
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.471418    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="f03f19a6-9d2e-421e-9388-68ae49ae68ef" containerName="sg-core"
Mar 20 16:02:14 crc kubenswrapper[4730]: E0320 16:02:14.471450    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f03f19a6-9d2e-421e-9388-68ae49ae68ef" containerName="ceilometer-central-agent"
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.471460    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="f03f19a6-9d2e-421e-9388-68ae49ae68ef" containerName="ceilometer-central-agent"
Mar 20 16:02:14 crc kubenswrapper[4730]: E0320 16:02:14.471476    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f03f19a6-9d2e-421e-9388-68ae49ae68ef" containerName="ceilometer-notification-agent"
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.471485    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="f03f19a6-9d2e-421e-9388-68ae49ae68ef" containerName="ceilometer-notification-agent"
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.471882    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="f03f19a6-9d2e-421e-9388-68ae49ae68ef" containerName="ceilometer-notification-agent"
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.471945    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="f03f19a6-9d2e-421e-9388-68ae49ae68ef" containerName="proxy-httpd"
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.471962    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="f03f19a6-9d2e-421e-9388-68ae49ae68ef" containerName="ceilometer-central-agent"
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.471978    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="f03f19a6-9d2e-421e-9388-68ae49ae68ef" containerName="sg-core"
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.474082    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0"
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.483729    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data"
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.488421    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts"
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.495521    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"]
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.585155    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c192a384-369a-4011-bbd0-af10cf958010-log-httpd\") pod \"ceilometer-0\" (UID: \"c192a384-369a-4011-bbd0-af10cf958010\") " pod="openstack/ceilometer-0"
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.585226    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c192a384-369a-4011-bbd0-af10cf958010-scripts\") pod \"ceilometer-0\" (UID: \"c192a384-369a-4011-bbd0-af10cf958010\") " pod="openstack/ceilometer-0"
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.585353    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c192a384-369a-4011-bbd0-af10cf958010-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c192a384-369a-4011-bbd0-af10cf958010\") " pod="openstack/ceilometer-0"
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.585402    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c192a384-369a-4011-bbd0-af10cf958010-config-data\") pod \"ceilometer-0\" (UID: \"c192a384-369a-4011-bbd0-af10cf958010\") " pod="openstack/ceilometer-0"
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.585478    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c192a384-369a-4011-bbd0-af10cf958010-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c192a384-369a-4011-bbd0-af10cf958010\") " pod="openstack/ceilometer-0"
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.585512    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c192a384-369a-4011-bbd0-af10cf958010-run-httpd\") pod \"ceilometer-0\" (UID: \"c192a384-369a-4011-bbd0-af10cf958010\") " pod="openstack/ceilometer-0"
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.585553    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tkk7\" (UniqueName: \"kubernetes.io/projected/c192a384-369a-4011-bbd0-af10cf958010-kube-api-access-2tkk7\") pod \"ceilometer-0\" (UID: \"c192a384-369a-4011-bbd0-af10cf958010\") " pod="openstack/ceilometer-0"
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.688998    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c192a384-369a-4011-bbd0-af10cf958010-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c192a384-369a-4011-bbd0-af10cf958010\") " pod="openstack/ceilometer-0"
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.689072    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c192a384-369a-4011-bbd0-af10cf958010-run-httpd\") pod \"ceilometer-0\" (UID: \"c192a384-369a-4011-bbd0-af10cf958010\") " pod="openstack/ceilometer-0"
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.689131    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tkk7\" (UniqueName: \"kubernetes.io/projected/c192a384-369a-4011-bbd0-af10cf958010-kube-api-access-2tkk7\") pod \"ceilometer-0\" (UID: \"c192a384-369a-4011-bbd0-af10cf958010\") " pod="openstack/ceilometer-0"
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.689189    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c192a384-369a-4011-bbd0-af10cf958010-log-httpd\") pod \"ceilometer-0\" (UID: \"c192a384-369a-4011-bbd0-af10cf958010\") " pod="openstack/ceilometer-0"
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.689226    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c192a384-369a-4011-bbd0-af10cf958010-scripts\") pod \"ceilometer-0\" (UID: \"c192a384-369a-4011-bbd0-af10cf958010\") " pod="openstack/ceilometer-0"
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.689410    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c192a384-369a-4011-bbd0-af10cf958010-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c192a384-369a-4011-bbd0-af10cf958010\") " pod="openstack/ceilometer-0"
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.689471    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c192a384-369a-4011-bbd0-af10cf958010-config-data\") pod \"ceilometer-0\" (UID: \"c192a384-369a-4011-bbd0-af10cf958010\") " pod="openstack/ceilometer-0"
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.690081    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c192a384-369a-4011-bbd0-af10cf958010-run-httpd\") pod \"ceilometer-0\" (UID: \"c192a384-369a-4011-bbd0-af10cf958010\") " pod="openstack/ceilometer-0"
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.691196    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c192a384-369a-4011-bbd0-af10cf958010-log-httpd\") pod \"ceilometer-0\" (UID: \"c192a384-369a-4011-bbd0-af10cf958010\") " pod="openstack/ceilometer-0"
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.694957    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c192a384-369a-4011-bbd0-af10cf958010-scripts\") pod \"ceilometer-0\" (UID: \"c192a384-369a-4011-bbd0-af10cf958010\") " pod="openstack/ceilometer-0"
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.695832    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c192a384-369a-4011-bbd0-af10cf958010-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c192a384-369a-4011-bbd0-af10cf958010\") " pod="openstack/ceilometer-0"
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.696062    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c192a384-369a-4011-bbd0-af10cf958010-config-data\") pod \"ceilometer-0\" (UID: \"c192a384-369a-4011-bbd0-af10cf958010\") " pod="openstack/ceilometer-0"
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.698714    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c192a384-369a-4011-bbd0-af10cf958010-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c192a384-369a-4011-bbd0-af10cf958010\") " pod="openstack/ceilometer-0"
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.740349    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tkk7\" (UniqueName: \"kubernetes.io/projected/c192a384-369a-4011-bbd0-af10cf958010-kube-api-access-2tkk7\") pod \"ceilometer-0\" (UID: \"c192a384-369a-4011-bbd0-af10cf958010\") " pod="openstack/ceilometer-0"
Mar 20 16:02:14 crc kubenswrapper[4730]: I0320 16:02:14.793446    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0"
Mar 20 16:02:15 crc kubenswrapper[4730]: I0320 16:02:15.072771    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"585eb246-c6cd-4641-a7fb-86d2ef87e31e","Type":"ContainerStarted","Data":"077bbf75eaae181c0e1146dfa3d57949104bdbba5a29e133956ce977816562e9"}
Mar 20 16:02:15 crc kubenswrapper[4730]: I0320 16:02:15.077609    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"32ac7762-8088-4313-a883-898b569b9154","Type":"ContainerStarted","Data":"c1159b8c566cff8b809d385b30d15b0c6eaab3536998d984b8f4ce6cd7a26c46"}
Mar 20 16:02:15 crc kubenswrapper[4730]: I0320 16:02:15.077751    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="32ac7762-8088-4313-a883-898b569b9154" containerName="nova-metadata-log" containerID="cri-o://9979114478a3a955b42e7b3859791b120a8c87e81eba7cafe092f17caecff80d" gracePeriod=30
Mar 20 16:02:15 crc kubenswrapper[4730]: I0320 16:02:15.078014    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="32ac7762-8088-4313-a883-898b569b9154" containerName="nova-metadata-metadata" containerID="cri-o://c1159b8c566cff8b809d385b30d15b0c6eaab3536998d984b8f4ce6cd7a26c46" gracePeriod=30
Mar 20 16:02:15 crc kubenswrapper[4730]: I0320 16:02:15.087397    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1f1bcc8c-7598-4c25-aaa7-0a9636c0729c","Type":"ContainerStarted","Data":"c51b7b65896e9d86f7a00a27c41da014a980db7c3cc6f6c5953fd8afdde0f897"}
Mar 20 16:02:15 crc kubenswrapper[4730]: I0320 16:02:15.090263    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="1f1bcc8c-7598-4c25-aaa7-0a9636c0729c" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://c51b7b65896e9d86f7a00a27c41da014a980db7c3cc6f6c5953fd8afdde0f897" gracePeriod=30
Mar 20 16:02:15 crc kubenswrapper[4730]: I0320 16:02:15.105363    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.934548693 podStartE2EDuration="7.105339282s" podCreationTimestamp="2026-03-20 16:02:08 +0000 UTC" firstStartedPulling="2026-03-20 16:02:09.282276578 +0000 UTC m=+1388.495647947" lastFinishedPulling="2026-03-20 16:02:13.453067167 +0000 UTC m=+1392.666438536" observedRunningTime="2026-03-20 16:02:15.098930778 +0000 UTC m=+1394.312302147" watchObservedRunningTime="2026-03-20 16:02:15.105339282 +0000 UTC m=+1394.318710651"
Mar 20 16:02:15 crc kubenswrapper[4730]: I0320 16:02:15.154225    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.376051404 podStartE2EDuration="7.154202607s" podCreationTimestamp="2026-03-20 16:02:08 +0000 UTC" firstStartedPulling="2026-03-20 16:02:09.707599184 +0000 UTC m=+1388.920970553" lastFinishedPulling="2026-03-20 16:02:13.485750387 +0000 UTC m=+1392.699121756" observedRunningTime="2026-03-20 16:02:15.122975319 +0000 UTC m=+1394.336346688" watchObservedRunningTime="2026-03-20 16:02:15.154202607 +0000 UTC m=+1394.367573976"
Mar 20 16:02:15 crc kubenswrapper[4730]: I0320 16:02:15.172663    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.338036492 podStartE2EDuration="7.172644427s" podCreationTimestamp="2026-03-20 16:02:08 +0000 UTC" firstStartedPulling="2026-03-20 16:02:09.64655863 +0000 UTC m=+1388.859929999" lastFinishedPulling="2026-03-20 16:02:13.481166565 +0000 UTC m=+1392.694537934" observedRunningTime="2026-03-20 16:02:15.141949394 +0000 UTC m=+1394.355320763" watchObservedRunningTime="2026-03-20 16:02:15.172644427 +0000 UTC m=+1394.386015796"
Mar 20 16:02:15 crc kubenswrapper[4730]: I0320 16:02:15.375699    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"]
Mar 20 16:02:15 crc kubenswrapper[4730]: I0320 16:02:15.544533    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f03f19a6-9d2e-421e-9388-68ae49ae68ef" path="/var/lib/kubelet/pods/f03f19a6-9d2e-421e-9388-68ae49ae68ef/volumes"
Mar 20 16:02:15 crc kubenswrapper[4730]: I0320 16:02:15.840690    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"]
Mar 20 16:02:15 crc kubenswrapper[4730]: I0320 16:02:15.841155    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-applier-0" podUID="5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0" containerName="watcher-applier" containerID="cri-o://06e4300ad2aa0cb045945f92073482ca78016e812eb685af49fb195ce150c681" gracePeriod=30
Mar 20 16:02:15 crc kubenswrapper[4730]: I0320 16:02:15.933050    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"]
Mar 20 16:02:15 crc kubenswrapper[4730]: I0320 16:02:15.933326    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a" containerName="watcher-api-log" containerID="cri-o://c743230ba78e3e74fccce83315455e43e5c471c2e21f3d4ac91af747ed0b8301" gracePeriod=30
Mar 20 16:02:15 crc kubenswrapper[4730]: I0320 16:02:15.933814    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a" containerName="watcher-api" containerID="cri-o://cdc802e9a5716d718ebb13c2b13971164edec56068203d16169206897ba03b6a" gracePeriod=30
Mar 20 16:02:16 crc kubenswrapper[4730]: I0320 16:02:16.136819    4730 generic.go:334] "Generic (PLEG): container finished" podID="32ac7762-8088-4313-a883-898b569b9154" containerID="c1159b8c566cff8b809d385b30d15b0c6eaab3536998d984b8f4ce6cd7a26c46" exitCode=0
Mar 20 16:02:16 crc kubenswrapper[4730]: I0320 16:02:16.136863    4730 generic.go:334] "Generic (PLEG): container finished" podID="32ac7762-8088-4313-a883-898b569b9154" containerID="9979114478a3a955b42e7b3859791b120a8c87e81eba7cafe092f17caecff80d" exitCode=143
Mar 20 16:02:16 crc kubenswrapper[4730]: I0320 16:02:16.136897    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"32ac7762-8088-4313-a883-898b569b9154","Type":"ContainerDied","Data":"c1159b8c566cff8b809d385b30d15b0c6eaab3536998d984b8f4ce6cd7a26c46"}
Mar 20 16:02:16 crc kubenswrapper[4730]: I0320 16:02:16.136931    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"32ac7762-8088-4313-a883-898b569b9154","Type":"ContainerDied","Data":"9979114478a3a955b42e7b3859791b120a8c87e81eba7cafe092f17caecff80d"}
Mar 20 16:02:16 crc kubenswrapper[4730]: I0320 16:02:16.140917    4730 generic.go:334] "Generic (PLEG): container finished" podID="3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a" containerID="c743230ba78e3e74fccce83315455e43e5c471c2e21f3d4ac91af747ed0b8301" exitCode=143
Mar 20 16:02:16 crc kubenswrapper[4730]: I0320 16:02:16.140971    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a","Type":"ContainerDied","Data":"c743230ba78e3e74fccce83315455e43e5c471c2e21f3d4ac91af747ed0b8301"}
Mar 20 16:02:16 crc kubenswrapper[4730]: I0320 16:02:16.160794    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c192a384-369a-4011-bbd0-af10cf958010","Type":"ContainerStarted","Data":"92ca70d767554dd7f5a3c2261603595e47200f286bd84d37815a7187cc55a125"}
Mar 20 16:02:16 crc kubenswrapper[4730]: I0320 16:02:16.160881    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c192a384-369a-4011-bbd0-af10cf958010","Type":"ContainerStarted","Data":"fc7185d335545d2bc3730a5a21059470bf6bc2bcae0a41d7ca4c2027aa011079"}
Mar 20 16:02:16 crc kubenswrapper[4730]: I0320 16:02:16.188584    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0"
Mar 20 16:02:16 crc kubenswrapper[4730]: I0320 16:02:16.225797    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32ac7762-8088-4313-a883-898b569b9154-combined-ca-bundle\") pod \"32ac7762-8088-4313-a883-898b569b9154\" (UID: \"32ac7762-8088-4313-a883-898b569b9154\") "
Mar 20 16:02:16 crc kubenswrapper[4730]: I0320 16:02:16.225863    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88r6q\" (UniqueName: \"kubernetes.io/projected/32ac7762-8088-4313-a883-898b569b9154-kube-api-access-88r6q\") pod \"32ac7762-8088-4313-a883-898b569b9154\" (UID: \"32ac7762-8088-4313-a883-898b569b9154\") "
Mar 20 16:02:16 crc kubenswrapper[4730]: I0320 16:02:16.225943    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32ac7762-8088-4313-a883-898b569b9154-config-data\") pod \"32ac7762-8088-4313-a883-898b569b9154\" (UID: \"32ac7762-8088-4313-a883-898b569b9154\") "
Mar 20 16:02:16 crc kubenswrapper[4730]: I0320 16:02:16.226134    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32ac7762-8088-4313-a883-898b569b9154-logs\") pod \"32ac7762-8088-4313-a883-898b569b9154\" (UID: \"32ac7762-8088-4313-a883-898b569b9154\") "
Mar 20 16:02:16 crc kubenswrapper[4730]: I0320 16:02:16.226920    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32ac7762-8088-4313-a883-898b569b9154-logs" (OuterVolumeSpecName: "logs") pod "32ac7762-8088-4313-a883-898b569b9154" (UID: "32ac7762-8088-4313-a883-898b569b9154"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:02:16 crc kubenswrapper[4730]: I0320 16:02:16.234858    4730 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32ac7762-8088-4313-a883-898b569b9154-logs\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:16 crc kubenswrapper[4730]: I0320 16:02:16.238609    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32ac7762-8088-4313-a883-898b569b9154-kube-api-access-88r6q" (OuterVolumeSpecName: "kube-api-access-88r6q") pod "32ac7762-8088-4313-a883-898b569b9154" (UID: "32ac7762-8088-4313-a883-898b569b9154"). InnerVolumeSpecName "kube-api-access-88r6q". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:02:16 crc kubenswrapper[4730]: I0320 16:02:16.266502    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32ac7762-8088-4313-a883-898b569b9154-config-data" (OuterVolumeSpecName: "config-data") pod "32ac7762-8088-4313-a883-898b569b9154" (UID: "32ac7762-8088-4313-a883-898b569b9154"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:02:16 crc kubenswrapper[4730]: I0320 16:02:16.269624    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32ac7762-8088-4313-a883-898b569b9154-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32ac7762-8088-4313-a883-898b569b9154" (UID: "32ac7762-8088-4313-a883-898b569b9154"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:02:16 crc kubenswrapper[4730]: I0320 16:02:16.344187    4730 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32ac7762-8088-4313-a883-898b569b9154-config-data\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:16 crc kubenswrapper[4730]: I0320 16:02:16.344228    4730 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32ac7762-8088-4313-a883-898b569b9154-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:16 crc kubenswrapper[4730]: I0320 16:02:16.344261    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88r6q\" (UniqueName: \"kubernetes.io/projected/32ac7762-8088-4313-a883-898b569b9154-kube-api-access-88r6q\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.173849    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"32ac7762-8088-4313-a883-898b569b9154","Type":"ContainerDied","Data":"e7a5d0df9d685e0f92bcdac057ecc8056efed293df571627fdc9a7cd3a4ee7ff"}
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.174276    4730 scope.go:117] "RemoveContainer" containerID="c1159b8c566cff8b809d385b30d15b0c6eaab3536998d984b8f4ce6cd7a26c46"
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.173870    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0"
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.176178    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c192a384-369a-4011-bbd0-af10cf958010","Type":"ContainerStarted","Data":"3ed2c729cfba6a845986581f09c8deca053ba44823152224340ca15c6f3b1018"}
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.285572    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"]
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.308288    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"]
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.318851    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"]
Mar 20 16:02:17 crc kubenswrapper[4730]: E0320 16:02:17.319455    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32ac7762-8088-4313-a883-898b569b9154" containerName="nova-metadata-metadata"
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.319474    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="32ac7762-8088-4313-a883-898b569b9154" containerName="nova-metadata-metadata"
Mar 20 16:02:17 crc kubenswrapper[4730]: E0320 16:02:17.319493    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32ac7762-8088-4313-a883-898b569b9154" containerName="nova-metadata-log"
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.319501    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="32ac7762-8088-4313-a883-898b569b9154" containerName="nova-metadata-log"
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.319700    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="32ac7762-8088-4313-a883-898b569b9154" containerName="nova-metadata-log"
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.319722    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="32ac7762-8088-4313-a883-898b569b9154" containerName="nova-metadata-metadata"
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.321274    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0"
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.323770    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data"
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.323778    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc"
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.332316    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"]
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.340062    4730 scope.go:117] "RemoveContainer" containerID="9979114478a3a955b42e7b3859791b120a8c87e81eba7cafe092f17caecff80d"
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.373989    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ctt8\" (UniqueName: \"kubernetes.io/projected/23319f08-4294-49cc-bb24-7a01520e37c6-kube-api-access-7ctt8\") pod \"nova-metadata-0\" (UID: \"23319f08-4294-49cc-bb24-7a01520e37c6\") " pod="openstack/nova-metadata-0"
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.374063    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23319f08-4294-49cc-bb24-7a01520e37c6-logs\") pod \"nova-metadata-0\" (UID: \"23319f08-4294-49cc-bb24-7a01520e37c6\") " pod="openstack/nova-metadata-0"
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.374444    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/23319f08-4294-49cc-bb24-7a01520e37c6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"23319f08-4294-49cc-bb24-7a01520e37c6\") " pod="openstack/nova-metadata-0"
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.374528    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23319f08-4294-49cc-bb24-7a01520e37c6-config-data\") pod \"nova-metadata-0\" (UID: \"23319f08-4294-49cc-bb24-7a01520e37c6\") " pod="openstack/nova-metadata-0"
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.374611    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23319f08-4294-49cc-bb24-7a01520e37c6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"23319f08-4294-49cc-bb24-7a01520e37c6\") " pod="openstack/nova-metadata-0"
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.476236    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/23319f08-4294-49cc-bb24-7a01520e37c6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"23319f08-4294-49cc-bb24-7a01520e37c6\") " pod="openstack/nova-metadata-0"
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.477151    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23319f08-4294-49cc-bb24-7a01520e37c6-config-data\") pod \"nova-metadata-0\" (UID: \"23319f08-4294-49cc-bb24-7a01520e37c6\") " pod="openstack/nova-metadata-0"
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.477195    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23319f08-4294-49cc-bb24-7a01520e37c6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"23319f08-4294-49cc-bb24-7a01520e37c6\") " pod="openstack/nova-metadata-0"
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.477233    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ctt8\" (UniqueName: \"kubernetes.io/projected/23319f08-4294-49cc-bb24-7a01520e37c6-kube-api-access-7ctt8\") pod \"nova-metadata-0\" (UID: \"23319f08-4294-49cc-bb24-7a01520e37c6\") " pod="openstack/nova-metadata-0"
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.477273    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23319f08-4294-49cc-bb24-7a01520e37c6-logs\") pod \"nova-metadata-0\" (UID: \"23319f08-4294-49cc-bb24-7a01520e37c6\") " pod="openstack/nova-metadata-0"
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.477761    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23319f08-4294-49cc-bb24-7a01520e37c6-logs\") pod \"nova-metadata-0\" (UID: \"23319f08-4294-49cc-bb24-7a01520e37c6\") " pod="openstack/nova-metadata-0"
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.482437    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/23319f08-4294-49cc-bb24-7a01520e37c6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"23319f08-4294-49cc-bb24-7a01520e37c6\") " pod="openstack/nova-metadata-0"
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.487155    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23319f08-4294-49cc-bb24-7a01520e37c6-config-data\") pod \"nova-metadata-0\" (UID: \"23319f08-4294-49cc-bb24-7a01520e37c6\") " pod="openstack/nova-metadata-0"
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.505847    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23319f08-4294-49cc-bb24-7a01520e37c6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"23319f08-4294-49cc-bb24-7a01520e37c6\") " pod="openstack/nova-metadata-0"
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.525092    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ctt8\" (UniqueName: \"kubernetes.io/projected/23319f08-4294-49cc-bb24-7a01520e37c6-kube-api-access-7ctt8\") pod \"nova-metadata-0\" (UID: \"23319f08-4294-49cc-bb24-7a01520e37c6\") " pod="openstack/nova-metadata-0"
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.548058    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32ac7762-8088-4313-a883-898b569b9154" path="/var/lib/kubelet/pods/32ac7762-8088-4313-a883-898b569b9154/volumes"
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.648879    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0"
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.760889    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0"
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.886088    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a-public-tls-certs\") pod \"3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a\" (UID: \"3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a\") "
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.886399    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fk5wd\" (UniqueName: \"kubernetes.io/projected/3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a-kube-api-access-fk5wd\") pod \"3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a\" (UID: \"3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a\") "
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.886435    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a-config-data\") pod \"3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a\" (UID: \"3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a\") "
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.886509    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a-combined-ca-bundle\") pod \"3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a\" (UID: \"3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a\") "
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.886553    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a-internal-tls-certs\") pod \"3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a\" (UID: \"3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a\") "
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.886679    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a-logs\") pod \"3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a\" (UID: \"3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a\") "
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.886771    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a-custom-prometheus-ca\") pod \"3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a\" (UID: \"3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a\") "
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.892560    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a-logs" (OuterVolumeSpecName: "logs") pod "3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a" (UID: "3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.904847    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a-kube-api-access-fk5wd" (OuterVolumeSpecName: "kube-api-access-fk5wd") pod "3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a" (UID: "3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a"). InnerVolumeSpecName "kube-api-access-fk5wd". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.920105    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a" (UID: "3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.951372    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a" (UID: "3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.957429    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a" (UID: "3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.972024    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a" (UID: "3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.982695    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a-config-data" (OuterVolumeSpecName: "config-data") pod "3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a" (UID: "3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.989447    4730 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a-config-data\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.989479    4730 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.989492    4730 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a-internal-tls-certs\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.989503    4730 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a-logs\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.989513    4730 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a-custom-prometheus-ca\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.989524    4730 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a-public-tls-certs\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:17 crc kubenswrapper[4730]: I0320 16:02:17.989534    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fk5wd\" (UniqueName: \"kubernetes.io/projected/3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a-kube-api-access-fk5wd\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.187415    4730 generic.go:334] "Generic (PLEG): container finished" podID="3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a" containerID="cdc802e9a5716d718ebb13c2b13971164edec56068203d16169206897ba03b6a" exitCode=0
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.187464    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a","Type":"ContainerDied","Data":"cdc802e9a5716d718ebb13c2b13971164edec56068203d16169206897ba03b6a"}
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.187494    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0"
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.187530    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a","Type":"ContainerDied","Data":"4c2c6a630c73d61868e537787c0ffc30dd54b3a6ece7e73f02c494b3afd3c924"}
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.187553    4730 scope.go:117] "RemoveContainer" containerID="cdc802e9a5716d718ebb13c2b13971164edec56068203d16169206897ba03b6a"
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.191487    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c192a384-369a-4011-bbd0-af10cf958010","Type":"ContainerStarted","Data":"67e70a210d20ab3fb6eb013ac35769c758e3dbe555ae4869a1f308e1ce50d12f"}
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.213660    4730 scope.go:117] "RemoveContainer" containerID="c743230ba78e3e74fccce83315455e43e5c471c2e21f3d4ac91af747ed0b8301"
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.233401    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"]
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.262481    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"]
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.274735    4730 scope.go:117] "RemoveContainer" containerID="cdc802e9a5716d718ebb13c2b13971164edec56068203d16169206897ba03b6a"
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.276279    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"]
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.291795    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"]
Mar 20 16:02:18 crc kubenswrapper[4730]: E0320 16:02:18.292695    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a" containerName="watcher-api-log"
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.292720    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a" containerName="watcher-api-log"
Mar 20 16:02:18 crc kubenswrapper[4730]: E0320 16:02:18.292769    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a" containerName="watcher-api"
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.292781    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a" containerName="watcher-api"
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.293003    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a" containerName="watcher-api"
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.293028    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a" containerName="watcher-api-log"
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.294564    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0"
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.298183    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data"
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.298418    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-internal-svc"
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.298545    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-public-svc"
Mar 20 16:02:18 crc kubenswrapper[4730]: E0320 16:02:18.305176    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdc802e9a5716d718ebb13c2b13971164edec56068203d16169206897ba03b6a\": container with ID starting with cdc802e9a5716d718ebb13c2b13971164edec56068203d16169206897ba03b6a not found: ID does not exist" containerID="cdc802e9a5716d718ebb13c2b13971164edec56068203d16169206897ba03b6a"
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.305403    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdc802e9a5716d718ebb13c2b13971164edec56068203d16169206897ba03b6a"} err="failed to get container status \"cdc802e9a5716d718ebb13c2b13971164edec56068203d16169206897ba03b6a\": rpc error: code = NotFound desc = could not find container \"cdc802e9a5716d718ebb13c2b13971164edec56068203d16169206897ba03b6a\": container with ID starting with cdc802e9a5716d718ebb13c2b13971164edec56068203d16169206897ba03b6a not found: ID does not exist"
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.305432    4730 scope.go:117] "RemoveContainer" containerID="c743230ba78e3e74fccce83315455e43e5c471c2e21f3d4ac91af747ed0b8301"
Mar 20 16:02:18 crc kubenswrapper[4730]: E0320 16:02:18.309062    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c743230ba78e3e74fccce83315455e43e5c471c2e21f3d4ac91af747ed0b8301\": container with ID starting with c743230ba78e3e74fccce83315455e43e5c471c2e21f3d4ac91af747ed0b8301 not found: ID does not exist" containerID="c743230ba78e3e74fccce83315455e43e5c471c2e21f3d4ac91af747ed0b8301"
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.309097    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c743230ba78e3e74fccce83315455e43e5c471c2e21f3d4ac91af747ed0b8301"} err="failed to get container status \"c743230ba78e3e74fccce83315455e43e5c471c2e21f3d4ac91af747ed0b8301\": rpc error: code = NotFound desc = could not find container \"c743230ba78e3e74fccce83315455e43e5c471c2e21f3d4ac91af747ed0b8301\": container with ID starting with c743230ba78e3e74fccce83315455e43e5c471c2e21f3d4ac91af747ed0b8301 not found: ID does not exist"
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.310320    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"]
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.399488    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba310e23-3097-4114-8628-4e7ada94eac6-config-data\") pod \"watcher-api-0\" (UID: \"ba310e23-3097-4114-8628-4e7ada94eac6\") " pod="openstack/watcher-api-0"
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.399561    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba310e23-3097-4114-8628-4e7ada94eac6-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"ba310e23-3097-4114-8628-4e7ada94eac6\") " pod="openstack/watcher-api-0"
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.399911    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba310e23-3097-4114-8628-4e7ada94eac6-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"ba310e23-3097-4114-8628-4e7ada94eac6\") " pod="openstack/watcher-api-0"
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.399977    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ba310e23-3097-4114-8628-4e7ada94eac6-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"ba310e23-3097-4114-8628-4e7ada94eac6\") " pod="openstack/watcher-api-0"
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.400235    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba310e23-3097-4114-8628-4e7ada94eac6-logs\") pod \"watcher-api-0\" (UID: \"ba310e23-3097-4114-8628-4e7ada94eac6\") " pod="openstack/watcher-api-0"
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.400319    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-252wn\" (UniqueName: \"kubernetes.io/projected/ba310e23-3097-4114-8628-4e7ada94eac6-kube-api-access-252wn\") pod \"watcher-api-0\" (UID: \"ba310e23-3097-4114-8628-4e7ada94eac6\") " pod="openstack/watcher-api-0"
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.400354    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba310e23-3097-4114-8628-4e7ada94eac6-public-tls-certs\") pod \"watcher-api-0\" (UID: \"ba310e23-3097-4114-8628-4e7ada94eac6\") " pod="openstack/watcher-api-0"
Mar 20 16:02:18 crc kubenswrapper[4730]: E0320 16:02:18.405975    4730 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="06e4300ad2aa0cb045945f92073482ca78016e812eb685af49fb195ce150c681" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"]
Mar 20 16:02:18 crc kubenswrapper[4730]: E0320 16:02:18.407803    4730 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="06e4300ad2aa0cb045945f92073482ca78016e812eb685af49fb195ce150c681" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"]
Mar 20 16:02:18 crc kubenswrapper[4730]: E0320 16:02:18.409214    4730 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="06e4300ad2aa0cb045945f92073482ca78016e812eb685af49fb195ce150c681" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"]
Mar 20 16:02:18 crc kubenswrapper[4730]: E0320 16:02:18.409317    4730 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0" containerName="watcher-applier"
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.502602    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba310e23-3097-4114-8628-4e7ada94eac6-logs\") pod \"watcher-api-0\" (UID: \"ba310e23-3097-4114-8628-4e7ada94eac6\") " pod="openstack/watcher-api-0"
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.502675    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-252wn\" (UniqueName: \"kubernetes.io/projected/ba310e23-3097-4114-8628-4e7ada94eac6-kube-api-access-252wn\") pod \"watcher-api-0\" (UID: \"ba310e23-3097-4114-8628-4e7ada94eac6\") " pod="openstack/watcher-api-0"
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.502703    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba310e23-3097-4114-8628-4e7ada94eac6-public-tls-certs\") pod \"watcher-api-0\" (UID: \"ba310e23-3097-4114-8628-4e7ada94eac6\") " pod="openstack/watcher-api-0"
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.502748    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba310e23-3097-4114-8628-4e7ada94eac6-config-data\") pod \"watcher-api-0\" (UID: \"ba310e23-3097-4114-8628-4e7ada94eac6\") " pod="openstack/watcher-api-0"
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.502815    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba310e23-3097-4114-8628-4e7ada94eac6-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"ba310e23-3097-4114-8628-4e7ada94eac6\") " pod="openstack/watcher-api-0"
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.502901    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba310e23-3097-4114-8628-4e7ada94eac6-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"ba310e23-3097-4114-8628-4e7ada94eac6\") " pod="openstack/watcher-api-0"
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.502931    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ba310e23-3097-4114-8628-4e7ada94eac6-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"ba310e23-3097-4114-8628-4e7ada94eac6\") " pod="openstack/watcher-api-0"
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.506767    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba310e23-3097-4114-8628-4e7ada94eac6-logs\") pod \"watcher-api-0\" (UID: \"ba310e23-3097-4114-8628-4e7ada94eac6\") " pod="openstack/watcher-api-0"
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.507126    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba310e23-3097-4114-8628-4e7ada94eac6-public-tls-certs\") pod \"watcher-api-0\" (UID: \"ba310e23-3097-4114-8628-4e7ada94eac6\") " pod="openstack/watcher-api-0"
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.507599    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba310e23-3097-4114-8628-4e7ada94eac6-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"ba310e23-3097-4114-8628-4e7ada94eac6\") " pod="openstack/watcher-api-0"
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.509792    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ba310e23-3097-4114-8628-4e7ada94eac6-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"ba310e23-3097-4114-8628-4e7ada94eac6\") " pod="openstack/watcher-api-0"
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.510236    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba310e23-3097-4114-8628-4e7ada94eac6-config-data\") pod \"watcher-api-0\" (UID: \"ba310e23-3097-4114-8628-4e7ada94eac6\") " pod="openstack/watcher-api-0"
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.513746    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba310e23-3097-4114-8628-4e7ada94eac6-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"ba310e23-3097-4114-8628-4e7ada94eac6\") " pod="openstack/watcher-api-0"
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.524111    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-252wn\" (UniqueName: \"kubernetes.io/projected/ba310e23-3097-4114-8628-4e7ada94eac6-kube-api-access-252wn\") pod \"watcher-api-0\" (UID: \"ba310e23-3097-4114-8628-4e7ada94eac6\") " pod="openstack/watcher-api-0"
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.541655    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0"
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.541914    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0"
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.564933    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0"
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.566226    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0"
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.614902    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0"
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.622629    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0"
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.771232    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0"
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.853382    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-654455944c-qph9q"
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.938332    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76cb94d47c-txmh6"]
Mar 20 16:02:18 crc kubenswrapper[4730]: I0320 16:02:18.938570    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-76cb94d47c-txmh6" podUID="ff335b2a-909a-4c39-a045-2267c73ac8b2" containerName="dnsmasq-dns" containerID="cri-o://1a5f9090983f451fb355cec1fdd913ee82666d2f07f611bcec31819f098f8126" gracePeriod=10
Mar 20 16:02:19 crc kubenswrapper[4730]: I0320 16:02:19.221981    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"23319f08-4294-49cc-bb24-7a01520e37c6","Type":"ContainerStarted","Data":"a527ffbd2a54fec297ae5f5968aa11134c7ecf139f953464b50d8aaf8a48e15d"}
Mar 20 16:02:19 crc kubenswrapper[4730]: I0320 16:02:19.222262    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"23319f08-4294-49cc-bb24-7a01520e37c6","Type":"ContainerStarted","Data":"a0c082b53bef3328cda841db37115f58cfc064af02b9d184944305bba914d52b"}
Mar 20 16:02:19 crc kubenswrapper[4730]: I0320 16:02:19.222272    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"23319f08-4294-49cc-bb24-7a01520e37c6","Type":"ContainerStarted","Data":"1a4e509ba032eff859c3dcc5260a9f2631d938aec766c95550415d37207eae1a"}
Mar 20 16:02:19 crc kubenswrapper[4730]: I0320 16:02:19.253943    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.253916722 podStartE2EDuration="2.253916722s" podCreationTimestamp="2026-03-20 16:02:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:02:19.244484111 +0000 UTC m=+1398.457855480" watchObservedRunningTime="2026-03-20 16:02:19.253916722 +0000 UTC m=+1398.467288091"
Mar 20 16:02:19 crc kubenswrapper[4730]: I0320 16:02:19.262932    4730 generic.go:334] "Generic (PLEG): container finished" podID="ff335b2a-909a-4c39-a045-2267c73ac8b2" containerID="1a5f9090983f451fb355cec1fdd913ee82666d2f07f611bcec31819f098f8126" exitCode=0
Mar 20 16:02:19 crc kubenswrapper[4730]: I0320 16:02:19.263937    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76cb94d47c-txmh6" event={"ID":"ff335b2a-909a-4c39-a045-2267c73ac8b2","Type":"ContainerDied","Data":"1a5f9090983f451fb355cec1fdd913ee82666d2f07f611bcec31819f098f8126"}
Mar 20 16:02:19 crc kubenswrapper[4730]: I0320 16:02:19.306845    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0"
Mar 20 16:02:19 crc kubenswrapper[4730]: I0320 16:02:19.392155    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"]
Mar 20 16:02:19 crc kubenswrapper[4730]: I0320 16:02:19.562682    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a" path="/var/lib/kubelet/pods/3a179d19-c83c-4bc1-b74a-d77d6a6b7d2a/volumes"
Mar 20 16:02:19 crc kubenswrapper[4730]: I0320 16:02:19.627856    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="585eb246-c6cd-4641-a7fb-86d2ef87e31e" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.204:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)"
Mar 20 16:02:19 crc kubenswrapper[4730]: I0320 16:02:19.628395    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="585eb246-c6cd-4641-a7fb-86d2ef87e31e" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.204:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)"
Mar 20 16:02:19 crc kubenswrapper[4730]: I0320 16:02:19.734653    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76cb94d47c-txmh6"
Mar 20 16:02:19 crc kubenswrapper[4730]: I0320 16:02:19.770049    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff335b2a-909a-4c39-a045-2267c73ac8b2-config\") pod \"ff335b2a-909a-4c39-a045-2267c73ac8b2\" (UID: \"ff335b2a-909a-4c39-a045-2267c73ac8b2\") "
Mar 20 16:02:19 crc kubenswrapper[4730]: I0320 16:02:19.770166    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff335b2a-909a-4c39-a045-2267c73ac8b2-ovsdbserver-sb\") pod \"ff335b2a-909a-4c39-a045-2267c73ac8b2\" (UID: \"ff335b2a-909a-4c39-a045-2267c73ac8b2\") "
Mar 20 16:02:19 crc kubenswrapper[4730]: I0320 16:02:19.770199    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff335b2a-909a-4c39-a045-2267c73ac8b2-dns-svc\") pod \"ff335b2a-909a-4c39-a045-2267c73ac8b2\" (UID: \"ff335b2a-909a-4c39-a045-2267c73ac8b2\") "
Mar 20 16:02:19 crc kubenswrapper[4730]: I0320 16:02:19.770226    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7njgl\" (UniqueName: \"kubernetes.io/projected/ff335b2a-909a-4c39-a045-2267c73ac8b2-kube-api-access-7njgl\") pod \"ff335b2a-909a-4c39-a045-2267c73ac8b2\" (UID: \"ff335b2a-909a-4c39-a045-2267c73ac8b2\") "
Mar 20 16:02:19 crc kubenswrapper[4730]: I0320 16:02:19.770344    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff335b2a-909a-4c39-a045-2267c73ac8b2-ovsdbserver-nb\") pod \"ff335b2a-909a-4c39-a045-2267c73ac8b2\" (UID: \"ff335b2a-909a-4c39-a045-2267c73ac8b2\") "
Mar 20 16:02:19 crc kubenswrapper[4730]: I0320 16:02:19.770376    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ff335b2a-909a-4c39-a045-2267c73ac8b2-dns-swift-storage-0\") pod \"ff335b2a-909a-4c39-a045-2267c73ac8b2\" (UID: \"ff335b2a-909a-4c39-a045-2267c73ac8b2\") "
Mar 20 16:02:19 crc kubenswrapper[4730]: I0320 16:02:19.775333    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff335b2a-909a-4c39-a045-2267c73ac8b2-kube-api-access-7njgl" (OuterVolumeSpecName: "kube-api-access-7njgl") pod "ff335b2a-909a-4c39-a045-2267c73ac8b2" (UID: "ff335b2a-909a-4c39-a045-2267c73ac8b2"). InnerVolumeSpecName "kube-api-access-7njgl". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:02:19 crc kubenswrapper[4730]: I0320 16:02:19.839073    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff335b2a-909a-4c39-a045-2267c73ac8b2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ff335b2a-909a-4c39-a045-2267c73ac8b2" (UID: "ff335b2a-909a-4c39-a045-2267c73ac8b2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:02:19 crc kubenswrapper[4730]: I0320 16:02:19.873783    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7njgl\" (UniqueName: \"kubernetes.io/projected/ff335b2a-909a-4c39-a045-2267c73ac8b2-kube-api-access-7njgl\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:19 crc kubenswrapper[4730]: I0320 16:02:19.873818    4730 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ff335b2a-909a-4c39-a045-2267c73ac8b2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.046104    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0"
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.047322    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff335b2a-909a-4c39-a045-2267c73ac8b2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ff335b2a-909a-4c39-a045-2267c73ac8b2" (UID: "ff335b2a-909a-4c39-a045-2267c73ac8b2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.066490    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff335b2a-909a-4c39-a045-2267c73ac8b2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ff335b2a-909a-4c39-a045-2267c73ac8b2" (UID: "ff335b2a-909a-4c39-a045-2267c73ac8b2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.066593    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff335b2a-909a-4c39-a045-2267c73ac8b2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ff335b2a-909a-4c39-a045-2267c73ac8b2" (UID: "ff335b2a-909a-4c39-a045-2267c73ac8b2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.078459    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0-config-data\") pod \"5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0\" (UID: \"5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0\") "
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.078659    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0-logs\") pod \"5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0\" (UID: \"5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0\") "
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.078732    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0-combined-ca-bundle\") pod \"5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0\" (UID: \"5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0\") "
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.078817    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djjz2\" (UniqueName: \"kubernetes.io/projected/5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0-kube-api-access-djjz2\") pod \"5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0\" (UID: \"5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0\") "
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.079402    4730 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff335b2a-909a-4c39-a045-2267c73ac8b2-dns-svc\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.079422    4730 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff335b2a-909a-4c39-a045-2267c73ac8b2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.079433    4730 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff335b2a-909a-4c39-a045-2267c73ac8b2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.079443    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0-logs" (OuterVolumeSpecName: "logs") pod "5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0" (UID: "5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.080715    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff335b2a-909a-4c39-a045-2267c73ac8b2-config" (OuterVolumeSpecName: "config") pod "ff335b2a-909a-4c39-a045-2267c73ac8b2" (UID: "ff335b2a-909a-4c39-a045-2267c73ac8b2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.095019    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0-kube-api-access-djjz2" (OuterVolumeSpecName: "kube-api-access-djjz2") pod "5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0" (UID: "5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0"). InnerVolumeSpecName "kube-api-access-djjz2". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.171692    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0" (UID: "5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.173833    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0-config-data" (OuterVolumeSpecName: "config-data") pod "5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0" (UID: "5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.181170    4730 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0-config-data\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.181205    4730 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0-logs\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.181215    4730 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.181226    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djjz2\" (UniqueName: \"kubernetes.io/projected/5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0-kube-api-access-djjz2\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.181237    4730 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff335b2a-909a-4c39-a045-2267c73ac8b2-config\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.276821    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c192a384-369a-4011-bbd0-af10cf958010","Type":"ContainerStarted","Data":"2e40439ca8128c74c4bde82b12fa3c90a57c5ca9aa2ec703f4bb34d1783c4a41"}
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.278453    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0"
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.286963    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76cb94d47c-txmh6" event={"ID":"ff335b2a-909a-4c39-a045-2267c73ac8b2","Type":"ContainerDied","Data":"bb5085b7efdc43fccc1330db4f8eeb5dbaa633cfdd11172926159e432c750243"}
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.287022    4730 scope.go:117] "RemoveContainer" containerID="1a5f9090983f451fb355cec1fdd913ee82666d2f07f611bcec31819f098f8126"
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.287143    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76cb94d47c-txmh6"
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.293988    4730 generic.go:334] "Generic (PLEG): container finished" podID="5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0" containerID="06e4300ad2aa0cb045945f92073482ca78016e812eb685af49fb195ce150c681" exitCode=0
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.294107    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0"
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.294789    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0","Type":"ContainerDied","Data":"06e4300ad2aa0cb045945f92073482ca78016e812eb685af49fb195ce150c681"}
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.294833    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0","Type":"ContainerDied","Data":"505e09709aeaef8a87d638031cf49a2f3e2fbdc9ec33e31ca6d7f0d3b9378532"}
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.307370    4730 generic.go:334] "Generic (PLEG): container finished" podID="8f144e50-8d18-49a5-a3ef-84b72e6e119f" containerID="712419554ee7980049c60af4c6c43298daddafb72a74d5efc70ba46df50bba0e" exitCode=0
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.307450    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-f7rjc" event={"ID":"8f144e50-8d18-49a5-a3ef-84b72e6e119f","Type":"ContainerDied","Data":"712419554ee7980049c60af4c6c43298daddafb72a74d5efc70ba46df50bba0e"}
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.320074    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.111080671 podStartE2EDuration="6.320043247s" podCreationTimestamp="2026-03-20 16:02:14 +0000 UTC" firstStartedPulling="2026-03-20 16:02:15.379929095 +0000 UTC m=+1394.593300464" lastFinishedPulling="2026-03-20 16:02:19.588891671 +0000 UTC m=+1398.802263040" observedRunningTime="2026-03-20 16:02:20.302649397 +0000 UTC m=+1399.516020766" watchObservedRunningTime="2026-03-20 16:02:20.320043247 +0000 UTC m=+1399.533414616"
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.329692    4730 scope.go:117] "RemoveContainer" containerID="e17867e23320294cb0b9160d421d696c692c189cb8d85e17cafc60d15c897466"
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.337552    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"ba310e23-3097-4114-8628-4e7ada94eac6","Type":"ContainerStarted","Data":"2383b07d3519cde613c64363468a9df1e002f0bdb657587ae10eb916ce433034"}
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.337613    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"ba310e23-3097-4114-8628-4e7ada94eac6","Type":"ContainerStarted","Data":"ae8a9c3901658c0acca61c344520be9c6936365074f0cf75db8856e5b7cb9951"}
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.373145    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76cb94d47c-txmh6"]
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.388330    4730 scope.go:117] "RemoveContainer" containerID="06e4300ad2aa0cb045945f92073482ca78016e812eb685af49fb195ce150c681"
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.394423    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76cb94d47c-txmh6"]
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.407355    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"]
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.440045    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-applier-0"]
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.440120    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"]
Mar 20 16:02:20 crc kubenswrapper[4730]: E0320 16:02:20.441530    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0" containerName="watcher-applier"
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.441554    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0" containerName="watcher-applier"
Mar 20 16:02:20 crc kubenswrapper[4730]: E0320 16:02:20.441586    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff335b2a-909a-4c39-a045-2267c73ac8b2" containerName="init"
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.441594    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff335b2a-909a-4c39-a045-2267c73ac8b2" containerName="init"
Mar 20 16:02:20 crc kubenswrapper[4730]: E0320 16:02:20.441617    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff335b2a-909a-4c39-a045-2267c73ac8b2" containerName="dnsmasq-dns"
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.441622    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff335b2a-909a-4c39-a045-2267c73ac8b2" containerName="dnsmasq-dns"
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.441817    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0" containerName="watcher-applier"
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.441846    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff335b2a-909a-4c39-a045-2267c73ac8b2" containerName="dnsmasq-dns"
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.442539    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0"
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.445342    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data"
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.451002    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"]
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.477442    4730 scope.go:117] "RemoveContainer" containerID="06e4300ad2aa0cb045945f92073482ca78016e812eb685af49fb195ce150c681"
Mar 20 16:02:20 crc kubenswrapper[4730]: E0320 16:02:20.479192    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06e4300ad2aa0cb045945f92073482ca78016e812eb685af49fb195ce150c681\": container with ID starting with 06e4300ad2aa0cb045945f92073482ca78016e812eb685af49fb195ce150c681 not found: ID does not exist" containerID="06e4300ad2aa0cb045945f92073482ca78016e812eb685af49fb195ce150c681"
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.479234    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06e4300ad2aa0cb045945f92073482ca78016e812eb685af49fb195ce150c681"} err="failed to get container status \"06e4300ad2aa0cb045945f92073482ca78016e812eb685af49fb195ce150c681\": rpc error: code = NotFound desc = could not find container \"06e4300ad2aa0cb045945f92073482ca78016e812eb685af49fb195ce150c681\": container with ID starting with 06e4300ad2aa0cb045945f92073482ca78016e812eb685af49fb195ce150c681 not found: ID does not exist"
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.488097    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d05b7e1-a651-404e-89e9-8276427610fc-logs\") pod \"watcher-applier-0\" (UID: \"0d05b7e1-a651-404e-89e9-8276427610fc\") " pod="openstack/watcher-applier-0"
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.488148    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d05b7e1-a651-404e-89e9-8276427610fc-config-data\") pod \"watcher-applier-0\" (UID: \"0d05b7e1-a651-404e-89e9-8276427610fc\") " pod="openstack/watcher-applier-0"
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.488239    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d05b7e1-a651-404e-89e9-8276427610fc-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"0d05b7e1-a651-404e-89e9-8276427610fc\") " pod="openstack/watcher-applier-0"
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.488317    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79zbq\" (UniqueName: \"kubernetes.io/projected/0d05b7e1-a651-404e-89e9-8276427610fc-kube-api-access-79zbq\") pod \"watcher-applier-0\" (UID: \"0d05b7e1-a651-404e-89e9-8276427610fc\") " pod="openstack/watcher-applier-0"
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.591058    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d05b7e1-a651-404e-89e9-8276427610fc-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"0d05b7e1-a651-404e-89e9-8276427610fc\") " pod="openstack/watcher-applier-0"
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.591181    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79zbq\" (UniqueName: \"kubernetes.io/projected/0d05b7e1-a651-404e-89e9-8276427610fc-kube-api-access-79zbq\") pod \"watcher-applier-0\" (UID: \"0d05b7e1-a651-404e-89e9-8276427610fc\") " pod="openstack/watcher-applier-0"
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.591342    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d05b7e1-a651-404e-89e9-8276427610fc-logs\") pod \"watcher-applier-0\" (UID: \"0d05b7e1-a651-404e-89e9-8276427610fc\") " pod="openstack/watcher-applier-0"
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.591363    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d05b7e1-a651-404e-89e9-8276427610fc-config-data\") pod \"watcher-applier-0\" (UID: \"0d05b7e1-a651-404e-89e9-8276427610fc\") " pod="openstack/watcher-applier-0"
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.592644    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d05b7e1-a651-404e-89e9-8276427610fc-logs\") pod \"watcher-applier-0\" (UID: \"0d05b7e1-a651-404e-89e9-8276427610fc\") " pod="openstack/watcher-applier-0"
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.598324    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d05b7e1-a651-404e-89e9-8276427610fc-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"0d05b7e1-a651-404e-89e9-8276427610fc\") " pod="openstack/watcher-applier-0"
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.600002    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d05b7e1-a651-404e-89e9-8276427610fc-config-data\") pod \"watcher-applier-0\" (UID: \"0d05b7e1-a651-404e-89e9-8276427610fc\") " pod="openstack/watcher-applier-0"
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.611113    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79zbq\" (UniqueName: \"kubernetes.io/projected/0d05b7e1-a651-404e-89e9-8276427610fc-kube-api-access-79zbq\") pod \"watcher-applier-0\" (UID: \"0d05b7e1-a651-404e-89e9-8276427610fc\") " pod="openstack/watcher-applier-0"
Mar 20 16:02:20 crc kubenswrapper[4730]: I0320 16:02:20.766739    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0"
Mar 20 16:02:21 crc kubenswrapper[4730]: E0320 16:02:21.149528    4730 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod279d2368_abe1_465a_9007_68542e5dbfc4.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod279d2368_abe1_465a_9007_68542e5dbfc4.slice/crio-da2c214fcdb33fae608237a0fb1d01559481f0b67a4afa1e6c930298a64b75a2\": RecentStats: unable to find data in memory cache]"
Mar 20 16:02:21 crc kubenswrapper[4730]: I0320 16:02:21.294110    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"]
Mar 20 16:02:21 crc kubenswrapper[4730]: W0320 16:02:21.297991    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d05b7e1_a651_404e_89e9_8276427610fc.slice/crio-a88aa72f302dc4cb94da7ed72334760b9efcfb8dbdce8e9f4770d7b90a1a9939 WatchSource:0}: Error finding container a88aa72f302dc4cb94da7ed72334760b9efcfb8dbdce8e9f4770d7b90a1a9939: Status 404 returned error can't find the container with id a88aa72f302dc4cb94da7ed72334760b9efcfb8dbdce8e9f4770d7b90a1a9939
Mar 20 16:02:21 crc kubenswrapper[4730]: I0320 16:02:21.355148    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"0d05b7e1-a651-404e-89e9-8276427610fc","Type":"ContainerStarted","Data":"a88aa72f302dc4cb94da7ed72334760b9efcfb8dbdce8e9f4770d7b90a1a9939"}
Mar 20 16:02:21 crc kubenswrapper[4730]: I0320 16:02:21.357378    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"ba310e23-3097-4114-8628-4e7ada94eac6","Type":"ContainerStarted","Data":"c956e77723ace7f162ae394dd01ecd3232795058c3ddc5312f368da025e800eb"}
Mar 20 16:02:21 crc kubenswrapper[4730]: I0320 16:02:21.387547    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=3.387525492 podStartE2EDuration="3.387525492s" podCreationTimestamp="2026-03-20 16:02:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:02:21.3780454 +0000 UTC m=+1400.591416769" watchObservedRunningTime="2026-03-20 16:02:21.387525492 +0000 UTC m=+1400.600896861"
Mar 20 16:02:21 crc kubenswrapper[4730]: I0320 16:02:21.568852    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0" path="/var/lib/kubelet/pods/5bd7fdc7-f9da-4f44-98d3-7b86f541b9f0/volumes"
Mar 20 16:02:21 crc kubenswrapper[4730]: I0320 16:02:21.579965    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff335b2a-909a-4c39-a045-2267c73ac8b2" path="/var/lib/kubelet/pods/ff335b2a-909a-4c39-a045-2267c73ac8b2/volumes"
Mar 20 16:02:21 crc kubenswrapper[4730]: I0320 16:02:21.828262    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-f7rjc"
Mar 20 16:02:22 crc kubenswrapper[4730]: I0320 16:02:22.020435    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f144e50-8d18-49a5-a3ef-84b72e6e119f-config-data\") pod \"8f144e50-8d18-49a5-a3ef-84b72e6e119f\" (UID: \"8f144e50-8d18-49a5-a3ef-84b72e6e119f\") "
Mar 20 16:02:22 crc kubenswrapper[4730]: I0320 16:02:22.020530    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dn84\" (UniqueName: \"kubernetes.io/projected/8f144e50-8d18-49a5-a3ef-84b72e6e119f-kube-api-access-4dn84\") pod \"8f144e50-8d18-49a5-a3ef-84b72e6e119f\" (UID: \"8f144e50-8d18-49a5-a3ef-84b72e6e119f\") "
Mar 20 16:02:22 crc kubenswrapper[4730]: I0320 16:02:22.020582    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f144e50-8d18-49a5-a3ef-84b72e6e119f-scripts\") pod \"8f144e50-8d18-49a5-a3ef-84b72e6e119f\" (UID: \"8f144e50-8d18-49a5-a3ef-84b72e6e119f\") "
Mar 20 16:02:22 crc kubenswrapper[4730]: I0320 16:02:22.020646    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f144e50-8d18-49a5-a3ef-84b72e6e119f-combined-ca-bundle\") pod \"8f144e50-8d18-49a5-a3ef-84b72e6e119f\" (UID: \"8f144e50-8d18-49a5-a3ef-84b72e6e119f\") "
Mar 20 16:02:22 crc kubenswrapper[4730]: I0320 16:02:22.025261    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f144e50-8d18-49a5-a3ef-84b72e6e119f-kube-api-access-4dn84" (OuterVolumeSpecName: "kube-api-access-4dn84") pod "8f144e50-8d18-49a5-a3ef-84b72e6e119f" (UID: "8f144e50-8d18-49a5-a3ef-84b72e6e119f"). InnerVolumeSpecName "kube-api-access-4dn84". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:02:22 crc kubenswrapper[4730]: I0320 16:02:22.032665    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f144e50-8d18-49a5-a3ef-84b72e6e119f-scripts" (OuterVolumeSpecName: "scripts") pod "8f144e50-8d18-49a5-a3ef-84b72e6e119f" (UID: "8f144e50-8d18-49a5-a3ef-84b72e6e119f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:02:22 crc kubenswrapper[4730]: I0320 16:02:22.065364    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f144e50-8d18-49a5-a3ef-84b72e6e119f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8f144e50-8d18-49a5-a3ef-84b72e6e119f" (UID: "8f144e50-8d18-49a5-a3ef-84b72e6e119f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:02:22 crc kubenswrapper[4730]: I0320 16:02:22.065386    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f144e50-8d18-49a5-a3ef-84b72e6e119f-config-data" (OuterVolumeSpecName: "config-data") pod "8f144e50-8d18-49a5-a3ef-84b72e6e119f" (UID: "8f144e50-8d18-49a5-a3ef-84b72e6e119f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:02:22 crc kubenswrapper[4730]: I0320 16:02:22.123976    4730 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f144e50-8d18-49a5-a3ef-84b72e6e119f-scripts\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:22 crc kubenswrapper[4730]: I0320 16:02:22.124020    4730 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f144e50-8d18-49a5-a3ef-84b72e6e119f-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:22 crc kubenswrapper[4730]: I0320 16:02:22.124035    4730 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f144e50-8d18-49a5-a3ef-84b72e6e119f-config-data\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:22 crc kubenswrapper[4730]: I0320 16:02:22.124046    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dn84\" (UniqueName: \"kubernetes.io/projected/8f144e50-8d18-49a5-a3ef-84b72e6e119f-kube-api-access-4dn84\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:22 crc kubenswrapper[4730]: I0320 16:02:22.370366    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"0d05b7e1-a651-404e-89e9-8276427610fc","Type":"ContainerStarted","Data":"7b930a07c6f095dab4008f80ae7f0a0319d912199b4e910c344255dd178b75f0"}
Mar 20 16:02:22 crc kubenswrapper[4730]: I0320 16:02:22.372274    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-f7rjc" event={"ID":"8f144e50-8d18-49a5-a3ef-84b72e6e119f","Type":"ContainerDied","Data":"e3b08dc7c4abd2dcaea34ec7c791ef1e1b1a8fa905aae37ea25e912f633e57a6"}
Mar 20 16:02:22 crc kubenswrapper[4730]: I0320 16:02:22.372306    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3b08dc7c4abd2dcaea34ec7c791ef1e1b1a8fa905aae37ea25e912f633e57a6"
Mar 20 16:02:22 crc kubenswrapper[4730]: I0320 16:02:22.372303    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-f7rjc"
Mar 20 16:02:22 crc kubenswrapper[4730]: I0320 16:02:22.372688    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0"
Mar 20 16:02:22 crc kubenswrapper[4730]: I0320 16:02:22.397950    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=2.397929216 podStartE2EDuration="2.397929216s" podCreationTimestamp="2026-03-20 16:02:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:02:22.386191289 +0000 UTC m=+1401.599562658" watchObservedRunningTime="2026-03-20 16:02:22.397929216 +0000 UTC m=+1401.611300585"
Mar 20 16:02:22 crc kubenswrapper[4730]: I0320 16:02:22.523318    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"]
Mar 20 16:02:22 crc kubenswrapper[4730]: I0320 16:02:22.523568    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="585eb246-c6cd-4641-a7fb-86d2ef87e31e" containerName="nova-api-log" containerID="cri-o://605dab74894b941f8dc47934657056d0f993430ffbd34d8c7cb723a73fb546d4" gracePeriod=30
Mar 20 16:02:22 crc kubenswrapper[4730]: I0320 16:02:22.523710    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="585eb246-c6cd-4641-a7fb-86d2ef87e31e" containerName="nova-api-api" containerID="cri-o://077bbf75eaae181c0e1146dfa3d57949104bdbba5a29e133956ce977816562e9" gracePeriod=30
Mar 20 16:02:22 crc kubenswrapper[4730]: I0320 16:02:22.570864    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"]
Mar 20 16:02:22 crc kubenswrapper[4730]: I0320 16:02:22.571087    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="01c93007-4d31-47a2-810c-819caf917e43" containerName="nova-scheduler-scheduler" containerID="cri-o://68063e072de62260e0731b15d4d132f06b3322eede18b910d69b064464dffc8d" gracePeriod=30
Mar 20 16:02:22 crc kubenswrapper[4730]: I0320 16:02:22.585234    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"]
Mar 20 16:02:22 crc kubenswrapper[4730]: I0320 16:02:22.585514    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="23319f08-4294-49cc-bb24-7a01520e37c6" containerName="nova-metadata-log" containerID="cri-o://a0c082b53bef3328cda841db37115f58cfc064af02b9d184944305bba914d52b" gracePeriod=30
Mar 20 16:02:22 crc kubenswrapper[4730]: I0320 16:02:22.586038    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="23319f08-4294-49cc-bb24-7a01520e37c6" containerName="nova-metadata-metadata" containerID="cri-o://a527ffbd2a54fec297ae5f5968aa11134c7ecf139f953464b50d8aaf8a48e15d" gracePeriod=30
Mar 20 16:02:23 crc kubenswrapper[4730]: I0320 16:02:23.390158    4730 generic.go:334] "Generic (PLEG): container finished" podID="585eb246-c6cd-4641-a7fb-86d2ef87e31e" containerID="605dab74894b941f8dc47934657056d0f993430ffbd34d8c7cb723a73fb546d4" exitCode=143
Mar 20 16:02:23 crc kubenswrapper[4730]: I0320 16:02:23.390435    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"585eb246-c6cd-4641-a7fb-86d2ef87e31e","Type":"ContainerDied","Data":"605dab74894b941f8dc47934657056d0f993430ffbd34d8c7cb723a73fb546d4"}
Mar 20 16:02:23 crc kubenswrapper[4730]: I0320 16:02:23.392931    4730 generic.go:334] "Generic (PLEG): container finished" podID="23319f08-4294-49cc-bb24-7a01520e37c6" containerID="a527ffbd2a54fec297ae5f5968aa11134c7ecf139f953464b50d8aaf8a48e15d" exitCode=0
Mar 20 16:02:23 crc kubenswrapper[4730]: I0320 16:02:23.392948    4730 generic.go:334] "Generic (PLEG): container finished" podID="23319f08-4294-49cc-bb24-7a01520e37c6" containerID="a0c082b53bef3328cda841db37115f58cfc064af02b9d184944305bba914d52b" exitCode=143
Mar 20 16:02:23 crc kubenswrapper[4730]: I0320 16:02:23.393012    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"23319f08-4294-49cc-bb24-7a01520e37c6","Type":"ContainerDied","Data":"a527ffbd2a54fec297ae5f5968aa11134c7ecf139f953464b50d8aaf8a48e15d"}
Mar 20 16:02:23 crc kubenswrapper[4730]: I0320 16:02:23.393038    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"23319f08-4294-49cc-bb24-7a01520e37c6","Type":"ContainerDied","Data":"a0c082b53bef3328cda841db37115f58cfc064af02b9d184944305bba914d52b"}
Mar 20 16:02:23 crc kubenswrapper[4730]: I0320 16:02:23.561138    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0"
Mar 20 16:02:23 crc kubenswrapper[4730]: E0320 16:02:23.568911    4730 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="68063e072de62260e0731b15d4d132f06b3322eede18b910d69b064464dffc8d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"]
Mar 20 16:02:23 crc kubenswrapper[4730]: E0320 16:02:23.571450    4730 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="68063e072de62260e0731b15d4d132f06b3322eede18b910d69b064464dffc8d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"]
Mar 20 16:02:23 crc kubenswrapper[4730]: E0320 16:02:23.577674    4730 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="68063e072de62260e0731b15d4d132f06b3322eede18b910d69b064464dffc8d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"]
Mar 20 16:02:23 crc kubenswrapper[4730]: E0320 16:02:23.577792    4730 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="01c93007-4d31-47a2-810c-819caf917e43" containerName="nova-scheduler-scheduler"
Mar 20 16:02:23 crc kubenswrapper[4730]: I0320 16:02:23.623989    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0"
Mar 20 16:02:23 crc kubenswrapper[4730]: I0320 16:02:23.658632    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/23319f08-4294-49cc-bb24-7a01520e37c6-nova-metadata-tls-certs\") pod \"23319f08-4294-49cc-bb24-7a01520e37c6\" (UID: \"23319f08-4294-49cc-bb24-7a01520e37c6\") "
Mar 20 16:02:23 crc kubenswrapper[4730]: I0320 16:02:23.658780    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23319f08-4294-49cc-bb24-7a01520e37c6-config-data\") pod \"23319f08-4294-49cc-bb24-7a01520e37c6\" (UID: \"23319f08-4294-49cc-bb24-7a01520e37c6\") "
Mar 20 16:02:23 crc kubenswrapper[4730]: I0320 16:02:23.658950    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23319f08-4294-49cc-bb24-7a01520e37c6-combined-ca-bundle\") pod \"23319f08-4294-49cc-bb24-7a01520e37c6\" (UID: \"23319f08-4294-49cc-bb24-7a01520e37c6\") "
Mar 20 16:02:23 crc kubenswrapper[4730]: I0320 16:02:23.659014    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ctt8\" (UniqueName: \"kubernetes.io/projected/23319f08-4294-49cc-bb24-7a01520e37c6-kube-api-access-7ctt8\") pod \"23319f08-4294-49cc-bb24-7a01520e37c6\" (UID: \"23319f08-4294-49cc-bb24-7a01520e37c6\") "
Mar 20 16:02:23 crc kubenswrapper[4730]: I0320 16:02:23.659047    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23319f08-4294-49cc-bb24-7a01520e37c6-logs\") pod \"23319f08-4294-49cc-bb24-7a01520e37c6\" (UID: \"23319f08-4294-49cc-bb24-7a01520e37c6\") "
Mar 20 16:02:23 crc kubenswrapper[4730]: I0320 16:02:23.659838    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23319f08-4294-49cc-bb24-7a01520e37c6-logs" (OuterVolumeSpecName: "logs") pod "23319f08-4294-49cc-bb24-7a01520e37c6" (UID: "23319f08-4294-49cc-bb24-7a01520e37c6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:02:23 crc kubenswrapper[4730]: I0320 16:02:23.661060    4730 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23319f08-4294-49cc-bb24-7a01520e37c6-logs\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:23 crc kubenswrapper[4730]: I0320 16:02:23.665675    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23319f08-4294-49cc-bb24-7a01520e37c6-kube-api-access-7ctt8" (OuterVolumeSpecName: "kube-api-access-7ctt8") pod "23319f08-4294-49cc-bb24-7a01520e37c6" (UID: "23319f08-4294-49cc-bb24-7a01520e37c6"). InnerVolumeSpecName "kube-api-access-7ctt8". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:02:23 crc kubenswrapper[4730]: I0320 16:02:23.696628    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23319f08-4294-49cc-bb24-7a01520e37c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "23319f08-4294-49cc-bb24-7a01520e37c6" (UID: "23319f08-4294-49cc-bb24-7a01520e37c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:02:23 crc kubenswrapper[4730]: I0320 16:02:23.698401    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23319f08-4294-49cc-bb24-7a01520e37c6-config-data" (OuterVolumeSpecName: "config-data") pod "23319f08-4294-49cc-bb24-7a01520e37c6" (UID: "23319f08-4294-49cc-bb24-7a01520e37c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:02:23 crc kubenswrapper[4730]: I0320 16:02:23.716938    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23319f08-4294-49cc-bb24-7a01520e37c6-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "23319f08-4294-49cc-bb24-7a01520e37c6" (UID: "23319f08-4294-49cc-bb24-7a01520e37c6"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:02:23 crc kubenswrapper[4730]: I0320 16:02:23.762513    4730 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23319f08-4294-49cc-bb24-7a01520e37c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:23 crc kubenswrapper[4730]: I0320 16:02:23.762544    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ctt8\" (UniqueName: \"kubernetes.io/projected/23319f08-4294-49cc-bb24-7a01520e37c6-kube-api-access-7ctt8\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:23 crc kubenswrapper[4730]: I0320 16:02:23.762558    4730 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/23319f08-4294-49cc-bb24-7a01520e37c6-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:23 crc kubenswrapper[4730]: I0320 16:02:23.762611    4730 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23319f08-4294-49cc-bb24-7a01520e37c6-config-data\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:24 crc kubenswrapper[4730]: I0320 16:02:24.404157    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"23319f08-4294-49cc-bb24-7a01520e37c6","Type":"ContainerDied","Data":"1a4e509ba032eff859c3dcc5260a9f2631d938aec766c95550415d37207eae1a"}
Mar 20 16:02:24 crc kubenswrapper[4730]: I0320 16:02:24.404207    4730 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness"
Mar 20 16:02:24 crc kubenswrapper[4730]: I0320 16:02:24.404183    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0"
Mar 20 16:02:24 crc kubenswrapper[4730]: I0320 16:02:24.404244    4730 scope.go:117] "RemoveContainer" containerID="a527ffbd2a54fec297ae5f5968aa11134c7ecf139f953464b50d8aaf8a48e15d"
Mar 20 16:02:24 crc kubenswrapper[4730]: I0320 16:02:24.458409    4730 scope.go:117] "RemoveContainer" containerID="a0c082b53bef3328cda841db37115f58cfc064af02b9d184944305bba914d52b"
Mar 20 16:02:24 crc kubenswrapper[4730]: I0320 16:02:24.555031    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"]
Mar 20 16:02:24 crc kubenswrapper[4730]: I0320 16:02:24.580891    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"]
Mar 20 16:02:24 crc kubenswrapper[4730]: I0320 16:02:24.597852    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"]
Mar 20 16:02:24 crc kubenswrapper[4730]: E0320 16:02:24.598721    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23319f08-4294-49cc-bb24-7a01520e37c6" containerName="nova-metadata-log"
Mar 20 16:02:24 crc kubenswrapper[4730]: I0320 16:02:24.598746    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="23319f08-4294-49cc-bb24-7a01520e37c6" containerName="nova-metadata-log"
Mar 20 16:02:24 crc kubenswrapper[4730]: E0320 16:02:24.598765    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23319f08-4294-49cc-bb24-7a01520e37c6" containerName="nova-metadata-metadata"
Mar 20 16:02:24 crc kubenswrapper[4730]: I0320 16:02:24.598772    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="23319f08-4294-49cc-bb24-7a01520e37c6" containerName="nova-metadata-metadata"
Mar 20 16:02:24 crc kubenswrapper[4730]: E0320 16:02:24.598793    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f144e50-8d18-49a5-a3ef-84b72e6e119f" containerName="nova-manage"
Mar 20 16:02:24 crc kubenswrapper[4730]: I0320 16:02:24.598801    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f144e50-8d18-49a5-a3ef-84b72e6e119f" containerName="nova-manage"
Mar 20 16:02:24 crc kubenswrapper[4730]: I0320 16:02:24.599189    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f144e50-8d18-49a5-a3ef-84b72e6e119f" containerName="nova-manage"
Mar 20 16:02:24 crc kubenswrapper[4730]: I0320 16:02:24.599218    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="23319f08-4294-49cc-bb24-7a01520e37c6" containerName="nova-metadata-log"
Mar 20 16:02:24 crc kubenswrapper[4730]: I0320 16:02:24.599238    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="23319f08-4294-49cc-bb24-7a01520e37c6" containerName="nova-metadata-metadata"
Mar 20 16:02:24 crc kubenswrapper[4730]: I0320 16:02:24.622159    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"]
Mar 20 16:02:24 crc kubenswrapper[4730]: I0320 16:02:24.622298    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0"
Mar 20 16:02:24 crc kubenswrapper[4730]: I0320 16:02:24.624596    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data"
Mar 20 16:02:24 crc kubenswrapper[4730]: I0320 16:02:24.624899    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc"
Mar 20 16:02:24 crc kubenswrapper[4730]: I0320 16:02:24.797349    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9bfb6c0-4971-4a58-aacc-17636a95b8a4-config-data\") pod \"nova-metadata-0\" (UID: \"c9bfb6c0-4971-4a58-aacc-17636a95b8a4\") " pod="openstack/nova-metadata-0"
Mar 20 16:02:24 crc kubenswrapper[4730]: I0320 16:02:24.797400    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9bfb6c0-4971-4a58-aacc-17636a95b8a4-logs\") pod \"nova-metadata-0\" (UID: \"c9bfb6c0-4971-4a58-aacc-17636a95b8a4\") " pod="openstack/nova-metadata-0"
Mar 20 16:02:24 crc kubenswrapper[4730]: I0320 16:02:24.797483    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9bfb6c0-4971-4a58-aacc-17636a95b8a4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c9bfb6c0-4971-4a58-aacc-17636a95b8a4\") " pod="openstack/nova-metadata-0"
Mar 20 16:02:24 crc kubenswrapper[4730]: I0320 16:02:24.797556    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9bfb6c0-4971-4a58-aacc-17636a95b8a4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c9bfb6c0-4971-4a58-aacc-17636a95b8a4\") " pod="openstack/nova-metadata-0"
Mar 20 16:02:24 crc kubenswrapper[4730]: I0320 16:02:24.797673    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xws9p\" (UniqueName: \"kubernetes.io/projected/c9bfb6c0-4971-4a58-aacc-17636a95b8a4-kube-api-access-xws9p\") pod \"nova-metadata-0\" (UID: \"c9bfb6c0-4971-4a58-aacc-17636a95b8a4\") " pod="openstack/nova-metadata-0"
Mar 20 16:02:24 crc kubenswrapper[4730]: I0320 16:02:24.899401    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xws9p\" (UniqueName: \"kubernetes.io/projected/c9bfb6c0-4971-4a58-aacc-17636a95b8a4-kube-api-access-xws9p\") pod \"nova-metadata-0\" (UID: \"c9bfb6c0-4971-4a58-aacc-17636a95b8a4\") " pod="openstack/nova-metadata-0"
Mar 20 16:02:24 crc kubenswrapper[4730]: I0320 16:02:24.899516    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9bfb6c0-4971-4a58-aacc-17636a95b8a4-config-data\") pod \"nova-metadata-0\" (UID: \"c9bfb6c0-4971-4a58-aacc-17636a95b8a4\") " pod="openstack/nova-metadata-0"
Mar 20 16:02:24 crc kubenswrapper[4730]: I0320 16:02:24.899545    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9bfb6c0-4971-4a58-aacc-17636a95b8a4-logs\") pod \"nova-metadata-0\" (UID: \"c9bfb6c0-4971-4a58-aacc-17636a95b8a4\") " pod="openstack/nova-metadata-0"
Mar 20 16:02:24 crc kubenswrapper[4730]: I0320 16:02:24.899606    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9bfb6c0-4971-4a58-aacc-17636a95b8a4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c9bfb6c0-4971-4a58-aacc-17636a95b8a4\") " pod="openstack/nova-metadata-0"
Mar 20 16:02:24 crc kubenswrapper[4730]: I0320 16:02:24.899664    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9bfb6c0-4971-4a58-aacc-17636a95b8a4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c9bfb6c0-4971-4a58-aacc-17636a95b8a4\") " pod="openstack/nova-metadata-0"
Mar 20 16:02:24 crc kubenswrapper[4730]: I0320 16:02:24.901118    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9bfb6c0-4971-4a58-aacc-17636a95b8a4-logs\") pod \"nova-metadata-0\" (UID: \"c9bfb6c0-4971-4a58-aacc-17636a95b8a4\") " pod="openstack/nova-metadata-0"
Mar 20 16:02:24 crc kubenswrapper[4730]: I0320 16:02:24.907138    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9bfb6c0-4971-4a58-aacc-17636a95b8a4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c9bfb6c0-4971-4a58-aacc-17636a95b8a4\") " pod="openstack/nova-metadata-0"
Mar 20 16:02:24 crc kubenswrapper[4730]: I0320 16:02:24.907602    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9bfb6c0-4971-4a58-aacc-17636a95b8a4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c9bfb6c0-4971-4a58-aacc-17636a95b8a4\") " pod="openstack/nova-metadata-0"
Mar 20 16:02:24 crc kubenswrapper[4730]: I0320 16:02:24.909041    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9bfb6c0-4971-4a58-aacc-17636a95b8a4-config-data\") pod \"nova-metadata-0\" (UID: \"c9bfb6c0-4971-4a58-aacc-17636a95b8a4\") " pod="openstack/nova-metadata-0"
Mar 20 16:02:24 crc kubenswrapper[4730]: I0320 16:02:24.918278    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xws9p\" (UniqueName: \"kubernetes.io/projected/c9bfb6c0-4971-4a58-aacc-17636a95b8a4-kube-api-access-xws9p\") pod \"nova-metadata-0\" (UID: \"c9bfb6c0-4971-4a58-aacc-17636a95b8a4\") " pod="openstack/nova-metadata-0"
Mar 20 16:02:24 crc kubenswrapper[4730]: I0320 16:02:24.947366    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0"
Mar 20 16:02:25 crc kubenswrapper[4730]: I0320 16:02:25.086985    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0"
Mar 20 16:02:25 crc kubenswrapper[4730]: I0320 16:02:25.205480    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/585eb246-c6cd-4641-a7fb-86d2ef87e31e-config-data\") pod \"585eb246-c6cd-4641-a7fb-86d2ef87e31e\" (UID: \"585eb246-c6cd-4641-a7fb-86d2ef87e31e\") "
Mar 20 16:02:25 crc kubenswrapper[4730]: I0320 16:02:25.205713    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/585eb246-c6cd-4641-a7fb-86d2ef87e31e-logs\") pod \"585eb246-c6cd-4641-a7fb-86d2ef87e31e\" (UID: \"585eb246-c6cd-4641-a7fb-86d2ef87e31e\") "
Mar 20 16:02:25 crc kubenswrapper[4730]: I0320 16:02:25.205757    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9r8b\" (UniqueName: \"kubernetes.io/projected/585eb246-c6cd-4641-a7fb-86d2ef87e31e-kube-api-access-l9r8b\") pod \"585eb246-c6cd-4641-a7fb-86d2ef87e31e\" (UID: \"585eb246-c6cd-4641-a7fb-86d2ef87e31e\") "
Mar 20 16:02:25 crc kubenswrapper[4730]: I0320 16:02:25.205831    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/585eb246-c6cd-4641-a7fb-86d2ef87e31e-combined-ca-bundle\") pod \"585eb246-c6cd-4641-a7fb-86d2ef87e31e\" (UID: \"585eb246-c6cd-4641-a7fb-86d2ef87e31e\") "
Mar 20 16:02:25 crc kubenswrapper[4730]: I0320 16:02:25.206541    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/585eb246-c6cd-4641-a7fb-86d2ef87e31e-logs" (OuterVolumeSpecName: "logs") pod "585eb246-c6cd-4641-a7fb-86d2ef87e31e" (UID: "585eb246-c6cd-4641-a7fb-86d2ef87e31e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:02:25 crc kubenswrapper[4730]: I0320 16:02:25.212689    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/585eb246-c6cd-4641-a7fb-86d2ef87e31e-kube-api-access-l9r8b" (OuterVolumeSpecName: "kube-api-access-l9r8b") pod "585eb246-c6cd-4641-a7fb-86d2ef87e31e" (UID: "585eb246-c6cd-4641-a7fb-86d2ef87e31e"). InnerVolumeSpecName "kube-api-access-l9r8b". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:02:25 crc kubenswrapper[4730]: I0320 16:02:25.243412    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/585eb246-c6cd-4641-a7fb-86d2ef87e31e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "585eb246-c6cd-4641-a7fb-86d2ef87e31e" (UID: "585eb246-c6cd-4641-a7fb-86d2ef87e31e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:02:25 crc kubenswrapper[4730]: I0320 16:02:25.248638    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/585eb246-c6cd-4641-a7fb-86d2ef87e31e-config-data" (OuterVolumeSpecName: "config-data") pod "585eb246-c6cd-4641-a7fb-86d2ef87e31e" (UID: "585eb246-c6cd-4641-a7fb-86d2ef87e31e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:02:25 crc kubenswrapper[4730]: I0320 16:02:25.308436    4730 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/585eb246-c6cd-4641-a7fb-86d2ef87e31e-logs\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:25 crc kubenswrapper[4730]: I0320 16:02:25.308483    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9r8b\" (UniqueName: \"kubernetes.io/projected/585eb246-c6cd-4641-a7fb-86d2ef87e31e-kube-api-access-l9r8b\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:25 crc kubenswrapper[4730]: I0320 16:02:25.308499    4730 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/585eb246-c6cd-4641-a7fb-86d2ef87e31e-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:25 crc kubenswrapper[4730]: I0320 16:02:25.308511    4730 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/585eb246-c6cd-4641-a7fb-86d2ef87e31e-config-data\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:25 crc kubenswrapper[4730]: I0320 16:02:25.337189    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0"
Mar 20 16:02:25 crc kubenswrapper[4730]: I0320 16:02:25.415012    4730 generic.go:334] "Generic (PLEG): container finished" podID="585eb246-c6cd-4641-a7fb-86d2ef87e31e" containerID="077bbf75eaae181c0e1146dfa3d57949104bdbba5a29e133956ce977816562e9" exitCode=0
Mar 20 16:02:25 crc kubenswrapper[4730]: I0320 16:02:25.415070    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"585eb246-c6cd-4641-a7fb-86d2ef87e31e","Type":"ContainerDied","Data":"077bbf75eaae181c0e1146dfa3d57949104bdbba5a29e133956ce977816562e9"}
Mar 20 16:02:25 crc kubenswrapper[4730]: I0320 16:02:25.415096    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"585eb246-c6cd-4641-a7fb-86d2ef87e31e","Type":"ContainerDied","Data":"b2e0e39bb6caaa7438ca18c26b9795ce5981a3e0bae88697a3af45f446cad74c"}
Mar 20 16:02:25 crc kubenswrapper[4730]: I0320 16:02:25.415114    4730 scope.go:117] "RemoveContainer" containerID="077bbf75eaae181c0e1146dfa3d57949104bdbba5a29e133956ce977816562e9"
Mar 20 16:02:25 crc kubenswrapper[4730]: I0320 16:02:25.415221    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0"
Mar 20 16:02:25 crc kubenswrapper[4730]: I0320 16:02:25.474439    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"]
Mar 20 16:02:25 crc kubenswrapper[4730]: W0320 16:02:25.501790    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9bfb6c0_4971_4a58_aacc_17636a95b8a4.slice/crio-26008e0cde0b10d5cf635c86bad21a3a5eba1afc3f588c8674fe357301fcf0ca WatchSource:0}: Error finding container 26008e0cde0b10d5cf635c86bad21a3a5eba1afc3f588c8674fe357301fcf0ca: Status 404 returned error can't find the container with id 26008e0cde0b10d5cf635c86bad21a3a5eba1afc3f588c8674fe357301fcf0ca
Mar 20 16:02:25 crc kubenswrapper[4730]: I0320 16:02:25.505467    4730 scope.go:117] "RemoveContainer" containerID="605dab74894b941f8dc47934657056d0f993430ffbd34d8c7cb723a73fb546d4"
Mar 20 16:02:25 crc kubenswrapper[4730]: I0320 16:02:25.517328    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"]
Mar 20 16:02:25 crc kubenswrapper[4730]: I0320 16:02:25.531265    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"]
Mar 20 16:02:25 crc kubenswrapper[4730]: I0320 16:02:25.578798    4730 scope.go:117] "RemoveContainer" containerID="077bbf75eaae181c0e1146dfa3d57949104bdbba5a29e133956ce977816562e9"
Mar 20 16:02:25 crc kubenswrapper[4730]: I0320 16:02:25.581240    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23319f08-4294-49cc-bb24-7a01520e37c6" path="/var/lib/kubelet/pods/23319f08-4294-49cc-bb24-7a01520e37c6/volumes"
Mar 20 16:02:25 crc kubenswrapper[4730]: I0320 16:02:25.582160    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="585eb246-c6cd-4641-a7fb-86d2ef87e31e" path="/var/lib/kubelet/pods/585eb246-c6cd-4641-a7fb-86d2ef87e31e/volumes"
Mar 20 16:02:25 crc kubenswrapper[4730]: I0320 16:02:25.582776    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"]
Mar 20 16:02:25 crc kubenswrapper[4730]: E0320 16:02:25.583105    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="585eb246-c6cd-4641-a7fb-86d2ef87e31e" containerName="nova-api-api"
Mar 20 16:02:25 crc kubenswrapper[4730]: I0320 16:02:25.583121    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="585eb246-c6cd-4641-a7fb-86d2ef87e31e" containerName="nova-api-api"
Mar 20 16:02:25 crc kubenswrapper[4730]: E0320 16:02:25.583145    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="585eb246-c6cd-4641-a7fb-86d2ef87e31e" containerName="nova-api-log"
Mar 20 16:02:25 crc kubenswrapper[4730]: I0320 16:02:25.583152    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="585eb246-c6cd-4641-a7fb-86d2ef87e31e" containerName="nova-api-log"
Mar 20 16:02:25 crc kubenswrapper[4730]: I0320 16:02:25.583396    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="585eb246-c6cd-4641-a7fb-86d2ef87e31e" containerName="nova-api-log"
Mar 20 16:02:25 crc kubenswrapper[4730]: I0320 16:02:25.583428    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="585eb246-c6cd-4641-a7fb-86d2ef87e31e" containerName="nova-api-api"
Mar 20 16:02:25 crc kubenswrapper[4730]: I0320 16:02:25.584436    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0"
Mar 20 16:02:25 crc kubenswrapper[4730]: E0320 16:02:25.584942    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"077bbf75eaae181c0e1146dfa3d57949104bdbba5a29e133956ce977816562e9\": container with ID starting with 077bbf75eaae181c0e1146dfa3d57949104bdbba5a29e133956ce977816562e9 not found: ID does not exist" containerID="077bbf75eaae181c0e1146dfa3d57949104bdbba5a29e133956ce977816562e9"
Mar 20 16:02:25 crc kubenswrapper[4730]: I0320 16:02:25.584995    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"077bbf75eaae181c0e1146dfa3d57949104bdbba5a29e133956ce977816562e9"} err="failed to get container status \"077bbf75eaae181c0e1146dfa3d57949104bdbba5a29e133956ce977816562e9\": rpc error: code = NotFound desc = could not find container \"077bbf75eaae181c0e1146dfa3d57949104bdbba5a29e133956ce977816562e9\": container with ID starting with 077bbf75eaae181c0e1146dfa3d57949104bdbba5a29e133956ce977816562e9 not found: ID does not exist"
Mar 20 16:02:25 crc kubenswrapper[4730]: I0320 16:02:25.585026    4730 scope.go:117] "RemoveContainer" containerID="605dab74894b941f8dc47934657056d0f993430ffbd34d8c7cb723a73fb546d4"
Mar 20 16:02:25 crc kubenswrapper[4730]: I0320 16:02:25.586756    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data"
Mar 20 16:02:25 crc kubenswrapper[4730]: E0320 16:02:25.588386    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"605dab74894b941f8dc47934657056d0f993430ffbd34d8c7cb723a73fb546d4\": container with ID starting with 605dab74894b941f8dc47934657056d0f993430ffbd34d8c7cb723a73fb546d4 not found: ID does not exist" containerID="605dab74894b941f8dc47934657056d0f993430ffbd34d8c7cb723a73fb546d4"
Mar 20 16:02:25 crc kubenswrapper[4730]: I0320 16:02:25.588470    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"605dab74894b941f8dc47934657056d0f993430ffbd34d8c7cb723a73fb546d4"} err="failed to get container status \"605dab74894b941f8dc47934657056d0f993430ffbd34d8c7cb723a73fb546d4\": rpc error: code = NotFound desc = could not find container \"605dab74894b941f8dc47934657056d0f993430ffbd34d8c7cb723a73fb546d4\": container with ID starting with 605dab74894b941f8dc47934657056d0f993430ffbd34d8c7cb723a73fb546d4 not found: ID does not exist"
Mar 20 16:02:25 crc kubenswrapper[4730]: I0320 16:02:25.590341    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"]
Mar 20 16:02:25 crc kubenswrapper[4730]: I0320 16:02:25.719639    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm4nm\" (UniqueName: \"kubernetes.io/projected/fc78da88-5699-44ed-af14-7627ea6191f9-kube-api-access-gm4nm\") pod \"nova-api-0\" (UID: \"fc78da88-5699-44ed-af14-7627ea6191f9\") " pod="openstack/nova-api-0"
Mar 20 16:02:25 crc kubenswrapper[4730]: I0320 16:02:25.719971    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc78da88-5699-44ed-af14-7627ea6191f9-logs\") pod \"nova-api-0\" (UID: \"fc78da88-5699-44ed-af14-7627ea6191f9\") " pod="openstack/nova-api-0"
Mar 20 16:02:25 crc kubenswrapper[4730]: I0320 16:02:25.720001    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc78da88-5699-44ed-af14-7627ea6191f9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fc78da88-5699-44ed-af14-7627ea6191f9\") " pod="openstack/nova-api-0"
Mar 20 16:02:25 crc kubenswrapper[4730]: I0320 16:02:25.720042    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc78da88-5699-44ed-af14-7627ea6191f9-config-data\") pod \"nova-api-0\" (UID: \"fc78da88-5699-44ed-af14-7627ea6191f9\") " pod="openstack/nova-api-0"
Mar 20 16:02:25 crc kubenswrapper[4730]: I0320 16:02:25.767655    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0"
Mar 20 16:02:25 crc kubenswrapper[4730]: I0320 16:02:25.821800    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gm4nm\" (UniqueName: \"kubernetes.io/projected/fc78da88-5699-44ed-af14-7627ea6191f9-kube-api-access-gm4nm\") pod \"nova-api-0\" (UID: \"fc78da88-5699-44ed-af14-7627ea6191f9\") " pod="openstack/nova-api-0"
Mar 20 16:02:25 crc kubenswrapper[4730]: I0320 16:02:25.821855    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc78da88-5699-44ed-af14-7627ea6191f9-logs\") pod \"nova-api-0\" (UID: \"fc78da88-5699-44ed-af14-7627ea6191f9\") " pod="openstack/nova-api-0"
Mar 20 16:02:25 crc kubenswrapper[4730]: I0320 16:02:25.821886    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc78da88-5699-44ed-af14-7627ea6191f9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fc78da88-5699-44ed-af14-7627ea6191f9\") " pod="openstack/nova-api-0"
Mar 20 16:02:25 crc kubenswrapper[4730]: I0320 16:02:25.821935    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc78da88-5699-44ed-af14-7627ea6191f9-config-data\") pod \"nova-api-0\" (UID: \"fc78da88-5699-44ed-af14-7627ea6191f9\") " pod="openstack/nova-api-0"
Mar 20 16:02:25 crc kubenswrapper[4730]: I0320 16:02:25.822432    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc78da88-5699-44ed-af14-7627ea6191f9-logs\") pod \"nova-api-0\" (UID: \"fc78da88-5699-44ed-af14-7627ea6191f9\") " pod="openstack/nova-api-0"
Mar 20 16:02:25 crc kubenswrapper[4730]: I0320 16:02:25.827810    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc78da88-5699-44ed-af14-7627ea6191f9-config-data\") pod \"nova-api-0\" (UID: \"fc78da88-5699-44ed-af14-7627ea6191f9\") " pod="openstack/nova-api-0"
Mar 20 16:02:25 crc kubenswrapper[4730]: I0320 16:02:25.827889    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc78da88-5699-44ed-af14-7627ea6191f9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fc78da88-5699-44ed-af14-7627ea6191f9\") " pod="openstack/nova-api-0"
Mar 20 16:02:25 crc kubenswrapper[4730]: I0320 16:02:25.839446    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm4nm\" (UniqueName: \"kubernetes.io/projected/fc78da88-5699-44ed-af14-7627ea6191f9-kube-api-access-gm4nm\") pod \"nova-api-0\" (UID: \"fc78da88-5699-44ed-af14-7627ea6191f9\") " pod="openstack/nova-api-0"
Mar 20 16:02:25 crc kubenswrapper[4730]: I0320 16:02:25.916811    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0"
Mar 20 16:02:26 crc kubenswrapper[4730]: I0320 16:02:26.184067    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"]
Mar 20 16:02:26 crc kubenswrapper[4730]: W0320 16:02:26.185508    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc78da88_5699_44ed_af14_7627ea6191f9.slice/crio-bcfa21becb5d8093cfdbab459412302a96e0a7cd61a8f2880830161d043b5f7d WatchSource:0}: Error finding container bcfa21becb5d8093cfdbab459412302a96e0a7cd61a8f2880830161d043b5f7d: Status 404 returned error can't find the container with id bcfa21becb5d8093cfdbab459412302a96e0a7cd61a8f2880830161d043b5f7d
Mar 20 16:02:26 crc kubenswrapper[4730]: I0320 16:02:26.436837    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c9bfb6c0-4971-4a58-aacc-17636a95b8a4","Type":"ContainerStarted","Data":"4ea91441433de2c9d6a90ef65cf2c1e62d0cba102e976dcb6c1e329ad4173188"}
Mar 20 16:02:26 crc kubenswrapper[4730]: I0320 16:02:26.436880    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c9bfb6c0-4971-4a58-aacc-17636a95b8a4","Type":"ContainerStarted","Data":"a81bd5c7f3aa448b067be4239c5895b4e08b8ed1cbb446ef78371ac5389a73a9"}
Mar 20 16:02:26 crc kubenswrapper[4730]: I0320 16:02:26.436890    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c9bfb6c0-4971-4a58-aacc-17636a95b8a4","Type":"ContainerStarted","Data":"26008e0cde0b10d5cf635c86bad21a3a5eba1afc3f588c8674fe357301fcf0ca"}
Mar 20 16:02:26 crc kubenswrapper[4730]: I0320 16:02:26.440613    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fc78da88-5699-44ed-af14-7627ea6191f9","Type":"ContainerStarted","Data":"42f9304ba2a3640bc2a608f72633df51d75830859b8b512c72f883ee6ebbc449"}
Mar 20 16:02:26 crc kubenswrapper[4730]: I0320 16:02:26.440653    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fc78da88-5699-44ed-af14-7627ea6191f9","Type":"ContainerStarted","Data":"bcfa21becb5d8093cfdbab459412302a96e0a7cd61a8f2880830161d043b5f7d"}
Mar 20 16:02:26 crc kubenswrapper[4730]: I0320 16:02:26.442039    4730 generic.go:334] "Generic (PLEG): container finished" podID="ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647" containerID="78840c5174380df1ec37853cf5867820b6c6e093e27294e16eb3795a80e9c2a8" exitCode=0
Mar 20 16:02:26 crc kubenswrapper[4730]: I0320 16:02:26.442079    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-cdhdz" event={"ID":"ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647","Type":"ContainerDied","Data":"78840c5174380df1ec37853cf5867820b6c6e093e27294e16eb3795a80e9c2a8"}
Mar 20 16:02:26 crc kubenswrapper[4730]: I0320 16:02:26.467673    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.46765321 podStartE2EDuration="2.46765321s" podCreationTimestamp="2026-03-20 16:02:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:02:26.460257348 +0000 UTC m=+1405.673628727" watchObservedRunningTime="2026-03-20 16:02:26.46765321 +0000 UTC m=+1405.681024569"
Mar 20 16:02:27 crc kubenswrapper[4730]: I0320 16:02:27.127173    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0"
Mar 20 16:02:27 crc kubenswrapper[4730]: I0320 16:02:27.260743    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01c93007-4d31-47a2-810c-819caf917e43-combined-ca-bundle\") pod \"01c93007-4d31-47a2-810c-819caf917e43\" (UID: \"01c93007-4d31-47a2-810c-819caf917e43\") "
Mar 20 16:02:27 crc kubenswrapper[4730]: I0320 16:02:27.260972    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlxxq\" (UniqueName: \"kubernetes.io/projected/01c93007-4d31-47a2-810c-819caf917e43-kube-api-access-nlxxq\") pod \"01c93007-4d31-47a2-810c-819caf917e43\" (UID: \"01c93007-4d31-47a2-810c-819caf917e43\") "
Mar 20 16:02:27 crc kubenswrapper[4730]: I0320 16:02:27.261073    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01c93007-4d31-47a2-810c-819caf917e43-config-data\") pod \"01c93007-4d31-47a2-810c-819caf917e43\" (UID: \"01c93007-4d31-47a2-810c-819caf917e43\") "
Mar 20 16:02:27 crc kubenswrapper[4730]: I0320 16:02:27.273633    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01c93007-4d31-47a2-810c-819caf917e43-kube-api-access-nlxxq" (OuterVolumeSpecName: "kube-api-access-nlxxq") pod "01c93007-4d31-47a2-810c-819caf917e43" (UID: "01c93007-4d31-47a2-810c-819caf917e43"). InnerVolumeSpecName "kube-api-access-nlxxq". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:02:27 crc kubenswrapper[4730]: I0320 16:02:27.293573    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01c93007-4d31-47a2-810c-819caf917e43-config-data" (OuterVolumeSpecName: "config-data") pod "01c93007-4d31-47a2-810c-819caf917e43" (UID: "01c93007-4d31-47a2-810c-819caf917e43"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:02:27 crc kubenswrapper[4730]: I0320 16:02:27.297014    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01c93007-4d31-47a2-810c-819caf917e43-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "01c93007-4d31-47a2-810c-819caf917e43" (UID: "01c93007-4d31-47a2-810c-819caf917e43"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:02:27 crc kubenswrapper[4730]: I0320 16:02:27.363487    4730 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01c93007-4d31-47a2-810c-819caf917e43-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:27 crc kubenswrapper[4730]: I0320 16:02:27.363530    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlxxq\" (UniqueName: \"kubernetes.io/projected/01c93007-4d31-47a2-810c-819caf917e43-kube-api-access-nlxxq\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:27 crc kubenswrapper[4730]: I0320 16:02:27.363547    4730 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01c93007-4d31-47a2-810c-819caf917e43-config-data\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:27 crc kubenswrapper[4730]: I0320 16:02:27.454342    4730 generic.go:334] "Generic (PLEG): container finished" podID="01c93007-4d31-47a2-810c-819caf917e43" containerID="68063e072de62260e0731b15d4d132f06b3322eede18b910d69b064464dffc8d" exitCode=0
Mar 20 16:02:27 crc kubenswrapper[4730]: I0320 16:02:27.454418    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"01c93007-4d31-47a2-810c-819caf917e43","Type":"ContainerDied","Data":"68063e072de62260e0731b15d4d132f06b3322eede18b910d69b064464dffc8d"}
Mar 20 16:02:27 crc kubenswrapper[4730]: I0320 16:02:27.454451    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"01c93007-4d31-47a2-810c-819caf917e43","Type":"ContainerDied","Data":"3250edd54dccb12be75e701aadcaf4d148acb76b38a455ec3944d6f9f7d0f0e8"}
Mar 20 16:02:27 crc kubenswrapper[4730]: I0320 16:02:27.454473    4730 scope.go:117] "RemoveContainer" containerID="68063e072de62260e0731b15d4d132f06b3322eede18b910d69b064464dffc8d"
Mar 20 16:02:27 crc kubenswrapper[4730]: I0320 16:02:27.454615    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0"
Mar 20 16:02:27 crc kubenswrapper[4730]: I0320 16:02:27.461570    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fc78da88-5699-44ed-af14-7627ea6191f9","Type":"ContainerStarted","Data":"b7766e5f23e174441651086dc567c68965531e447de145ebeb0b1414c28ef4e6"}
Mar 20 16:02:27 crc kubenswrapper[4730]: I0320 16:02:27.488683    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.488661348 podStartE2EDuration="2.488661348s" podCreationTimestamp="2026-03-20 16:02:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:02:27.486496856 +0000 UTC m=+1406.699868225" watchObservedRunningTime="2026-03-20 16:02:27.488661348 +0000 UTC m=+1406.702032717"
Mar 20 16:02:27 crc kubenswrapper[4730]: I0320 16:02:27.491783    4730 scope.go:117] "RemoveContainer" containerID="68063e072de62260e0731b15d4d132f06b3322eede18b910d69b064464dffc8d"
Mar 20 16:02:27 crc kubenswrapper[4730]: E0320 16:02:27.492687    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68063e072de62260e0731b15d4d132f06b3322eede18b910d69b064464dffc8d\": container with ID starting with 68063e072de62260e0731b15d4d132f06b3322eede18b910d69b064464dffc8d not found: ID does not exist" containerID="68063e072de62260e0731b15d4d132f06b3322eede18b910d69b064464dffc8d"
Mar 20 16:02:27 crc kubenswrapper[4730]: I0320 16:02:27.492953    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68063e072de62260e0731b15d4d132f06b3322eede18b910d69b064464dffc8d"} err="failed to get container status \"68063e072de62260e0731b15d4d132f06b3322eede18b910d69b064464dffc8d\": rpc error: code = NotFound desc = could not find container \"68063e072de62260e0731b15d4d132f06b3322eede18b910d69b064464dffc8d\": container with ID starting with 68063e072de62260e0731b15d4d132f06b3322eede18b910d69b064464dffc8d not found: ID does not exist"
Mar 20 16:02:27 crc kubenswrapper[4730]: I0320 16:02:27.518195    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"]
Mar 20 16:02:27 crc kubenswrapper[4730]: I0320 16:02:27.542522    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"]
Mar 20 16:02:27 crc kubenswrapper[4730]: I0320 16:02:27.563502    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"]
Mar 20 16:02:27 crc kubenswrapper[4730]: E0320 16:02:27.564218    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01c93007-4d31-47a2-810c-819caf917e43" containerName="nova-scheduler-scheduler"
Mar 20 16:02:27 crc kubenswrapper[4730]: I0320 16:02:27.564376    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="01c93007-4d31-47a2-810c-819caf917e43" containerName="nova-scheduler-scheduler"
Mar 20 16:02:27 crc kubenswrapper[4730]: I0320 16:02:27.564639    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="01c93007-4d31-47a2-810c-819caf917e43" containerName="nova-scheduler-scheduler"
Mar 20 16:02:27 crc kubenswrapper[4730]: I0320 16:02:27.565438    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0"
Mar 20 16:02:27 crc kubenswrapper[4730]: I0320 16:02:27.569896    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data"
Mar 20 16:02:27 crc kubenswrapper[4730]: I0320 16:02:27.584611    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"]
Mar 20 16:02:27 crc kubenswrapper[4730]: I0320 16:02:27.671099    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3edc02b8-2451-4edf-a79d-fc86a078de83-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3edc02b8-2451-4edf-a79d-fc86a078de83\") " pod="openstack/nova-scheduler-0"
Mar 20 16:02:27 crc kubenswrapper[4730]: I0320 16:02:27.671157    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrnlf\" (UniqueName: \"kubernetes.io/projected/3edc02b8-2451-4edf-a79d-fc86a078de83-kube-api-access-qrnlf\") pod \"nova-scheduler-0\" (UID: \"3edc02b8-2451-4edf-a79d-fc86a078de83\") " pod="openstack/nova-scheduler-0"
Mar 20 16:02:27 crc kubenswrapper[4730]: I0320 16:02:27.671271    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3edc02b8-2451-4edf-a79d-fc86a078de83-config-data\") pod \"nova-scheduler-0\" (UID: \"3edc02b8-2451-4edf-a79d-fc86a078de83\") " pod="openstack/nova-scheduler-0"
Mar 20 16:02:27 crc kubenswrapper[4730]: I0320 16:02:27.771058    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-cdhdz"
Mar 20 16:02:27 crc kubenswrapper[4730]: I0320 16:02:27.773064    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3edc02b8-2451-4edf-a79d-fc86a078de83-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3edc02b8-2451-4edf-a79d-fc86a078de83\") " pod="openstack/nova-scheduler-0"
Mar 20 16:02:27 crc kubenswrapper[4730]: I0320 16:02:27.773115    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrnlf\" (UniqueName: \"kubernetes.io/projected/3edc02b8-2451-4edf-a79d-fc86a078de83-kube-api-access-qrnlf\") pod \"nova-scheduler-0\" (UID: \"3edc02b8-2451-4edf-a79d-fc86a078de83\") " pod="openstack/nova-scheduler-0"
Mar 20 16:02:27 crc kubenswrapper[4730]: I0320 16:02:27.773183    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3edc02b8-2451-4edf-a79d-fc86a078de83-config-data\") pod \"nova-scheduler-0\" (UID: \"3edc02b8-2451-4edf-a79d-fc86a078de83\") " pod="openstack/nova-scheduler-0"
Mar 20 16:02:27 crc kubenswrapper[4730]: I0320 16:02:27.777133    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3edc02b8-2451-4edf-a79d-fc86a078de83-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3edc02b8-2451-4edf-a79d-fc86a078de83\") " pod="openstack/nova-scheduler-0"
Mar 20 16:02:27 crc kubenswrapper[4730]: I0320 16:02:27.781699    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3edc02b8-2451-4edf-a79d-fc86a078de83-config-data\") pod \"nova-scheduler-0\" (UID: \"3edc02b8-2451-4edf-a79d-fc86a078de83\") " pod="openstack/nova-scheduler-0"
Mar 20 16:02:27 crc kubenswrapper[4730]: I0320 16:02:27.790004    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrnlf\" (UniqueName: \"kubernetes.io/projected/3edc02b8-2451-4edf-a79d-fc86a078de83-kube-api-access-qrnlf\") pod \"nova-scheduler-0\" (UID: \"3edc02b8-2451-4edf-a79d-fc86a078de83\") " pod="openstack/nova-scheduler-0"
Mar 20 16:02:27 crc kubenswrapper[4730]: I0320 16:02:27.874287    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647-combined-ca-bundle\") pod \"ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647\" (UID: \"ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647\") "
Mar 20 16:02:27 crc kubenswrapper[4730]: I0320 16:02:27.874435    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnxln\" (UniqueName: \"kubernetes.io/projected/ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647-kube-api-access-jnxln\") pod \"ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647\" (UID: \"ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647\") "
Mar 20 16:02:27 crc kubenswrapper[4730]: I0320 16:02:27.874490    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647-config-data\") pod \"ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647\" (UID: \"ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647\") "
Mar 20 16:02:27 crc kubenswrapper[4730]: I0320 16:02:27.874876    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647-scripts\") pod \"ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647\" (UID: \"ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647\") "
Mar 20 16:02:27 crc kubenswrapper[4730]: I0320 16:02:27.878609    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647-kube-api-access-jnxln" (OuterVolumeSpecName: "kube-api-access-jnxln") pod "ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647" (UID: "ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647"). InnerVolumeSpecName "kube-api-access-jnxln". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:02:27 crc kubenswrapper[4730]: I0320 16:02:27.879208    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647-scripts" (OuterVolumeSpecName: "scripts") pod "ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647" (UID: "ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:02:27 crc kubenswrapper[4730]: I0320 16:02:27.889995    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0"
Mar 20 16:02:27 crc kubenswrapper[4730]: I0320 16:02:27.902931    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647-config-data" (OuterVolumeSpecName: "config-data") pod "ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647" (UID: "ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:02:27 crc kubenswrapper[4730]: I0320 16:02:27.903521    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647" (UID: "ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:02:27 crc kubenswrapper[4730]: I0320 16:02:27.980590    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnxln\" (UniqueName: \"kubernetes.io/projected/ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647-kube-api-access-jnxln\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:27 crc kubenswrapper[4730]: I0320 16:02:27.980617    4730 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647-config-data\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:27 crc kubenswrapper[4730]: I0320 16:02:27.980627    4730 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647-scripts\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:27 crc kubenswrapper[4730]: I0320 16:02:27.980635    4730 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:28 crc kubenswrapper[4730]: W0320 16:02:28.360223    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3edc02b8_2451_4edf_a79d_fc86a078de83.slice/crio-b529fd31ef154a115120a0df54b6ded5ddefe9a8e3d6d2b8b099bc3bd1c28517 WatchSource:0}: Error finding container b529fd31ef154a115120a0df54b6ded5ddefe9a8e3d6d2b8b099bc3bd1c28517: Status 404 returned error can't find the container with id b529fd31ef154a115120a0df54b6ded5ddefe9a8e3d6d2b8b099bc3bd1c28517
Mar 20 16:02:28 crc kubenswrapper[4730]: I0320 16:02:28.365431    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"]
Mar 20 16:02:28 crc kubenswrapper[4730]: I0320 16:02:28.475382    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-cdhdz" event={"ID":"ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647","Type":"ContainerDied","Data":"d9f3d982f9ee881581e688def58986a8c7c62dacb4325c11bdceadbe4b6ffa1a"}
Mar 20 16:02:28 crc kubenswrapper[4730]: I0320 16:02:28.475449    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9f3d982f9ee881581e688def58986a8c7c62dacb4325c11bdceadbe4b6ffa1a"
Mar 20 16:02:28 crc kubenswrapper[4730]: I0320 16:02:28.475416    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-cdhdz"
Mar 20 16:02:28 crc kubenswrapper[4730]: I0320 16:02:28.477427    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3edc02b8-2451-4edf-a79d-fc86a078de83","Type":"ContainerStarted","Data":"b529fd31ef154a115120a0df54b6ded5ddefe9a8e3d6d2b8b099bc3bd1c28517"}
Mar 20 16:02:28 crc kubenswrapper[4730]: I0320 16:02:28.581816    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"]
Mar 20 16:02:28 crc kubenswrapper[4730]: E0320 16:02:28.587821    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647" containerName="nova-cell1-conductor-db-sync"
Mar 20 16:02:28 crc kubenswrapper[4730]: I0320 16:02:28.587867    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647" containerName="nova-cell1-conductor-db-sync"
Mar 20 16:02:28 crc kubenswrapper[4730]: I0320 16:02:28.588199    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647" containerName="nova-cell1-conductor-db-sync"
Mar 20 16:02:28 crc kubenswrapper[4730]: I0320 16:02:28.588931    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0"
Mar 20 16:02:28 crc kubenswrapper[4730]: I0320 16:02:28.594047    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data"
Mar 20 16:02:28 crc kubenswrapper[4730]: I0320 16:02:28.599615    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"]
Mar 20 16:02:28 crc kubenswrapper[4730]: I0320 16:02:28.623342    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0"
Mar 20 16:02:28 crc kubenswrapper[4730]: I0320 16:02:28.639870    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0"
Mar 20 16:02:28 crc kubenswrapper[4730]: I0320 16:02:28.697326    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e788dfbe-bc18-46f9-b2bf-674940e1c392-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e788dfbe-bc18-46f9-b2bf-674940e1c392\") " pod="openstack/nova-cell1-conductor-0"
Mar 20 16:02:28 crc kubenswrapper[4730]: I0320 16:02:28.697933    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e788dfbe-bc18-46f9-b2bf-674940e1c392-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e788dfbe-bc18-46f9-b2bf-674940e1c392\") " pod="openstack/nova-cell1-conductor-0"
Mar 20 16:02:28 crc kubenswrapper[4730]: I0320 16:02:28.698380    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsjnl\" (UniqueName: \"kubernetes.io/projected/e788dfbe-bc18-46f9-b2bf-674940e1c392-kube-api-access-gsjnl\") pod \"nova-cell1-conductor-0\" (UID: \"e788dfbe-bc18-46f9-b2bf-674940e1c392\") " pod="openstack/nova-cell1-conductor-0"
Mar 20 16:02:28 crc kubenswrapper[4730]: I0320 16:02:28.800162    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e788dfbe-bc18-46f9-b2bf-674940e1c392-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e788dfbe-bc18-46f9-b2bf-674940e1c392\") " pod="openstack/nova-cell1-conductor-0"
Mar 20 16:02:28 crc kubenswrapper[4730]: I0320 16:02:28.800384    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsjnl\" (UniqueName: \"kubernetes.io/projected/e788dfbe-bc18-46f9-b2bf-674940e1c392-kube-api-access-gsjnl\") pod \"nova-cell1-conductor-0\" (UID: \"e788dfbe-bc18-46f9-b2bf-674940e1c392\") " pod="openstack/nova-cell1-conductor-0"
Mar 20 16:02:28 crc kubenswrapper[4730]: I0320 16:02:28.800438    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e788dfbe-bc18-46f9-b2bf-674940e1c392-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e788dfbe-bc18-46f9-b2bf-674940e1c392\") " pod="openstack/nova-cell1-conductor-0"
Mar 20 16:02:28 crc kubenswrapper[4730]: I0320 16:02:28.804695    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e788dfbe-bc18-46f9-b2bf-674940e1c392-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e788dfbe-bc18-46f9-b2bf-674940e1c392\") " pod="openstack/nova-cell1-conductor-0"
Mar 20 16:02:28 crc kubenswrapper[4730]: I0320 16:02:28.806043    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e788dfbe-bc18-46f9-b2bf-674940e1c392-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e788dfbe-bc18-46f9-b2bf-674940e1c392\") " pod="openstack/nova-cell1-conductor-0"
Mar 20 16:02:28 crc kubenswrapper[4730]: I0320 16:02:28.820650    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsjnl\" (UniqueName: \"kubernetes.io/projected/e788dfbe-bc18-46f9-b2bf-674940e1c392-kube-api-access-gsjnl\") pod \"nova-cell1-conductor-0\" (UID: \"e788dfbe-bc18-46f9-b2bf-674940e1c392\") " pod="openstack/nova-cell1-conductor-0"
Mar 20 16:02:28 crc kubenswrapper[4730]: I0320 16:02:28.918583    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0"
Mar 20 16:02:29 crc kubenswrapper[4730]: I0320 16:02:29.365401    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"]
Mar 20 16:02:29 crc kubenswrapper[4730]: I0320 16:02:29.509506    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e788dfbe-bc18-46f9-b2bf-674940e1c392","Type":"ContainerStarted","Data":"157a6ca1ded6c375598cd97c6882947a4130ebeb518e793e9a94ebfd5f9212a3"}
Mar 20 16:02:29 crc kubenswrapper[4730]: I0320 16:02:29.513407    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3edc02b8-2451-4edf-a79d-fc86a078de83","Type":"ContainerStarted","Data":"f01bb1ce3fdb271555994504b12501ddebf5c6f44d1ae92e8481bfe809a7796f"}
Mar 20 16:02:29 crc kubenswrapper[4730]: I0320 16:02:29.525886    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0"
Mar 20 16:02:29 crc kubenswrapper[4730]: I0320 16:02:29.541234    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.541215639 podStartE2EDuration="2.541215639s" podCreationTimestamp="2026-03-20 16:02:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:02:29.531643664 +0000 UTC m=+1408.745015043" watchObservedRunningTime="2026-03-20 16:02:29.541215639 +0000 UTC m=+1408.754587008"
Mar 20 16:02:29 crc kubenswrapper[4730]: I0320 16:02:29.548564    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01c93007-4d31-47a2-810c-819caf917e43" path="/var/lib/kubelet/pods/01c93007-4d31-47a2-810c-819caf917e43/volumes"
Mar 20 16:02:29 crc kubenswrapper[4730]: E0320 16:02:29.976065    4730 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ed4f7daeff4a5766b3f240bd0c76702ab4bff0db1848809b07121ff30e3a5bfc" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"]
Mar 20 16:02:29 crc kubenswrapper[4730]: E0320 16:02:29.977491    4730 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ed4f7daeff4a5766b3f240bd0c76702ab4bff0db1848809b07121ff30e3a5bfc" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"]
Mar 20 16:02:29 crc kubenswrapper[4730]: E0320 16:02:29.978479    4730 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ed4f7daeff4a5766b3f240bd0c76702ab4bff0db1848809b07121ff30e3a5bfc" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"]
Mar 20 16:02:29 crc kubenswrapper[4730]: E0320 16:02:29.978516    4730 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-decision-engine-0" podUID="3f6c808e-d523-48bd-8ec2-28b625834317" containerName="watcher-decision-engine"
Mar 20 16:02:30 crc kubenswrapper[4730]: I0320 16:02:30.523637    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e788dfbe-bc18-46f9-b2bf-674940e1c392","Type":"ContainerStarted","Data":"274814869f12b52bd4d369d60b217bb93eba1b6cde19af52b744d15dbc1c07a7"}
Mar 20 16:02:30 crc kubenswrapper[4730]: I0320 16:02:30.548100    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.548079832 podStartE2EDuration="2.548079832s" podCreationTimestamp="2026-03-20 16:02:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:02:30.537861978 +0000 UTC m=+1409.751233357" watchObservedRunningTime="2026-03-20 16:02:30.548079832 +0000 UTC m=+1409.761451211"
Mar 20 16:02:30 crc kubenswrapper[4730]: I0320 16:02:30.768202    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0"
Mar 20 16:02:30 crc kubenswrapper[4730]: I0320 16:02:30.795629    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0"
Mar 20 16:02:31 crc kubenswrapper[4730]: E0320 16:02:31.394492    4730 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod279d2368_abe1_465a_9007_68542e5dbfc4.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod279d2368_abe1_465a_9007_68542e5dbfc4.slice/crio-da2c214fcdb33fae608237a0fb1d01559481f0b67a4afa1e6c930298a64b75a2\": RecentStats: unable to find data in memory cache]"
Mar 20 16:02:31 crc kubenswrapper[4730]: I0320 16:02:31.566886    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0"
Mar 20 16:02:31 crc kubenswrapper[4730]: I0320 16:02:31.568966    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0"
Mar 20 16:02:32 crc kubenswrapper[4730]: I0320 16:02:32.891633    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0"
Mar 20 16:02:34 crc kubenswrapper[4730]: I0320 16:02:34.948216    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0"
Mar 20 16:02:34 crc kubenswrapper[4730]: I0320 16:02:34.948765    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0"
Mar 20 16:02:35 crc kubenswrapper[4730]: I0320 16:02:35.917162    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0"
Mar 20 16:02:35 crc kubenswrapper[4730]: I0320 16:02:35.917207    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0"
Mar 20 16:02:35 crc kubenswrapper[4730]: I0320 16:02:35.957911    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c9bfb6c0-4971-4a58-aacc-17636a95b8a4" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.214:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)"
Mar 20 16:02:35 crc kubenswrapper[4730]: I0320 16:02:35.966564    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c9bfb6c0-4971-4a58-aacc-17636a95b8a4" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.214:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)"
Mar 20 16:02:37 crc kubenswrapper[4730]: I0320 16:02:37.001437    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="fc78da88-5699-44ed-af14-7627ea6191f9" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.215:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)"
Mar 20 16:02:37 crc kubenswrapper[4730]: I0320 16:02:37.001534    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="fc78da88-5699-44ed-af14-7627ea6191f9" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.215:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)"
Mar 20 16:02:37 crc kubenswrapper[4730]: I0320 16:02:37.890967    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0"
Mar 20 16:02:37 crc kubenswrapper[4730]: I0320 16:02:37.939599    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0"
Mar 20 16:02:38 crc kubenswrapper[4730]: I0320 16:02:38.651469    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0"
Mar 20 16:02:38 crc kubenswrapper[4730]: I0320 16:02:38.953487    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0"
Mar 20 16:02:41 crc kubenswrapper[4730]: E0320 16:02:41.631162    4730 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod279d2368_abe1_465a_9007_68542e5dbfc4.slice/crio-da2c214fcdb33fae608237a0fb1d01559481f0b67a4afa1e6c930298a64b75a2\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod279d2368_abe1_465a_9007_68542e5dbfc4.slice\": RecentStats: unable to find data in memory cache]"
Mar 20 16:02:42 crc kubenswrapper[4730]: I0320 16:02:42.949395    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0"
Mar 20 16:02:42 crc kubenswrapper[4730]: I0320 16:02:42.950809    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0"
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.405771    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0"
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.518222    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3f6c808e-d523-48bd-8ec2-28b625834317-custom-prometheus-ca\") pod \"3f6c808e-d523-48bd-8ec2-28b625834317\" (UID: \"3f6c808e-d523-48bd-8ec2-28b625834317\") "
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.518366    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f6c808e-d523-48bd-8ec2-28b625834317-logs\") pod \"3f6c808e-d523-48bd-8ec2-28b625834317\" (UID: \"3f6c808e-d523-48bd-8ec2-28b625834317\") "
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.518404    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f6c808e-d523-48bd-8ec2-28b625834317-config-data\") pod \"3f6c808e-d523-48bd-8ec2-28b625834317\" (UID: \"3f6c808e-d523-48bd-8ec2-28b625834317\") "
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.518477    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f6c808e-d523-48bd-8ec2-28b625834317-combined-ca-bundle\") pod \"3f6c808e-d523-48bd-8ec2-28b625834317\" (UID: \"3f6c808e-d523-48bd-8ec2-28b625834317\") "
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.518674    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2z2tj\" (UniqueName: \"kubernetes.io/projected/3f6c808e-d523-48bd-8ec2-28b625834317-kube-api-access-2z2tj\") pod \"3f6c808e-d523-48bd-8ec2-28b625834317\" (UID: \"3f6c808e-d523-48bd-8ec2-28b625834317\") "
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.518776    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f6c808e-d523-48bd-8ec2-28b625834317-logs" (OuterVolumeSpecName: "logs") pod "3f6c808e-d523-48bd-8ec2-28b625834317" (UID: "3f6c808e-d523-48bd-8ec2-28b625834317"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.519207    4730 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f6c808e-d523-48bd-8ec2-28b625834317-logs\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.526221    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f6c808e-d523-48bd-8ec2-28b625834317-kube-api-access-2z2tj" (OuterVolumeSpecName: "kube-api-access-2z2tj") pod "3f6c808e-d523-48bd-8ec2-28b625834317" (UID: "3f6c808e-d523-48bd-8ec2-28b625834317"). InnerVolumeSpecName "kube-api-access-2z2tj". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.551298    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f6c808e-d523-48bd-8ec2-28b625834317-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "3f6c808e-d523-48bd-8ec2-28b625834317" (UID: "3f6c808e-d523-48bd-8ec2-28b625834317"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.561967    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f6c808e-d523-48bd-8ec2-28b625834317-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f6c808e-d523-48bd-8ec2-28b625834317" (UID: "3f6c808e-d523-48bd-8ec2-28b625834317"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.583981    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f6c808e-d523-48bd-8ec2-28b625834317-config-data" (OuterVolumeSpecName: "config-data") pod "3f6c808e-d523-48bd-8ec2-28b625834317" (UID: "3f6c808e-d523-48bd-8ec2-28b625834317"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.624137    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2z2tj\" (UniqueName: \"kubernetes.io/projected/3f6c808e-d523-48bd-8ec2-28b625834317-kube-api-access-2z2tj\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.624173    4730 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3f6c808e-d523-48bd-8ec2-28b625834317-custom-prometheus-ca\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.624185    4730 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f6c808e-d523-48bd-8ec2-28b625834317-config-data\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.624196    4730 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f6c808e-d523-48bd-8ec2-28b625834317-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.680170    4730 generic.go:334] "Generic (PLEG): container finished" podID="3f6c808e-d523-48bd-8ec2-28b625834317" containerID="ed4f7daeff4a5766b3f240bd0c76702ab4bff0db1848809b07121ff30e3a5bfc" exitCode=137
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.680262    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0"
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.680278    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"3f6c808e-d523-48bd-8ec2-28b625834317","Type":"ContainerDied","Data":"ed4f7daeff4a5766b3f240bd0c76702ab4bff0db1848809b07121ff30e3a5bfc"}
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.680330    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"3f6c808e-d523-48bd-8ec2-28b625834317","Type":"ContainerDied","Data":"d03f482cee6be89afdaaa20261b8838840560db337ea7e22e3e773497fe55805"}
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.680350    4730 scope.go:117] "RemoveContainer" containerID="ed4f7daeff4a5766b3f240bd0c76702ab4bff0db1848809b07121ff30e3a5bfc"
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.699077    4730 scope.go:117] "RemoveContainer" containerID="ca3ac5b513d25322badcc2bf19b245d687c9ccf8bff6c35cf5794c95ec2ab964"
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.724587    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"]
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.742194    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-decision-engine-0"]
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.747169    4730 scope.go:117] "RemoveContainer" containerID="ed4f7daeff4a5766b3f240bd0c76702ab4bff0db1848809b07121ff30e3a5bfc"
Mar 20 16:02:43 crc kubenswrapper[4730]: E0320 16:02:43.747571    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed4f7daeff4a5766b3f240bd0c76702ab4bff0db1848809b07121ff30e3a5bfc\": container with ID starting with ed4f7daeff4a5766b3f240bd0c76702ab4bff0db1848809b07121ff30e3a5bfc not found: ID does not exist" containerID="ed4f7daeff4a5766b3f240bd0c76702ab4bff0db1848809b07121ff30e3a5bfc"
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.747609    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed4f7daeff4a5766b3f240bd0c76702ab4bff0db1848809b07121ff30e3a5bfc"} err="failed to get container status \"ed4f7daeff4a5766b3f240bd0c76702ab4bff0db1848809b07121ff30e3a5bfc\": rpc error: code = NotFound desc = could not find container \"ed4f7daeff4a5766b3f240bd0c76702ab4bff0db1848809b07121ff30e3a5bfc\": container with ID starting with ed4f7daeff4a5766b3f240bd0c76702ab4bff0db1848809b07121ff30e3a5bfc not found: ID does not exist"
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.747636    4730 scope.go:117] "RemoveContainer" containerID="ca3ac5b513d25322badcc2bf19b245d687c9ccf8bff6c35cf5794c95ec2ab964"
Mar 20 16:02:43 crc kubenswrapper[4730]: E0320 16:02:43.747860    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca3ac5b513d25322badcc2bf19b245d687c9ccf8bff6c35cf5794c95ec2ab964\": container with ID starting with ca3ac5b513d25322badcc2bf19b245d687c9ccf8bff6c35cf5794c95ec2ab964 not found: ID does not exist" containerID="ca3ac5b513d25322badcc2bf19b245d687c9ccf8bff6c35cf5794c95ec2ab964"
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.747888    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca3ac5b513d25322badcc2bf19b245d687c9ccf8bff6c35cf5794c95ec2ab964"} err="failed to get container status \"ca3ac5b513d25322badcc2bf19b245d687c9ccf8bff6c35cf5794c95ec2ab964\": rpc error: code = NotFound desc = could not find container \"ca3ac5b513d25322badcc2bf19b245d687c9ccf8bff6c35cf5794c95ec2ab964\": container with ID starting with ca3ac5b513d25322badcc2bf19b245d687c9ccf8bff6c35cf5794c95ec2ab964 not found: ID does not exist"
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.752822    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"]
Mar 20 16:02:43 crc kubenswrapper[4730]: E0320 16:02:43.753344    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f6c808e-d523-48bd-8ec2-28b625834317" containerName="watcher-decision-engine"
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.753368    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f6c808e-d523-48bd-8ec2-28b625834317" containerName="watcher-decision-engine"
Mar 20 16:02:43 crc kubenswrapper[4730]: E0320 16:02:43.753380    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f6c808e-d523-48bd-8ec2-28b625834317" containerName="watcher-decision-engine"
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.753386    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f6c808e-d523-48bd-8ec2-28b625834317" containerName="watcher-decision-engine"
Mar 20 16:02:43 crc kubenswrapper[4730]: E0320 16:02:43.753400    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f6c808e-d523-48bd-8ec2-28b625834317" containerName="watcher-decision-engine"
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.753407    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f6c808e-d523-48bd-8ec2-28b625834317" containerName="watcher-decision-engine"
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.753608    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f6c808e-d523-48bd-8ec2-28b625834317" containerName="watcher-decision-engine"
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.753619    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f6c808e-d523-48bd-8ec2-28b625834317" containerName="watcher-decision-engine"
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.753638    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f6c808e-d523-48bd-8ec2-28b625834317" containerName="watcher-decision-engine"
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.753648    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f6c808e-d523-48bd-8ec2-28b625834317" containerName="watcher-decision-engine"
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.754582    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0"
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.756291    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data"
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.762308    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"]
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.827945    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/4adb002b-165b-4e7c-9e26-0a98f30dd467-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"4adb002b-165b-4e7c-9e26-0a98f30dd467\") " pod="openstack/watcher-decision-engine-0"
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.828079    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4adb002b-165b-4e7c-9e26-0a98f30dd467-config-data\") pod \"watcher-decision-engine-0\" (UID: \"4adb002b-165b-4e7c-9e26-0a98f30dd467\") " pod="openstack/watcher-decision-engine-0"
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.828136    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4adb002b-165b-4e7c-9e26-0a98f30dd467-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"4adb002b-165b-4e7c-9e26-0a98f30dd467\") " pod="openstack/watcher-decision-engine-0"
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.828180    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4adb002b-165b-4e7c-9e26-0a98f30dd467-logs\") pod \"watcher-decision-engine-0\" (UID: \"4adb002b-165b-4e7c-9e26-0a98f30dd467\") " pod="openstack/watcher-decision-engine-0"
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.828321    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jchf\" (UniqueName: \"kubernetes.io/projected/4adb002b-165b-4e7c-9e26-0a98f30dd467-kube-api-access-9jchf\") pod \"watcher-decision-engine-0\" (UID: \"4adb002b-165b-4e7c-9e26-0a98f30dd467\") " pod="openstack/watcher-decision-engine-0"
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.916985    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0"
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.918377    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0"
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.930400    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jchf\" (UniqueName: \"kubernetes.io/projected/4adb002b-165b-4e7c-9e26-0a98f30dd467-kube-api-access-9jchf\") pod \"watcher-decision-engine-0\" (UID: \"4adb002b-165b-4e7c-9e26-0a98f30dd467\") " pod="openstack/watcher-decision-engine-0"
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.930480    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/4adb002b-165b-4e7c-9e26-0a98f30dd467-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"4adb002b-165b-4e7c-9e26-0a98f30dd467\") " pod="openstack/watcher-decision-engine-0"
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.930541    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4adb002b-165b-4e7c-9e26-0a98f30dd467-config-data\") pod \"watcher-decision-engine-0\" (UID: \"4adb002b-165b-4e7c-9e26-0a98f30dd467\") " pod="openstack/watcher-decision-engine-0"
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.930580    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4adb002b-165b-4e7c-9e26-0a98f30dd467-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"4adb002b-165b-4e7c-9e26-0a98f30dd467\") " pod="openstack/watcher-decision-engine-0"
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.930609    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4adb002b-165b-4e7c-9e26-0a98f30dd467-logs\") pod \"watcher-decision-engine-0\" (UID: \"4adb002b-165b-4e7c-9e26-0a98f30dd467\") " pod="openstack/watcher-decision-engine-0"
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.930998    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4adb002b-165b-4e7c-9e26-0a98f30dd467-logs\") pod \"watcher-decision-engine-0\" (UID: \"4adb002b-165b-4e7c-9e26-0a98f30dd467\") " pod="openstack/watcher-decision-engine-0"
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.934841    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/4adb002b-165b-4e7c-9e26-0a98f30dd467-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"4adb002b-165b-4e7c-9e26-0a98f30dd467\") " pod="openstack/watcher-decision-engine-0"
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.935079    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4adb002b-165b-4e7c-9e26-0a98f30dd467-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"4adb002b-165b-4e7c-9e26-0a98f30dd467\") " pod="openstack/watcher-decision-engine-0"
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.935397    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4adb002b-165b-4e7c-9e26-0a98f30dd467-config-data\") pod \"watcher-decision-engine-0\" (UID: \"4adb002b-165b-4e7c-9e26-0a98f30dd467\") " pod="openstack/watcher-decision-engine-0"
Mar 20 16:02:43 crc kubenswrapper[4730]: I0320 16:02:43.952519    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jchf\" (UniqueName: \"kubernetes.io/projected/4adb002b-165b-4e7c-9e26-0a98f30dd467-kube-api-access-9jchf\") pod \"watcher-decision-engine-0\" (UID: \"4adb002b-165b-4e7c-9e26-0a98f30dd467\") " pod="openstack/watcher-decision-engine-0"
Mar 20 16:02:44 crc kubenswrapper[4730]: I0320 16:02:44.078127    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0"
Mar 20 16:02:44 crc kubenswrapper[4730]: I0320 16:02:44.671971    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"]
Mar 20 16:02:44 crc kubenswrapper[4730]: I0320 16:02:44.702763    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"4adb002b-165b-4e7c-9e26-0a98f30dd467","Type":"ContainerStarted","Data":"d108cf722bb9b9c48b9ba9b8abd0f35e60b17aab1e0179d47e3ead893c3f4026"}
Mar 20 16:02:44 crc kubenswrapper[4730]: I0320 16:02:44.807874    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0"
Mar 20 16:02:44 crc kubenswrapper[4730]: I0320 16:02:44.953955    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0"
Mar 20 16:02:44 crc kubenswrapper[4730]: I0320 16:02:44.955036    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0"
Mar 20 16:02:44 crc kubenswrapper[4730]: I0320 16:02:44.964726    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0"
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:45.528435    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0"
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:45.544604    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f6c808e-d523-48bd-8ec2-28b625834317" path="/var/lib/kubelet/pods/3f6c808e-d523-48bd-8ec2-28b625834317/volumes"
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:45.560104    4730 scope.go:117] "RemoveContainer" containerID="2ec54c009b326db4c49da642b8ab1232405aacb430ead248fe894a34dfe7c452"
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:45.572220    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f1bcc8c-7598-4c25-aaa7-0a9636c0729c-config-data\") pod \"1f1bcc8c-7598-4c25-aaa7-0a9636c0729c\" (UID: \"1f1bcc8c-7598-4c25-aaa7-0a9636c0729c\") "
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:45.572318    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f1bcc8c-7598-4c25-aaa7-0a9636c0729c-combined-ca-bundle\") pod \"1f1bcc8c-7598-4c25-aaa7-0a9636c0729c\" (UID: \"1f1bcc8c-7598-4c25-aaa7-0a9636c0729c\") "
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:45.572525    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmvzz\" (UniqueName: \"kubernetes.io/projected/1f1bcc8c-7598-4c25-aaa7-0a9636c0729c-kube-api-access-zmvzz\") pod \"1f1bcc8c-7598-4c25-aaa7-0a9636c0729c\" (UID: \"1f1bcc8c-7598-4c25-aaa7-0a9636c0729c\") "
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:45.602724    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f1bcc8c-7598-4c25-aaa7-0a9636c0729c-kube-api-access-zmvzz" (OuterVolumeSpecName: "kube-api-access-zmvzz") pod "1f1bcc8c-7598-4c25-aaa7-0a9636c0729c" (UID: "1f1bcc8c-7598-4c25-aaa7-0a9636c0729c"). InnerVolumeSpecName "kube-api-access-zmvzz". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:45.619978    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f1bcc8c-7598-4c25-aaa7-0a9636c0729c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f1bcc8c-7598-4c25-aaa7-0a9636c0729c" (UID: "1f1bcc8c-7598-4c25-aaa7-0a9636c0729c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:45.640624    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f1bcc8c-7598-4c25-aaa7-0a9636c0729c-config-data" (OuterVolumeSpecName: "config-data") pod "1f1bcc8c-7598-4c25-aaa7-0a9636c0729c" (UID: "1f1bcc8c-7598-4c25-aaa7-0a9636c0729c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:45.674939    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmvzz\" (UniqueName: \"kubernetes.io/projected/1f1bcc8c-7598-4c25-aaa7-0a9636c0729c-kube-api-access-zmvzz\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:45.674990    4730 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f1bcc8c-7598-4c25-aaa7-0a9636c0729c-config-data\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:45.675002    4730 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f1bcc8c-7598-4c25-aaa7-0a9636c0729c-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:45.721810    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"4adb002b-165b-4e7c-9e26-0a98f30dd467","Type":"ContainerStarted","Data":"37384daa758ca62f06b21e724f38767164da4ab8c9fc86eed747236b08d6fc3b"}
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:45.723420    4730 generic.go:334] "Generic (PLEG): container finished" podID="1f1bcc8c-7598-4c25-aaa7-0a9636c0729c" containerID="c51b7b65896e9d86f7a00a27c41da014a980db7c3cc6f6c5953fd8afdde0f897" exitCode=137
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:45.724563    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0"
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:45.725197    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1f1bcc8c-7598-4c25-aaa7-0a9636c0729c","Type":"ContainerDied","Data":"c51b7b65896e9d86f7a00a27c41da014a980db7c3cc6f6c5953fd8afdde0f897"}
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:45.725230    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1f1bcc8c-7598-4c25-aaa7-0a9636c0729c","Type":"ContainerDied","Data":"bb467f0f9108cd2d4075fdbdf95b9f10f944692ab74865e527713ebadb5491f5"}
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:45.725245    4730 scope.go:117] "RemoveContainer" containerID="c51b7b65896e9d86f7a00a27c41da014a980db7c3cc6f6c5953fd8afdde0f897"
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:45.745294    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0"
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:45.746171    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=2.7461525780000002 podStartE2EDuration="2.746152578s" podCreationTimestamp="2026-03-20 16:02:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:02:45.740626549 +0000 UTC m=+1424.953997928" watchObservedRunningTime="2026-03-20 16:02:45.746152578 +0000 UTC m=+1424.959523947"
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:45.770479    4730 scope.go:117] "RemoveContainer" containerID="c51b7b65896e9d86f7a00a27c41da014a980db7c3cc6f6c5953fd8afdde0f897"
Mar 20 16:02:46 crc kubenswrapper[4730]: E0320 16:02:45.771389    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c51b7b65896e9d86f7a00a27c41da014a980db7c3cc6f6c5953fd8afdde0f897\": container with ID starting with c51b7b65896e9d86f7a00a27c41da014a980db7c3cc6f6c5953fd8afdde0f897 not found: ID does not exist" containerID="c51b7b65896e9d86f7a00a27c41da014a980db7c3cc6f6c5953fd8afdde0f897"
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:45.771447    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c51b7b65896e9d86f7a00a27c41da014a980db7c3cc6f6c5953fd8afdde0f897"} err="failed to get container status \"c51b7b65896e9d86f7a00a27c41da014a980db7c3cc6f6c5953fd8afdde0f897\": rpc error: code = NotFound desc = could not find container \"c51b7b65896e9d86f7a00a27c41da014a980db7c3cc6f6c5953fd8afdde0f897\": container with ID starting with c51b7b65896e9d86f7a00a27c41da014a980db7c3cc6f6c5953fd8afdde0f897 not found: ID does not exist"
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:45.774313    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"]
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:45.783333    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"]
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:45.793429    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"]
Mar 20 16:02:46 crc kubenswrapper[4730]: E0320 16:02:45.793971    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f1bcc8c-7598-4c25-aaa7-0a9636c0729c" containerName="nova-cell1-novncproxy-novncproxy"
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:45.793991    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f1bcc8c-7598-4c25-aaa7-0a9636c0729c" containerName="nova-cell1-novncproxy-novncproxy"
Mar 20 16:02:46 crc kubenswrapper[4730]: E0320 16:02:45.794033    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f6c808e-d523-48bd-8ec2-28b625834317" containerName="watcher-decision-engine"
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:45.794041    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f6c808e-d523-48bd-8ec2-28b625834317" containerName="watcher-decision-engine"
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:45.794301    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f6c808e-d523-48bd-8ec2-28b625834317" containerName="watcher-decision-engine"
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:45.794327    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f1bcc8c-7598-4c25-aaa7-0a9636c0729c" containerName="nova-cell1-novncproxy-novncproxy"
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:45.795175    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0"
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:45.800753    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt"
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:45.800979    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data"
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:45.801384    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc"
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:45.820146    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"]
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:45.880776    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dksk\" (UniqueName: \"kubernetes.io/projected/6586493e-e5d0-4504-b516-ebaac5defd79-kube-api-access-9dksk\") pod \"nova-cell1-novncproxy-0\" (UID: \"6586493e-e5d0-4504-b516-ebaac5defd79\") " pod="openstack/nova-cell1-novncproxy-0"
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:45.880953    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6586493e-e5d0-4504-b516-ebaac5defd79-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6586493e-e5d0-4504-b516-ebaac5defd79\") " pod="openstack/nova-cell1-novncproxy-0"
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:45.880983    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6586493e-e5d0-4504-b516-ebaac5defd79-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6586493e-e5d0-4504-b516-ebaac5defd79\") " pod="openstack/nova-cell1-novncproxy-0"
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:45.880999    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6586493e-e5d0-4504-b516-ebaac5defd79-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6586493e-e5d0-4504-b516-ebaac5defd79\") " pod="openstack/nova-cell1-novncproxy-0"
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:45.881017    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6586493e-e5d0-4504-b516-ebaac5defd79-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6586493e-e5d0-4504-b516-ebaac5defd79\") " pod="openstack/nova-cell1-novncproxy-0"
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:45.923127    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0"
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:45.924978    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0"
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:45.935792    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0"
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:45.983225    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6586493e-e5d0-4504-b516-ebaac5defd79-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6586493e-e5d0-4504-b516-ebaac5defd79\") " pod="openstack/nova-cell1-novncproxy-0"
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:45.983302    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6586493e-e5d0-4504-b516-ebaac5defd79-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6586493e-e5d0-4504-b516-ebaac5defd79\") " pod="openstack/nova-cell1-novncproxy-0"
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:45.983323    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6586493e-e5d0-4504-b516-ebaac5defd79-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6586493e-e5d0-4504-b516-ebaac5defd79\") " pod="openstack/nova-cell1-novncproxy-0"
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:45.983351    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6586493e-e5d0-4504-b516-ebaac5defd79-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6586493e-e5d0-4504-b516-ebaac5defd79\") " pod="openstack/nova-cell1-novncproxy-0"
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:45.983397    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dksk\" (UniqueName: \"kubernetes.io/projected/6586493e-e5d0-4504-b516-ebaac5defd79-kube-api-access-9dksk\") pod \"nova-cell1-novncproxy-0\" (UID: \"6586493e-e5d0-4504-b516-ebaac5defd79\") " pod="openstack/nova-cell1-novncproxy-0"
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:45.988591    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6586493e-e5d0-4504-b516-ebaac5defd79-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6586493e-e5d0-4504-b516-ebaac5defd79\") " pod="openstack/nova-cell1-novncproxy-0"
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:45.988745    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6586493e-e5d0-4504-b516-ebaac5defd79-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6586493e-e5d0-4504-b516-ebaac5defd79\") " pod="openstack/nova-cell1-novncproxy-0"
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:45.993210    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6586493e-e5d0-4504-b516-ebaac5defd79-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6586493e-e5d0-4504-b516-ebaac5defd79\") " pod="openstack/nova-cell1-novncproxy-0"
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:45.993891    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6586493e-e5d0-4504-b516-ebaac5defd79-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6586493e-e5d0-4504-b516-ebaac5defd79\") " pod="openstack/nova-cell1-novncproxy-0"
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:46.002195    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dksk\" (UniqueName: \"kubernetes.io/projected/6586493e-e5d0-4504-b516-ebaac5defd79-kube-api-access-9dksk\") pod \"nova-cell1-novncproxy-0\" (UID: \"6586493e-e5d0-4504-b516-ebaac5defd79\") " pod="openstack/nova-cell1-novncproxy-0"
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:46.135952    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0"
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:46.604909    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"]
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:46.740788    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6586493e-e5d0-4504-b516-ebaac5defd79","Type":"ContainerStarted","Data":"8ba4c5896280523a1ab721d88ca88b134df3056e6f891eadeda5c2cfecb2910c"}
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:46.749515    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0"
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:46.936457    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f99bcbd6f-lkgpt"]
Mar 20 16:02:46 crc kubenswrapper[4730]: E0320 16:02:46.937163    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f6c808e-d523-48bd-8ec2-28b625834317" containerName="watcher-decision-engine"
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:46.937176    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f6c808e-d523-48bd-8ec2-28b625834317" containerName="watcher-decision-engine"
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:46.938547    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f99bcbd6f-lkgpt"
Mar 20 16:02:46 crc kubenswrapper[4730]: I0320 16:02:46.968768    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f99bcbd6f-lkgpt"]
Mar 20 16:02:47 crc kubenswrapper[4730]: I0320 16:02:47.012043    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eee8e670-a743-4284-bde0-5a8a77d8058e-dns-swift-storage-0\") pod \"dnsmasq-dns-7f99bcbd6f-lkgpt\" (UID: \"eee8e670-a743-4284-bde0-5a8a77d8058e\") " pod="openstack/dnsmasq-dns-7f99bcbd6f-lkgpt"
Mar 20 16:02:47 crc kubenswrapper[4730]: I0320 16:02:47.012128    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eee8e670-a743-4284-bde0-5a8a77d8058e-dns-svc\") pod \"dnsmasq-dns-7f99bcbd6f-lkgpt\" (UID: \"eee8e670-a743-4284-bde0-5a8a77d8058e\") " pod="openstack/dnsmasq-dns-7f99bcbd6f-lkgpt"
Mar 20 16:02:47 crc kubenswrapper[4730]: I0320 16:02:47.012170    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eee8e670-a743-4284-bde0-5a8a77d8058e-ovsdbserver-nb\") pod \"dnsmasq-dns-7f99bcbd6f-lkgpt\" (UID: \"eee8e670-a743-4284-bde0-5a8a77d8058e\") " pod="openstack/dnsmasq-dns-7f99bcbd6f-lkgpt"
Mar 20 16:02:47 crc kubenswrapper[4730]: I0320 16:02:47.012210    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eee8e670-a743-4284-bde0-5a8a77d8058e-config\") pod \"dnsmasq-dns-7f99bcbd6f-lkgpt\" (UID: \"eee8e670-a743-4284-bde0-5a8a77d8058e\") " pod="openstack/dnsmasq-dns-7f99bcbd6f-lkgpt"
Mar 20 16:02:47 crc kubenswrapper[4730]: I0320 16:02:47.012362    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmzv5\" (UniqueName: \"kubernetes.io/projected/eee8e670-a743-4284-bde0-5a8a77d8058e-kube-api-access-jmzv5\") pod \"dnsmasq-dns-7f99bcbd6f-lkgpt\" (UID: \"eee8e670-a743-4284-bde0-5a8a77d8058e\") " pod="openstack/dnsmasq-dns-7f99bcbd6f-lkgpt"
Mar 20 16:02:47 crc kubenswrapper[4730]: I0320 16:02:47.012787    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eee8e670-a743-4284-bde0-5a8a77d8058e-ovsdbserver-sb\") pod \"dnsmasq-dns-7f99bcbd6f-lkgpt\" (UID: \"eee8e670-a743-4284-bde0-5a8a77d8058e\") " pod="openstack/dnsmasq-dns-7f99bcbd6f-lkgpt"
Mar 20 16:02:47 crc kubenswrapper[4730]: I0320 16:02:47.114852    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmzv5\" (UniqueName: \"kubernetes.io/projected/eee8e670-a743-4284-bde0-5a8a77d8058e-kube-api-access-jmzv5\") pod \"dnsmasq-dns-7f99bcbd6f-lkgpt\" (UID: \"eee8e670-a743-4284-bde0-5a8a77d8058e\") " pod="openstack/dnsmasq-dns-7f99bcbd6f-lkgpt"
Mar 20 16:02:47 crc kubenswrapper[4730]: I0320 16:02:47.115020    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eee8e670-a743-4284-bde0-5a8a77d8058e-ovsdbserver-sb\") pod \"dnsmasq-dns-7f99bcbd6f-lkgpt\" (UID: \"eee8e670-a743-4284-bde0-5a8a77d8058e\") " pod="openstack/dnsmasq-dns-7f99bcbd6f-lkgpt"
Mar 20 16:02:47 crc kubenswrapper[4730]: I0320 16:02:47.116170    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eee8e670-a743-4284-bde0-5a8a77d8058e-ovsdbserver-sb\") pod \"dnsmasq-dns-7f99bcbd6f-lkgpt\" (UID: \"eee8e670-a743-4284-bde0-5a8a77d8058e\") " pod="openstack/dnsmasq-dns-7f99bcbd6f-lkgpt"
Mar 20 16:02:47 crc kubenswrapper[4730]: I0320 16:02:47.116373    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eee8e670-a743-4284-bde0-5a8a77d8058e-dns-swift-storage-0\") pod \"dnsmasq-dns-7f99bcbd6f-lkgpt\" (UID: \"eee8e670-a743-4284-bde0-5a8a77d8058e\") " pod="openstack/dnsmasq-dns-7f99bcbd6f-lkgpt"
Mar 20 16:02:47 crc kubenswrapper[4730]: I0320 16:02:47.116438    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eee8e670-a743-4284-bde0-5a8a77d8058e-dns-svc\") pod \"dnsmasq-dns-7f99bcbd6f-lkgpt\" (UID: \"eee8e670-a743-4284-bde0-5a8a77d8058e\") " pod="openstack/dnsmasq-dns-7f99bcbd6f-lkgpt"
Mar 20 16:02:47 crc kubenswrapper[4730]: I0320 16:02:47.116464    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eee8e670-a743-4284-bde0-5a8a77d8058e-ovsdbserver-nb\") pod \"dnsmasq-dns-7f99bcbd6f-lkgpt\" (UID: \"eee8e670-a743-4284-bde0-5a8a77d8058e\") " pod="openstack/dnsmasq-dns-7f99bcbd6f-lkgpt"
Mar 20 16:02:47 crc kubenswrapper[4730]: I0320 16:02:47.116488    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eee8e670-a743-4284-bde0-5a8a77d8058e-config\") pod \"dnsmasq-dns-7f99bcbd6f-lkgpt\" (UID: \"eee8e670-a743-4284-bde0-5a8a77d8058e\") " pod="openstack/dnsmasq-dns-7f99bcbd6f-lkgpt"
Mar 20 16:02:47 crc kubenswrapper[4730]: I0320 16:02:47.117155    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eee8e670-a743-4284-bde0-5a8a77d8058e-dns-svc\") pod \"dnsmasq-dns-7f99bcbd6f-lkgpt\" (UID: \"eee8e670-a743-4284-bde0-5a8a77d8058e\") " pod="openstack/dnsmasq-dns-7f99bcbd6f-lkgpt"
Mar 20 16:02:47 crc kubenswrapper[4730]: I0320 16:02:47.117321    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eee8e670-a743-4284-bde0-5a8a77d8058e-dns-swift-storage-0\") pod \"dnsmasq-dns-7f99bcbd6f-lkgpt\" (UID: \"eee8e670-a743-4284-bde0-5a8a77d8058e\") " pod="openstack/dnsmasq-dns-7f99bcbd6f-lkgpt"
Mar 20 16:02:47 crc kubenswrapper[4730]: I0320 16:02:47.117744    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eee8e670-a743-4284-bde0-5a8a77d8058e-config\") pod \"dnsmasq-dns-7f99bcbd6f-lkgpt\" (UID: \"eee8e670-a743-4284-bde0-5a8a77d8058e\") " pod="openstack/dnsmasq-dns-7f99bcbd6f-lkgpt"
Mar 20 16:02:47 crc kubenswrapper[4730]: I0320 16:02:47.117929    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eee8e670-a743-4284-bde0-5a8a77d8058e-ovsdbserver-nb\") pod \"dnsmasq-dns-7f99bcbd6f-lkgpt\" (UID: \"eee8e670-a743-4284-bde0-5a8a77d8058e\") " pod="openstack/dnsmasq-dns-7f99bcbd6f-lkgpt"
Mar 20 16:02:47 crc kubenswrapper[4730]: I0320 16:02:47.135286    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmzv5\" (UniqueName: \"kubernetes.io/projected/eee8e670-a743-4284-bde0-5a8a77d8058e-kube-api-access-jmzv5\") pod \"dnsmasq-dns-7f99bcbd6f-lkgpt\" (UID: \"eee8e670-a743-4284-bde0-5a8a77d8058e\") " pod="openstack/dnsmasq-dns-7f99bcbd6f-lkgpt"
Mar 20 16:02:47 crc kubenswrapper[4730]: I0320 16:02:47.294350    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f99bcbd6f-lkgpt"
Mar 20 16:02:47 crc kubenswrapper[4730]: I0320 16:02:47.560101    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f1bcc8c-7598-4c25-aaa7-0a9636c0729c" path="/var/lib/kubelet/pods/1f1bcc8c-7598-4c25-aaa7-0a9636c0729c/volumes"
Mar 20 16:02:47 crc kubenswrapper[4730]: I0320 16:02:47.753198    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6586493e-e5d0-4504-b516-ebaac5defd79","Type":"ContainerStarted","Data":"87b4a3a716a45fcaa15c2114e60d8ca03a4b2288d8a98246deaad952b89c93a2"}
Mar 20 16:02:47 crc kubenswrapper[4730]: I0320 16:02:47.770960    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.770943321 podStartE2EDuration="2.770943321s" podCreationTimestamp="2026-03-20 16:02:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:02:47.770498888 +0000 UTC m=+1426.983870257" watchObservedRunningTime="2026-03-20 16:02:47.770943321 +0000 UTC m=+1426.984314690"
Mar 20 16:02:48 crc kubenswrapper[4730]: I0320 16:02:48.008990    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f99bcbd6f-lkgpt"]
Mar 20 16:02:48 crc kubenswrapper[4730]: I0320 16:02:48.762385    4730 generic.go:334] "Generic (PLEG): container finished" podID="eee8e670-a743-4284-bde0-5a8a77d8058e" containerID="eb503bcef68144de38c1262fa118798748090de6157da609f58bcca7e2bbdbcc" exitCode=0
Mar 20 16:02:48 crc kubenswrapper[4730]: I0320 16:02:48.763721    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f99bcbd6f-lkgpt" event={"ID":"eee8e670-a743-4284-bde0-5a8a77d8058e","Type":"ContainerDied","Data":"eb503bcef68144de38c1262fa118798748090de6157da609f58bcca7e2bbdbcc"}
Mar 20 16:02:48 crc kubenswrapper[4730]: I0320 16:02:48.763786    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f99bcbd6f-lkgpt" event={"ID":"eee8e670-a743-4284-bde0-5a8a77d8058e","Type":"ContainerStarted","Data":"0440fe1ccefaeb24724ee16cfedb746eead11f6c1b7d84e1056a70f539a4c85b"}
Mar 20 16:02:49 crc kubenswrapper[4730]: I0320 16:02:49.625887    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"]
Mar 20 16:02:49 crc kubenswrapper[4730]: I0320 16:02:49.774388    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="fc78da88-5699-44ed-af14-7627ea6191f9" containerName="nova-api-log" containerID="cri-o://42f9304ba2a3640bc2a608f72633df51d75830859b8b512c72f883ee6ebbc449" gracePeriod=30
Mar 20 16:02:49 crc kubenswrapper[4730]: I0320 16:02:49.776370    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f99bcbd6f-lkgpt" event={"ID":"eee8e670-a743-4284-bde0-5a8a77d8058e","Type":"ContainerStarted","Data":"b9b68041f5e6d75af1b6ac50a20d8efa0a3cf0ab36fd4d2c2cf619950c911910"}
Mar 20 16:02:49 crc kubenswrapper[4730]: I0320 16:02:49.776476    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f99bcbd6f-lkgpt"
Mar 20 16:02:49 crc kubenswrapper[4730]: I0320 16:02:49.776976    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="fc78da88-5699-44ed-af14-7627ea6191f9" containerName="nova-api-api" containerID="cri-o://b7766e5f23e174441651086dc567c68965531e447de145ebeb0b1414c28ef4e6" gracePeriod=30
Mar 20 16:02:49 crc kubenswrapper[4730]: I0320 16:02:49.810596    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f99bcbd6f-lkgpt" podStartSLOduration=3.81057767 podStartE2EDuration="3.81057767s" podCreationTimestamp="2026-03-20 16:02:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:02:49.798208094 +0000 UTC m=+1429.011579463" watchObservedRunningTime="2026-03-20 16:02:49.81057767 +0000 UTC m=+1429.023949039"
Mar 20 16:02:49 crc kubenswrapper[4730]: I0320 16:02:49.975706    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"]
Mar 20 16:02:49 crc kubenswrapper[4730]: I0320 16:02:49.976343    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c192a384-369a-4011-bbd0-af10cf958010" containerName="ceilometer-central-agent" containerID="cri-o://92ca70d767554dd7f5a3c2261603595e47200f286bd84d37815a7187cc55a125" gracePeriod=30
Mar 20 16:02:49 crc kubenswrapper[4730]: I0320 16:02:49.976385    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c192a384-369a-4011-bbd0-af10cf958010" containerName="proxy-httpd" containerID="cri-o://2e40439ca8128c74c4bde82b12fa3c90a57c5ca9aa2ec703f4bb34d1783c4a41" gracePeriod=30
Mar 20 16:02:49 crc kubenswrapper[4730]: I0320 16:02:49.976418    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c192a384-369a-4011-bbd0-af10cf958010" containerName="sg-core" containerID="cri-o://67e70a210d20ab3fb6eb013ac35769c758e3dbe555ae4869a1f308e1ce50d12f" gracePeriod=30
Mar 20 16:02:49 crc kubenswrapper[4730]: I0320 16:02:49.976403    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c192a384-369a-4011-bbd0-af10cf958010" containerName="ceilometer-notification-agent" containerID="cri-o://3ed2c729cfba6a845986581f09c8deca053ba44823152224340ca15c6f3b1018" gracePeriod=30
Mar 20 16:02:50 crc kubenswrapper[4730]: I0320 16:02:50.789584    4730 generic.go:334] "Generic (PLEG): container finished" podID="c192a384-369a-4011-bbd0-af10cf958010" containerID="2e40439ca8128c74c4bde82b12fa3c90a57c5ca9aa2ec703f4bb34d1783c4a41" exitCode=0
Mar 20 16:02:50 crc kubenswrapper[4730]: I0320 16:02:50.790045    4730 generic.go:334] "Generic (PLEG): container finished" podID="c192a384-369a-4011-bbd0-af10cf958010" containerID="67e70a210d20ab3fb6eb013ac35769c758e3dbe555ae4869a1f308e1ce50d12f" exitCode=2
Mar 20 16:02:50 crc kubenswrapper[4730]: I0320 16:02:50.790104    4730 generic.go:334] "Generic (PLEG): container finished" podID="c192a384-369a-4011-bbd0-af10cf958010" containerID="92ca70d767554dd7f5a3c2261603595e47200f286bd84d37815a7187cc55a125" exitCode=0
Mar 20 16:02:50 crc kubenswrapper[4730]: I0320 16:02:50.789695    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c192a384-369a-4011-bbd0-af10cf958010","Type":"ContainerDied","Data":"2e40439ca8128c74c4bde82b12fa3c90a57c5ca9aa2ec703f4bb34d1783c4a41"}
Mar 20 16:02:50 crc kubenswrapper[4730]: I0320 16:02:50.790381    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c192a384-369a-4011-bbd0-af10cf958010","Type":"ContainerDied","Data":"67e70a210d20ab3fb6eb013ac35769c758e3dbe555ae4869a1f308e1ce50d12f"}
Mar 20 16:02:50 crc kubenswrapper[4730]: I0320 16:02:50.790463    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c192a384-369a-4011-bbd0-af10cf958010","Type":"ContainerDied","Data":"92ca70d767554dd7f5a3c2261603595e47200f286bd84d37815a7187cc55a125"}
Mar 20 16:02:50 crc kubenswrapper[4730]: I0320 16:02:50.796732    4730 generic.go:334] "Generic (PLEG): container finished" podID="fc78da88-5699-44ed-af14-7627ea6191f9" containerID="42f9304ba2a3640bc2a608f72633df51d75830859b8b512c72f883ee6ebbc449" exitCode=143
Mar 20 16:02:50 crc kubenswrapper[4730]: I0320 16:02:50.796804    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fc78da88-5699-44ed-af14-7627ea6191f9","Type":"ContainerDied","Data":"42f9304ba2a3640bc2a608f72633df51d75830859b8b512c72f883ee6ebbc449"}
Mar 20 16:02:51 crc kubenswrapper[4730]: I0320 16:02:51.136202    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0"
Mar 20 16:02:51 crc kubenswrapper[4730]: I0320 16:02:51.861725    4730 generic.go:334] "Generic (PLEG): container finished" podID="fc78da88-5699-44ed-af14-7627ea6191f9" containerID="b7766e5f23e174441651086dc567c68965531e447de145ebeb0b1414c28ef4e6" exitCode=0
Mar 20 16:02:51 crc kubenswrapper[4730]: I0320 16:02:51.862005    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fc78da88-5699-44ed-af14-7627ea6191f9","Type":"ContainerDied","Data":"b7766e5f23e174441651086dc567c68965531e447de145ebeb0b1414c28ef4e6"}
Mar 20 16:02:51 crc kubenswrapper[4730]: E0320 16:02:51.987371    4730 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod279d2368_abe1_465a_9007_68542e5dbfc4.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod279d2368_abe1_465a_9007_68542e5dbfc4.slice/crio-da2c214fcdb33fae608237a0fb1d01559481f0b67a4afa1e6c930298a64b75a2\": RecentStats: unable to find data in memory cache]"
Mar 20 16:02:52 crc kubenswrapper[4730]: I0320 16:02:52.029900    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0"
Mar 20 16:02:52 crc kubenswrapper[4730]: I0320 16:02:52.132938    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gm4nm\" (UniqueName: \"kubernetes.io/projected/fc78da88-5699-44ed-af14-7627ea6191f9-kube-api-access-gm4nm\") pod \"fc78da88-5699-44ed-af14-7627ea6191f9\" (UID: \"fc78da88-5699-44ed-af14-7627ea6191f9\") "
Mar 20 16:02:52 crc kubenswrapper[4730]: I0320 16:02:52.133076    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc78da88-5699-44ed-af14-7627ea6191f9-config-data\") pod \"fc78da88-5699-44ed-af14-7627ea6191f9\" (UID: \"fc78da88-5699-44ed-af14-7627ea6191f9\") "
Mar 20 16:02:52 crc kubenswrapper[4730]: I0320 16:02:52.133109    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc78da88-5699-44ed-af14-7627ea6191f9-combined-ca-bundle\") pod \"fc78da88-5699-44ed-af14-7627ea6191f9\" (UID: \"fc78da88-5699-44ed-af14-7627ea6191f9\") "
Mar 20 16:02:52 crc kubenswrapper[4730]: I0320 16:02:52.133187    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc78da88-5699-44ed-af14-7627ea6191f9-logs\") pod \"fc78da88-5699-44ed-af14-7627ea6191f9\" (UID: \"fc78da88-5699-44ed-af14-7627ea6191f9\") "
Mar 20 16:02:52 crc kubenswrapper[4730]: I0320 16:02:52.135436    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc78da88-5699-44ed-af14-7627ea6191f9-logs" (OuterVolumeSpecName: "logs") pod "fc78da88-5699-44ed-af14-7627ea6191f9" (UID: "fc78da88-5699-44ed-af14-7627ea6191f9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:02:52 crc kubenswrapper[4730]: I0320 16:02:52.145505    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc78da88-5699-44ed-af14-7627ea6191f9-kube-api-access-gm4nm" (OuterVolumeSpecName: "kube-api-access-gm4nm") pod "fc78da88-5699-44ed-af14-7627ea6191f9" (UID: "fc78da88-5699-44ed-af14-7627ea6191f9"). InnerVolumeSpecName "kube-api-access-gm4nm". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:02:52 crc kubenswrapper[4730]: I0320 16:02:52.179333    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc78da88-5699-44ed-af14-7627ea6191f9-config-data" (OuterVolumeSpecName: "config-data") pod "fc78da88-5699-44ed-af14-7627ea6191f9" (UID: "fc78da88-5699-44ed-af14-7627ea6191f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:02:52 crc kubenswrapper[4730]: I0320 16:02:52.184407    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc78da88-5699-44ed-af14-7627ea6191f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc78da88-5699-44ed-af14-7627ea6191f9" (UID: "fc78da88-5699-44ed-af14-7627ea6191f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:02:52 crc kubenswrapper[4730]: I0320 16:02:52.235770    4730 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc78da88-5699-44ed-af14-7627ea6191f9-logs\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:52 crc kubenswrapper[4730]: I0320 16:02:52.235818    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gm4nm\" (UniqueName: \"kubernetes.io/projected/fc78da88-5699-44ed-af14-7627ea6191f9-kube-api-access-gm4nm\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:52 crc kubenswrapper[4730]: I0320 16:02:52.235833    4730 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc78da88-5699-44ed-af14-7627ea6191f9-config-data\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:52 crc kubenswrapper[4730]: I0320 16:02:52.235846    4730 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc78da88-5699-44ed-af14-7627ea6191f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:52 crc kubenswrapper[4730]: I0320 16:02:52.897568    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0"
Mar 20 16:02:52 crc kubenswrapper[4730]: I0320 16:02:52.908639    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fc78da88-5699-44ed-af14-7627ea6191f9","Type":"ContainerDied","Data":"bcfa21becb5d8093cfdbab459412302a96e0a7cd61a8f2880830161d043b5f7d"}
Mar 20 16:02:52 crc kubenswrapper[4730]: I0320 16:02:52.908699    4730 scope.go:117] "RemoveContainer" containerID="b7766e5f23e174441651086dc567c68965531e447de145ebeb0b1414c28ef4e6"
Mar 20 16:02:52 crc kubenswrapper[4730]: I0320 16:02:52.908871    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0"
Mar 20 16:02:52 crc kubenswrapper[4730]: I0320 16:02:52.936630    4730 generic.go:334] "Generic (PLEG): container finished" podID="c192a384-369a-4011-bbd0-af10cf958010" containerID="3ed2c729cfba6a845986581f09c8deca053ba44823152224340ca15c6f3b1018" exitCode=0
Mar 20 16:02:52 crc kubenswrapper[4730]: I0320 16:02:52.936674    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c192a384-369a-4011-bbd0-af10cf958010","Type":"ContainerDied","Data":"3ed2c729cfba6a845986581f09c8deca053ba44823152224340ca15c6f3b1018"}
Mar 20 16:02:52 crc kubenswrapper[4730]: I0320 16:02:52.936703    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c192a384-369a-4011-bbd0-af10cf958010","Type":"ContainerDied","Data":"fc7185d335545d2bc3730a5a21059470bf6bc2bcae0a41d7ca4c2027aa011079"}
Mar 20 16:02:52 crc kubenswrapper[4730]: I0320 16:02:52.936862    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0"
Mar 20 16:02:52 crc kubenswrapper[4730]: I0320 16:02:52.984508    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c192a384-369a-4011-bbd0-af10cf958010-log-httpd\") pod \"c192a384-369a-4011-bbd0-af10cf958010\" (UID: \"c192a384-369a-4011-bbd0-af10cf958010\") "
Mar 20 16:02:52 crc kubenswrapper[4730]: I0320 16:02:52.984638    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c192a384-369a-4011-bbd0-af10cf958010-sg-core-conf-yaml\") pod \"c192a384-369a-4011-bbd0-af10cf958010\" (UID: \"c192a384-369a-4011-bbd0-af10cf958010\") "
Mar 20 16:02:52 crc kubenswrapper[4730]: I0320 16:02:52.984678    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c192a384-369a-4011-bbd0-af10cf958010-run-httpd\") pod \"c192a384-369a-4011-bbd0-af10cf958010\" (UID: \"c192a384-369a-4011-bbd0-af10cf958010\") "
Mar 20 16:02:52 crc kubenswrapper[4730]: I0320 16:02:52.984702    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c192a384-369a-4011-bbd0-af10cf958010-config-data\") pod \"c192a384-369a-4011-bbd0-af10cf958010\" (UID: \"c192a384-369a-4011-bbd0-af10cf958010\") "
Mar 20 16:02:52 crc kubenswrapper[4730]: I0320 16:02:52.984742    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c192a384-369a-4011-bbd0-af10cf958010-combined-ca-bundle\") pod \"c192a384-369a-4011-bbd0-af10cf958010\" (UID: \"c192a384-369a-4011-bbd0-af10cf958010\") "
Mar 20 16:02:52 crc kubenswrapper[4730]: I0320 16:02:52.984786    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c192a384-369a-4011-bbd0-af10cf958010-scripts\") pod \"c192a384-369a-4011-bbd0-af10cf958010\" (UID: \"c192a384-369a-4011-bbd0-af10cf958010\") "
Mar 20 16:02:52 crc kubenswrapper[4730]: I0320 16:02:52.984889    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tkk7\" (UniqueName: \"kubernetes.io/projected/c192a384-369a-4011-bbd0-af10cf958010-kube-api-access-2tkk7\") pod \"c192a384-369a-4011-bbd0-af10cf958010\" (UID: \"c192a384-369a-4011-bbd0-af10cf958010\") "
Mar 20 16:02:52 crc kubenswrapper[4730]: I0320 16:02:52.985481    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c192a384-369a-4011-bbd0-af10cf958010-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c192a384-369a-4011-bbd0-af10cf958010" (UID: "c192a384-369a-4011-bbd0-af10cf958010"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:02:52 crc kubenswrapper[4730]: I0320 16:02:52.988732    4730 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c192a384-369a-4011-bbd0-af10cf958010-log-httpd\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:52 crc kubenswrapper[4730]: I0320 16:02:52.990450    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c192a384-369a-4011-bbd0-af10cf958010-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c192a384-369a-4011-bbd0-af10cf958010" (UID: "c192a384-369a-4011-bbd0-af10cf958010"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:02:52 crc kubenswrapper[4730]: I0320 16:02:52.992409    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c192a384-369a-4011-bbd0-af10cf958010-scripts" (OuterVolumeSpecName: "scripts") pod "c192a384-369a-4011-bbd0-af10cf958010" (UID: "c192a384-369a-4011-bbd0-af10cf958010"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:02:52 crc kubenswrapper[4730]: I0320 16:02:52.995796    4730 scope.go:117] "RemoveContainer" containerID="42f9304ba2a3640bc2a608f72633df51d75830859b8b512c72f883ee6ebbc449"
Mar 20 16:02:52 crc kubenswrapper[4730]: I0320 16:02:52.996385    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c192a384-369a-4011-bbd0-af10cf958010-kube-api-access-2tkk7" (OuterVolumeSpecName: "kube-api-access-2tkk7") pod "c192a384-369a-4011-bbd0-af10cf958010" (UID: "c192a384-369a-4011-bbd0-af10cf958010"). InnerVolumeSpecName "kube-api-access-2tkk7". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.083233    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"]
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.091191    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tkk7\" (UniqueName: \"kubernetes.io/projected/c192a384-369a-4011-bbd0-af10cf958010-kube-api-access-2tkk7\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.091226    4730 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c192a384-369a-4011-bbd0-af10cf958010-run-httpd\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.091240    4730 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c192a384-369a-4011-bbd0-af10cf958010-scripts\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.094481    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"]
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.095674    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c192a384-369a-4011-bbd0-af10cf958010-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c192a384-369a-4011-bbd0-af10cf958010" (UID: "c192a384-369a-4011-bbd0-af10cf958010"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.113887    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"]
Mar 20 16:02:53 crc kubenswrapper[4730]: E0320 16:02:53.114370    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc78da88-5699-44ed-af14-7627ea6191f9" containerName="nova-api-log"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.114387    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc78da88-5699-44ed-af14-7627ea6191f9" containerName="nova-api-log"
Mar 20 16:02:53 crc kubenswrapper[4730]: E0320 16:02:53.114413    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c192a384-369a-4011-bbd0-af10cf958010" containerName="proxy-httpd"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.114421    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="c192a384-369a-4011-bbd0-af10cf958010" containerName="proxy-httpd"
Mar 20 16:02:53 crc kubenswrapper[4730]: E0320 16:02:53.114437    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c192a384-369a-4011-bbd0-af10cf958010" containerName="ceilometer-notification-agent"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.114442    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="c192a384-369a-4011-bbd0-af10cf958010" containerName="ceilometer-notification-agent"
Mar 20 16:02:53 crc kubenswrapper[4730]: E0320 16:02:53.114455    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc78da88-5699-44ed-af14-7627ea6191f9" containerName="nova-api-api"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.114461    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc78da88-5699-44ed-af14-7627ea6191f9" containerName="nova-api-api"
Mar 20 16:02:53 crc kubenswrapper[4730]: E0320 16:02:53.114470    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c192a384-369a-4011-bbd0-af10cf958010" containerName="ceilometer-central-agent"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.114476    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="c192a384-369a-4011-bbd0-af10cf958010" containerName="ceilometer-central-agent"
Mar 20 16:02:53 crc kubenswrapper[4730]: E0320 16:02:53.114492    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c192a384-369a-4011-bbd0-af10cf958010" containerName="sg-core"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.114497    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="c192a384-369a-4011-bbd0-af10cf958010" containerName="sg-core"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.114662    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="c192a384-369a-4011-bbd0-af10cf958010" containerName="ceilometer-notification-agent"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.114675    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc78da88-5699-44ed-af14-7627ea6191f9" containerName="nova-api-api"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.114686    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc78da88-5699-44ed-af14-7627ea6191f9" containerName="nova-api-log"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.114704    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="c192a384-369a-4011-bbd0-af10cf958010" containerName="proxy-httpd"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.114714    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="c192a384-369a-4011-bbd0-af10cf958010" containerName="sg-core"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.114721    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="c192a384-369a-4011-bbd0-af10cf958010" containerName="ceilometer-central-agent"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.115770    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.121203    4730 scope.go:117] "RemoveContainer" containerID="2e40439ca8128c74c4bde82b12fa3c90a57c5ca9aa2ec703f4bb34d1783c4a41"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.121664    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.121691    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.123566    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.130983    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"]
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.166756    4730 scope.go:117] "RemoveContainer" containerID="67e70a210d20ab3fb6eb013ac35769c758e3dbe555ae4869a1f308e1ce50d12f"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.172361    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c192a384-369a-4011-bbd0-af10cf958010-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c192a384-369a-4011-bbd0-af10cf958010" (UID: "c192a384-369a-4011-bbd0-af10cf958010"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.189104    4730 scope.go:117] "RemoveContainer" containerID="3ed2c729cfba6a845986581f09c8deca053ba44823152224340ca15c6f3b1018"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.191645    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c192a384-369a-4011-bbd0-af10cf958010-config-data" (OuterVolumeSpecName: "config-data") pod "c192a384-369a-4011-bbd0-af10cf958010" (UID: "c192a384-369a-4011-bbd0-af10cf958010"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.192199    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c192a384-369a-4011-bbd0-af10cf958010-config-data\") pod \"c192a384-369a-4011-bbd0-af10cf958010\" (UID: \"c192a384-369a-4011-bbd0-af10cf958010\") "
Mar 20 16:02:53 crc kubenswrapper[4730]: W0320 16:02:53.192393    4730 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/c192a384-369a-4011-bbd0-af10cf958010/volumes/kubernetes.io~secret/config-data
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.192426    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c192a384-369a-4011-bbd0-af10cf958010-config-data" (OuterVolumeSpecName: "config-data") pod "c192a384-369a-4011-bbd0-af10cf958010" (UID: "c192a384-369a-4011-bbd0-af10cf958010"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.192860    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a875c7b4-22fb-4b91-803c-09a7a439aea1-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a875c7b4-22fb-4b91-803c-09a7a439aea1\") " pod="openstack/nova-api-0"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.192897    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a875c7b4-22fb-4b91-803c-09a7a439aea1-config-data\") pod \"nova-api-0\" (UID: \"a875c7b4-22fb-4b91-803c-09a7a439aea1\") " pod="openstack/nova-api-0"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.192954    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a875c7b4-22fb-4b91-803c-09a7a439aea1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a875c7b4-22fb-4b91-803c-09a7a439aea1\") " pod="openstack/nova-api-0"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.193067    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cf45d\" (UniqueName: \"kubernetes.io/projected/a875c7b4-22fb-4b91-803c-09a7a439aea1-kube-api-access-cf45d\") pod \"nova-api-0\" (UID: \"a875c7b4-22fb-4b91-803c-09a7a439aea1\") " pod="openstack/nova-api-0"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.193135    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a875c7b4-22fb-4b91-803c-09a7a439aea1-public-tls-certs\") pod \"nova-api-0\" (UID: \"a875c7b4-22fb-4b91-803c-09a7a439aea1\") " pod="openstack/nova-api-0"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.193230    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a875c7b4-22fb-4b91-803c-09a7a439aea1-logs\") pod \"nova-api-0\" (UID: \"a875c7b4-22fb-4b91-803c-09a7a439aea1\") " pod="openstack/nova-api-0"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.193455    4730 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c192a384-369a-4011-bbd0-af10cf958010-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.193471    4730 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c192a384-369a-4011-bbd0-af10cf958010-config-data\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.193480    4730 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c192a384-369a-4011-bbd0-af10cf958010-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.214060    4730 scope.go:117] "RemoveContainer" containerID="92ca70d767554dd7f5a3c2261603595e47200f286bd84d37815a7187cc55a125"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.233697    4730 scope.go:117] "RemoveContainer" containerID="2e40439ca8128c74c4bde82b12fa3c90a57c5ca9aa2ec703f4bb34d1783c4a41"
Mar 20 16:02:53 crc kubenswrapper[4730]: E0320 16:02:53.234366    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e40439ca8128c74c4bde82b12fa3c90a57c5ca9aa2ec703f4bb34d1783c4a41\": container with ID starting with 2e40439ca8128c74c4bde82b12fa3c90a57c5ca9aa2ec703f4bb34d1783c4a41 not found: ID does not exist" containerID="2e40439ca8128c74c4bde82b12fa3c90a57c5ca9aa2ec703f4bb34d1783c4a41"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.234399    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e40439ca8128c74c4bde82b12fa3c90a57c5ca9aa2ec703f4bb34d1783c4a41"} err="failed to get container status \"2e40439ca8128c74c4bde82b12fa3c90a57c5ca9aa2ec703f4bb34d1783c4a41\": rpc error: code = NotFound desc = could not find container \"2e40439ca8128c74c4bde82b12fa3c90a57c5ca9aa2ec703f4bb34d1783c4a41\": container with ID starting with 2e40439ca8128c74c4bde82b12fa3c90a57c5ca9aa2ec703f4bb34d1783c4a41 not found: ID does not exist"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.234420    4730 scope.go:117] "RemoveContainer" containerID="67e70a210d20ab3fb6eb013ac35769c758e3dbe555ae4869a1f308e1ce50d12f"
Mar 20 16:02:53 crc kubenswrapper[4730]: E0320 16:02:53.234822    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67e70a210d20ab3fb6eb013ac35769c758e3dbe555ae4869a1f308e1ce50d12f\": container with ID starting with 67e70a210d20ab3fb6eb013ac35769c758e3dbe555ae4869a1f308e1ce50d12f not found: ID does not exist" containerID="67e70a210d20ab3fb6eb013ac35769c758e3dbe555ae4869a1f308e1ce50d12f"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.234865    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67e70a210d20ab3fb6eb013ac35769c758e3dbe555ae4869a1f308e1ce50d12f"} err="failed to get container status \"67e70a210d20ab3fb6eb013ac35769c758e3dbe555ae4869a1f308e1ce50d12f\": rpc error: code = NotFound desc = could not find container \"67e70a210d20ab3fb6eb013ac35769c758e3dbe555ae4869a1f308e1ce50d12f\": container with ID starting with 67e70a210d20ab3fb6eb013ac35769c758e3dbe555ae4869a1f308e1ce50d12f not found: ID does not exist"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.234897    4730 scope.go:117] "RemoveContainer" containerID="3ed2c729cfba6a845986581f09c8deca053ba44823152224340ca15c6f3b1018"
Mar 20 16:02:53 crc kubenswrapper[4730]: E0320 16:02:53.235279    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ed2c729cfba6a845986581f09c8deca053ba44823152224340ca15c6f3b1018\": container with ID starting with 3ed2c729cfba6a845986581f09c8deca053ba44823152224340ca15c6f3b1018 not found: ID does not exist" containerID="3ed2c729cfba6a845986581f09c8deca053ba44823152224340ca15c6f3b1018"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.235313    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ed2c729cfba6a845986581f09c8deca053ba44823152224340ca15c6f3b1018"} err="failed to get container status \"3ed2c729cfba6a845986581f09c8deca053ba44823152224340ca15c6f3b1018\": rpc error: code = NotFound desc = could not find container \"3ed2c729cfba6a845986581f09c8deca053ba44823152224340ca15c6f3b1018\": container with ID starting with 3ed2c729cfba6a845986581f09c8deca053ba44823152224340ca15c6f3b1018 not found: ID does not exist"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.235334    4730 scope.go:117] "RemoveContainer" containerID="92ca70d767554dd7f5a3c2261603595e47200f286bd84d37815a7187cc55a125"
Mar 20 16:02:53 crc kubenswrapper[4730]: E0320 16:02:53.235629    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92ca70d767554dd7f5a3c2261603595e47200f286bd84d37815a7187cc55a125\": container with ID starting with 92ca70d767554dd7f5a3c2261603595e47200f286bd84d37815a7187cc55a125 not found: ID does not exist" containerID="92ca70d767554dd7f5a3c2261603595e47200f286bd84d37815a7187cc55a125"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.235671    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92ca70d767554dd7f5a3c2261603595e47200f286bd84d37815a7187cc55a125"} err="failed to get container status \"92ca70d767554dd7f5a3c2261603595e47200f286bd84d37815a7187cc55a125\": rpc error: code = NotFound desc = could not find container \"92ca70d767554dd7f5a3c2261603595e47200f286bd84d37815a7187cc55a125\": container with ID starting with 92ca70d767554dd7f5a3c2261603595e47200f286bd84d37815a7187cc55a125 not found: ID does not exist"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.275174    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"]
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.286705    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"]
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.297453    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a875c7b4-22fb-4b91-803c-09a7a439aea1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a875c7b4-22fb-4b91-803c-09a7a439aea1\") " pod="openstack/nova-api-0"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.297511    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cf45d\" (UniqueName: \"kubernetes.io/projected/a875c7b4-22fb-4b91-803c-09a7a439aea1-kube-api-access-cf45d\") pod \"nova-api-0\" (UID: \"a875c7b4-22fb-4b91-803c-09a7a439aea1\") " pod="openstack/nova-api-0"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.297538    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a875c7b4-22fb-4b91-803c-09a7a439aea1-public-tls-certs\") pod \"nova-api-0\" (UID: \"a875c7b4-22fb-4b91-803c-09a7a439aea1\") " pod="openstack/nova-api-0"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.297571    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a875c7b4-22fb-4b91-803c-09a7a439aea1-logs\") pod \"nova-api-0\" (UID: \"a875c7b4-22fb-4b91-803c-09a7a439aea1\") " pod="openstack/nova-api-0"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.297654    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a875c7b4-22fb-4b91-803c-09a7a439aea1-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a875c7b4-22fb-4b91-803c-09a7a439aea1\") " pod="openstack/nova-api-0"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.297676    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a875c7b4-22fb-4b91-803c-09a7a439aea1-config-data\") pod \"nova-api-0\" (UID: \"a875c7b4-22fb-4b91-803c-09a7a439aea1\") " pod="openstack/nova-api-0"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.298880    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a875c7b4-22fb-4b91-803c-09a7a439aea1-logs\") pod \"nova-api-0\" (UID: \"a875c7b4-22fb-4b91-803c-09a7a439aea1\") " pod="openstack/nova-api-0"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.301368    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a875c7b4-22fb-4b91-803c-09a7a439aea1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a875c7b4-22fb-4b91-803c-09a7a439aea1\") " pod="openstack/nova-api-0"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.301548    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a875c7b4-22fb-4b91-803c-09a7a439aea1-config-data\") pod \"nova-api-0\" (UID: \"a875c7b4-22fb-4b91-803c-09a7a439aea1\") " pod="openstack/nova-api-0"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.302485    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a875c7b4-22fb-4b91-803c-09a7a439aea1-public-tls-certs\") pod \"nova-api-0\" (UID: \"a875c7b4-22fb-4b91-803c-09a7a439aea1\") " pod="openstack/nova-api-0"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.302511    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a875c7b4-22fb-4b91-803c-09a7a439aea1-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a875c7b4-22fb-4b91-803c-09a7a439aea1\") " pod="openstack/nova-api-0"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.316751    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"]
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.325995    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cf45d\" (UniqueName: \"kubernetes.io/projected/a875c7b4-22fb-4b91-803c-09a7a439aea1-kube-api-access-cf45d\") pod \"nova-api-0\" (UID: \"a875c7b4-22fb-4b91-803c-09a7a439aea1\") " pod="openstack/nova-api-0"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.326389    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.332186    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.332386    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.356710    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"]
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.399518    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfa18b95-4067-42bf-82ac-ace10629e4bf-log-httpd\") pod \"ceilometer-0\" (UID: \"bfa18b95-4067-42bf-82ac-ace10629e4bf\") " pod="openstack/ceilometer-0"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.399593    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfa18b95-4067-42bf-82ac-ace10629e4bf-config-data\") pod \"ceilometer-0\" (UID: \"bfa18b95-4067-42bf-82ac-ace10629e4bf\") " pod="openstack/ceilometer-0"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.399641    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfa18b95-4067-42bf-82ac-ace10629e4bf-run-httpd\") pod \"ceilometer-0\" (UID: \"bfa18b95-4067-42bf-82ac-ace10629e4bf\") " pod="openstack/ceilometer-0"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.399706    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bfa18b95-4067-42bf-82ac-ace10629e4bf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bfa18b95-4067-42bf-82ac-ace10629e4bf\") " pod="openstack/ceilometer-0"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.399734    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfa18b95-4067-42bf-82ac-ace10629e4bf-scripts\") pod \"ceilometer-0\" (UID: \"bfa18b95-4067-42bf-82ac-ace10629e4bf\") " pod="openstack/ceilometer-0"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.399763    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfa18b95-4067-42bf-82ac-ace10629e4bf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bfa18b95-4067-42bf-82ac-ace10629e4bf\") " pod="openstack/ceilometer-0"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.399890    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvg5x\" (UniqueName: \"kubernetes.io/projected/bfa18b95-4067-42bf-82ac-ace10629e4bf-kube-api-access-vvg5x\") pod \"ceilometer-0\" (UID: \"bfa18b95-4067-42bf-82ac-ace10629e4bf\") " pod="openstack/ceilometer-0"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.433079    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.491734    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"]
Mar 20 16:02:53 crc kubenswrapper[4730]: E0320 16:02:53.492607    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-vvg5x log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="bfa18b95-4067-42bf-82ac-ace10629e4bf"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.501969    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfa18b95-4067-42bf-82ac-ace10629e4bf-run-httpd\") pod \"ceilometer-0\" (UID: \"bfa18b95-4067-42bf-82ac-ace10629e4bf\") " pod="openstack/ceilometer-0"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.502041    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bfa18b95-4067-42bf-82ac-ace10629e4bf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bfa18b95-4067-42bf-82ac-ace10629e4bf\") " pod="openstack/ceilometer-0"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.502071    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfa18b95-4067-42bf-82ac-ace10629e4bf-scripts\") pod \"ceilometer-0\" (UID: \"bfa18b95-4067-42bf-82ac-ace10629e4bf\") " pod="openstack/ceilometer-0"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.502097    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfa18b95-4067-42bf-82ac-ace10629e4bf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bfa18b95-4067-42bf-82ac-ace10629e4bf\") " pod="openstack/ceilometer-0"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.502232    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvg5x\" (UniqueName: \"kubernetes.io/projected/bfa18b95-4067-42bf-82ac-ace10629e4bf-kube-api-access-vvg5x\") pod \"ceilometer-0\" (UID: \"bfa18b95-4067-42bf-82ac-ace10629e4bf\") " pod="openstack/ceilometer-0"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.502319    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfa18b95-4067-42bf-82ac-ace10629e4bf-log-httpd\") pod \"ceilometer-0\" (UID: \"bfa18b95-4067-42bf-82ac-ace10629e4bf\") " pod="openstack/ceilometer-0"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.502365    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfa18b95-4067-42bf-82ac-ace10629e4bf-config-data\") pod \"ceilometer-0\" (UID: \"bfa18b95-4067-42bf-82ac-ace10629e4bf\") " pod="openstack/ceilometer-0"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.502899    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfa18b95-4067-42bf-82ac-ace10629e4bf-log-httpd\") pod \"ceilometer-0\" (UID: \"bfa18b95-4067-42bf-82ac-ace10629e4bf\") " pod="openstack/ceilometer-0"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.503119    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfa18b95-4067-42bf-82ac-ace10629e4bf-run-httpd\") pod \"ceilometer-0\" (UID: \"bfa18b95-4067-42bf-82ac-ace10629e4bf\") " pod="openstack/ceilometer-0"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.505890    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bfa18b95-4067-42bf-82ac-ace10629e4bf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bfa18b95-4067-42bf-82ac-ace10629e4bf\") " pod="openstack/ceilometer-0"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.506356    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfa18b95-4067-42bf-82ac-ace10629e4bf-config-data\") pod \"ceilometer-0\" (UID: \"bfa18b95-4067-42bf-82ac-ace10629e4bf\") " pod="openstack/ceilometer-0"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.507932    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfa18b95-4067-42bf-82ac-ace10629e4bf-scripts\") pod \"ceilometer-0\" (UID: \"bfa18b95-4067-42bf-82ac-ace10629e4bf\") " pod="openstack/ceilometer-0"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.511863    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfa18b95-4067-42bf-82ac-ace10629e4bf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bfa18b95-4067-42bf-82ac-ace10629e4bf\") " pod="openstack/ceilometer-0"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.525952    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvg5x\" (UniqueName: \"kubernetes.io/projected/bfa18b95-4067-42bf-82ac-ace10629e4bf-kube-api-access-vvg5x\") pod \"ceilometer-0\" (UID: \"bfa18b95-4067-42bf-82ac-ace10629e4bf\") " pod="openstack/ceilometer-0"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.549845    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c192a384-369a-4011-bbd0-af10cf958010" path="/var/lib/kubelet/pods/c192a384-369a-4011-bbd0-af10cf958010/volumes"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.550610    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc78da88-5699-44ed-af14-7627ea6191f9" path="/var/lib/kubelet/pods/fc78da88-5699-44ed-af14-7627ea6191f9/volumes"
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.551207    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"]
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.551437    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="4938ac0e-1226-4f20-8f23-763b62b863c4" containerName="kube-state-metrics" containerID="cri-o://49b97730a0d8925f4b80deb7db0aeea30943ec9b5135c0345bbaf48837573e5f" gracePeriod=30
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.958046    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"]
Mar 20 16:02:53 crc kubenswrapper[4730]: W0320 16:02:53.962127    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda875c7b4_22fb_4b91_803c_09a7a439aea1.slice/crio-b02119858ae3b92ae86beccde670787e102f4edc1a93a7bf0ae26e53cf88792e WatchSource:0}: Error finding container b02119858ae3b92ae86beccde670787e102f4edc1a93a7bf0ae26e53cf88792e: Status 404 returned error can't find the container with id b02119858ae3b92ae86beccde670787e102f4edc1a93a7bf0ae26e53cf88792e
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.969697    4730 generic.go:334] "Generic (PLEG): container finished" podID="4938ac0e-1226-4f20-8f23-763b62b863c4" containerID="49b97730a0d8925f4b80deb7db0aeea30943ec9b5135c0345bbaf48837573e5f" exitCode=2
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.970052    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4938ac0e-1226-4f20-8f23-763b62b863c4","Type":"ContainerDied","Data":"49b97730a0d8925f4b80deb7db0aeea30943ec9b5135c0345bbaf48837573e5f"}
Mar 20 16:02:53 crc kubenswrapper[4730]: I0320 16:02:53.978429    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0"
Mar 20 16:02:54 crc kubenswrapper[4730]: I0320 16:02:54.078581    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0"
Mar 20 16:02:54 crc kubenswrapper[4730]: I0320 16:02:54.082863    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0"
Mar 20 16:02:54 crc kubenswrapper[4730]: I0320 16:02:54.095906    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0"
Mar 20 16:02:54 crc kubenswrapper[4730]: I0320 16:02:54.157594    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0"
Mar 20 16:02:54 crc kubenswrapper[4730]: I0320 16:02:54.224045    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfa18b95-4067-42bf-82ac-ace10629e4bf-config-data\") pod \"bfa18b95-4067-42bf-82ac-ace10629e4bf\" (UID: \"bfa18b95-4067-42bf-82ac-ace10629e4bf\") "
Mar 20 16:02:54 crc kubenswrapper[4730]: I0320 16:02:54.224118    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfa18b95-4067-42bf-82ac-ace10629e4bf-combined-ca-bundle\") pod \"bfa18b95-4067-42bf-82ac-ace10629e4bf\" (UID: \"bfa18b95-4067-42bf-82ac-ace10629e4bf\") "
Mar 20 16:02:54 crc kubenswrapper[4730]: I0320 16:02:54.224150    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfa18b95-4067-42bf-82ac-ace10629e4bf-scripts\") pod \"bfa18b95-4067-42bf-82ac-ace10629e4bf\" (UID: \"bfa18b95-4067-42bf-82ac-ace10629e4bf\") "
Mar 20 16:02:54 crc kubenswrapper[4730]: I0320 16:02:54.224166    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bfa18b95-4067-42bf-82ac-ace10629e4bf-sg-core-conf-yaml\") pod \"bfa18b95-4067-42bf-82ac-ace10629e4bf\" (UID: \"bfa18b95-4067-42bf-82ac-ace10629e4bf\") "
Mar 20 16:02:54 crc kubenswrapper[4730]: I0320 16:02:54.224277    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvg5x\" (UniqueName: \"kubernetes.io/projected/bfa18b95-4067-42bf-82ac-ace10629e4bf-kube-api-access-vvg5x\") pod \"bfa18b95-4067-42bf-82ac-ace10629e4bf\" (UID: \"bfa18b95-4067-42bf-82ac-ace10629e4bf\") "
Mar 20 16:02:54 crc kubenswrapper[4730]: I0320 16:02:54.224323    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfa18b95-4067-42bf-82ac-ace10629e4bf-log-httpd\") pod \"bfa18b95-4067-42bf-82ac-ace10629e4bf\" (UID: \"bfa18b95-4067-42bf-82ac-ace10629e4bf\") "
Mar 20 16:02:54 crc kubenswrapper[4730]: I0320 16:02:54.224340    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6bgz\" (UniqueName: \"kubernetes.io/projected/4938ac0e-1226-4f20-8f23-763b62b863c4-kube-api-access-d6bgz\") pod \"4938ac0e-1226-4f20-8f23-763b62b863c4\" (UID: \"4938ac0e-1226-4f20-8f23-763b62b863c4\") "
Mar 20 16:02:54 crc kubenswrapper[4730]: I0320 16:02:54.224367    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfa18b95-4067-42bf-82ac-ace10629e4bf-run-httpd\") pod \"bfa18b95-4067-42bf-82ac-ace10629e4bf\" (UID: \"bfa18b95-4067-42bf-82ac-ace10629e4bf\") "
Mar 20 16:02:54 crc kubenswrapper[4730]: I0320 16:02:54.226377    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfa18b95-4067-42bf-82ac-ace10629e4bf-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bfa18b95-4067-42bf-82ac-ace10629e4bf" (UID: "bfa18b95-4067-42bf-82ac-ace10629e4bf"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:02:54 crc kubenswrapper[4730]: I0320 16:02:54.233654    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfa18b95-4067-42bf-82ac-ace10629e4bf-kube-api-access-vvg5x" (OuterVolumeSpecName: "kube-api-access-vvg5x") pod "bfa18b95-4067-42bf-82ac-ace10629e4bf" (UID: "bfa18b95-4067-42bf-82ac-ace10629e4bf"). InnerVolumeSpecName "kube-api-access-vvg5x". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:02:54 crc kubenswrapper[4730]: I0320 16:02:54.235677    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfa18b95-4067-42bf-82ac-ace10629e4bf-config-data" (OuterVolumeSpecName: "config-data") pod "bfa18b95-4067-42bf-82ac-ace10629e4bf" (UID: "bfa18b95-4067-42bf-82ac-ace10629e4bf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:02:54 crc kubenswrapper[4730]: I0320 16:02:54.237029    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfa18b95-4067-42bf-82ac-ace10629e4bf-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bfa18b95-4067-42bf-82ac-ace10629e4bf" (UID: "bfa18b95-4067-42bf-82ac-ace10629e4bf"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:02:54 crc kubenswrapper[4730]: I0320 16:02:54.237686    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfa18b95-4067-42bf-82ac-ace10629e4bf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bfa18b95-4067-42bf-82ac-ace10629e4bf" (UID: "bfa18b95-4067-42bf-82ac-ace10629e4bf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:02:54 crc kubenswrapper[4730]: I0320 16:02:54.238788    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4938ac0e-1226-4f20-8f23-763b62b863c4-kube-api-access-d6bgz" (OuterVolumeSpecName: "kube-api-access-d6bgz") pod "4938ac0e-1226-4f20-8f23-763b62b863c4" (UID: "4938ac0e-1226-4f20-8f23-763b62b863c4"). InnerVolumeSpecName "kube-api-access-d6bgz". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:02:54 crc kubenswrapper[4730]: I0320 16:02:54.239572    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfa18b95-4067-42bf-82ac-ace10629e4bf-scripts" (OuterVolumeSpecName: "scripts") pod "bfa18b95-4067-42bf-82ac-ace10629e4bf" (UID: "bfa18b95-4067-42bf-82ac-ace10629e4bf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:02:54 crc kubenswrapper[4730]: I0320 16:02:54.244866    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfa18b95-4067-42bf-82ac-ace10629e4bf-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bfa18b95-4067-42bf-82ac-ace10629e4bf" (UID: "bfa18b95-4067-42bf-82ac-ace10629e4bf"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:02:54 crc kubenswrapper[4730]: I0320 16:02:54.326552    4730 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfa18b95-4067-42bf-82ac-ace10629e4bf-config-data\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:54 crc kubenswrapper[4730]: I0320 16:02:54.326847    4730 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfa18b95-4067-42bf-82ac-ace10629e4bf-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:54 crc kubenswrapper[4730]: I0320 16:02:54.326862    4730 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfa18b95-4067-42bf-82ac-ace10629e4bf-scripts\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:54 crc kubenswrapper[4730]: I0320 16:02:54.326874    4730 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bfa18b95-4067-42bf-82ac-ace10629e4bf-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:54 crc kubenswrapper[4730]: I0320 16:02:54.326884    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvg5x\" (UniqueName: \"kubernetes.io/projected/bfa18b95-4067-42bf-82ac-ace10629e4bf-kube-api-access-vvg5x\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:54 crc kubenswrapper[4730]: I0320 16:02:54.326896    4730 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfa18b95-4067-42bf-82ac-ace10629e4bf-log-httpd\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:54 crc kubenswrapper[4730]: I0320 16:02:54.326906    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6bgz\" (UniqueName: \"kubernetes.io/projected/4938ac0e-1226-4f20-8f23-763b62b863c4-kube-api-access-d6bgz\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:54 crc kubenswrapper[4730]: I0320 16:02:54.326916    4730 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfa18b95-4067-42bf-82ac-ace10629e4bf-run-httpd\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:54 crc kubenswrapper[4730]: I0320 16:02:54.989926    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4938ac0e-1226-4f20-8f23-763b62b863c4","Type":"ContainerDied","Data":"7d7a0da71b781876c622a4a6bd339b425ce64e7ebb40d9b345bbc6e39de452eb"}
Mar 20 16:02:54 crc kubenswrapper[4730]: I0320 16:02:54.989991    4730 scope.go:117] "RemoveContainer" containerID="49b97730a0d8925f4b80deb7db0aeea30943ec9b5135c0345bbaf48837573e5f"
Mar 20 16:02:54 crc kubenswrapper[4730]: I0320 16:02:54.989951    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0"
Mar 20 16:02:54 crc kubenswrapper[4730]: I0320 16:02:54.992838    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0"
Mar 20 16:02:54 crc kubenswrapper[4730]: I0320 16:02:54.992836    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a875c7b4-22fb-4b91-803c-09a7a439aea1","Type":"ContainerStarted","Data":"8ad5725e9930431c50bd1862d3f763c6380098c554ce816d54c8f41bdc9481c4"}
Mar 20 16:02:54 crc kubenswrapper[4730]: I0320 16:02:54.993893    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0"
Mar 20 16:02:54 crc kubenswrapper[4730]: I0320 16:02:54.993926    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a875c7b4-22fb-4b91-803c-09a7a439aea1","Type":"ContainerStarted","Data":"5b5163f015343a503b8bb1265422ad47154d41b97a20358ae2eebcfb9b02968f"}
Mar 20 16:02:54 crc kubenswrapper[4730]: I0320 16:02:54.993943    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a875c7b4-22fb-4b91-803c-09a7a439aea1","Type":"ContainerStarted","Data":"b02119858ae3b92ae86beccde670787e102f4edc1a93a7bf0ae26e53cf88792e"}
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.040465    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.040441582 podStartE2EDuration="2.040441582s" podCreationTimestamp="2026-03-20 16:02:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:02:55.029547509 +0000 UTC m=+1434.242918878" watchObservedRunningTime="2026-03-20 16:02:55.040441582 +0000 UTC m=+1434.253812951"
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.047420    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0"
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.108683    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"]
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.125315    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"]
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.146295    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"]
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.162303    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"]
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.176680    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"]
Mar 20 16:02:55 crc kubenswrapper[4730]: E0320 16:02:55.177161    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4938ac0e-1226-4f20-8f23-763b62b863c4" containerName="kube-state-metrics"
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.177177    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="4938ac0e-1226-4f20-8f23-763b62b863c4" containerName="kube-state-metrics"
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.177382    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="4938ac0e-1226-4f20-8f23-763b62b863c4" containerName="kube-state-metrics"
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.179148    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0"
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.181437    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-wwqfb"
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.183072    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data"
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.183531    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts"
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.198388    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"]
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.226310    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"]
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.227641    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0"
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.242668    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config"
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.242858    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc"
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.266422    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8-log-httpd\") pod \"ceilometer-0\" (UID: \"15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8\") " pod="openstack/ceilometer-0"
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.266515    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8\") " pod="openstack/ceilometer-0"
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.266614    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8-scripts\") pod \"ceilometer-0\" (UID: \"15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8\") " pod="openstack/ceilometer-0"
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.266689    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8-run-httpd\") pod \"ceilometer-0\" (UID: \"15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8\") " pod="openstack/ceilometer-0"
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.266708    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8-config-data\") pod \"ceilometer-0\" (UID: \"15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8\") " pod="openstack/ceilometer-0"
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.266752    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8\") " pod="openstack/ceilometer-0"
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.266803    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddg7t\" (UniqueName: \"kubernetes.io/projected/15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8-kube-api-access-ddg7t\") pod \"ceilometer-0\" (UID: \"15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8\") " pod="openstack/ceilometer-0"
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.271721    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"]
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.368818    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8\") " pod="openstack/ceilometer-0"
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.368887    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8-scripts\") pod \"ceilometer-0\" (UID: \"15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8\") " pod="openstack/ceilometer-0"
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.368927    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8-run-httpd\") pod \"ceilometer-0\" (UID: \"15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8\") " pod="openstack/ceilometer-0"
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.368943    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8-config-data\") pod \"ceilometer-0\" (UID: \"15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8\") " pod="openstack/ceilometer-0"
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.368965    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2455b53b-7716-45b9-ac24-cd0bd892fbb9-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"2455b53b-7716-45b9-ac24-cd0bd892fbb9\") " pod="openstack/kube-state-metrics-0"
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.368991    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8\") " pod="openstack/ceilometer-0"
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.369050    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddg7t\" (UniqueName: \"kubernetes.io/projected/15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8-kube-api-access-ddg7t\") pod \"ceilometer-0\" (UID: \"15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8\") " pod="openstack/ceilometer-0"
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.369096    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/2455b53b-7716-45b9-ac24-cd0bd892fbb9-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"2455b53b-7716-45b9-ac24-cd0bd892fbb9\") " pod="openstack/kube-state-metrics-0"
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.369150    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tf8lr\" (UniqueName: \"kubernetes.io/projected/2455b53b-7716-45b9-ac24-cd0bd892fbb9-kube-api-access-tf8lr\") pod \"kube-state-metrics-0\" (UID: \"2455b53b-7716-45b9-ac24-cd0bd892fbb9\") " pod="openstack/kube-state-metrics-0"
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.369195    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/2455b53b-7716-45b9-ac24-cd0bd892fbb9-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"2455b53b-7716-45b9-ac24-cd0bd892fbb9\") " pod="openstack/kube-state-metrics-0"
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.369239    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8-log-httpd\") pod \"ceilometer-0\" (UID: \"15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8\") " pod="openstack/ceilometer-0"
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.369768    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8-log-httpd\") pod \"ceilometer-0\" (UID: \"15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8\") " pod="openstack/ceilometer-0"
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.370206    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8-run-httpd\") pod \"ceilometer-0\" (UID: \"15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8\") " pod="openstack/ceilometer-0"
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.374922    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8\") " pod="openstack/ceilometer-0"
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.375064    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8\") " pod="openstack/ceilometer-0"
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.379802    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8-scripts\") pod \"ceilometer-0\" (UID: \"15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8\") " pod="openstack/ceilometer-0"
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.380152    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8-config-data\") pod \"ceilometer-0\" (UID: \"15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8\") " pod="openstack/ceilometer-0"
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.388186    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddg7t\" (UniqueName: \"kubernetes.io/projected/15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8-kube-api-access-ddg7t\") pod \"ceilometer-0\" (UID: \"15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8\") " pod="openstack/ceilometer-0"
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.470690    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2455b53b-7716-45b9-ac24-cd0bd892fbb9-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"2455b53b-7716-45b9-ac24-cd0bd892fbb9\") " pod="openstack/kube-state-metrics-0"
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.470793    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/2455b53b-7716-45b9-ac24-cd0bd892fbb9-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"2455b53b-7716-45b9-ac24-cd0bd892fbb9\") " pod="openstack/kube-state-metrics-0"
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.470839    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tf8lr\" (UniqueName: \"kubernetes.io/projected/2455b53b-7716-45b9-ac24-cd0bd892fbb9-kube-api-access-tf8lr\") pod \"kube-state-metrics-0\" (UID: \"2455b53b-7716-45b9-ac24-cd0bd892fbb9\") " pod="openstack/kube-state-metrics-0"
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.470879    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/2455b53b-7716-45b9-ac24-cd0bd892fbb9-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"2455b53b-7716-45b9-ac24-cd0bd892fbb9\") " pod="openstack/kube-state-metrics-0"
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.474714    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/2455b53b-7716-45b9-ac24-cd0bd892fbb9-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"2455b53b-7716-45b9-ac24-cd0bd892fbb9\") " pod="openstack/kube-state-metrics-0"
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.475212    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2455b53b-7716-45b9-ac24-cd0bd892fbb9-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"2455b53b-7716-45b9-ac24-cd0bd892fbb9\") " pod="openstack/kube-state-metrics-0"
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.477076    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/2455b53b-7716-45b9-ac24-cd0bd892fbb9-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"2455b53b-7716-45b9-ac24-cd0bd892fbb9\") " pod="openstack/kube-state-metrics-0"
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.488572    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tf8lr\" (UniqueName: \"kubernetes.io/projected/2455b53b-7716-45b9-ac24-cd0bd892fbb9-kube-api-access-tf8lr\") pod \"kube-state-metrics-0\" (UID: \"2455b53b-7716-45b9-ac24-cd0bd892fbb9\") " pod="openstack/kube-state-metrics-0"
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.523084    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0"
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.545384    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4938ac0e-1226-4f20-8f23-763b62b863c4" path="/var/lib/kubelet/pods/4938ac0e-1226-4f20-8f23-763b62b863c4/volumes"
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.546065    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfa18b95-4067-42bf-82ac-ace10629e4bf" path="/var/lib/kubelet/pods/bfa18b95-4067-42bf-82ac-ace10629e4bf/volumes"
Mar 20 16:02:55 crc kubenswrapper[4730]: I0320 16:02:55.587175    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0"
Mar 20 16:02:56 crc kubenswrapper[4730]: I0320 16:02:56.028725    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"]
Mar 20 16:02:56 crc kubenswrapper[4730]: I0320 16:02:56.069628    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"]
Mar 20 16:02:56 crc kubenswrapper[4730]: I0320 16:02:56.136542    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0"
Mar 20 16:02:56 crc kubenswrapper[4730]: W0320 16:02:56.147464    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2455b53b_7716_45b9_ac24_cd0bd892fbb9.slice/crio-1e53b5bd2b518ed47526b3f05e87f049e3b47ee7d766a73b95963da9b3a0646a WatchSource:0}: Error finding container 1e53b5bd2b518ed47526b3f05e87f049e3b47ee7d766a73b95963da9b3a0646a: Status 404 returned error can't find the container with id 1e53b5bd2b518ed47526b3f05e87f049e3b47ee7d766a73b95963da9b3a0646a
Mar 20 16:02:56 crc kubenswrapper[4730]: I0320 16:02:56.151601    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"]
Mar 20 16:02:56 crc kubenswrapper[4730]: I0320 16:02:56.157544    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0"
Mar 20 16:02:57 crc kubenswrapper[4730]: I0320 16:02:57.043191    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2455b53b-7716-45b9-ac24-cd0bd892fbb9","Type":"ContainerStarted","Data":"1e53b5bd2b518ed47526b3f05e87f049e3b47ee7d766a73b95963da9b3a0646a"}
Mar 20 16:02:57 crc kubenswrapper[4730]: I0320 16:02:57.048662    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8","Type":"ContainerStarted","Data":"60a8e4234dec296e07fa2e6b25c8cd3d8400e9e7059761ba834d2524bf4e7bc9"}
Mar 20 16:02:57 crc kubenswrapper[4730]: I0320 16:02:57.048693    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8","Type":"ContainerStarted","Data":"7a8aa45fa61f1cac40b9b89dc3a2959bc5ab43046293be849c2cd09eec70c60a"}
Mar 20 16:02:57 crc kubenswrapper[4730]: I0320 16:02:57.048703    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8","Type":"ContainerStarted","Data":"2d283fdd4c1c73728f70c718eab039966732590eed0ec087942b19f09e1c496d"}
Mar 20 16:02:57 crc kubenswrapper[4730]: I0320 16:02:57.064454    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0"
Mar 20 16:02:57 crc kubenswrapper[4730]: I0320 16:02:57.269106    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-5j2w4"]
Mar 20 16:02:57 crc kubenswrapper[4730]: I0320 16:02:57.270424    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5j2w4"
Mar 20 16:02:57 crc kubenswrapper[4730]: I0320 16:02:57.272761    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data"
Mar 20 16:02:57 crc kubenswrapper[4730]: I0320 16:02:57.273067    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts"
Mar 20 16:02:57 crc kubenswrapper[4730]: I0320 16:02:57.299901    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7f99bcbd6f-lkgpt"
Mar 20 16:02:57 crc kubenswrapper[4730]: I0320 16:02:57.338342    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-5j2w4"]
Mar 20 16:02:57 crc kubenswrapper[4730]: I0320 16:02:57.414494    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-654455944c-qph9q"]
Mar 20 16:02:57 crc kubenswrapper[4730]: I0320 16:02:57.414720    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-654455944c-qph9q" podUID="84937d37-8276-4014-b1ae-bb84547384af" containerName="dnsmasq-dns" containerID="cri-o://61e03f987a4e7a22fe8153ac2d6c60c19cb72977de582edcb281b2f18aeef951" gracePeriod=10
Mar 20 16:02:57 crc kubenswrapper[4730]: I0320 16:02:57.433930    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6941d556-3020-4344-b185-5d79cf68187c-scripts\") pod \"nova-cell1-cell-mapping-5j2w4\" (UID: \"6941d556-3020-4344-b185-5d79cf68187c\") " pod="openstack/nova-cell1-cell-mapping-5j2w4"
Mar 20 16:02:57 crc kubenswrapper[4730]: I0320 16:02:57.434131    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6941d556-3020-4344-b185-5d79cf68187c-config-data\") pod \"nova-cell1-cell-mapping-5j2w4\" (UID: \"6941d556-3020-4344-b185-5d79cf68187c\") " pod="openstack/nova-cell1-cell-mapping-5j2w4"
Mar 20 16:02:57 crc kubenswrapper[4730]: I0320 16:02:57.434167    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6rml\" (UniqueName: \"kubernetes.io/projected/6941d556-3020-4344-b185-5d79cf68187c-kube-api-access-p6rml\") pod \"nova-cell1-cell-mapping-5j2w4\" (UID: \"6941d556-3020-4344-b185-5d79cf68187c\") " pod="openstack/nova-cell1-cell-mapping-5j2w4"
Mar 20 16:02:57 crc kubenswrapper[4730]: I0320 16:02:57.434240    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6941d556-3020-4344-b185-5d79cf68187c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-5j2w4\" (UID: \"6941d556-3020-4344-b185-5d79cf68187c\") " pod="openstack/nova-cell1-cell-mapping-5j2w4"
Mar 20 16:02:57 crc kubenswrapper[4730]: I0320 16:02:57.535591    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6941d556-3020-4344-b185-5d79cf68187c-config-data\") pod \"nova-cell1-cell-mapping-5j2w4\" (UID: \"6941d556-3020-4344-b185-5d79cf68187c\") " pod="openstack/nova-cell1-cell-mapping-5j2w4"
Mar 20 16:02:57 crc kubenswrapper[4730]: I0320 16:02:57.535989    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6rml\" (UniqueName: \"kubernetes.io/projected/6941d556-3020-4344-b185-5d79cf68187c-kube-api-access-p6rml\") pod \"nova-cell1-cell-mapping-5j2w4\" (UID: \"6941d556-3020-4344-b185-5d79cf68187c\") " pod="openstack/nova-cell1-cell-mapping-5j2w4"
Mar 20 16:02:57 crc kubenswrapper[4730]: I0320 16:02:57.536078    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6941d556-3020-4344-b185-5d79cf68187c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-5j2w4\" (UID: \"6941d556-3020-4344-b185-5d79cf68187c\") " pod="openstack/nova-cell1-cell-mapping-5j2w4"
Mar 20 16:02:57 crc kubenswrapper[4730]: I0320 16:02:57.536173    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6941d556-3020-4344-b185-5d79cf68187c-scripts\") pod \"nova-cell1-cell-mapping-5j2w4\" (UID: \"6941d556-3020-4344-b185-5d79cf68187c\") " pod="openstack/nova-cell1-cell-mapping-5j2w4"
Mar 20 16:02:57 crc kubenswrapper[4730]: I0320 16:02:57.541375    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6941d556-3020-4344-b185-5d79cf68187c-config-data\") pod \"nova-cell1-cell-mapping-5j2w4\" (UID: \"6941d556-3020-4344-b185-5d79cf68187c\") " pod="openstack/nova-cell1-cell-mapping-5j2w4"
Mar 20 16:02:57 crc kubenswrapper[4730]: I0320 16:02:57.541813    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6941d556-3020-4344-b185-5d79cf68187c-scripts\") pod \"nova-cell1-cell-mapping-5j2w4\" (UID: \"6941d556-3020-4344-b185-5d79cf68187c\") " pod="openstack/nova-cell1-cell-mapping-5j2w4"
Mar 20 16:02:57 crc kubenswrapper[4730]: I0320 16:02:57.553166    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6941d556-3020-4344-b185-5d79cf68187c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-5j2w4\" (UID: \"6941d556-3020-4344-b185-5d79cf68187c\") " pod="openstack/nova-cell1-cell-mapping-5j2w4"
Mar 20 16:02:57 crc kubenswrapper[4730]: I0320 16:02:57.564974    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6rml\" (UniqueName: \"kubernetes.io/projected/6941d556-3020-4344-b185-5d79cf68187c-kube-api-access-p6rml\") pod \"nova-cell1-cell-mapping-5j2w4\" (UID: \"6941d556-3020-4344-b185-5d79cf68187c\") " pod="openstack/nova-cell1-cell-mapping-5j2w4"
Mar 20 16:02:57 crc kubenswrapper[4730]: I0320 16:02:57.666414    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5j2w4"
Mar 20 16:02:57 crc kubenswrapper[4730]: I0320 16:02:57.959577    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-654455944c-qph9q"
Mar 20 16:02:58 crc kubenswrapper[4730]: I0320 16:02:58.051023    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84937d37-8276-4014-b1ae-bb84547384af-ovsdbserver-nb\") pod \"84937d37-8276-4014-b1ae-bb84547384af\" (UID: \"84937d37-8276-4014-b1ae-bb84547384af\") "
Mar 20 16:02:58 crc kubenswrapper[4730]: I0320 16:02:58.051073    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84937d37-8276-4014-b1ae-bb84547384af-dns-svc\") pod \"84937d37-8276-4014-b1ae-bb84547384af\" (UID: \"84937d37-8276-4014-b1ae-bb84547384af\") "
Mar 20 16:02:58 crc kubenswrapper[4730]: I0320 16:02:58.051111    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/84937d37-8276-4014-b1ae-bb84547384af-dns-swift-storage-0\") pod \"84937d37-8276-4014-b1ae-bb84547384af\" (UID: \"84937d37-8276-4014-b1ae-bb84547384af\") "
Mar 20 16:02:58 crc kubenswrapper[4730]: I0320 16:02:58.051358    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84937d37-8276-4014-b1ae-bb84547384af-config\") pod \"84937d37-8276-4014-b1ae-bb84547384af\" (UID: \"84937d37-8276-4014-b1ae-bb84547384af\") "
Mar 20 16:02:58 crc kubenswrapper[4730]: I0320 16:02:58.051395    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84937d37-8276-4014-b1ae-bb84547384af-ovsdbserver-sb\") pod \"84937d37-8276-4014-b1ae-bb84547384af\" (UID: \"84937d37-8276-4014-b1ae-bb84547384af\") "
Mar 20 16:02:58 crc kubenswrapper[4730]: I0320 16:02:58.051475    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwnnn\" (UniqueName: \"kubernetes.io/projected/84937d37-8276-4014-b1ae-bb84547384af-kube-api-access-fwnnn\") pod \"84937d37-8276-4014-b1ae-bb84547384af\" (UID: \"84937d37-8276-4014-b1ae-bb84547384af\") "
Mar 20 16:02:58 crc kubenswrapper[4730]: I0320 16:02:58.072880    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84937d37-8276-4014-b1ae-bb84547384af-kube-api-access-fwnnn" (OuterVolumeSpecName: "kube-api-access-fwnnn") pod "84937d37-8276-4014-b1ae-bb84547384af" (UID: "84937d37-8276-4014-b1ae-bb84547384af"). InnerVolumeSpecName "kube-api-access-fwnnn". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:02:58 crc kubenswrapper[4730]: I0320 16:02:58.082522    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8","Type":"ContainerStarted","Data":"51c72e38c5d67c6135d394c4c7da074fc0fe8aedaa4b2e3b893209505e547395"}
Mar 20 16:02:58 crc kubenswrapper[4730]: I0320 16:02:58.086868    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2455b53b-7716-45b9-ac24-cd0bd892fbb9","Type":"ContainerStarted","Data":"51e9bde4b97cf7cfa7abc37598cc92c008ea5779caaf5ece80947fdb05a2e0e9"}
Mar 20 16:02:58 crc kubenswrapper[4730]: I0320 16:02:58.090272    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0"
Mar 20 16:02:58 crc kubenswrapper[4730]: I0320 16:02:58.090428    4730 generic.go:334] "Generic (PLEG): container finished" podID="84937d37-8276-4014-b1ae-bb84547384af" containerID="61e03f987a4e7a22fe8153ac2d6c60c19cb72977de582edcb281b2f18aeef951" exitCode=0
Mar 20 16:02:58 crc kubenswrapper[4730]: I0320 16:02:58.091041    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-654455944c-qph9q" event={"ID":"84937d37-8276-4014-b1ae-bb84547384af","Type":"ContainerDied","Data":"61e03f987a4e7a22fe8153ac2d6c60c19cb72977de582edcb281b2f18aeef951"}
Mar 20 16:02:58 crc kubenswrapper[4730]: I0320 16:02:58.091076    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-654455944c-qph9q" event={"ID":"84937d37-8276-4014-b1ae-bb84547384af","Type":"ContainerDied","Data":"0a2eb8594326a606d45ae0159cba2be47a452aea3e7bbc81624368815408532b"}
Mar 20 16:02:58 crc kubenswrapper[4730]: I0320 16:02:58.091098    4730 scope.go:117] "RemoveContainer" containerID="61e03f987a4e7a22fe8153ac2d6c60c19cb72977de582edcb281b2f18aeef951"
Mar 20 16:02:58 crc kubenswrapper[4730]: I0320 16:02:58.090859    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-654455944c-qph9q"
Mar 20 16:02:58 crc kubenswrapper[4730]: I0320 16:02:58.112094    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.559597944 podStartE2EDuration="3.112067705s" podCreationTimestamp="2026-03-20 16:02:55 +0000 UTC" firstStartedPulling="2026-03-20 16:02:56.150370986 +0000 UTC m=+1435.363742355" lastFinishedPulling="2026-03-20 16:02:56.702840747 +0000 UTC m=+1435.916212116" observedRunningTime="2026-03-20 16:02:58.101841891 +0000 UTC m=+1437.315213260" watchObservedRunningTime="2026-03-20 16:02:58.112067705 +0000 UTC m=+1437.325439074"
Mar 20 16:02:58 crc kubenswrapper[4730]: I0320 16:02:58.133445    4730 scope.go:117] "RemoveContainer" containerID="a42a5cae6611c2fe3d170915f157931f092b64200d58d7b15c079992b8c3bd9a"
Mar 20 16:02:58 crc kubenswrapper[4730]: I0320 16:02:58.133955    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84937d37-8276-4014-b1ae-bb84547384af-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "84937d37-8276-4014-b1ae-bb84547384af" (UID: "84937d37-8276-4014-b1ae-bb84547384af"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:02:58 crc kubenswrapper[4730]: I0320 16:02:58.136150    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84937d37-8276-4014-b1ae-bb84547384af-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "84937d37-8276-4014-b1ae-bb84547384af" (UID: "84937d37-8276-4014-b1ae-bb84547384af"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:02:58 crc kubenswrapper[4730]: I0320 16:02:58.149131    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84937d37-8276-4014-b1ae-bb84547384af-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "84937d37-8276-4014-b1ae-bb84547384af" (UID: "84937d37-8276-4014-b1ae-bb84547384af"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:02:58 crc kubenswrapper[4730]: I0320 16:02:58.155360    4730 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84937d37-8276-4014-b1ae-bb84547384af-ovsdbserver-nb\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:58 crc kubenswrapper[4730]: I0320 16:02:58.155394    4730 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84937d37-8276-4014-b1ae-bb84547384af-dns-svc\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:58 crc kubenswrapper[4730]: I0320 16:02:58.155405    4730 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/84937d37-8276-4014-b1ae-bb84547384af-dns-swift-storage-0\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:58 crc kubenswrapper[4730]: I0320 16:02:58.155416    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwnnn\" (UniqueName: \"kubernetes.io/projected/84937d37-8276-4014-b1ae-bb84547384af-kube-api-access-fwnnn\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:58 crc kubenswrapper[4730]: I0320 16:02:58.186268    4730 scope.go:117] "RemoveContainer" containerID="61e03f987a4e7a22fe8153ac2d6c60c19cb72977de582edcb281b2f18aeef951"
Mar 20 16:02:58 crc kubenswrapper[4730]: E0320 16:02:58.186991    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61e03f987a4e7a22fe8153ac2d6c60c19cb72977de582edcb281b2f18aeef951\": container with ID starting with 61e03f987a4e7a22fe8153ac2d6c60c19cb72977de582edcb281b2f18aeef951 not found: ID does not exist" containerID="61e03f987a4e7a22fe8153ac2d6c60c19cb72977de582edcb281b2f18aeef951"
Mar 20 16:02:58 crc kubenswrapper[4730]: I0320 16:02:58.187042    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61e03f987a4e7a22fe8153ac2d6c60c19cb72977de582edcb281b2f18aeef951"} err="failed to get container status \"61e03f987a4e7a22fe8153ac2d6c60c19cb72977de582edcb281b2f18aeef951\": rpc error: code = NotFound desc = could not find container \"61e03f987a4e7a22fe8153ac2d6c60c19cb72977de582edcb281b2f18aeef951\": container with ID starting with 61e03f987a4e7a22fe8153ac2d6c60c19cb72977de582edcb281b2f18aeef951 not found: ID does not exist"
Mar 20 16:02:58 crc kubenswrapper[4730]: I0320 16:02:58.187077    4730 scope.go:117] "RemoveContainer" containerID="a42a5cae6611c2fe3d170915f157931f092b64200d58d7b15c079992b8c3bd9a"
Mar 20 16:02:58 crc kubenswrapper[4730]: E0320 16:02:58.187452    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a42a5cae6611c2fe3d170915f157931f092b64200d58d7b15c079992b8c3bd9a\": container with ID starting with a42a5cae6611c2fe3d170915f157931f092b64200d58d7b15c079992b8c3bd9a not found: ID does not exist" containerID="a42a5cae6611c2fe3d170915f157931f092b64200d58d7b15c079992b8c3bd9a"
Mar 20 16:02:58 crc kubenswrapper[4730]: I0320 16:02:58.187487    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a42a5cae6611c2fe3d170915f157931f092b64200d58d7b15c079992b8c3bd9a"} err="failed to get container status \"a42a5cae6611c2fe3d170915f157931f092b64200d58d7b15c079992b8c3bd9a\": rpc error: code = NotFound desc = could not find container \"a42a5cae6611c2fe3d170915f157931f092b64200d58d7b15c079992b8c3bd9a\": container with ID starting with a42a5cae6611c2fe3d170915f157931f092b64200d58d7b15c079992b8c3bd9a not found: ID does not exist"
Mar 20 16:02:58 crc kubenswrapper[4730]: I0320 16:02:58.187743    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84937d37-8276-4014-b1ae-bb84547384af-config" (OuterVolumeSpecName: "config") pod "84937d37-8276-4014-b1ae-bb84547384af" (UID: "84937d37-8276-4014-b1ae-bb84547384af"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:02:58 crc kubenswrapper[4730]: I0320 16:02:58.188260    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84937d37-8276-4014-b1ae-bb84547384af-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "84937d37-8276-4014-b1ae-bb84547384af" (UID: "84937d37-8276-4014-b1ae-bb84547384af"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:02:58 crc kubenswrapper[4730]: I0320 16:02:58.224992    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-5j2w4"]
Mar 20 16:02:58 crc kubenswrapper[4730]: W0320 16:02:58.226815    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6941d556_3020_4344_b185_5d79cf68187c.slice/crio-dfc712c046f53bbab42da7c72012f52d748e148acd1b54785e8a3397492fd3b6 WatchSource:0}: Error finding container dfc712c046f53bbab42da7c72012f52d748e148acd1b54785e8a3397492fd3b6: Status 404 returned error can't find the container with id dfc712c046f53bbab42da7c72012f52d748e148acd1b54785e8a3397492fd3b6
Mar 20 16:02:58 crc kubenswrapper[4730]: I0320 16:02:58.262061    4730 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84937d37-8276-4014-b1ae-bb84547384af-config\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:58 crc kubenswrapper[4730]: I0320 16:02:58.262242    4730 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84937d37-8276-4014-b1ae-bb84547384af-ovsdbserver-sb\") on node \"crc\" DevicePath \"\""
Mar 20 16:02:58 crc kubenswrapper[4730]: I0320 16:02:58.433308    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-654455944c-qph9q"]
Mar 20 16:02:58 crc kubenswrapper[4730]: I0320 16:02:58.443739    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-654455944c-qph9q"]
Mar 20 16:02:59 crc kubenswrapper[4730]: I0320 16:02:59.108860    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5j2w4" event={"ID":"6941d556-3020-4344-b185-5d79cf68187c","Type":"ContainerStarted","Data":"c002961b60e7f958b6eac722566b65c8c9c5ccb02bb7acae14d9879bae50e4f2"}
Mar 20 16:02:59 crc kubenswrapper[4730]: I0320 16:02:59.109219    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5j2w4" event={"ID":"6941d556-3020-4344-b185-5d79cf68187c","Type":"ContainerStarted","Data":"dfc712c046f53bbab42da7c72012f52d748e148acd1b54785e8a3397492fd3b6"}
Mar 20 16:02:59 crc kubenswrapper[4730]: I0320 16:02:59.130053    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-5j2w4" podStartSLOduration=2.130031286 podStartE2EDuration="2.130031286s" podCreationTimestamp="2026-03-20 16:02:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:02:59.12944848 +0000 UTC m=+1438.342819859" watchObservedRunningTime="2026-03-20 16:02:59.130031286 +0000 UTC m=+1438.343402655"
Mar 20 16:02:59 crc kubenswrapper[4730]: I0320 16:02:59.548458    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84937d37-8276-4014-b1ae-bb84547384af" path="/var/lib/kubelet/pods/84937d37-8276-4014-b1ae-bb84547384af/volumes"
Mar 20 16:03:00 crc kubenswrapper[4730]: I0320 16:03:00.120650    4730 generic.go:334] "Generic (PLEG): container finished" podID="15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8" containerID="a2770542ad305768cf6cad907c5fcf1dddba97a517e618481d482e8c5742f864" exitCode=1
Mar 20 16:03:00 crc kubenswrapper[4730]: I0320 16:03:00.121049    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8" containerName="ceilometer-notification-agent" containerID="cri-o://60a8e4234dec296e07fa2e6b25c8cd3d8400e9e7059761ba834d2524bf4e7bc9" gracePeriod=30
Mar 20 16:03:00 crc kubenswrapper[4730]: I0320 16:03:00.120691    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8","Type":"ContainerDied","Data":"a2770542ad305768cf6cad907c5fcf1dddba97a517e618481d482e8c5742f864"}
Mar 20 16:03:00 crc kubenswrapper[4730]: I0320 16:03:00.120997    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8" containerName="sg-core" containerID="cri-o://51c72e38c5d67c6135d394c4c7da074fc0fe8aedaa4b2e3b893209505e547395" gracePeriod=30
Mar 20 16:03:00 crc kubenswrapper[4730]: I0320 16:03:00.120837    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8" containerName="ceilometer-central-agent" containerID="cri-o://7a8aa45fa61f1cac40b9b89dc3a2959bc5ab43046293be849c2cd09eec70c60a" gracePeriod=30
Mar 20 16:03:01 crc kubenswrapper[4730]: I0320 16:03:01.135218    4730 generic.go:334] "Generic (PLEG): container finished" podID="15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8" containerID="51c72e38c5d67c6135d394c4c7da074fc0fe8aedaa4b2e3b893209505e547395" exitCode=2
Mar 20 16:03:01 crc kubenswrapper[4730]: I0320 16:03:01.135330    4730 generic.go:334] "Generic (PLEG): container finished" podID="15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8" containerID="60a8e4234dec296e07fa2e6b25c8cd3d8400e9e7059761ba834d2524bf4e7bc9" exitCode=0
Mar 20 16:03:01 crc kubenswrapper[4730]: I0320 16:03:01.135295    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8","Type":"ContainerDied","Data":"51c72e38c5d67c6135d394c4c7da074fc0fe8aedaa4b2e3b893209505e547395"}
Mar 20 16:03:01 crc kubenswrapper[4730]: I0320 16:03:01.135364    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8","Type":"ContainerDied","Data":"60a8e4234dec296e07fa2e6b25c8cd3d8400e9e7059761ba834d2524bf4e7bc9"}
Mar 20 16:03:02 crc kubenswrapper[4730]: I0320 16:03:02.938291    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.066588    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8-config-data\") pod \"15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8\" (UID: \"15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8\") "
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.066641    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8-scripts\") pod \"15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8\" (UID: \"15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8\") "
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.066836    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8-log-httpd\") pod \"15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8\" (UID: \"15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8\") "
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.066873    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddg7t\" (UniqueName: \"kubernetes.io/projected/15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8-kube-api-access-ddg7t\") pod \"15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8\" (UID: \"15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8\") "
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.066905    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8-run-httpd\") pod \"15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8\" (UID: \"15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8\") "
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.066947    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8-sg-core-conf-yaml\") pod \"15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8\" (UID: \"15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8\") "
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.067016    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8-combined-ca-bundle\") pod \"15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8\" (UID: \"15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8\") "
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.072015    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8" (UID: "15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.072132    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8-scripts" (OuterVolumeSpecName: "scripts") pod "15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8" (UID: "15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.072880    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8" (UID: "15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.083445    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8-kube-api-access-ddg7t" (OuterVolumeSpecName: "kube-api-access-ddg7t") pod "15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8" (UID: "15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8"). InnerVolumeSpecName "kube-api-access-ddg7t". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.114565    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8" (UID: "15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.145096    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8" (UID: "15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.159352    4730 generic.go:334] "Generic (PLEG): container finished" podID="15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8" containerID="7a8aa45fa61f1cac40b9b89dc3a2959bc5ab43046293be849c2cd09eec70c60a" exitCode=0
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.159403    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8","Type":"ContainerDied","Data":"7a8aa45fa61f1cac40b9b89dc3a2959bc5ab43046293be849c2cd09eec70c60a"}
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.159436    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8","Type":"ContainerDied","Data":"2d283fdd4c1c73728f70c718eab039966732590eed0ec087942b19f09e1c496d"}
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.159456    4730 scope.go:117] "RemoveContainer" containerID="a2770542ad305768cf6cad907c5fcf1dddba97a517e618481d482e8c5742f864"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.159653    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.170119    4730 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8-scripts\") on node \"crc\" DevicePath \"\""
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.170156    4730 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8-log-httpd\") on node \"crc\" DevicePath \"\""
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.170173    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddg7t\" (UniqueName: \"kubernetes.io/projected/15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8-kube-api-access-ddg7t\") on node \"crc\" DevicePath \"\""
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.170186    4730 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8-run-httpd\") on node \"crc\" DevicePath \"\""
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.170198    4730 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\""
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.170209    4730 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.182583    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8-config-data" (OuterVolumeSpecName: "config-data") pod "15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8" (UID: "15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.187643    4730 scope.go:117] "RemoveContainer" containerID="51c72e38c5d67c6135d394c4c7da074fc0fe8aedaa4b2e3b893209505e547395"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.206678    4730 scope.go:117] "RemoveContainer" containerID="60a8e4234dec296e07fa2e6b25c8cd3d8400e9e7059761ba834d2524bf4e7bc9"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.227702    4730 scope.go:117] "RemoveContainer" containerID="7a8aa45fa61f1cac40b9b89dc3a2959bc5ab43046293be849c2cd09eec70c60a"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.246782    4730 scope.go:117] "RemoveContainer" containerID="a2770542ad305768cf6cad907c5fcf1dddba97a517e618481d482e8c5742f864"
Mar 20 16:03:03 crc kubenswrapper[4730]: E0320 16:03:03.247144    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2770542ad305768cf6cad907c5fcf1dddba97a517e618481d482e8c5742f864\": container with ID starting with a2770542ad305768cf6cad907c5fcf1dddba97a517e618481d482e8c5742f864 not found: ID does not exist" containerID="a2770542ad305768cf6cad907c5fcf1dddba97a517e618481d482e8c5742f864"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.247181    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2770542ad305768cf6cad907c5fcf1dddba97a517e618481d482e8c5742f864"} err="failed to get container status \"a2770542ad305768cf6cad907c5fcf1dddba97a517e618481d482e8c5742f864\": rpc error: code = NotFound desc = could not find container \"a2770542ad305768cf6cad907c5fcf1dddba97a517e618481d482e8c5742f864\": container with ID starting with a2770542ad305768cf6cad907c5fcf1dddba97a517e618481d482e8c5742f864 not found: ID does not exist"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.247203    4730 scope.go:117] "RemoveContainer" containerID="51c72e38c5d67c6135d394c4c7da074fc0fe8aedaa4b2e3b893209505e547395"
Mar 20 16:03:03 crc kubenswrapper[4730]: E0320 16:03:03.247519    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51c72e38c5d67c6135d394c4c7da074fc0fe8aedaa4b2e3b893209505e547395\": container with ID starting with 51c72e38c5d67c6135d394c4c7da074fc0fe8aedaa4b2e3b893209505e547395 not found: ID does not exist" containerID="51c72e38c5d67c6135d394c4c7da074fc0fe8aedaa4b2e3b893209505e547395"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.247568    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51c72e38c5d67c6135d394c4c7da074fc0fe8aedaa4b2e3b893209505e547395"} err="failed to get container status \"51c72e38c5d67c6135d394c4c7da074fc0fe8aedaa4b2e3b893209505e547395\": rpc error: code = NotFound desc = could not find container \"51c72e38c5d67c6135d394c4c7da074fc0fe8aedaa4b2e3b893209505e547395\": container with ID starting with 51c72e38c5d67c6135d394c4c7da074fc0fe8aedaa4b2e3b893209505e547395 not found: ID does not exist"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.247600    4730 scope.go:117] "RemoveContainer" containerID="60a8e4234dec296e07fa2e6b25c8cd3d8400e9e7059761ba834d2524bf4e7bc9"
Mar 20 16:03:03 crc kubenswrapper[4730]: E0320 16:03:03.247858    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60a8e4234dec296e07fa2e6b25c8cd3d8400e9e7059761ba834d2524bf4e7bc9\": container with ID starting with 60a8e4234dec296e07fa2e6b25c8cd3d8400e9e7059761ba834d2524bf4e7bc9 not found: ID does not exist" containerID="60a8e4234dec296e07fa2e6b25c8cd3d8400e9e7059761ba834d2524bf4e7bc9"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.247886    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60a8e4234dec296e07fa2e6b25c8cd3d8400e9e7059761ba834d2524bf4e7bc9"} err="failed to get container status \"60a8e4234dec296e07fa2e6b25c8cd3d8400e9e7059761ba834d2524bf4e7bc9\": rpc error: code = NotFound desc = could not find container \"60a8e4234dec296e07fa2e6b25c8cd3d8400e9e7059761ba834d2524bf4e7bc9\": container with ID starting with 60a8e4234dec296e07fa2e6b25c8cd3d8400e9e7059761ba834d2524bf4e7bc9 not found: ID does not exist"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.247902    4730 scope.go:117] "RemoveContainer" containerID="7a8aa45fa61f1cac40b9b89dc3a2959bc5ab43046293be849c2cd09eec70c60a"
Mar 20 16:03:03 crc kubenswrapper[4730]: E0320 16:03:03.248070    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a8aa45fa61f1cac40b9b89dc3a2959bc5ab43046293be849c2cd09eec70c60a\": container with ID starting with 7a8aa45fa61f1cac40b9b89dc3a2959bc5ab43046293be849c2cd09eec70c60a not found: ID does not exist" containerID="7a8aa45fa61f1cac40b9b89dc3a2959bc5ab43046293be849c2cd09eec70c60a"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.248112    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a8aa45fa61f1cac40b9b89dc3a2959bc5ab43046293be849c2cd09eec70c60a"} err="failed to get container status \"7a8aa45fa61f1cac40b9b89dc3a2959bc5ab43046293be849c2cd09eec70c60a\": rpc error: code = NotFound desc = could not find container \"7a8aa45fa61f1cac40b9b89dc3a2959bc5ab43046293be849c2cd09eec70c60a\": container with ID starting with 7a8aa45fa61f1cac40b9b89dc3a2959bc5ab43046293be849c2cd09eec70c60a not found: ID does not exist"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.271748    4730 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8-config-data\") on node \"crc\" DevicePath \"\""
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.433800    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.433840    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.512873    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"]
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.526458    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"]
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.573480    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8" path="/var/lib/kubelet/pods/15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8/volumes"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.574904    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"]
Mar 20 16:03:03 crc kubenswrapper[4730]: E0320 16:03:03.575653    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84937d37-8276-4014-b1ae-bb84547384af" containerName="init"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.575676    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="84937d37-8276-4014-b1ae-bb84547384af" containerName="init"
Mar 20 16:03:03 crc kubenswrapper[4730]: E0320 16:03:03.575689    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84937d37-8276-4014-b1ae-bb84547384af" containerName="dnsmasq-dns"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.575697    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="84937d37-8276-4014-b1ae-bb84547384af" containerName="dnsmasq-dns"
Mar 20 16:03:03 crc kubenswrapper[4730]: E0320 16:03:03.575713    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8" containerName="sg-core"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.575721    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8" containerName="sg-core"
Mar 20 16:03:03 crc kubenswrapper[4730]: E0320 16:03:03.575734    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8" containerName="proxy-httpd"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.575742    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8" containerName="proxy-httpd"
Mar 20 16:03:03 crc kubenswrapper[4730]: E0320 16:03:03.575775    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8" containerName="ceilometer-notification-agent"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.575783    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8" containerName="ceilometer-notification-agent"
Mar 20 16:03:03 crc kubenswrapper[4730]: E0320 16:03:03.575802    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8" containerName="ceilometer-central-agent"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.575810    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8" containerName="ceilometer-central-agent"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.576032    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8" containerName="ceilometer-notification-agent"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.576047    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8" containerName="ceilometer-central-agent"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.576099    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8" containerName="proxy-httpd"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.576124    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="84937d37-8276-4014-b1ae-bb84547384af" containerName="dnsmasq-dns"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.576143    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="15b8ac2f-7010-4b62-a0ed-a5fadb57b0a8" containerName="sg-core"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.578548    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"]
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.578656    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.581848    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.581877    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.581848    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.679350    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2333c0f3-d6ce-405f-b8c8-755be42ba74b-run-httpd\") pod \"ceilometer-0\" (UID: \"2333c0f3-d6ce-405f-b8c8-755be42ba74b\") " pod="openstack/ceilometer-0"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.679493    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95n5w\" (UniqueName: \"kubernetes.io/projected/2333c0f3-d6ce-405f-b8c8-755be42ba74b-kube-api-access-95n5w\") pod \"ceilometer-0\" (UID: \"2333c0f3-d6ce-405f-b8c8-755be42ba74b\") " pod="openstack/ceilometer-0"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.679558    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2333c0f3-d6ce-405f-b8c8-755be42ba74b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2333c0f3-d6ce-405f-b8c8-755be42ba74b\") " pod="openstack/ceilometer-0"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.679645    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2333c0f3-d6ce-405f-b8c8-755be42ba74b-log-httpd\") pod \"ceilometer-0\" (UID: \"2333c0f3-d6ce-405f-b8c8-755be42ba74b\") " pod="openstack/ceilometer-0"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.679744    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2333c0f3-d6ce-405f-b8c8-755be42ba74b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2333c0f3-d6ce-405f-b8c8-755be42ba74b\") " pod="openstack/ceilometer-0"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.679768    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2333c0f3-d6ce-405f-b8c8-755be42ba74b-config-data\") pod \"ceilometer-0\" (UID: \"2333c0f3-d6ce-405f-b8c8-755be42ba74b\") " pod="openstack/ceilometer-0"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.679807    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2333c0f3-d6ce-405f-b8c8-755be42ba74b-scripts\") pod \"ceilometer-0\" (UID: \"2333c0f3-d6ce-405f-b8c8-755be42ba74b\") " pod="openstack/ceilometer-0"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.679851    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2333c0f3-d6ce-405f-b8c8-755be42ba74b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2333c0f3-d6ce-405f-b8c8-755be42ba74b\") " pod="openstack/ceilometer-0"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.781864    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95n5w\" (UniqueName: \"kubernetes.io/projected/2333c0f3-d6ce-405f-b8c8-755be42ba74b-kube-api-access-95n5w\") pod \"ceilometer-0\" (UID: \"2333c0f3-d6ce-405f-b8c8-755be42ba74b\") " pod="openstack/ceilometer-0"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.782133    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2333c0f3-d6ce-405f-b8c8-755be42ba74b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2333c0f3-d6ce-405f-b8c8-755be42ba74b\") " pod="openstack/ceilometer-0"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.782187    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2333c0f3-d6ce-405f-b8c8-755be42ba74b-log-httpd\") pod \"ceilometer-0\" (UID: \"2333c0f3-d6ce-405f-b8c8-755be42ba74b\") " pod="openstack/ceilometer-0"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.782229    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2333c0f3-d6ce-405f-b8c8-755be42ba74b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2333c0f3-d6ce-405f-b8c8-755be42ba74b\") " pod="openstack/ceilometer-0"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.782261    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2333c0f3-d6ce-405f-b8c8-755be42ba74b-config-data\") pod \"ceilometer-0\" (UID: \"2333c0f3-d6ce-405f-b8c8-755be42ba74b\") " pod="openstack/ceilometer-0"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.782292    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2333c0f3-d6ce-405f-b8c8-755be42ba74b-scripts\") pod \"ceilometer-0\" (UID: \"2333c0f3-d6ce-405f-b8c8-755be42ba74b\") " pod="openstack/ceilometer-0"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.782323    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2333c0f3-d6ce-405f-b8c8-755be42ba74b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2333c0f3-d6ce-405f-b8c8-755be42ba74b\") " pod="openstack/ceilometer-0"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.782347    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2333c0f3-d6ce-405f-b8c8-755be42ba74b-run-httpd\") pod \"ceilometer-0\" (UID: \"2333c0f3-d6ce-405f-b8c8-755be42ba74b\") " pod="openstack/ceilometer-0"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.782698    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2333c0f3-d6ce-405f-b8c8-755be42ba74b-run-httpd\") pod \"ceilometer-0\" (UID: \"2333c0f3-d6ce-405f-b8c8-755be42ba74b\") " pod="openstack/ceilometer-0"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.782922    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2333c0f3-d6ce-405f-b8c8-755be42ba74b-log-httpd\") pod \"ceilometer-0\" (UID: \"2333c0f3-d6ce-405f-b8c8-755be42ba74b\") " pod="openstack/ceilometer-0"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.787406    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2333c0f3-d6ce-405f-b8c8-755be42ba74b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2333c0f3-d6ce-405f-b8c8-755be42ba74b\") " pod="openstack/ceilometer-0"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.788043    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2333c0f3-d6ce-405f-b8c8-755be42ba74b-scripts\") pod \"ceilometer-0\" (UID: \"2333c0f3-d6ce-405f-b8c8-755be42ba74b\") " pod="openstack/ceilometer-0"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.788757    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2333c0f3-d6ce-405f-b8c8-755be42ba74b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2333c0f3-d6ce-405f-b8c8-755be42ba74b\") " pod="openstack/ceilometer-0"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.802772    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95n5w\" (UniqueName: \"kubernetes.io/projected/2333c0f3-d6ce-405f-b8c8-755be42ba74b-kube-api-access-95n5w\") pod \"ceilometer-0\" (UID: \"2333c0f3-d6ce-405f-b8c8-755be42ba74b\") " pod="openstack/ceilometer-0"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.804007    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2333c0f3-d6ce-405f-b8c8-755be42ba74b-config-data\") pod \"ceilometer-0\" (UID: \"2333c0f3-d6ce-405f-b8c8-755be42ba74b\") " pod="openstack/ceilometer-0"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.811975    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2333c0f3-d6ce-405f-b8c8-755be42ba74b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2333c0f3-d6ce-405f-b8c8-755be42ba74b\") " pod="openstack/ceilometer-0"
Mar 20 16:03:03 crc kubenswrapper[4730]: I0320 16:03:03.987269    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0"
Mar 20 16:03:04 crc kubenswrapper[4730]: I0320 16:03:04.186571    4730 generic.go:334] "Generic (PLEG): container finished" podID="6941d556-3020-4344-b185-5d79cf68187c" containerID="c002961b60e7f958b6eac722566b65c8c9c5ccb02bb7acae14d9879bae50e4f2" exitCode=0
Mar 20 16:03:04 crc kubenswrapper[4730]: I0320 16:03:04.186859    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5j2w4" event={"ID":"6941d556-3020-4344-b185-5d79cf68187c","Type":"ContainerDied","Data":"c002961b60e7f958b6eac722566b65c8c9c5ccb02bb7acae14d9879bae50e4f2"}
Mar 20 16:03:04 crc kubenswrapper[4730]: I0320 16:03:04.450536    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a875c7b4-22fb-4b91-803c-09a7a439aea1" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.221:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)"
Mar 20 16:03:04 crc kubenswrapper[4730]: I0320 16:03:04.450570    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a875c7b4-22fb-4b91-803c-09a7a439aea1" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.221:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)"
Mar 20 16:03:04 crc kubenswrapper[4730]: I0320 16:03:04.466064    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"]
Mar 20 16:03:04 crc kubenswrapper[4730]: W0320 16:03:04.468939    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2333c0f3_d6ce_405f_b8c8_755be42ba74b.slice/crio-1299e2d7df514e1a97295b01275fe6d255ee41e4697438fd7f4e6ddef4421c83 WatchSource:0}: Error finding container 1299e2d7df514e1a97295b01275fe6d255ee41e4697438fd7f4e6ddef4421c83: Status 404 returned error can't find the container with id 1299e2d7df514e1a97295b01275fe6d255ee41e4697438fd7f4e6ddef4421c83
Mar 20 16:03:05 crc kubenswrapper[4730]: I0320 16:03:05.198410    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2333c0f3-d6ce-405f-b8c8-755be42ba74b","Type":"ContainerStarted","Data":"96c436e4c4ad2205424a867b0300ce8ea915a1f696fca311e0994be16783f1d9"}
Mar 20 16:03:05 crc kubenswrapper[4730]: I0320 16:03:05.198518    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2333c0f3-d6ce-405f-b8c8-755be42ba74b","Type":"ContainerStarted","Data":"05214ba2af35aa37b13a0a689fa73f69aedb3ffa941dcd2d56500efb0971fd16"}
Mar 20 16:03:05 crc kubenswrapper[4730]: I0320 16:03:05.198540    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2333c0f3-d6ce-405f-b8c8-755be42ba74b","Type":"ContainerStarted","Data":"1299e2d7df514e1a97295b01275fe6d255ee41e4697438fd7f4e6ddef4421c83"}
Mar 20 16:03:05 crc kubenswrapper[4730]: I0320 16:03:05.522321    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5j2w4"
Mar 20 16:03:05 crc kubenswrapper[4730]: I0320 16:03:05.606850    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0"
Mar 20 16:03:05 crc kubenswrapper[4730]: I0320 16:03:05.618095    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6rml\" (UniqueName: \"kubernetes.io/projected/6941d556-3020-4344-b185-5d79cf68187c-kube-api-access-p6rml\") pod \"6941d556-3020-4344-b185-5d79cf68187c\" (UID: \"6941d556-3020-4344-b185-5d79cf68187c\") "
Mar 20 16:03:05 crc kubenswrapper[4730]: I0320 16:03:05.618171    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6941d556-3020-4344-b185-5d79cf68187c-scripts\") pod \"6941d556-3020-4344-b185-5d79cf68187c\" (UID: \"6941d556-3020-4344-b185-5d79cf68187c\") "
Mar 20 16:03:05 crc kubenswrapper[4730]: I0320 16:03:05.618199    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6941d556-3020-4344-b185-5d79cf68187c-config-data\") pod \"6941d556-3020-4344-b185-5d79cf68187c\" (UID: \"6941d556-3020-4344-b185-5d79cf68187c\") "
Mar 20 16:03:05 crc kubenswrapper[4730]: I0320 16:03:05.618338    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6941d556-3020-4344-b185-5d79cf68187c-combined-ca-bundle\") pod \"6941d556-3020-4344-b185-5d79cf68187c\" (UID: \"6941d556-3020-4344-b185-5d79cf68187c\") "
Mar 20 16:03:05 crc kubenswrapper[4730]: I0320 16:03:05.642035    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6941d556-3020-4344-b185-5d79cf68187c-scripts" (OuterVolumeSpecName: "scripts") pod "6941d556-3020-4344-b185-5d79cf68187c" (UID: "6941d556-3020-4344-b185-5d79cf68187c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:03:05 crc kubenswrapper[4730]: I0320 16:03:05.645820    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6941d556-3020-4344-b185-5d79cf68187c-kube-api-access-p6rml" (OuterVolumeSpecName: "kube-api-access-p6rml") pod "6941d556-3020-4344-b185-5d79cf68187c" (UID: "6941d556-3020-4344-b185-5d79cf68187c"). InnerVolumeSpecName "kube-api-access-p6rml". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:03:05 crc kubenswrapper[4730]: I0320 16:03:05.661531    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6941d556-3020-4344-b185-5d79cf68187c-config-data" (OuterVolumeSpecName: "config-data") pod "6941d556-3020-4344-b185-5d79cf68187c" (UID: "6941d556-3020-4344-b185-5d79cf68187c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:03:05 crc kubenswrapper[4730]: I0320 16:03:05.680926    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6941d556-3020-4344-b185-5d79cf68187c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6941d556-3020-4344-b185-5d79cf68187c" (UID: "6941d556-3020-4344-b185-5d79cf68187c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:03:05 crc kubenswrapper[4730]: I0320 16:03:05.721040    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6rml\" (UniqueName: \"kubernetes.io/projected/6941d556-3020-4344-b185-5d79cf68187c-kube-api-access-p6rml\") on node \"crc\" DevicePath \"\""
Mar 20 16:03:05 crc kubenswrapper[4730]: I0320 16:03:05.721071    4730 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6941d556-3020-4344-b185-5d79cf68187c-scripts\") on node \"crc\" DevicePath \"\""
Mar 20 16:03:05 crc kubenswrapper[4730]: I0320 16:03:05.721081    4730 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6941d556-3020-4344-b185-5d79cf68187c-config-data\") on node \"crc\" DevicePath \"\""
Mar 20 16:03:05 crc kubenswrapper[4730]: I0320 16:03:05.721090    4730 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6941d556-3020-4344-b185-5d79cf68187c-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:03:06 crc kubenswrapper[4730]: I0320 16:03:06.211061    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5j2w4" event={"ID":"6941d556-3020-4344-b185-5d79cf68187c","Type":"ContainerDied","Data":"dfc712c046f53bbab42da7c72012f52d748e148acd1b54785e8a3397492fd3b6"}
Mar 20 16:03:06 crc kubenswrapper[4730]: I0320 16:03:06.211428    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfc712c046f53bbab42da7c72012f52d748e148acd1b54785e8a3397492fd3b6"
Mar 20 16:03:06 crc kubenswrapper[4730]: I0320 16:03:06.211176    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5j2w4"
Mar 20 16:03:06 crc kubenswrapper[4730]: I0320 16:03:06.393194    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"]
Mar 20 16:03:06 crc kubenswrapper[4730]: I0320 16:03:06.393625    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a875c7b4-22fb-4b91-803c-09a7a439aea1" containerName="nova-api-log" containerID="cri-o://5b5163f015343a503b8bb1265422ad47154d41b97a20358ae2eebcfb9b02968f" gracePeriod=30
Mar 20 16:03:06 crc kubenswrapper[4730]: I0320 16:03:06.394131    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a875c7b4-22fb-4b91-803c-09a7a439aea1" containerName="nova-api-api" containerID="cri-o://8ad5725e9930431c50bd1862d3f763c6380098c554ce816d54c8f41bdc9481c4" gracePeriod=30
Mar 20 16:03:06 crc kubenswrapper[4730]: I0320 16:03:06.405953    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"]
Mar 20 16:03:06 crc kubenswrapper[4730]: I0320 16:03:06.408058    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="3edc02b8-2451-4edf-a79d-fc86a078de83" containerName="nova-scheduler-scheduler" containerID="cri-o://f01bb1ce3fdb271555994504b12501ddebf5c6f44d1ae92e8481bfe809a7796f" gracePeriod=30
Mar 20 16:03:06 crc kubenswrapper[4730]: I0320 16:03:06.427225    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"]
Mar 20 16:03:06 crc kubenswrapper[4730]: I0320 16:03:06.427738    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c9bfb6c0-4971-4a58-aacc-17636a95b8a4" containerName="nova-metadata-log" containerID="cri-o://a81bd5c7f3aa448b067be4239c5895b4e08b8ed1cbb446ef78371ac5389a73a9" gracePeriod=30
Mar 20 16:03:06 crc kubenswrapper[4730]: I0320 16:03:06.427732    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c9bfb6c0-4971-4a58-aacc-17636a95b8a4" containerName="nova-metadata-metadata" containerID="cri-o://4ea91441433de2c9d6a90ef65cf2c1e62d0cba102e976dcb6c1e329ad4173188" gracePeriod=30
Mar 20 16:03:07 crc kubenswrapper[4730]: I0320 16:03:07.233063    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2333c0f3-d6ce-405f-b8c8-755be42ba74b","Type":"ContainerStarted","Data":"6346e0485e2a6b37dd5fec8edf65ac491ec6f80b862b5bb6f83d9906e4e8fe7e"}
Mar 20 16:03:07 crc kubenswrapper[4730]: I0320 16:03:07.239177    4730 generic.go:334] "Generic (PLEG): container finished" podID="c9bfb6c0-4971-4a58-aacc-17636a95b8a4" containerID="a81bd5c7f3aa448b067be4239c5895b4e08b8ed1cbb446ef78371ac5389a73a9" exitCode=143
Mar 20 16:03:07 crc kubenswrapper[4730]: I0320 16:03:07.239258    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c9bfb6c0-4971-4a58-aacc-17636a95b8a4","Type":"ContainerDied","Data":"a81bd5c7f3aa448b067be4239c5895b4e08b8ed1cbb446ef78371ac5389a73a9"}
Mar 20 16:03:07 crc kubenswrapper[4730]: I0320 16:03:07.242457    4730 generic.go:334] "Generic (PLEG): container finished" podID="a875c7b4-22fb-4b91-803c-09a7a439aea1" containerID="5b5163f015343a503b8bb1265422ad47154d41b97a20358ae2eebcfb9b02968f" exitCode=143
Mar 20 16:03:07 crc kubenswrapper[4730]: I0320 16:03:07.242503    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a875c7b4-22fb-4b91-803c-09a7a439aea1","Type":"ContainerDied","Data":"5b5163f015343a503b8bb1265422ad47154d41b97a20358ae2eebcfb9b02968f"}
Mar 20 16:03:07 crc kubenswrapper[4730]: I0320 16:03:07.803701    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0"
Mar 20 16:03:07 crc kubenswrapper[4730]: I0320 16:03:07.809500    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0"
Mar 20 16:03:07 crc kubenswrapper[4730]: I0320 16:03:07.970952    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3edc02b8-2451-4edf-a79d-fc86a078de83-config-data\") pod \"3edc02b8-2451-4edf-a79d-fc86a078de83\" (UID: \"3edc02b8-2451-4edf-a79d-fc86a078de83\") "
Mar 20 16:03:07 crc kubenswrapper[4730]: I0320 16:03:07.971106    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3edc02b8-2451-4edf-a79d-fc86a078de83-combined-ca-bundle\") pod \"3edc02b8-2451-4edf-a79d-fc86a078de83\" (UID: \"3edc02b8-2451-4edf-a79d-fc86a078de83\") "
Mar 20 16:03:07 crc kubenswrapper[4730]: I0320 16:03:07.971156    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9bfb6c0-4971-4a58-aacc-17636a95b8a4-config-data\") pod \"c9bfb6c0-4971-4a58-aacc-17636a95b8a4\" (UID: \"c9bfb6c0-4971-4a58-aacc-17636a95b8a4\") "
Mar 20 16:03:07 crc kubenswrapper[4730]: I0320 16:03:07.971327    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9bfb6c0-4971-4a58-aacc-17636a95b8a4-logs\") pod \"c9bfb6c0-4971-4a58-aacc-17636a95b8a4\" (UID: \"c9bfb6c0-4971-4a58-aacc-17636a95b8a4\") "
Mar 20 16:03:07 crc kubenswrapper[4730]: I0320 16:03:07.971369    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xws9p\" (UniqueName: \"kubernetes.io/projected/c9bfb6c0-4971-4a58-aacc-17636a95b8a4-kube-api-access-xws9p\") pod \"c9bfb6c0-4971-4a58-aacc-17636a95b8a4\" (UID: \"c9bfb6c0-4971-4a58-aacc-17636a95b8a4\") "
Mar 20 16:03:07 crc kubenswrapper[4730]: I0320 16:03:07.971401    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9bfb6c0-4971-4a58-aacc-17636a95b8a4-combined-ca-bundle\") pod \"c9bfb6c0-4971-4a58-aacc-17636a95b8a4\" (UID: \"c9bfb6c0-4971-4a58-aacc-17636a95b8a4\") "
Mar 20 16:03:07 crc kubenswrapper[4730]: I0320 16:03:07.971435    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9bfb6c0-4971-4a58-aacc-17636a95b8a4-nova-metadata-tls-certs\") pod \"c9bfb6c0-4971-4a58-aacc-17636a95b8a4\" (UID: \"c9bfb6c0-4971-4a58-aacc-17636a95b8a4\") "
Mar 20 16:03:07 crc kubenswrapper[4730]: I0320 16:03:07.971453    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrnlf\" (UniqueName: \"kubernetes.io/projected/3edc02b8-2451-4edf-a79d-fc86a078de83-kube-api-access-qrnlf\") pod \"3edc02b8-2451-4edf-a79d-fc86a078de83\" (UID: \"3edc02b8-2451-4edf-a79d-fc86a078de83\") "
Mar 20 16:03:07 crc kubenswrapper[4730]: I0320 16:03:07.972945    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9bfb6c0-4971-4a58-aacc-17636a95b8a4-logs" (OuterVolumeSpecName: "logs") pod "c9bfb6c0-4971-4a58-aacc-17636a95b8a4" (UID: "c9bfb6c0-4971-4a58-aacc-17636a95b8a4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:03:07 crc kubenswrapper[4730]: I0320 16:03:07.978514    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3edc02b8-2451-4edf-a79d-fc86a078de83-kube-api-access-qrnlf" (OuterVolumeSpecName: "kube-api-access-qrnlf") pod "3edc02b8-2451-4edf-a79d-fc86a078de83" (UID: "3edc02b8-2451-4edf-a79d-fc86a078de83"). InnerVolumeSpecName "kube-api-access-qrnlf". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:03:07 crc kubenswrapper[4730]: I0320 16:03:07.980723    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9bfb6c0-4971-4a58-aacc-17636a95b8a4-kube-api-access-xws9p" (OuterVolumeSpecName: "kube-api-access-xws9p") pod "c9bfb6c0-4971-4a58-aacc-17636a95b8a4" (UID: "c9bfb6c0-4971-4a58-aacc-17636a95b8a4"). InnerVolumeSpecName "kube-api-access-xws9p". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.012194    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9bfb6c0-4971-4a58-aacc-17636a95b8a4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c9bfb6c0-4971-4a58-aacc-17636a95b8a4" (UID: "c9bfb6c0-4971-4a58-aacc-17636a95b8a4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.012228    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3edc02b8-2451-4edf-a79d-fc86a078de83-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3edc02b8-2451-4edf-a79d-fc86a078de83" (UID: "3edc02b8-2451-4edf-a79d-fc86a078de83"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.016199    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9bfb6c0-4971-4a58-aacc-17636a95b8a4-config-data" (OuterVolumeSpecName: "config-data") pod "c9bfb6c0-4971-4a58-aacc-17636a95b8a4" (UID: "c9bfb6c0-4971-4a58-aacc-17636a95b8a4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.016190    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3edc02b8-2451-4edf-a79d-fc86a078de83-config-data" (OuterVolumeSpecName: "config-data") pod "3edc02b8-2451-4edf-a79d-fc86a078de83" (UID: "3edc02b8-2451-4edf-a79d-fc86a078de83"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.037486    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9bfb6c0-4971-4a58-aacc-17636a95b8a4-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "c9bfb6c0-4971-4a58-aacc-17636a95b8a4" (UID: "c9bfb6c0-4971-4a58-aacc-17636a95b8a4"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.073428    4730 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3edc02b8-2451-4edf-a79d-fc86a078de83-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.073469    4730 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9bfb6c0-4971-4a58-aacc-17636a95b8a4-config-data\") on node \"crc\" DevicePath \"\""
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.073479    4730 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9bfb6c0-4971-4a58-aacc-17636a95b8a4-logs\") on node \"crc\" DevicePath \"\""
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.073489    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xws9p\" (UniqueName: \"kubernetes.io/projected/c9bfb6c0-4971-4a58-aacc-17636a95b8a4-kube-api-access-xws9p\") on node \"crc\" DevicePath \"\""
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.073500    4730 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9bfb6c0-4971-4a58-aacc-17636a95b8a4-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.073511    4730 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9bfb6c0-4971-4a58-aacc-17636a95b8a4-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\""
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.073521    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrnlf\" (UniqueName: \"kubernetes.io/projected/3edc02b8-2451-4edf-a79d-fc86a078de83-kube-api-access-qrnlf\") on node \"crc\" DevicePath \"\""
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.073529    4730 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3edc02b8-2451-4edf-a79d-fc86a078de83-config-data\") on node \"crc\" DevicePath \"\""
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.256220    4730 generic.go:334] "Generic (PLEG): container finished" podID="3edc02b8-2451-4edf-a79d-fc86a078de83" containerID="f01bb1ce3fdb271555994504b12501ddebf5c6f44d1ae92e8481bfe809a7796f" exitCode=0
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.256283    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3edc02b8-2451-4edf-a79d-fc86a078de83","Type":"ContainerDied","Data":"f01bb1ce3fdb271555994504b12501ddebf5c6f44d1ae92e8481bfe809a7796f"}
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.256328    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3edc02b8-2451-4edf-a79d-fc86a078de83","Type":"ContainerDied","Data":"b529fd31ef154a115120a0df54b6ded5ddefe9a8e3d6d2b8b099bc3bd1c28517"}
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.256351    4730 scope.go:117] "RemoveContainer" containerID="f01bb1ce3fdb271555994504b12501ddebf5c6f44d1ae92e8481bfe809a7796f"
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.257495    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0"
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.259178    4730 generic.go:334] "Generic (PLEG): container finished" podID="c9bfb6c0-4971-4a58-aacc-17636a95b8a4" containerID="4ea91441433de2c9d6a90ef65cf2c1e62d0cba102e976dcb6c1e329ad4173188" exitCode=0
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.259210    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c9bfb6c0-4971-4a58-aacc-17636a95b8a4","Type":"ContainerDied","Data":"4ea91441433de2c9d6a90ef65cf2c1e62d0cba102e976dcb6c1e329ad4173188"}
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.259234    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c9bfb6c0-4971-4a58-aacc-17636a95b8a4","Type":"ContainerDied","Data":"26008e0cde0b10d5cf635c86bad21a3a5eba1afc3f588c8674fe357301fcf0ca"}
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.259295    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0"
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.282346    4730 scope.go:117] "RemoveContainer" containerID="f01bb1ce3fdb271555994504b12501ddebf5c6f44d1ae92e8481bfe809a7796f"
Mar 20 16:03:08 crc kubenswrapper[4730]: E0320 16:03:08.285856    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f01bb1ce3fdb271555994504b12501ddebf5c6f44d1ae92e8481bfe809a7796f\": container with ID starting with f01bb1ce3fdb271555994504b12501ddebf5c6f44d1ae92e8481bfe809a7796f not found: ID does not exist" containerID="f01bb1ce3fdb271555994504b12501ddebf5c6f44d1ae92e8481bfe809a7796f"
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.285942    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f01bb1ce3fdb271555994504b12501ddebf5c6f44d1ae92e8481bfe809a7796f"} err="failed to get container status \"f01bb1ce3fdb271555994504b12501ddebf5c6f44d1ae92e8481bfe809a7796f\": rpc error: code = NotFound desc = could not find container \"f01bb1ce3fdb271555994504b12501ddebf5c6f44d1ae92e8481bfe809a7796f\": container with ID starting with f01bb1ce3fdb271555994504b12501ddebf5c6f44d1ae92e8481bfe809a7796f not found: ID does not exist"
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.286021    4730 scope.go:117] "RemoveContainer" containerID="4ea91441433de2c9d6a90ef65cf2c1e62d0cba102e976dcb6c1e329ad4173188"
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.296170    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"]
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.309035    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"]
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.326425    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"]
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.338425    4730 scope.go:117] "RemoveContainer" containerID="a81bd5c7f3aa448b067be4239c5895b4e08b8ed1cbb446ef78371ac5389a73a9"
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.342797    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"]
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.359781    4730 scope.go:117] "RemoveContainer" containerID="4ea91441433de2c9d6a90ef65cf2c1e62d0cba102e976dcb6c1e329ad4173188"
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.359881    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"]
Mar 20 16:03:08 crc kubenswrapper[4730]: E0320 16:03:08.360282    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3edc02b8-2451-4edf-a79d-fc86a078de83" containerName="nova-scheduler-scheduler"
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.360298    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="3edc02b8-2451-4edf-a79d-fc86a078de83" containerName="nova-scheduler-scheduler"
Mar 20 16:03:08 crc kubenswrapper[4730]: E0320 16:03:08.360307    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9bfb6c0-4971-4a58-aacc-17636a95b8a4" containerName="nova-metadata-log"
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.360313    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9bfb6c0-4971-4a58-aacc-17636a95b8a4" containerName="nova-metadata-log"
Mar 20 16:03:08 crc kubenswrapper[4730]: E0320 16:03:08.360326    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9bfb6c0-4971-4a58-aacc-17636a95b8a4" containerName="nova-metadata-metadata"
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.360332    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9bfb6c0-4971-4a58-aacc-17636a95b8a4" containerName="nova-metadata-metadata"
Mar 20 16:03:08 crc kubenswrapper[4730]: E0320 16:03:08.360345    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6941d556-3020-4344-b185-5d79cf68187c" containerName="nova-manage"
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.360352    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="6941d556-3020-4344-b185-5d79cf68187c" containerName="nova-manage"
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.360538    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9bfb6c0-4971-4a58-aacc-17636a95b8a4" containerName="nova-metadata-metadata"
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.360549    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="6941d556-3020-4344-b185-5d79cf68187c" containerName="nova-manage"
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.360572    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9bfb6c0-4971-4a58-aacc-17636a95b8a4" containerName="nova-metadata-log"
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.360580    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="3edc02b8-2451-4edf-a79d-fc86a078de83" containerName="nova-scheduler-scheduler"
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.361584    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0"
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.363696    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data"
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.363944    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc"
Mar 20 16:03:08 crc kubenswrapper[4730]: E0320 16:03:08.369557    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ea91441433de2c9d6a90ef65cf2c1e62d0cba102e976dcb6c1e329ad4173188\": container with ID starting with 4ea91441433de2c9d6a90ef65cf2c1e62d0cba102e976dcb6c1e329ad4173188 not found: ID does not exist" containerID="4ea91441433de2c9d6a90ef65cf2c1e62d0cba102e976dcb6c1e329ad4173188"
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.369614    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ea91441433de2c9d6a90ef65cf2c1e62d0cba102e976dcb6c1e329ad4173188"} err="failed to get container status \"4ea91441433de2c9d6a90ef65cf2c1e62d0cba102e976dcb6c1e329ad4173188\": rpc error: code = NotFound desc = could not find container \"4ea91441433de2c9d6a90ef65cf2c1e62d0cba102e976dcb6c1e329ad4173188\": container with ID starting with 4ea91441433de2c9d6a90ef65cf2c1e62d0cba102e976dcb6c1e329ad4173188 not found: ID does not exist"
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.369647    4730 scope.go:117] "RemoveContainer" containerID="a81bd5c7f3aa448b067be4239c5895b4e08b8ed1cbb446ef78371ac5389a73a9"
Mar 20 16:03:08 crc kubenswrapper[4730]: E0320 16:03:08.370368    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a81bd5c7f3aa448b067be4239c5895b4e08b8ed1cbb446ef78371ac5389a73a9\": container with ID starting with a81bd5c7f3aa448b067be4239c5895b4e08b8ed1cbb446ef78371ac5389a73a9 not found: ID does not exist" containerID="a81bd5c7f3aa448b067be4239c5895b4e08b8ed1cbb446ef78371ac5389a73a9"
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.370426    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a81bd5c7f3aa448b067be4239c5895b4e08b8ed1cbb446ef78371ac5389a73a9"} err="failed to get container status \"a81bd5c7f3aa448b067be4239c5895b4e08b8ed1cbb446ef78371ac5389a73a9\": rpc error: code = NotFound desc = could not find container \"a81bd5c7f3aa448b067be4239c5895b4e08b8ed1cbb446ef78371ac5389a73a9\": container with ID starting with a81bd5c7f3aa448b067be4239c5895b4e08b8ed1cbb446ef78371ac5389a73a9 not found: ID does not exist"
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.379861    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"]
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.389229    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0"
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.393682    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data"
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.412700    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"]
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.426472    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"]
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.490624    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a58f453e-84d8-47b1-8740-406f92c4ca79-logs\") pod \"nova-metadata-0\" (UID: \"a58f453e-84d8-47b1-8740-406f92c4ca79\") " pod="openstack/nova-metadata-0"
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.490743    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4deff063-ecb8-4cf2-8e94-45ab62a613bc-config-data\") pod \"nova-scheduler-0\" (UID: \"4deff063-ecb8-4cf2-8e94-45ab62a613bc\") " pod="openstack/nova-scheduler-0"
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.490767    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a58f453e-84d8-47b1-8740-406f92c4ca79-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a58f453e-84d8-47b1-8740-406f92c4ca79\") " pod="openstack/nova-metadata-0"
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.490784    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a58f453e-84d8-47b1-8740-406f92c4ca79-config-data\") pod \"nova-metadata-0\" (UID: \"a58f453e-84d8-47b1-8740-406f92c4ca79\") " pod="openstack/nova-metadata-0"
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.490956    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a58f453e-84d8-47b1-8740-406f92c4ca79-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a58f453e-84d8-47b1-8740-406f92c4ca79\") " pod="openstack/nova-metadata-0"
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.491004    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlrsq\" (UniqueName: \"kubernetes.io/projected/a58f453e-84d8-47b1-8740-406f92c4ca79-kube-api-access-vlrsq\") pod \"nova-metadata-0\" (UID: \"a58f453e-84d8-47b1-8740-406f92c4ca79\") " pod="openstack/nova-metadata-0"
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.491101    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4deff063-ecb8-4cf2-8e94-45ab62a613bc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4deff063-ecb8-4cf2-8e94-45ab62a613bc\") " pod="openstack/nova-scheduler-0"
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.491307    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kqgd\" (UniqueName: \"kubernetes.io/projected/4deff063-ecb8-4cf2-8e94-45ab62a613bc-kube-api-access-5kqgd\") pod \"nova-scheduler-0\" (UID: \"4deff063-ecb8-4cf2-8e94-45ab62a613bc\") " pod="openstack/nova-scheduler-0"
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.593458    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4deff063-ecb8-4cf2-8e94-45ab62a613bc-config-data\") pod \"nova-scheduler-0\" (UID: \"4deff063-ecb8-4cf2-8e94-45ab62a613bc\") " pod="openstack/nova-scheduler-0"
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.593519    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a58f453e-84d8-47b1-8740-406f92c4ca79-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a58f453e-84d8-47b1-8740-406f92c4ca79\") " pod="openstack/nova-metadata-0"
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.593546    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a58f453e-84d8-47b1-8740-406f92c4ca79-config-data\") pod \"nova-metadata-0\" (UID: \"a58f453e-84d8-47b1-8740-406f92c4ca79\") " pod="openstack/nova-metadata-0"
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.593600    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a58f453e-84d8-47b1-8740-406f92c4ca79-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a58f453e-84d8-47b1-8740-406f92c4ca79\") " pod="openstack/nova-metadata-0"
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.593625    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlrsq\" (UniqueName: \"kubernetes.io/projected/a58f453e-84d8-47b1-8740-406f92c4ca79-kube-api-access-vlrsq\") pod \"nova-metadata-0\" (UID: \"a58f453e-84d8-47b1-8740-406f92c4ca79\") " pod="openstack/nova-metadata-0"
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.593674    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4deff063-ecb8-4cf2-8e94-45ab62a613bc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4deff063-ecb8-4cf2-8e94-45ab62a613bc\") " pod="openstack/nova-scheduler-0"
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.593746    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kqgd\" (UniqueName: \"kubernetes.io/projected/4deff063-ecb8-4cf2-8e94-45ab62a613bc-kube-api-access-5kqgd\") pod \"nova-scheduler-0\" (UID: \"4deff063-ecb8-4cf2-8e94-45ab62a613bc\") " pod="openstack/nova-scheduler-0"
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.593849    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a58f453e-84d8-47b1-8740-406f92c4ca79-logs\") pod \"nova-metadata-0\" (UID: \"a58f453e-84d8-47b1-8740-406f92c4ca79\") " pod="openstack/nova-metadata-0"
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.594504    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a58f453e-84d8-47b1-8740-406f92c4ca79-logs\") pod \"nova-metadata-0\" (UID: \"a58f453e-84d8-47b1-8740-406f92c4ca79\") " pod="openstack/nova-metadata-0"
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.597592    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a58f453e-84d8-47b1-8740-406f92c4ca79-config-data\") pod \"nova-metadata-0\" (UID: \"a58f453e-84d8-47b1-8740-406f92c4ca79\") " pod="openstack/nova-metadata-0"
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.598736    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4deff063-ecb8-4cf2-8e94-45ab62a613bc-config-data\") pod \"nova-scheduler-0\" (UID: \"4deff063-ecb8-4cf2-8e94-45ab62a613bc\") " pod="openstack/nova-scheduler-0"
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.599070    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a58f453e-84d8-47b1-8740-406f92c4ca79-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a58f453e-84d8-47b1-8740-406f92c4ca79\") " pod="openstack/nova-metadata-0"
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.600820    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4deff063-ecb8-4cf2-8e94-45ab62a613bc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4deff063-ecb8-4cf2-8e94-45ab62a613bc\") " pod="openstack/nova-scheduler-0"
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.601740    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a58f453e-84d8-47b1-8740-406f92c4ca79-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a58f453e-84d8-47b1-8740-406f92c4ca79\") " pod="openstack/nova-metadata-0"
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.617927    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlrsq\" (UniqueName: \"kubernetes.io/projected/a58f453e-84d8-47b1-8740-406f92c4ca79-kube-api-access-vlrsq\") pod \"nova-metadata-0\" (UID: \"a58f453e-84d8-47b1-8740-406f92c4ca79\") " pod="openstack/nova-metadata-0"
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.618037    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kqgd\" (UniqueName: \"kubernetes.io/projected/4deff063-ecb8-4cf2-8e94-45ab62a613bc-kube-api-access-5kqgd\") pod \"nova-scheduler-0\" (UID: \"4deff063-ecb8-4cf2-8e94-45ab62a613bc\") " pod="openstack/nova-scheduler-0"
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.694023    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0"
Mar 20 16:03:08 crc kubenswrapper[4730]: I0320 16:03:08.714155    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0"
Mar 20 16:03:09 crc kubenswrapper[4730]: I0320 16:03:09.178869    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"]
Mar 20 16:03:09 crc kubenswrapper[4730]: W0320 16:03:09.188443    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda58f453e_84d8_47b1_8740_406f92c4ca79.slice/crio-8c4113a3a395f61d82a4e3f3327cc72aaec3ee3b7ff952100b967b5e75bad9d9 WatchSource:0}: Error finding container 8c4113a3a395f61d82a4e3f3327cc72aaec3ee3b7ff952100b967b5e75bad9d9: Status 404 returned error can't find the container with id 8c4113a3a395f61d82a4e3f3327cc72aaec3ee3b7ff952100b967b5e75bad9d9
Mar 20 16:03:09 crc kubenswrapper[4730]: I0320 16:03:09.259321    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"]
Mar 20 16:03:09 crc kubenswrapper[4730]: I0320 16:03:09.273890    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a58f453e-84d8-47b1-8740-406f92c4ca79","Type":"ContainerStarted","Data":"8c4113a3a395f61d82a4e3f3327cc72aaec3ee3b7ff952100b967b5e75bad9d9"}
Mar 20 16:03:09 crc kubenswrapper[4730]: W0320 16:03:09.279704    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4deff063_ecb8_4cf2_8e94_45ab62a613bc.slice/crio-21158f74c1175b4725450a9060010e126ee9a4812de984c4772223cfd04c4b44 WatchSource:0}: Error finding container 21158f74c1175b4725450a9060010e126ee9a4812de984c4772223cfd04c4b44: Status 404 returned error can't find the container with id 21158f74c1175b4725450a9060010e126ee9a4812de984c4772223cfd04c4b44
Mar 20 16:03:09 crc kubenswrapper[4730]: I0320 16:03:09.552308    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3edc02b8-2451-4edf-a79d-fc86a078de83" path="/var/lib/kubelet/pods/3edc02b8-2451-4edf-a79d-fc86a078de83/volumes"
Mar 20 16:03:09 crc kubenswrapper[4730]: I0320 16:03:09.553175    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9bfb6c0-4971-4a58-aacc-17636a95b8a4" path="/var/lib/kubelet/pods/c9bfb6c0-4971-4a58-aacc-17636a95b8a4/volumes"
Mar 20 16:03:09 crc kubenswrapper[4730]: I0320 16:03:09.730092    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0"
Mar 20 16:03:09 crc kubenswrapper[4730]: I0320 16:03:09.826551    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a875c7b4-22fb-4b91-803c-09a7a439aea1-public-tls-certs\") pod \"a875c7b4-22fb-4b91-803c-09a7a439aea1\" (UID: \"a875c7b4-22fb-4b91-803c-09a7a439aea1\") "
Mar 20 16:03:09 crc kubenswrapper[4730]: I0320 16:03:09.826624    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a875c7b4-22fb-4b91-803c-09a7a439aea1-config-data\") pod \"a875c7b4-22fb-4b91-803c-09a7a439aea1\" (UID: \"a875c7b4-22fb-4b91-803c-09a7a439aea1\") "
Mar 20 16:03:09 crc kubenswrapper[4730]: I0320 16:03:09.826665    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a875c7b4-22fb-4b91-803c-09a7a439aea1-combined-ca-bundle\") pod \"a875c7b4-22fb-4b91-803c-09a7a439aea1\" (UID: \"a875c7b4-22fb-4b91-803c-09a7a439aea1\") "
Mar 20 16:03:09 crc kubenswrapper[4730]: I0320 16:03:09.826699    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a875c7b4-22fb-4b91-803c-09a7a439aea1-logs\") pod \"a875c7b4-22fb-4b91-803c-09a7a439aea1\" (UID: \"a875c7b4-22fb-4b91-803c-09a7a439aea1\") "
Mar 20 16:03:09 crc kubenswrapper[4730]: I0320 16:03:09.826773    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cf45d\" (UniqueName: \"kubernetes.io/projected/a875c7b4-22fb-4b91-803c-09a7a439aea1-kube-api-access-cf45d\") pod \"a875c7b4-22fb-4b91-803c-09a7a439aea1\" (UID: \"a875c7b4-22fb-4b91-803c-09a7a439aea1\") "
Mar 20 16:03:09 crc kubenswrapper[4730]: I0320 16:03:09.826884    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a875c7b4-22fb-4b91-803c-09a7a439aea1-internal-tls-certs\") pod \"a875c7b4-22fb-4b91-803c-09a7a439aea1\" (UID: \"a875c7b4-22fb-4b91-803c-09a7a439aea1\") "
Mar 20 16:03:09 crc kubenswrapper[4730]: I0320 16:03:09.827333    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a875c7b4-22fb-4b91-803c-09a7a439aea1-logs" (OuterVolumeSpecName: "logs") pod "a875c7b4-22fb-4b91-803c-09a7a439aea1" (UID: "a875c7b4-22fb-4b91-803c-09a7a439aea1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:03:09 crc kubenswrapper[4730]: I0320 16:03:09.827660    4730 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a875c7b4-22fb-4b91-803c-09a7a439aea1-logs\") on node \"crc\" DevicePath \"\""
Mar 20 16:03:09 crc kubenswrapper[4730]: I0320 16:03:09.833623    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a875c7b4-22fb-4b91-803c-09a7a439aea1-kube-api-access-cf45d" (OuterVolumeSpecName: "kube-api-access-cf45d") pod "a875c7b4-22fb-4b91-803c-09a7a439aea1" (UID: "a875c7b4-22fb-4b91-803c-09a7a439aea1"). InnerVolumeSpecName "kube-api-access-cf45d". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:03:09 crc kubenswrapper[4730]: I0320 16:03:09.862197    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a875c7b4-22fb-4b91-803c-09a7a439aea1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a875c7b4-22fb-4b91-803c-09a7a439aea1" (UID: "a875c7b4-22fb-4b91-803c-09a7a439aea1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:03:09 crc kubenswrapper[4730]: I0320 16:03:09.884383    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a875c7b4-22fb-4b91-803c-09a7a439aea1-config-data" (OuterVolumeSpecName: "config-data") pod "a875c7b4-22fb-4b91-803c-09a7a439aea1" (UID: "a875c7b4-22fb-4b91-803c-09a7a439aea1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:03:09 crc kubenswrapper[4730]: I0320 16:03:09.886331    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a875c7b4-22fb-4b91-803c-09a7a439aea1-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a875c7b4-22fb-4b91-803c-09a7a439aea1" (UID: "a875c7b4-22fb-4b91-803c-09a7a439aea1"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:03:09 crc kubenswrapper[4730]: I0320 16:03:09.919376    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a875c7b4-22fb-4b91-803c-09a7a439aea1-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a875c7b4-22fb-4b91-803c-09a7a439aea1" (UID: "a875c7b4-22fb-4b91-803c-09a7a439aea1"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:03:09 crc kubenswrapper[4730]: I0320 16:03:09.930366    4730 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a875c7b4-22fb-4b91-803c-09a7a439aea1-public-tls-certs\") on node \"crc\" DevicePath \"\""
Mar 20 16:03:09 crc kubenswrapper[4730]: I0320 16:03:09.930389    4730 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a875c7b4-22fb-4b91-803c-09a7a439aea1-config-data\") on node \"crc\" DevicePath \"\""
Mar 20 16:03:09 crc kubenswrapper[4730]: I0320 16:03:09.930398    4730 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a875c7b4-22fb-4b91-803c-09a7a439aea1-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:03:09 crc kubenswrapper[4730]: I0320 16:03:09.930407    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cf45d\" (UniqueName: \"kubernetes.io/projected/a875c7b4-22fb-4b91-803c-09a7a439aea1-kube-api-access-cf45d\") on node \"crc\" DevicePath \"\""
Mar 20 16:03:09 crc kubenswrapper[4730]: I0320 16:03:09.930434    4730 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a875c7b4-22fb-4b91-803c-09a7a439aea1-internal-tls-certs\") on node \"crc\" DevicePath \"\""
Mar 20 16:03:10 crc kubenswrapper[4730]: I0320 16:03:10.296011    4730 generic.go:334] "Generic (PLEG): container finished" podID="a875c7b4-22fb-4b91-803c-09a7a439aea1" containerID="8ad5725e9930431c50bd1862d3f763c6380098c554ce816d54c8f41bdc9481c4" exitCode=0
Mar 20 16:03:10 crc kubenswrapper[4730]: I0320 16:03:10.296095    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0"
Mar 20 16:03:10 crc kubenswrapper[4730]: I0320 16:03:10.296118    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a875c7b4-22fb-4b91-803c-09a7a439aea1","Type":"ContainerDied","Data":"8ad5725e9930431c50bd1862d3f763c6380098c554ce816d54c8f41bdc9481c4"}
Mar 20 16:03:10 crc kubenswrapper[4730]: I0320 16:03:10.299416    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a875c7b4-22fb-4b91-803c-09a7a439aea1","Type":"ContainerDied","Data":"b02119858ae3b92ae86beccde670787e102f4edc1a93a7bf0ae26e53cf88792e"}
Mar 20 16:03:10 crc kubenswrapper[4730]: I0320 16:03:10.299441    4730 scope.go:117] "RemoveContainer" containerID="8ad5725e9930431c50bd1862d3f763c6380098c554ce816d54c8f41bdc9481c4"
Mar 20 16:03:10 crc kubenswrapper[4730]: I0320 16:03:10.305879    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a58f453e-84d8-47b1-8740-406f92c4ca79","Type":"ContainerStarted","Data":"28085b89ffc4ce452ccb90eb0e4869438568347b071b197104cf4fffebe470c3"}
Mar 20 16:03:10 crc kubenswrapper[4730]: I0320 16:03:10.305925    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a58f453e-84d8-47b1-8740-406f92c4ca79","Type":"ContainerStarted","Data":"5b95a18c7e8212d8f5c84cac299872f1049c51407e0f916a9069755e43ecc51f"}
Mar 20 16:03:10 crc kubenswrapper[4730]: I0320 16:03:10.309023    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4deff063-ecb8-4cf2-8e94-45ab62a613bc","Type":"ContainerStarted","Data":"d2a3b9bce5ad193f1702a85ebe0d22aa011263d2d3114ec660ae0fb7d74ed20b"}
Mar 20 16:03:10 crc kubenswrapper[4730]: I0320 16:03:10.309064    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4deff063-ecb8-4cf2-8e94-45ab62a613bc","Type":"ContainerStarted","Data":"21158f74c1175b4725450a9060010e126ee9a4812de984c4772223cfd04c4b44"}
Mar 20 16:03:10 crc kubenswrapper[4730]: I0320 16:03:10.340984    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.340962723 podStartE2EDuration="2.340962723s" podCreationTimestamp="2026-03-20 16:03:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:03:10.329887704 +0000 UTC m=+1449.543259153" watchObservedRunningTime="2026-03-20 16:03:10.340962723 +0000 UTC m=+1449.554334092"
Mar 20 16:03:10 crc kubenswrapper[4730]: I0320 16:03:10.343462    4730 scope.go:117] "RemoveContainer" containerID="5b5163f015343a503b8bb1265422ad47154d41b97a20358ae2eebcfb9b02968f"
Mar 20 16:03:10 crc kubenswrapper[4730]: I0320 16:03:10.359852    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"]
Mar 20 16:03:10 crc kubenswrapper[4730]: I0320 16:03:10.384623    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"]
Mar 20 16:03:10 crc kubenswrapper[4730]: I0320 16:03:10.390731    4730 scope.go:117] "RemoveContainer" containerID="8ad5725e9930431c50bd1862d3f763c6380098c554ce816d54c8f41bdc9481c4"
Mar 20 16:03:10 crc kubenswrapper[4730]: E0320 16:03:10.391191    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ad5725e9930431c50bd1862d3f763c6380098c554ce816d54c8f41bdc9481c4\": container with ID starting with 8ad5725e9930431c50bd1862d3f763c6380098c554ce816d54c8f41bdc9481c4 not found: ID does not exist" containerID="8ad5725e9930431c50bd1862d3f763c6380098c554ce816d54c8f41bdc9481c4"
Mar 20 16:03:10 crc kubenswrapper[4730]: I0320 16:03:10.391319    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ad5725e9930431c50bd1862d3f763c6380098c554ce816d54c8f41bdc9481c4"} err="failed to get container status \"8ad5725e9930431c50bd1862d3f763c6380098c554ce816d54c8f41bdc9481c4\": rpc error: code = NotFound desc = could not find container \"8ad5725e9930431c50bd1862d3f763c6380098c554ce816d54c8f41bdc9481c4\": container with ID starting with 8ad5725e9930431c50bd1862d3f763c6380098c554ce816d54c8f41bdc9481c4 not found: ID does not exist"
Mar 20 16:03:10 crc kubenswrapper[4730]: I0320 16:03:10.391351    4730 scope.go:117] "RemoveContainer" containerID="5b5163f015343a503b8bb1265422ad47154d41b97a20358ae2eebcfb9b02968f"
Mar 20 16:03:10 crc kubenswrapper[4730]: E0320 16:03:10.392653    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b5163f015343a503b8bb1265422ad47154d41b97a20358ae2eebcfb9b02968f\": container with ID starting with 5b5163f015343a503b8bb1265422ad47154d41b97a20358ae2eebcfb9b02968f not found: ID does not exist" containerID="5b5163f015343a503b8bb1265422ad47154d41b97a20358ae2eebcfb9b02968f"
Mar 20 16:03:10 crc kubenswrapper[4730]: I0320 16:03:10.392676    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b5163f015343a503b8bb1265422ad47154d41b97a20358ae2eebcfb9b02968f"} err="failed to get container status \"5b5163f015343a503b8bb1265422ad47154d41b97a20358ae2eebcfb9b02968f\": rpc error: code = NotFound desc = could not find container \"5b5163f015343a503b8bb1265422ad47154d41b97a20358ae2eebcfb9b02968f\": container with ID starting with 5b5163f015343a503b8bb1265422ad47154d41b97a20358ae2eebcfb9b02968f not found: ID does not exist"
Mar 20 16:03:10 crc kubenswrapper[4730]: I0320 16:03:10.397105    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"]
Mar 20 16:03:10 crc kubenswrapper[4730]: E0320 16:03:10.397514    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a875c7b4-22fb-4b91-803c-09a7a439aea1" containerName="nova-api-api"
Mar 20 16:03:10 crc kubenswrapper[4730]: I0320 16:03:10.397532    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="a875c7b4-22fb-4b91-803c-09a7a439aea1" containerName="nova-api-api"
Mar 20 16:03:10 crc kubenswrapper[4730]: E0320 16:03:10.397564    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a875c7b4-22fb-4b91-803c-09a7a439aea1" containerName="nova-api-log"
Mar 20 16:03:10 crc kubenswrapper[4730]: I0320 16:03:10.397571    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="a875c7b4-22fb-4b91-803c-09a7a439aea1" containerName="nova-api-log"
Mar 20 16:03:10 crc kubenswrapper[4730]: I0320 16:03:10.397773    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="a875c7b4-22fb-4b91-803c-09a7a439aea1" containerName="nova-api-api"
Mar 20 16:03:10 crc kubenswrapper[4730]: I0320 16:03:10.397806    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="a875c7b4-22fb-4b91-803c-09a7a439aea1" containerName="nova-api-log"
Mar 20 16:03:10 crc kubenswrapper[4730]: I0320 16:03:10.398827    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0"
Mar 20 16:03:10 crc kubenswrapper[4730]: I0320 16:03:10.403077    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc"
Mar 20 16:03:10 crc kubenswrapper[4730]: I0320 16:03:10.404639    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data"
Mar 20 16:03:10 crc kubenswrapper[4730]: I0320 16:03:10.404835    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc"
Mar 20 16:03:10 crc kubenswrapper[4730]: I0320 16:03:10.410517    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.410494271 podStartE2EDuration="2.410494271s" podCreationTimestamp="2026-03-20 16:03:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:03:10.377427391 +0000 UTC m=+1449.590798770" watchObservedRunningTime="2026-03-20 16:03:10.410494271 +0000 UTC m=+1449.623865650"
Mar 20 16:03:10 crc kubenswrapper[4730]: I0320 16:03:10.434288    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"]
Mar 20 16:03:10 crc kubenswrapper[4730]: I0320 16:03:10.543742    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2380321d-63e0-40a2-8ca4-5780cba46259-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2380321d-63e0-40a2-8ca4-5780cba46259\") " pod="openstack/nova-api-0"
Mar 20 16:03:10 crc kubenswrapper[4730]: I0320 16:03:10.543963    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2380321d-63e0-40a2-8ca4-5780cba46259-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2380321d-63e0-40a2-8ca4-5780cba46259\") " pod="openstack/nova-api-0"
Mar 20 16:03:10 crc kubenswrapper[4730]: I0320 16:03:10.544111    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2380321d-63e0-40a2-8ca4-5780cba46259-logs\") pod \"nova-api-0\" (UID: \"2380321d-63e0-40a2-8ca4-5780cba46259\") " pod="openstack/nova-api-0"
Mar 20 16:03:10 crc kubenswrapper[4730]: I0320 16:03:10.544280    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trrdt\" (UniqueName: \"kubernetes.io/projected/2380321d-63e0-40a2-8ca4-5780cba46259-kube-api-access-trrdt\") pod \"nova-api-0\" (UID: \"2380321d-63e0-40a2-8ca4-5780cba46259\") " pod="openstack/nova-api-0"
Mar 20 16:03:10 crc kubenswrapper[4730]: I0320 16:03:10.544400    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2380321d-63e0-40a2-8ca4-5780cba46259-public-tls-certs\") pod \"nova-api-0\" (UID: \"2380321d-63e0-40a2-8ca4-5780cba46259\") " pod="openstack/nova-api-0"
Mar 20 16:03:10 crc kubenswrapper[4730]: I0320 16:03:10.544480    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2380321d-63e0-40a2-8ca4-5780cba46259-config-data\") pod \"nova-api-0\" (UID: \"2380321d-63e0-40a2-8ca4-5780cba46259\") " pod="openstack/nova-api-0"
Mar 20 16:03:10 crc kubenswrapper[4730]: I0320 16:03:10.646123    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2380321d-63e0-40a2-8ca4-5780cba46259-public-tls-certs\") pod \"nova-api-0\" (UID: \"2380321d-63e0-40a2-8ca4-5780cba46259\") " pod="openstack/nova-api-0"
Mar 20 16:03:10 crc kubenswrapper[4730]: I0320 16:03:10.646428    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2380321d-63e0-40a2-8ca4-5780cba46259-config-data\") pod \"nova-api-0\" (UID: \"2380321d-63e0-40a2-8ca4-5780cba46259\") " pod="openstack/nova-api-0"
Mar 20 16:03:10 crc kubenswrapper[4730]: I0320 16:03:10.646701    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2380321d-63e0-40a2-8ca4-5780cba46259-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2380321d-63e0-40a2-8ca4-5780cba46259\") " pod="openstack/nova-api-0"
Mar 20 16:03:10 crc kubenswrapper[4730]: I0320 16:03:10.646919    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2380321d-63e0-40a2-8ca4-5780cba46259-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2380321d-63e0-40a2-8ca4-5780cba46259\") " pod="openstack/nova-api-0"
Mar 20 16:03:10 crc kubenswrapper[4730]: I0320 16:03:10.647428    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2380321d-63e0-40a2-8ca4-5780cba46259-logs\") pod \"nova-api-0\" (UID: \"2380321d-63e0-40a2-8ca4-5780cba46259\") " pod="openstack/nova-api-0"
Mar 20 16:03:10 crc kubenswrapper[4730]: I0320 16:03:10.647770    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trrdt\" (UniqueName: \"kubernetes.io/projected/2380321d-63e0-40a2-8ca4-5780cba46259-kube-api-access-trrdt\") pod \"nova-api-0\" (UID: \"2380321d-63e0-40a2-8ca4-5780cba46259\") " pod="openstack/nova-api-0"
Mar 20 16:03:10 crc kubenswrapper[4730]: I0320 16:03:10.648808    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2380321d-63e0-40a2-8ca4-5780cba46259-logs\") pod \"nova-api-0\" (UID: \"2380321d-63e0-40a2-8ca4-5780cba46259\") " pod="openstack/nova-api-0"
Mar 20 16:03:10 crc kubenswrapper[4730]: I0320 16:03:10.649924    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2380321d-63e0-40a2-8ca4-5780cba46259-public-tls-certs\") pod \"nova-api-0\" (UID: \"2380321d-63e0-40a2-8ca4-5780cba46259\") " pod="openstack/nova-api-0"
Mar 20 16:03:10 crc kubenswrapper[4730]: I0320 16:03:10.652775    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2380321d-63e0-40a2-8ca4-5780cba46259-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2380321d-63e0-40a2-8ca4-5780cba46259\") " pod="openstack/nova-api-0"
Mar 20 16:03:10 crc kubenswrapper[4730]: I0320 16:03:10.654402    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2380321d-63e0-40a2-8ca4-5780cba46259-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2380321d-63e0-40a2-8ca4-5780cba46259\") " pod="openstack/nova-api-0"
Mar 20 16:03:10 crc kubenswrapper[4730]: I0320 16:03:10.656233    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2380321d-63e0-40a2-8ca4-5780cba46259-config-data\") pod \"nova-api-0\" (UID: \"2380321d-63e0-40a2-8ca4-5780cba46259\") " pod="openstack/nova-api-0"
Mar 20 16:03:10 crc kubenswrapper[4730]: I0320 16:03:10.668043    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trrdt\" (UniqueName: \"kubernetes.io/projected/2380321d-63e0-40a2-8ca4-5780cba46259-kube-api-access-trrdt\") pod \"nova-api-0\" (UID: \"2380321d-63e0-40a2-8ca4-5780cba46259\") " pod="openstack/nova-api-0"
Mar 20 16:03:10 crc kubenswrapper[4730]: I0320 16:03:10.725706    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0"
Mar 20 16:03:11 crc kubenswrapper[4730]: I0320 16:03:11.204821    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"]
Mar 20 16:03:11 crc kubenswrapper[4730]: W0320 16:03:11.206108    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2380321d_63e0_40a2_8ca4_5780cba46259.slice/crio-cdfcbf8df97ae3fb78b268f7faedaffc2024b946380448f550e841b6a860cf3e WatchSource:0}: Error finding container cdfcbf8df97ae3fb78b268f7faedaffc2024b946380448f550e841b6a860cf3e: Status 404 returned error can't find the container with id cdfcbf8df97ae3fb78b268f7faedaffc2024b946380448f550e841b6a860cf3e
Mar 20 16:03:11 crc kubenswrapper[4730]: I0320 16:03:11.329862    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2333c0f3-d6ce-405f-b8c8-755be42ba74b","Type":"ContainerStarted","Data":"beb9c38840f81abf1794ad7a37e5012911824dfd945cddb58ee8a2a4d957578d"}
Mar 20 16:03:11 crc kubenswrapper[4730]: I0320 16:03:11.330340    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0"
Mar 20 16:03:11 crc kubenswrapper[4730]: I0320 16:03:11.332910    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2380321d-63e0-40a2-8ca4-5780cba46259","Type":"ContainerStarted","Data":"cdfcbf8df97ae3fb78b268f7faedaffc2024b946380448f550e841b6a860cf3e"}
Mar 20 16:03:11 crc kubenswrapper[4730]: I0320 16:03:11.365831    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.318650157 podStartE2EDuration="8.365813702s" podCreationTimestamp="2026-03-20 16:03:03 +0000 UTC" firstStartedPulling="2026-03-20 16:03:04.471970169 +0000 UTC m=+1443.685341528" lastFinishedPulling="2026-03-20 16:03:10.519133714 +0000 UTC m=+1449.732505073" observedRunningTime="2026-03-20 16:03:11.357882214 +0000 UTC m=+1450.571253583" watchObservedRunningTime="2026-03-20 16:03:11.365813702 +0000 UTC m=+1450.579185071"
Mar 20 16:03:11 crc kubenswrapper[4730]: I0320 16:03:11.559559    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a875c7b4-22fb-4b91-803c-09a7a439aea1" path="/var/lib/kubelet/pods/a875c7b4-22fb-4b91-803c-09a7a439aea1/volumes"
Mar 20 16:03:12 crc kubenswrapper[4730]: I0320 16:03:12.343777    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2380321d-63e0-40a2-8ca4-5780cba46259","Type":"ContainerStarted","Data":"ba5e35851950f100dad51028dcbb21bbfef539fc12fbfe9a41f99f5a8cf0301b"}
Mar 20 16:03:12 crc kubenswrapper[4730]: I0320 16:03:12.344130    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2380321d-63e0-40a2-8ca4-5780cba46259","Type":"ContainerStarted","Data":"ffc44d330d32f88758a8bfabb235a3b27c595d5f1479c1e6ef4eaf45b82a0cd6"}
Mar 20 16:03:12 crc kubenswrapper[4730]: I0320 16:03:12.373729    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.373709084 podStartE2EDuration="2.373709084s" podCreationTimestamp="2026-03-20 16:03:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:03:12.364059407 +0000 UTC m=+1451.577430786" watchObservedRunningTime="2026-03-20 16:03:12.373709084 +0000 UTC m=+1451.587080443"
Mar 20 16:03:13 crc kubenswrapper[4730]: I0320 16:03:13.714752    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0"
Mar 20 16:03:18 crc kubenswrapper[4730]: I0320 16:03:18.695093    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0"
Mar 20 16:03:18 crc kubenswrapper[4730]: I0320 16:03:18.695622    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0"
Mar 20 16:03:18 crc kubenswrapper[4730]: I0320 16:03:18.714414    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0"
Mar 20 16:03:18 crc kubenswrapper[4730]: I0320 16:03:18.740677    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0"
Mar 20 16:03:19 crc kubenswrapper[4730]: I0320 16:03:19.474615    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0"
Mar 20 16:03:19 crc kubenswrapper[4730]: I0320 16:03:19.709365    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a58f453e-84d8-47b1-8740-406f92c4ca79" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.227:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)"
Mar 20 16:03:19 crc kubenswrapper[4730]: I0320 16:03:19.709411    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a58f453e-84d8-47b1-8740-406f92c4ca79" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.227:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)"
Mar 20 16:03:20 crc kubenswrapper[4730]: I0320 16:03:20.726655    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0"
Mar 20 16:03:20 crc kubenswrapper[4730]: I0320 16:03:20.726999    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0"
Mar 20 16:03:21 crc kubenswrapper[4730]: I0320 16:03:21.740424    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2380321d-63e0-40a2-8ca4-5780cba46259" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.229:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)"
Mar 20 16:03:21 crc kubenswrapper[4730]: I0320 16:03:21.740470    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2380321d-63e0-40a2-8ca4-5780cba46259" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.229:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)"
Mar 20 16:03:26 crc kubenswrapper[4730]: I0320 16:03:26.695169    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0"
Mar 20 16:03:26 crc kubenswrapper[4730]: I0320 16:03:26.695929    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0"
Mar 20 16:03:28 crc kubenswrapper[4730]: I0320 16:03:28.700194    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0"
Mar 20 16:03:28 crc kubenswrapper[4730]: I0320 16:03:28.702552    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0"
Mar 20 16:03:28 crc kubenswrapper[4730]: I0320 16:03:28.706194    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0"
Mar 20 16:03:28 crc kubenswrapper[4730]: I0320 16:03:28.725892    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0"
Mar 20 16:03:28 crc kubenswrapper[4730]: I0320 16:03:28.726870    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0"
Mar 20 16:03:29 crc kubenswrapper[4730]: I0320 16:03:29.554001    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0"
Mar 20 16:03:30 crc kubenswrapper[4730]: I0320 16:03:30.736654    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0"
Mar 20 16:03:30 crc kubenswrapper[4730]: I0320 16:03:30.737180    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0"
Mar 20 16:03:30 crc kubenswrapper[4730]: I0320 16:03:30.748431    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0"
Mar 20 16:03:31 crc kubenswrapper[4730]: I0320 16:03:31.646267    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0"
Mar 20 16:03:34 crc kubenswrapper[4730]: I0320 16:03:34.003073    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0"
Mar 20 16:03:43 crc kubenswrapper[4730]: I0320 16:03:43.762430    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"]
Mar 20 16:03:44 crc kubenswrapper[4730]: I0320 16:03:44.673490    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"]
Mar 20 16:03:47 crc kubenswrapper[4730]: I0320 16:03:47.128146    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="dfd9111c-a9f4-4874-91fc-c0ef68ae09a3" containerName="rabbitmq" containerID="cri-o://3a783d296547ab247634b62ed131b57fa9392453e5aadc95036d56c15ea1686f" gracePeriod=604797
Mar 20 16:03:47 crc kubenswrapper[4730]: I0320 16:03:47.825212    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="8043f69c-832c-4afa-a9b9-211507664805" containerName="rabbitmq" containerID="cri-o://f4ff5614730f4bee870729b9dbea193a82cb2fbf2b64a4650757a12a3469fc3b" gracePeriod=604797
Mar 20 16:03:48 crc kubenswrapper[4730]: I0320 16:03:48.752450    4730 generic.go:334] "Generic (PLEG): container finished" podID="dfd9111c-a9f4-4874-91fc-c0ef68ae09a3" containerID="3a783d296547ab247634b62ed131b57fa9392453e5aadc95036d56c15ea1686f" exitCode=0
Mar 20 16:03:48 crc kubenswrapper[4730]: I0320 16:03:48.752547    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3","Type":"ContainerDied","Data":"3a783d296547ab247634b62ed131b57fa9392453e5aadc95036d56c15ea1686f"}
Mar 20 16:03:48 crc kubenswrapper[4730]: I0320 16:03:48.752870    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3","Type":"ContainerDied","Data":"ddda948591892d2a138c1c22bc7cf5e93ad382615cc9ff618b810cf784bacaf9"}
Mar 20 16:03:48 crc kubenswrapper[4730]: I0320 16:03:48.752903    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddda948591892d2a138c1c22bc7cf5e93ad382615cc9ff618b810cf784bacaf9"
Mar 20 16:03:48 crc kubenswrapper[4730]: I0320 16:03:48.807567    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0"
Mar 20 16:03:48 crc kubenswrapper[4730]: I0320 16:03:48.857838    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-config-data\") pod \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\" (UID: \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\") "
Mar 20 16:03:48 crc kubenswrapper[4730]: I0320 16:03:48.857918    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\" (UID: \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\") "
Mar 20 16:03:48 crc kubenswrapper[4730]: I0320 16:03:48.857981    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-pod-info\") pod \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\" (UID: \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\") "
Mar 20 16:03:48 crc kubenswrapper[4730]: I0320 16:03:48.858005    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-rabbitmq-plugins\") pod \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\" (UID: \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\") "
Mar 20 16:03:48 crc kubenswrapper[4730]: I0320 16:03:48.858027    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-rabbitmq-confd\") pod \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\" (UID: \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\") "
Mar 20 16:03:48 crc kubenswrapper[4730]: I0320 16:03:48.858066    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-rabbitmq-tls\") pod \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\" (UID: \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\") "
Mar 20 16:03:48 crc kubenswrapper[4730]: I0320 16:03:48.858127    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-erlang-cookie-secret\") pod \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\" (UID: \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\") "
Mar 20 16:03:48 crc kubenswrapper[4730]: I0320 16:03:48.858153    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-plugins-conf\") pod \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\" (UID: \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\") "
Mar 20 16:03:48 crc kubenswrapper[4730]: I0320 16:03:48.858200    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-server-conf\") pod \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\" (UID: \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\") "
Mar 20 16:03:48 crc kubenswrapper[4730]: I0320 16:03:48.858233    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-rabbitmq-erlang-cookie\") pod \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\" (UID: \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\") "
Mar 20 16:03:48 crc kubenswrapper[4730]: I0320 16:03:48.858269    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4jnf\" (UniqueName: \"kubernetes.io/projected/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-kube-api-access-c4jnf\") pod \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\" (UID: \"dfd9111c-a9f4-4874-91fc-c0ef68ae09a3\") "
Mar 20 16:03:48 crc kubenswrapper[4730]: I0320 16:03:48.860456    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "dfd9111c-a9f4-4874-91fc-c0ef68ae09a3" (UID: "dfd9111c-a9f4-4874-91fc-c0ef68ae09a3"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:03:48 crc kubenswrapper[4730]: I0320 16:03:48.861614    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "dfd9111c-a9f4-4874-91fc-c0ef68ae09a3" (UID: "dfd9111c-a9f4-4874-91fc-c0ef68ae09a3"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:03:48 crc kubenswrapper[4730]: I0320 16:03:48.861668    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "dfd9111c-a9f4-4874-91fc-c0ef68ae09a3" (UID: "dfd9111c-a9f4-4874-91fc-c0ef68ae09a3"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:03:48 crc kubenswrapper[4730]: I0320 16:03:48.866733    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-kube-api-access-c4jnf" (OuterVolumeSpecName: "kube-api-access-c4jnf") pod "dfd9111c-a9f4-4874-91fc-c0ef68ae09a3" (UID: "dfd9111c-a9f4-4874-91fc-c0ef68ae09a3"). InnerVolumeSpecName "kube-api-access-c4jnf". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:03:48 crc kubenswrapper[4730]: I0320 16:03:48.867061    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "dfd9111c-a9f4-4874-91fc-c0ef68ae09a3" (UID: "dfd9111c-a9f4-4874-91fc-c0ef68ae09a3"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue ""
Mar 20 16:03:48 crc kubenswrapper[4730]: I0320 16:03:48.868366    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-pod-info" (OuterVolumeSpecName: "pod-info") pod "dfd9111c-a9f4-4874-91fc-c0ef68ae09a3" (UID: "dfd9111c-a9f4-4874-91fc-c0ef68ae09a3"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue ""
Mar 20 16:03:48 crc kubenswrapper[4730]: I0320 16:03:48.883236    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "dfd9111c-a9f4-4874-91fc-c0ef68ae09a3" (UID: "dfd9111c-a9f4-4874-91fc-c0ef68ae09a3"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:03:48 crc kubenswrapper[4730]: I0320 16:03:48.892501    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "dfd9111c-a9f4-4874-91fc-c0ef68ae09a3" (UID: "dfd9111c-a9f4-4874-91fc-c0ef68ae09a3"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:03:48 crc kubenswrapper[4730]: I0320 16:03:48.898911    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-config-data" (OuterVolumeSpecName: "config-data") pod "dfd9111c-a9f4-4874-91fc-c0ef68ae09a3" (UID: "dfd9111c-a9f4-4874-91fc-c0ef68ae09a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:03:48 crc kubenswrapper[4730]: I0320 16:03:48.960853    4730 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" "
Mar 20 16:03:48 crc kubenswrapper[4730]: I0320 16:03:48.960885    4730 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-pod-info\") on node \"crc\" DevicePath \"\""
Mar 20 16:03:48 crc kubenswrapper[4730]: I0320 16:03:48.960895    4730 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-rabbitmq-plugins\") on node \"crc\" DevicePath \"\""
Mar 20 16:03:48 crc kubenswrapper[4730]: I0320 16:03:48.960904    4730 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-rabbitmq-tls\") on node \"crc\" DevicePath \"\""
Mar 20 16:03:48 crc kubenswrapper[4730]: I0320 16:03:48.960914    4730 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-erlang-cookie-secret\") on node \"crc\" DevicePath \"\""
Mar 20 16:03:48 crc kubenswrapper[4730]: I0320 16:03:48.960925    4730 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-plugins-conf\") on node \"crc\" DevicePath \"\""
Mar 20 16:03:48 crc kubenswrapper[4730]: I0320 16:03:48.960933    4730 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\""
Mar 20 16:03:48 crc kubenswrapper[4730]: I0320 16:03:48.960942    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4jnf\" (UniqueName: \"kubernetes.io/projected/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-kube-api-access-c4jnf\") on node \"crc\" DevicePath \"\""
Mar 20 16:03:48 crc kubenswrapper[4730]: I0320 16:03:48.960950    4730 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-config-data\") on node \"crc\" DevicePath \"\""
Mar 20 16:03:48 crc kubenswrapper[4730]: I0320 16:03:48.973472    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-server-conf" (OuterVolumeSpecName: "server-conf") pod "dfd9111c-a9f4-4874-91fc-c0ef68ae09a3" (UID: "dfd9111c-a9f4-4874-91fc-c0ef68ae09a3"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:03:48 crc kubenswrapper[4730]: I0320 16:03:48.982963    4730 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc"
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.062726    4730 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\""
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.063042    4730 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-server-conf\") on node \"crc\" DevicePath \"\""
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.065930    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "dfd9111c-a9f4-4874-91fc-c0ef68ae09a3" (UID: "dfd9111c-a9f4-4874-91fc-c0ef68ae09a3"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.164902    4730 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3-rabbitmq-confd\") on node \"crc\" DevicePath \"\""
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.337719    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0"
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.499020    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8043f69c-832c-4afa-a9b9-211507664805-rabbitmq-plugins\") pod \"8043f69c-832c-4afa-a9b9-211507664805\" (UID: \"8043f69c-832c-4afa-a9b9-211507664805\") "
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.499113    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8043f69c-832c-4afa-a9b9-211507664805-pod-info\") pod \"8043f69c-832c-4afa-a9b9-211507664805\" (UID: \"8043f69c-832c-4afa-a9b9-211507664805\") "
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.499185    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8043f69c-832c-4afa-a9b9-211507664805-rabbitmq-confd\") pod \"8043f69c-832c-4afa-a9b9-211507664805\" (UID: \"8043f69c-832c-4afa-a9b9-211507664805\") "
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.499203    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8043f69c-832c-4afa-a9b9-211507664805-plugins-conf\") pod \"8043f69c-832c-4afa-a9b9-211507664805\" (UID: \"8043f69c-832c-4afa-a9b9-211507664805\") "
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.499267    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8043f69c-832c-4afa-a9b9-211507664805-rabbitmq-erlang-cookie\") pod \"8043f69c-832c-4afa-a9b9-211507664805\" (UID: \"8043f69c-832c-4afa-a9b9-211507664805\") "
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.499291    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8043f69c-832c-4afa-a9b9-211507664805-rabbitmq-tls\") pod \"8043f69c-832c-4afa-a9b9-211507664805\" (UID: \"8043f69c-832c-4afa-a9b9-211507664805\") "
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.499339    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8043f69c-832c-4afa-a9b9-211507664805-erlang-cookie-secret\") pod \"8043f69c-832c-4afa-a9b9-211507664805\" (UID: \"8043f69c-832c-4afa-a9b9-211507664805\") "
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.499363    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8043f69c-832c-4afa-a9b9-211507664805-server-conf\") pod \"8043f69c-832c-4afa-a9b9-211507664805\" (UID: \"8043f69c-832c-4afa-a9b9-211507664805\") "
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.499380    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"8043f69c-832c-4afa-a9b9-211507664805\" (UID: \"8043f69c-832c-4afa-a9b9-211507664805\") "
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.499423    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vth2k\" (UniqueName: \"kubernetes.io/projected/8043f69c-832c-4afa-a9b9-211507664805-kube-api-access-vth2k\") pod \"8043f69c-832c-4afa-a9b9-211507664805\" (UID: \"8043f69c-832c-4afa-a9b9-211507664805\") "
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.499448    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8043f69c-832c-4afa-a9b9-211507664805-config-data\") pod \"8043f69c-832c-4afa-a9b9-211507664805\" (UID: \"8043f69c-832c-4afa-a9b9-211507664805\") "
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.499504    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8043f69c-832c-4afa-a9b9-211507664805-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "8043f69c-832c-4afa-a9b9-211507664805" (UID: "8043f69c-832c-4afa-a9b9-211507664805"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.499853    4730 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8043f69c-832c-4afa-a9b9-211507664805-rabbitmq-plugins\") on node \"crc\" DevicePath \"\""
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.500070    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8043f69c-832c-4afa-a9b9-211507664805-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "8043f69c-832c-4afa-a9b9-211507664805" (UID: "8043f69c-832c-4afa-a9b9-211507664805"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.500212    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8043f69c-832c-4afa-a9b9-211507664805-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "8043f69c-832c-4afa-a9b9-211507664805" (UID: "8043f69c-832c-4afa-a9b9-211507664805"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.503978    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8043f69c-832c-4afa-a9b9-211507664805-kube-api-access-vth2k" (OuterVolumeSpecName: "kube-api-access-vth2k") pod "8043f69c-832c-4afa-a9b9-211507664805" (UID: "8043f69c-832c-4afa-a9b9-211507664805"). InnerVolumeSpecName "kube-api-access-vth2k". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.505718    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8043f69c-832c-4afa-a9b9-211507664805-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "8043f69c-832c-4afa-a9b9-211507664805" (UID: "8043f69c-832c-4afa-a9b9-211507664805"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.509215    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8043f69c-832c-4afa-a9b9-211507664805-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "8043f69c-832c-4afa-a9b9-211507664805" (UID: "8043f69c-832c-4afa-a9b9-211507664805"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.511559    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "persistence") pod "8043f69c-832c-4afa-a9b9-211507664805" (UID: "8043f69c-832c-4afa-a9b9-211507664805"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue ""
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.520600    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/8043f69c-832c-4afa-a9b9-211507664805-pod-info" (OuterVolumeSpecName: "pod-info") pod "8043f69c-832c-4afa-a9b9-211507664805" (UID: "8043f69c-832c-4afa-a9b9-211507664805"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue ""
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.584441    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8043f69c-832c-4afa-a9b9-211507664805-config-data" (OuterVolumeSpecName: "config-data") pod "8043f69c-832c-4afa-a9b9-211507664805" (UID: "8043f69c-832c-4afa-a9b9-211507664805"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.588821    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8043f69c-832c-4afa-a9b9-211507664805-server-conf" (OuterVolumeSpecName: "server-conf") pod "8043f69c-832c-4afa-a9b9-211507664805" (UID: "8043f69c-832c-4afa-a9b9-211507664805"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.601818    4730 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8043f69c-832c-4afa-a9b9-211507664805-pod-info\") on node \"crc\" DevicePath \"\""
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.601861    4730 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8043f69c-832c-4afa-a9b9-211507664805-plugins-conf\") on node \"crc\" DevicePath \"\""
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.601880    4730 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8043f69c-832c-4afa-a9b9-211507664805-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\""
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.601891    4730 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8043f69c-832c-4afa-a9b9-211507664805-rabbitmq-tls\") on node \"crc\" DevicePath \"\""
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.601902    4730 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8043f69c-832c-4afa-a9b9-211507664805-erlang-cookie-secret\") on node \"crc\" DevicePath \"\""
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.601913    4730 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8043f69c-832c-4afa-a9b9-211507664805-server-conf\") on node \"crc\" DevicePath \"\""
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.601937    4730 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" "
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.601952    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vth2k\" (UniqueName: \"kubernetes.io/projected/8043f69c-832c-4afa-a9b9-211507664805-kube-api-access-vth2k\") on node \"crc\" DevicePath \"\""
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.601963    4730 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8043f69c-832c-4afa-a9b9-211507664805-config-data\") on node \"crc\" DevicePath \"\""
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.634935    4730 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc"
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.681012    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8043f69c-832c-4afa-a9b9-211507664805-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "8043f69c-832c-4afa-a9b9-211507664805" (UID: "8043f69c-832c-4afa-a9b9-211507664805"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.703503    4730 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8043f69c-832c-4afa-a9b9-211507664805-rabbitmq-confd\") on node \"crc\" DevicePath \"\""
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.703537    4730 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\""
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.764072    4730 generic.go:334] "Generic (PLEG): container finished" podID="8043f69c-832c-4afa-a9b9-211507664805" containerID="f4ff5614730f4bee870729b9dbea193a82cb2fbf2b64a4650757a12a3469fc3b" exitCode=0
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.764127    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0"
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.764140    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8043f69c-832c-4afa-a9b9-211507664805","Type":"ContainerDied","Data":"f4ff5614730f4bee870729b9dbea193a82cb2fbf2b64a4650757a12a3469fc3b"}
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.764199    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8043f69c-832c-4afa-a9b9-211507664805","Type":"ContainerDied","Data":"8412f9f53f9d92b76fea1c48228bf1b9d922d18161acc37ea8504ee75f4ce219"}
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.764231    4730 scope.go:117] "RemoveContainer" containerID="f4ff5614730f4bee870729b9dbea193a82cb2fbf2b64a4650757a12a3469fc3b"
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.764155    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0"
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.785576    4730 scope.go:117] "RemoveContainer" containerID="5873082b81a5b9253ac47bf2bf3866502e40b3ccab836a111c3bd8134e015ee5"
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.813502    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"]
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.827777    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"]
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.850004    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"]
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.854767    4730 scope.go:117] "RemoveContainer" containerID="f4ff5614730f4bee870729b9dbea193a82cb2fbf2b64a4650757a12a3469fc3b"
Mar 20 16:03:49 crc kubenswrapper[4730]: E0320 16:03:49.855504    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4ff5614730f4bee870729b9dbea193a82cb2fbf2b64a4650757a12a3469fc3b\": container with ID starting with f4ff5614730f4bee870729b9dbea193a82cb2fbf2b64a4650757a12a3469fc3b not found: ID does not exist" containerID="f4ff5614730f4bee870729b9dbea193a82cb2fbf2b64a4650757a12a3469fc3b"
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.855555    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4ff5614730f4bee870729b9dbea193a82cb2fbf2b64a4650757a12a3469fc3b"} err="failed to get container status \"f4ff5614730f4bee870729b9dbea193a82cb2fbf2b64a4650757a12a3469fc3b\": rpc error: code = NotFound desc = could not find container \"f4ff5614730f4bee870729b9dbea193a82cb2fbf2b64a4650757a12a3469fc3b\": container with ID starting with f4ff5614730f4bee870729b9dbea193a82cb2fbf2b64a4650757a12a3469fc3b not found: ID does not exist"
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.855587    4730 scope.go:117] "RemoveContainer" containerID="5873082b81a5b9253ac47bf2bf3866502e40b3ccab836a111c3bd8134e015ee5"
Mar 20 16:03:49 crc kubenswrapper[4730]: E0320 16:03:49.856163    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5873082b81a5b9253ac47bf2bf3866502e40b3ccab836a111c3bd8134e015ee5\": container with ID starting with 5873082b81a5b9253ac47bf2bf3866502e40b3ccab836a111c3bd8134e015ee5 not found: ID does not exist" containerID="5873082b81a5b9253ac47bf2bf3866502e40b3ccab836a111c3bd8134e015ee5"
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.856188    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5873082b81a5b9253ac47bf2bf3866502e40b3ccab836a111c3bd8134e015ee5"} err="failed to get container status \"5873082b81a5b9253ac47bf2bf3866502e40b3ccab836a111c3bd8134e015ee5\": rpc error: code = NotFound desc = could not find container \"5873082b81a5b9253ac47bf2bf3866502e40b3ccab836a111c3bd8134e015ee5\": container with ID starting with 5873082b81a5b9253ac47bf2bf3866502e40b3ccab836a111c3bd8134e015ee5 not found: ID does not exist"
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.860885    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"]
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.873197    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"]
Mar 20 16:03:49 crc kubenswrapper[4730]: E0320 16:03:49.873717    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfd9111c-a9f4-4874-91fc-c0ef68ae09a3" containerName="rabbitmq"
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.873746    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfd9111c-a9f4-4874-91fc-c0ef68ae09a3" containerName="rabbitmq"
Mar 20 16:03:49 crc kubenswrapper[4730]: E0320 16:03:49.873784    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfd9111c-a9f4-4874-91fc-c0ef68ae09a3" containerName="setup-container"
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.873794    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfd9111c-a9f4-4874-91fc-c0ef68ae09a3" containerName="setup-container"
Mar 20 16:03:49 crc kubenswrapper[4730]: E0320 16:03:49.873815    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8043f69c-832c-4afa-a9b9-211507664805" containerName="rabbitmq"
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.873823    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="8043f69c-832c-4afa-a9b9-211507664805" containerName="rabbitmq"
Mar 20 16:03:49 crc kubenswrapper[4730]: E0320 16:03:49.873843    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8043f69c-832c-4afa-a9b9-211507664805" containerName="setup-container"
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.873867    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="8043f69c-832c-4afa-a9b9-211507664805" containerName="setup-container"
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.874103    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfd9111c-a9f4-4874-91fc-c0ef68ae09a3" containerName="rabbitmq"
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.874142    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="8043f69c-832c-4afa-a9b9-211507664805" containerName="rabbitmq"
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.875480    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0"
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.887868    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf"
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.887879    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user"
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.888037    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"]
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.888227    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc"
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.888453    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie"
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.888555    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf"
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.888649    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data"
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.888813    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-rwcvj"
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.908651    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"]
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.917898    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0"
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.924341    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user"
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.937809    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie"
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.941997    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf"
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.942441    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-dlhcb"
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.942587    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data"
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.942636    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf"
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.942640    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc"
Mar 20 16:03:49 crc kubenswrapper[4730]: I0320 16:03:49.973474    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"]
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.010236    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/707f8f93-76f2-4472-a015-5dccae194c5e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"707f8f93-76f2-4472-a015-5dccae194c5e\") " pod="openstack/rabbitmq-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.010351    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/707f8f93-76f2-4472-a015-5dccae194c5e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"707f8f93-76f2-4472-a015-5dccae194c5e\") " pod="openstack/rabbitmq-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.010395    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpvv4\" (UniqueName: \"kubernetes.io/projected/707f8f93-76f2-4472-a015-5dccae194c5e-kube-api-access-qpvv4\") pod \"rabbitmq-server-0\" (UID: \"707f8f93-76f2-4472-a015-5dccae194c5e\") " pod="openstack/rabbitmq-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.010419    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/707f8f93-76f2-4472-a015-5dccae194c5e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"707f8f93-76f2-4472-a015-5dccae194c5e\") " pod="openstack/rabbitmq-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.010442    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b92f799a-be4e-45a1-9e2e-c93c4992c9ce-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b92f799a-be4e-45a1-9e2e-c93c4992c9ce\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.010590    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b92f799a-be4e-45a1-9e2e-c93c4992c9ce-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b92f799a-be4e-45a1-9e2e-c93c4992c9ce\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.010650    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"707f8f93-76f2-4472-a015-5dccae194c5e\") " pod="openstack/rabbitmq-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.010707    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b92f799a-be4e-45a1-9e2e-c93c4992c9ce-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b92f799a-be4e-45a1-9e2e-c93c4992c9ce\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.010758    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b92f799a-be4e-45a1-9e2e-c93c4992c9ce\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.010919    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b92f799a-be4e-45a1-9e2e-c93c4992c9ce-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b92f799a-be4e-45a1-9e2e-c93c4992c9ce\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.010956    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/707f8f93-76f2-4472-a015-5dccae194c5e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"707f8f93-76f2-4472-a015-5dccae194c5e\") " pod="openstack/rabbitmq-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.011079    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b92f799a-be4e-45a1-9e2e-c93c4992c9ce-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b92f799a-be4e-45a1-9e2e-c93c4992c9ce\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.011116    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b92f799a-be4e-45a1-9e2e-c93c4992c9ce-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b92f799a-be4e-45a1-9e2e-c93c4992c9ce\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.011137    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt52r\" (UniqueName: \"kubernetes.io/projected/b92f799a-be4e-45a1-9e2e-c93c4992c9ce-kube-api-access-kt52r\") pod \"rabbitmq-cell1-server-0\" (UID: \"b92f799a-be4e-45a1-9e2e-c93c4992c9ce\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.011159    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b92f799a-be4e-45a1-9e2e-c93c4992c9ce-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b92f799a-be4e-45a1-9e2e-c93c4992c9ce\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.011186    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/707f8f93-76f2-4472-a015-5dccae194c5e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"707f8f93-76f2-4472-a015-5dccae194c5e\") " pod="openstack/rabbitmq-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.011204    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b92f799a-be4e-45a1-9e2e-c93c4992c9ce-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b92f799a-be4e-45a1-9e2e-c93c4992c9ce\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.011236    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/707f8f93-76f2-4472-a015-5dccae194c5e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"707f8f93-76f2-4472-a015-5dccae194c5e\") " pod="openstack/rabbitmq-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.011309    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/707f8f93-76f2-4472-a015-5dccae194c5e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"707f8f93-76f2-4472-a015-5dccae194c5e\") " pod="openstack/rabbitmq-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.011338    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/707f8f93-76f2-4472-a015-5dccae194c5e-config-data\") pod \"rabbitmq-server-0\" (UID: \"707f8f93-76f2-4472-a015-5dccae194c5e\") " pod="openstack/rabbitmq-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.011355    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/707f8f93-76f2-4472-a015-5dccae194c5e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"707f8f93-76f2-4472-a015-5dccae194c5e\") " pod="openstack/rabbitmq-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.011377    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b92f799a-be4e-45a1-9e2e-c93c4992c9ce-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b92f799a-be4e-45a1-9e2e-c93c4992c9ce\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.113564    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b92f799a-be4e-45a1-9e2e-c93c4992c9ce-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b92f799a-be4e-45a1-9e2e-c93c4992c9ce\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.113611    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b92f799a-be4e-45a1-9e2e-c93c4992c9ce-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b92f799a-be4e-45a1-9e2e-c93c4992c9ce\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.113629    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt52r\" (UniqueName: \"kubernetes.io/projected/b92f799a-be4e-45a1-9e2e-c93c4992c9ce-kube-api-access-kt52r\") pod \"rabbitmq-cell1-server-0\" (UID: \"b92f799a-be4e-45a1-9e2e-c93c4992c9ce\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.113649    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b92f799a-be4e-45a1-9e2e-c93c4992c9ce-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b92f799a-be4e-45a1-9e2e-c93c4992c9ce\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.113672    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/707f8f93-76f2-4472-a015-5dccae194c5e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"707f8f93-76f2-4472-a015-5dccae194c5e\") " pod="openstack/rabbitmq-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.113690    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b92f799a-be4e-45a1-9e2e-c93c4992c9ce-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b92f799a-be4e-45a1-9e2e-c93c4992c9ce\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.113704    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/707f8f93-76f2-4472-a015-5dccae194c5e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"707f8f93-76f2-4472-a015-5dccae194c5e\") " pod="openstack/rabbitmq-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.113724    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/707f8f93-76f2-4472-a015-5dccae194c5e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"707f8f93-76f2-4472-a015-5dccae194c5e\") " pod="openstack/rabbitmq-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.114226    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/707f8f93-76f2-4472-a015-5dccae194c5e-config-data\") pod \"rabbitmq-server-0\" (UID: \"707f8f93-76f2-4472-a015-5dccae194c5e\") " pod="openstack/rabbitmq-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.114375    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/707f8f93-76f2-4472-a015-5dccae194c5e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"707f8f93-76f2-4472-a015-5dccae194c5e\") " pod="openstack/rabbitmq-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.114604    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b92f799a-be4e-45a1-9e2e-c93c4992c9ce-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b92f799a-be4e-45a1-9e2e-c93c4992c9ce\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.114756    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/707f8f93-76f2-4472-a015-5dccae194c5e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"707f8f93-76f2-4472-a015-5dccae194c5e\") " pod="openstack/rabbitmq-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.114846    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/707f8f93-76f2-4472-a015-5dccae194c5e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"707f8f93-76f2-4472-a015-5dccae194c5e\") " pod="openstack/rabbitmq-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.114962    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpvv4\" (UniqueName: \"kubernetes.io/projected/707f8f93-76f2-4472-a015-5dccae194c5e-kube-api-access-qpvv4\") pod \"rabbitmq-server-0\" (UID: \"707f8f93-76f2-4472-a015-5dccae194c5e\") " pod="openstack/rabbitmq-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.115041    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/707f8f93-76f2-4472-a015-5dccae194c5e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"707f8f93-76f2-4472-a015-5dccae194c5e\") " pod="openstack/rabbitmq-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.115115    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b92f799a-be4e-45a1-9e2e-c93c4992c9ce-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b92f799a-be4e-45a1-9e2e-c93c4992c9ce\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.115228    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b92f799a-be4e-45a1-9e2e-c93c4992c9ce-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b92f799a-be4e-45a1-9e2e-c93c4992c9ce\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.115415    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"707f8f93-76f2-4472-a015-5dccae194c5e\") " pod="openstack/rabbitmq-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.115519    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b92f799a-be4e-45a1-9e2e-c93c4992c9ce-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b92f799a-be4e-45a1-9e2e-c93c4992c9ce\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.115610    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b92f799a-be4e-45a1-9e2e-c93c4992c9ce\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.115805    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b92f799a-be4e-45a1-9e2e-c93c4992c9ce-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b92f799a-be4e-45a1-9e2e-c93c4992c9ce\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.115907    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/707f8f93-76f2-4472-a015-5dccae194c5e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"707f8f93-76f2-4472-a015-5dccae194c5e\") " pod="openstack/rabbitmq-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.116427    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b92f799a-be4e-45a1-9e2e-c93c4992c9ce-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b92f799a-be4e-45a1-9e2e-c93c4992c9ce\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.117298    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b92f799a-be4e-45a1-9e2e-c93c4992c9ce-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b92f799a-be4e-45a1-9e2e-c93c4992c9ce\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.117529    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b92f799a-be4e-45a1-9e2e-c93c4992c9ce-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b92f799a-be4e-45a1-9e2e-c93c4992c9ce\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.117564    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/707f8f93-76f2-4472-a015-5dccae194c5e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"707f8f93-76f2-4472-a015-5dccae194c5e\") " pod="openstack/rabbitmq-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.117646    4730 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"707f8f93-76f2-4472-a015-5dccae194c5e\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.117662    4730 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b92f799a-be4e-45a1-9e2e-c93c4992c9ce\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-cell1-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.117953    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b92f799a-be4e-45a1-9e2e-c93c4992c9ce-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b92f799a-be4e-45a1-9e2e-c93c4992c9ce\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.117967    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b92f799a-be4e-45a1-9e2e-c93c4992c9ce-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b92f799a-be4e-45a1-9e2e-c93c4992c9ce\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.119530    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/707f8f93-76f2-4472-a015-5dccae194c5e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"707f8f93-76f2-4472-a015-5dccae194c5e\") " pod="openstack/rabbitmq-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.117535    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/707f8f93-76f2-4472-a015-5dccae194c5e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"707f8f93-76f2-4472-a015-5dccae194c5e\") " pod="openstack/rabbitmq-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.117661    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b92f799a-be4e-45a1-9e2e-c93c4992c9ce-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b92f799a-be4e-45a1-9e2e-c93c4992c9ce\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.119951    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/707f8f93-76f2-4472-a015-5dccae194c5e-config-data\") pod \"rabbitmq-server-0\" (UID: \"707f8f93-76f2-4472-a015-5dccae194c5e\") " pod="openstack/rabbitmq-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.120158    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/707f8f93-76f2-4472-a015-5dccae194c5e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"707f8f93-76f2-4472-a015-5dccae194c5e\") " pod="openstack/rabbitmq-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.120452    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b92f799a-be4e-45a1-9e2e-c93c4992c9ce-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b92f799a-be4e-45a1-9e2e-c93c4992c9ce\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.121966    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/707f8f93-76f2-4472-a015-5dccae194c5e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"707f8f93-76f2-4472-a015-5dccae194c5e\") " pod="openstack/rabbitmq-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.123993    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b92f799a-be4e-45a1-9e2e-c93c4992c9ce-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b92f799a-be4e-45a1-9e2e-c93c4992c9ce\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.124480    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/707f8f93-76f2-4472-a015-5dccae194c5e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"707f8f93-76f2-4472-a015-5dccae194c5e\") " pod="openstack/rabbitmq-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.125107    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/707f8f93-76f2-4472-a015-5dccae194c5e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"707f8f93-76f2-4472-a015-5dccae194c5e\") " pod="openstack/rabbitmq-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.125882    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/707f8f93-76f2-4472-a015-5dccae194c5e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"707f8f93-76f2-4472-a015-5dccae194c5e\") " pod="openstack/rabbitmq-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.125988    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b92f799a-be4e-45a1-9e2e-c93c4992c9ce-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b92f799a-be4e-45a1-9e2e-c93c4992c9ce\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.139811    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpvv4\" (UniqueName: \"kubernetes.io/projected/707f8f93-76f2-4472-a015-5dccae194c5e-kube-api-access-qpvv4\") pod \"rabbitmq-server-0\" (UID: \"707f8f93-76f2-4472-a015-5dccae194c5e\") " pod="openstack/rabbitmq-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.139983    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt52r\" (UniqueName: \"kubernetes.io/projected/b92f799a-be4e-45a1-9e2e-c93c4992c9ce-kube-api-access-kt52r\") pod \"rabbitmq-cell1-server-0\" (UID: \"b92f799a-be4e-45a1-9e2e-c93c4992c9ce\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.164970    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b92f799a-be4e-45a1-9e2e-c93c4992c9ce\") " pod="openstack/rabbitmq-cell1-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.176818    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"707f8f93-76f2-4472-a015-5dccae194c5e\") " pod="openstack/rabbitmq-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.196715    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.246114    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0"
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.803395    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"]
Mar 20 16:03:50 crc kubenswrapper[4730]: I0320 16:03:50.908274    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"]
Mar 20 16:03:51 crc kubenswrapper[4730]: I0320 16:03:51.545325    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8043f69c-832c-4afa-a9b9-211507664805" path="/var/lib/kubelet/pods/8043f69c-832c-4afa-a9b9-211507664805/volumes"
Mar 20 16:03:51 crc kubenswrapper[4730]: I0320 16:03:51.546482    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfd9111c-a9f4-4874-91fc-c0ef68ae09a3" path="/var/lib/kubelet/pods/dfd9111c-a9f4-4874-91fc-c0ef68ae09a3/volumes"
Mar 20 16:03:51 crc kubenswrapper[4730]: I0320 16:03:51.785789    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"707f8f93-76f2-4472-a015-5dccae194c5e","Type":"ContainerStarted","Data":"53e55d748e88630ace3309f85c281512f29a48784cd77391059785f2314b3d4c"}
Mar 20 16:03:51 crc kubenswrapper[4730]: I0320 16:03:51.787174    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b92f799a-be4e-45a1-9e2e-c93c4992c9ce","Type":"ContainerStarted","Data":"2d8da0f1c654d6ae5130aa10ef577327b9906a2dbf7fe48ced9b5c8e74d81bae"}
Mar 20 16:03:52 crc kubenswrapper[4730]: I0320 16:03:52.812204    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"707f8f93-76f2-4472-a015-5dccae194c5e","Type":"ContainerStarted","Data":"de03597ab9cfb6693189c9786007baca422ddc54452cd14a6d93d946dbc0292f"}
Mar 20 16:03:52 crc kubenswrapper[4730]: I0320 16:03:52.814419    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b92f799a-be4e-45a1-9e2e-c93c4992c9ce","Type":"ContainerStarted","Data":"1e29d00f6f29fa8f0367c00832e311cff753dbf51d6fb8d2e00ec1e4fe83f33b"}
Mar 20 16:03:59 crc kubenswrapper[4730]: I0320 16:03:59.439667    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-559876945-l7ht2"]
Mar 20 16:03:59 crc kubenswrapper[4730]: I0320 16:03:59.442950    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-559876945-l7ht2"
Mar 20 16:03:59 crc kubenswrapper[4730]: I0320 16:03:59.445194    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam"
Mar 20 16:03:59 crc kubenswrapper[4730]: I0320 16:03:59.491751    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-559876945-l7ht2"]
Mar 20 16:03:59 crc kubenswrapper[4730]: I0320 16:03:59.607332    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99a6ce04-c06b-4fb6-84d6-a836cc82d87a-ovsdbserver-nb\") pod \"dnsmasq-dns-559876945-l7ht2\" (UID: \"99a6ce04-c06b-4fb6-84d6-a836cc82d87a\") " pod="openstack/dnsmasq-dns-559876945-l7ht2"
Mar 20 16:03:59 crc kubenswrapper[4730]: I0320 16:03:59.607402    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99a6ce04-c06b-4fb6-84d6-a836cc82d87a-dns-svc\") pod \"dnsmasq-dns-559876945-l7ht2\" (UID: \"99a6ce04-c06b-4fb6-84d6-a836cc82d87a\") " pod="openstack/dnsmasq-dns-559876945-l7ht2"
Mar 20 16:03:59 crc kubenswrapper[4730]: I0320 16:03:59.607456    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/99a6ce04-c06b-4fb6-84d6-a836cc82d87a-openstack-edpm-ipam\") pod \"dnsmasq-dns-559876945-l7ht2\" (UID: \"99a6ce04-c06b-4fb6-84d6-a836cc82d87a\") " pod="openstack/dnsmasq-dns-559876945-l7ht2"
Mar 20 16:03:59 crc kubenswrapper[4730]: I0320 16:03:59.607561    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99a6ce04-c06b-4fb6-84d6-a836cc82d87a-dns-swift-storage-0\") pod \"dnsmasq-dns-559876945-l7ht2\" (UID: \"99a6ce04-c06b-4fb6-84d6-a836cc82d87a\") " pod="openstack/dnsmasq-dns-559876945-l7ht2"
Mar 20 16:03:59 crc kubenswrapper[4730]: I0320 16:03:59.607587    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99a6ce04-c06b-4fb6-84d6-a836cc82d87a-config\") pod \"dnsmasq-dns-559876945-l7ht2\" (UID: \"99a6ce04-c06b-4fb6-84d6-a836cc82d87a\") " pod="openstack/dnsmasq-dns-559876945-l7ht2"
Mar 20 16:03:59 crc kubenswrapper[4730]: I0320 16:03:59.607622    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g8ft\" (UniqueName: \"kubernetes.io/projected/99a6ce04-c06b-4fb6-84d6-a836cc82d87a-kube-api-access-6g8ft\") pod \"dnsmasq-dns-559876945-l7ht2\" (UID: \"99a6ce04-c06b-4fb6-84d6-a836cc82d87a\") " pod="openstack/dnsmasq-dns-559876945-l7ht2"
Mar 20 16:03:59 crc kubenswrapper[4730]: I0320 16:03:59.607695    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99a6ce04-c06b-4fb6-84d6-a836cc82d87a-ovsdbserver-sb\") pod \"dnsmasq-dns-559876945-l7ht2\" (UID: \"99a6ce04-c06b-4fb6-84d6-a836cc82d87a\") " pod="openstack/dnsmasq-dns-559876945-l7ht2"
Mar 20 16:03:59 crc kubenswrapper[4730]: I0320 16:03:59.710035    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99a6ce04-c06b-4fb6-84d6-a836cc82d87a-ovsdbserver-nb\") pod \"dnsmasq-dns-559876945-l7ht2\" (UID: \"99a6ce04-c06b-4fb6-84d6-a836cc82d87a\") " pod="openstack/dnsmasq-dns-559876945-l7ht2"
Mar 20 16:03:59 crc kubenswrapper[4730]: I0320 16:03:59.710398    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99a6ce04-c06b-4fb6-84d6-a836cc82d87a-dns-svc\") pod \"dnsmasq-dns-559876945-l7ht2\" (UID: \"99a6ce04-c06b-4fb6-84d6-a836cc82d87a\") " pod="openstack/dnsmasq-dns-559876945-l7ht2"
Mar 20 16:03:59 crc kubenswrapper[4730]: I0320 16:03:59.710531    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/99a6ce04-c06b-4fb6-84d6-a836cc82d87a-openstack-edpm-ipam\") pod \"dnsmasq-dns-559876945-l7ht2\" (UID: \"99a6ce04-c06b-4fb6-84d6-a836cc82d87a\") " pod="openstack/dnsmasq-dns-559876945-l7ht2"
Mar 20 16:03:59 crc kubenswrapper[4730]: I0320 16:03:59.710681    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99a6ce04-c06b-4fb6-84d6-a836cc82d87a-dns-swift-storage-0\") pod \"dnsmasq-dns-559876945-l7ht2\" (UID: \"99a6ce04-c06b-4fb6-84d6-a836cc82d87a\") " pod="openstack/dnsmasq-dns-559876945-l7ht2"
Mar 20 16:03:59 crc kubenswrapper[4730]: I0320 16:03:59.710755    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99a6ce04-c06b-4fb6-84d6-a836cc82d87a-config\") pod \"dnsmasq-dns-559876945-l7ht2\" (UID: \"99a6ce04-c06b-4fb6-84d6-a836cc82d87a\") " pod="openstack/dnsmasq-dns-559876945-l7ht2"
Mar 20 16:03:59 crc kubenswrapper[4730]: I0320 16:03:59.710838    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6g8ft\" (UniqueName: \"kubernetes.io/projected/99a6ce04-c06b-4fb6-84d6-a836cc82d87a-kube-api-access-6g8ft\") pod \"dnsmasq-dns-559876945-l7ht2\" (UID: \"99a6ce04-c06b-4fb6-84d6-a836cc82d87a\") " pod="openstack/dnsmasq-dns-559876945-l7ht2"
Mar 20 16:03:59 crc kubenswrapper[4730]: I0320 16:03:59.710989    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99a6ce04-c06b-4fb6-84d6-a836cc82d87a-ovsdbserver-sb\") pod \"dnsmasq-dns-559876945-l7ht2\" (UID: \"99a6ce04-c06b-4fb6-84d6-a836cc82d87a\") " pod="openstack/dnsmasq-dns-559876945-l7ht2"
Mar 20 16:03:59 crc kubenswrapper[4730]: I0320 16:03:59.711563    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99a6ce04-c06b-4fb6-84d6-a836cc82d87a-ovsdbserver-nb\") pod \"dnsmasq-dns-559876945-l7ht2\" (UID: \"99a6ce04-c06b-4fb6-84d6-a836cc82d87a\") " pod="openstack/dnsmasq-dns-559876945-l7ht2"
Mar 20 16:03:59 crc kubenswrapper[4730]: I0320 16:03:59.711585    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/99a6ce04-c06b-4fb6-84d6-a836cc82d87a-openstack-edpm-ipam\") pod \"dnsmasq-dns-559876945-l7ht2\" (UID: \"99a6ce04-c06b-4fb6-84d6-a836cc82d87a\") " pod="openstack/dnsmasq-dns-559876945-l7ht2"
Mar 20 16:03:59 crc kubenswrapper[4730]: I0320 16:03:59.711601    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99a6ce04-c06b-4fb6-84d6-a836cc82d87a-dns-swift-storage-0\") pod \"dnsmasq-dns-559876945-l7ht2\" (UID: \"99a6ce04-c06b-4fb6-84d6-a836cc82d87a\") " pod="openstack/dnsmasq-dns-559876945-l7ht2"
Mar 20 16:03:59 crc kubenswrapper[4730]: I0320 16:03:59.711691    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99a6ce04-c06b-4fb6-84d6-a836cc82d87a-dns-svc\") pod \"dnsmasq-dns-559876945-l7ht2\" (UID: \"99a6ce04-c06b-4fb6-84d6-a836cc82d87a\") " pod="openstack/dnsmasq-dns-559876945-l7ht2"
Mar 20 16:03:59 crc kubenswrapper[4730]: I0320 16:03:59.712613    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99a6ce04-c06b-4fb6-84d6-a836cc82d87a-ovsdbserver-sb\") pod \"dnsmasq-dns-559876945-l7ht2\" (UID: \"99a6ce04-c06b-4fb6-84d6-a836cc82d87a\") " pod="openstack/dnsmasq-dns-559876945-l7ht2"
Mar 20 16:03:59 crc kubenswrapper[4730]: I0320 16:03:59.713750    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99a6ce04-c06b-4fb6-84d6-a836cc82d87a-config\") pod \"dnsmasq-dns-559876945-l7ht2\" (UID: \"99a6ce04-c06b-4fb6-84d6-a836cc82d87a\") " pod="openstack/dnsmasq-dns-559876945-l7ht2"
Mar 20 16:03:59 crc kubenswrapper[4730]: I0320 16:03:59.730741    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g8ft\" (UniqueName: \"kubernetes.io/projected/99a6ce04-c06b-4fb6-84d6-a836cc82d87a-kube-api-access-6g8ft\") pod \"dnsmasq-dns-559876945-l7ht2\" (UID: \"99a6ce04-c06b-4fb6-84d6-a836cc82d87a\") " pod="openstack/dnsmasq-dns-559876945-l7ht2"
Mar 20 16:03:59 crc kubenswrapper[4730]: I0320 16:03:59.801368    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-559876945-l7ht2"
Mar 20 16:04:00 crc kubenswrapper[4730]: I0320 16:04:00.130851    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567044-nw9nk"]
Mar 20 16:04:00 crc kubenswrapper[4730]: I0320 16:04:00.132929    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567044-nw9nk"
Mar 20 16:04:00 crc kubenswrapper[4730]: I0320 16:04:00.139371    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt"
Mar 20 16:04:00 crc kubenswrapper[4730]: I0320 16:04:00.139746    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vlqbl"
Mar 20 16:04:00 crc kubenswrapper[4730]: I0320 16:04:00.139982    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt"
Mar 20 16:04:00 crc kubenswrapper[4730]: I0320 16:04:00.142051    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567044-nw9nk"]
Mar 20 16:04:00 crc kubenswrapper[4730]: I0320 16:04:00.318563    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-559876945-l7ht2"]
Mar 20 16:04:00 crc kubenswrapper[4730]: I0320 16:04:00.322489    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz7j8\" (UniqueName: \"kubernetes.io/projected/44fa3d60-826d-4b59-b44a-0102f155b586-kube-api-access-mz7j8\") pod \"auto-csr-approver-29567044-nw9nk\" (UID: \"44fa3d60-826d-4b59-b44a-0102f155b586\") " pod="openshift-infra/auto-csr-approver-29567044-nw9nk"
Mar 20 16:04:00 crc kubenswrapper[4730]: I0320 16:04:00.424900    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mz7j8\" (UniqueName: \"kubernetes.io/projected/44fa3d60-826d-4b59-b44a-0102f155b586-kube-api-access-mz7j8\") pod \"auto-csr-approver-29567044-nw9nk\" (UID: \"44fa3d60-826d-4b59-b44a-0102f155b586\") " pod="openshift-infra/auto-csr-approver-29567044-nw9nk"
Mar 20 16:04:00 crc kubenswrapper[4730]: I0320 16:04:00.446482    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz7j8\" (UniqueName: \"kubernetes.io/projected/44fa3d60-826d-4b59-b44a-0102f155b586-kube-api-access-mz7j8\") pod \"auto-csr-approver-29567044-nw9nk\" (UID: \"44fa3d60-826d-4b59-b44a-0102f155b586\") " pod="openshift-infra/auto-csr-approver-29567044-nw9nk"
Mar 20 16:04:00 crc kubenswrapper[4730]: I0320 16:04:00.467075    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567044-nw9nk"
Mar 20 16:04:00 crc kubenswrapper[4730]: I0320 16:04:00.891176    4730 generic.go:334] "Generic (PLEG): container finished" podID="99a6ce04-c06b-4fb6-84d6-a836cc82d87a" containerID="9d119601c72d61103c3dc16a3891cc72c3cd26d5d7d837176227b7a8dffe1c9a" exitCode=0
Mar 20 16:04:00 crc kubenswrapper[4730]: I0320 16:04:00.891219    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-559876945-l7ht2" event={"ID":"99a6ce04-c06b-4fb6-84d6-a836cc82d87a","Type":"ContainerDied","Data":"9d119601c72d61103c3dc16a3891cc72c3cd26d5d7d837176227b7a8dffe1c9a"}
Mar 20 16:04:00 crc kubenswrapper[4730]: I0320 16:04:00.891782    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-559876945-l7ht2" event={"ID":"99a6ce04-c06b-4fb6-84d6-a836cc82d87a","Type":"ContainerStarted","Data":"81e53bbf904ba77b9ef44d62d3e4f4d28f5a6cda01dd315128148262131a319a"}
Mar 20 16:04:00 crc kubenswrapper[4730]: I0320 16:04:00.954200    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567044-nw9nk"]
Mar 20 16:04:00 crc kubenswrapper[4730]: W0320 16:04:00.955421    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44fa3d60_826d_4b59_b44a_0102f155b586.slice/crio-6391bff5eb60c16800aba1e6aa157311c040dbb581d96962fb5337dd149a2c31 WatchSource:0}: Error finding container 6391bff5eb60c16800aba1e6aa157311c040dbb581d96962fb5337dd149a2c31: Status 404 returned error can't find the container with id 6391bff5eb60c16800aba1e6aa157311c040dbb581d96962fb5337dd149a2c31
Mar 20 16:04:00 crc kubenswrapper[4730]: I0320 16:04:00.957568    4730 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider
Mar 20 16:04:01 crc kubenswrapper[4730]: I0320 16:04:01.915625    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-559876945-l7ht2" event={"ID":"99a6ce04-c06b-4fb6-84d6-a836cc82d87a","Type":"ContainerStarted","Data":"1cf3b8c0059e1e31ef949cf289521c33a2ceeb42e33377927e55f4fd8a97b5b3"}
Mar 20 16:04:01 crc kubenswrapper[4730]: I0320 16:04:01.916325    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-559876945-l7ht2"
Mar 20 16:04:01 crc kubenswrapper[4730]: I0320 16:04:01.916895    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567044-nw9nk" event={"ID":"44fa3d60-826d-4b59-b44a-0102f155b586","Type":"ContainerStarted","Data":"6391bff5eb60c16800aba1e6aa157311c040dbb581d96962fb5337dd149a2c31"}
Mar 20 16:04:01 crc kubenswrapper[4730]: I0320 16:04:01.950435    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-559876945-l7ht2" podStartSLOduration=2.950415629 podStartE2EDuration="2.950415629s" podCreationTimestamp="2026-03-20 16:03:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:01.940786913 +0000 UTC m=+1501.154158292" watchObservedRunningTime="2026-03-20 16:04:01.950415629 +0000 UTC m=+1501.163786998"
Mar 20 16:04:02 crc kubenswrapper[4730]: I0320 16:04:02.928042    4730 generic.go:334] "Generic (PLEG): container finished" podID="44fa3d60-826d-4b59-b44a-0102f155b586" containerID="3ad79a5f57b1a4c7b377fb15d13f7708e0e00b53bbc48929b06820bc137a571e" exitCode=0
Mar 20 16:04:02 crc kubenswrapper[4730]: I0320 16:04:02.928292    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567044-nw9nk" event={"ID":"44fa3d60-826d-4b59-b44a-0102f155b586","Type":"ContainerDied","Data":"3ad79a5f57b1a4c7b377fb15d13f7708e0e00b53bbc48929b06820bc137a571e"}
Mar 20 16:04:04 crc kubenswrapper[4730]: I0320 16:04:04.298443    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567044-nw9nk"
Mar 20 16:04:04 crc kubenswrapper[4730]: I0320 16:04:04.413083    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mz7j8\" (UniqueName: \"kubernetes.io/projected/44fa3d60-826d-4b59-b44a-0102f155b586-kube-api-access-mz7j8\") pod \"44fa3d60-826d-4b59-b44a-0102f155b586\" (UID: \"44fa3d60-826d-4b59-b44a-0102f155b586\") "
Mar 20 16:04:04 crc kubenswrapper[4730]: I0320 16:04:04.420083    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44fa3d60-826d-4b59-b44a-0102f155b586-kube-api-access-mz7j8" (OuterVolumeSpecName: "kube-api-access-mz7j8") pod "44fa3d60-826d-4b59-b44a-0102f155b586" (UID: "44fa3d60-826d-4b59-b44a-0102f155b586"). InnerVolumeSpecName "kube-api-access-mz7j8". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:04:04 crc kubenswrapper[4730]: I0320 16:04:04.515526    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mz7j8\" (UniqueName: \"kubernetes.io/projected/44fa3d60-826d-4b59-b44a-0102f155b586-kube-api-access-mz7j8\") on node \"crc\" DevicePath \"\""
Mar 20 16:04:04 crc kubenswrapper[4730]: I0320 16:04:04.989652    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567044-nw9nk" event={"ID":"44fa3d60-826d-4b59-b44a-0102f155b586","Type":"ContainerDied","Data":"6391bff5eb60c16800aba1e6aa157311c040dbb581d96962fb5337dd149a2c31"}
Mar 20 16:04:04 crc kubenswrapper[4730]: I0320 16:04:04.989730    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6391bff5eb60c16800aba1e6aa157311c040dbb581d96962fb5337dd149a2c31"
Mar 20 16:04:04 crc kubenswrapper[4730]: I0320 16:04:04.989791    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567044-nw9nk"
Mar 20 16:04:05 crc kubenswrapper[4730]: I0320 16:04:05.389999    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567038-qvgqb"]
Mar 20 16:04:05 crc kubenswrapper[4730]: I0320 16:04:05.397927    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567038-qvgqb"]
Mar 20 16:04:05 crc kubenswrapper[4730]: I0320 16:04:05.546813    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67854402-4e0e-4ebe-b9d4-700669827780" path="/var/lib/kubelet/pods/67854402-4e0e-4ebe-b9d4-700669827780/volumes"
Mar 20 16:04:09 crc kubenswrapper[4730]: I0320 16:04:09.803192    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-559876945-l7ht2"
Mar 20 16:04:09 crc kubenswrapper[4730]: I0320 16:04:09.907564    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f99bcbd6f-lkgpt"]
Mar 20 16:04:09 crc kubenswrapper[4730]: I0320 16:04:09.907978    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f99bcbd6f-lkgpt" podUID="eee8e670-a743-4284-bde0-5a8a77d8058e" containerName="dnsmasq-dns" containerID="cri-o://b9b68041f5e6d75af1b6ac50a20d8efa0a3cf0ab36fd4d2c2cf619950c911910" gracePeriod=10
Mar 20 16:04:10 crc kubenswrapper[4730]: I0320 16:04:10.070523    4730 generic.go:334] "Generic (PLEG): container finished" podID="eee8e670-a743-4284-bde0-5a8a77d8058e" containerID="b9b68041f5e6d75af1b6ac50a20d8efa0a3cf0ab36fd4d2c2cf619950c911910" exitCode=0
Mar 20 16:04:10 crc kubenswrapper[4730]: I0320 16:04:10.070804    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f99bcbd6f-lkgpt" event={"ID":"eee8e670-a743-4284-bde0-5a8a77d8058e","Type":"ContainerDied","Data":"b9b68041f5e6d75af1b6ac50a20d8efa0a3cf0ab36fd4d2c2cf619950c911910"}
Mar 20 16:04:10 crc kubenswrapper[4730]: I0320 16:04:10.103460    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9449c877-vxfrw"]
Mar 20 16:04:10 crc kubenswrapper[4730]: E0320 16:04:10.104170    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44fa3d60-826d-4b59-b44a-0102f155b586" containerName="oc"
Mar 20 16:04:10 crc kubenswrapper[4730]: I0320 16:04:10.104195    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="44fa3d60-826d-4b59-b44a-0102f155b586" containerName="oc"
Mar 20 16:04:10 crc kubenswrapper[4730]: I0320 16:04:10.104493    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="44fa3d60-826d-4b59-b44a-0102f155b586" containerName="oc"
Mar 20 16:04:10 crc kubenswrapper[4730]: I0320 16:04:10.105634    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9449c877-vxfrw"
Mar 20 16:04:10 crc kubenswrapper[4730]: I0320 16:04:10.126033    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9449c877-vxfrw"]
Mar 20 16:04:10 crc kubenswrapper[4730]: I0320 16:04:10.249043    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4139a04b-4804-475f-9da3-6c40dad56690-dns-svc\") pod \"dnsmasq-dns-9449c877-vxfrw\" (UID: \"4139a04b-4804-475f-9da3-6c40dad56690\") " pod="openstack/dnsmasq-dns-9449c877-vxfrw"
Mar 20 16:04:10 crc kubenswrapper[4730]: I0320 16:04:10.249090    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgsp4\" (UniqueName: \"kubernetes.io/projected/4139a04b-4804-475f-9da3-6c40dad56690-kube-api-access-dgsp4\") pod \"dnsmasq-dns-9449c877-vxfrw\" (UID: \"4139a04b-4804-475f-9da3-6c40dad56690\") " pod="openstack/dnsmasq-dns-9449c877-vxfrw"
Mar 20 16:04:10 crc kubenswrapper[4730]: I0320 16:04:10.249109    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4139a04b-4804-475f-9da3-6c40dad56690-config\") pod \"dnsmasq-dns-9449c877-vxfrw\" (UID: \"4139a04b-4804-475f-9da3-6c40dad56690\") " pod="openstack/dnsmasq-dns-9449c877-vxfrw"
Mar 20 16:04:10 crc kubenswrapper[4730]: I0320 16:04:10.249127    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4139a04b-4804-475f-9da3-6c40dad56690-ovsdbserver-sb\") pod \"dnsmasq-dns-9449c877-vxfrw\" (UID: \"4139a04b-4804-475f-9da3-6c40dad56690\") " pod="openstack/dnsmasq-dns-9449c877-vxfrw"
Mar 20 16:04:10 crc kubenswrapper[4730]: I0320 16:04:10.249183    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4139a04b-4804-475f-9da3-6c40dad56690-openstack-edpm-ipam\") pod \"dnsmasq-dns-9449c877-vxfrw\" (UID: \"4139a04b-4804-475f-9da3-6c40dad56690\") " pod="openstack/dnsmasq-dns-9449c877-vxfrw"
Mar 20 16:04:10 crc kubenswrapper[4730]: I0320 16:04:10.249415    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4139a04b-4804-475f-9da3-6c40dad56690-dns-swift-storage-0\") pod \"dnsmasq-dns-9449c877-vxfrw\" (UID: \"4139a04b-4804-475f-9da3-6c40dad56690\") " pod="openstack/dnsmasq-dns-9449c877-vxfrw"
Mar 20 16:04:10 crc kubenswrapper[4730]: I0320 16:04:10.249484    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4139a04b-4804-475f-9da3-6c40dad56690-ovsdbserver-nb\") pod \"dnsmasq-dns-9449c877-vxfrw\" (UID: \"4139a04b-4804-475f-9da3-6c40dad56690\") " pod="openstack/dnsmasq-dns-9449c877-vxfrw"
Mar 20 16:04:10 crc kubenswrapper[4730]: I0320 16:04:10.351064    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4139a04b-4804-475f-9da3-6c40dad56690-openstack-edpm-ipam\") pod \"dnsmasq-dns-9449c877-vxfrw\" (UID: \"4139a04b-4804-475f-9da3-6c40dad56690\") " pod="openstack/dnsmasq-dns-9449c877-vxfrw"
Mar 20 16:04:10 crc kubenswrapper[4730]: I0320 16:04:10.351150    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4139a04b-4804-475f-9da3-6c40dad56690-dns-swift-storage-0\") pod \"dnsmasq-dns-9449c877-vxfrw\" (UID: \"4139a04b-4804-475f-9da3-6c40dad56690\") " pod="openstack/dnsmasq-dns-9449c877-vxfrw"
Mar 20 16:04:10 crc kubenswrapper[4730]: I0320 16:04:10.351177    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4139a04b-4804-475f-9da3-6c40dad56690-ovsdbserver-nb\") pod \"dnsmasq-dns-9449c877-vxfrw\" (UID: \"4139a04b-4804-475f-9da3-6c40dad56690\") " pod="openstack/dnsmasq-dns-9449c877-vxfrw"
Mar 20 16:04:10 crc kubenswrapper[4730]: I0320 16:04:10.351287    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4139a04b-4804-475f-9da3-6c40dad56690-dns-svc\") pod \"dnsmasq-dns-9449c877-vxfrw\" (UID: \"4139a04b-4804-475f-9da3-6c40dad56690\") " pod="openstack/dnsmasq-dns-9449c877-vxfrw"
Mar 20 16:04:10 crc kubenswrapper[4730]: I0320 16:04:10.351324    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgsp4\" (UniqueName: \"kubernetes.io/projected/4139a04b-4804-475f-9da3-6c40dad56690-kube-api-access-dgsp4\") pod \"dnsmasq-dns-9449c877-vxfrw\" (UID: \"4139a04b-4804-475f-9da3-6c40dad56690\") " pod="openstack/dnsmasq-dns-9449c877-vxfrw"
Mar 20 16:04:10 crc kubenswrapper[4730]: I0320 16:04:10.351349    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4139a04b-4804-475f-9da3-6c40dad56690-config\") pod \"dnsmasq-dns-9449c877-vxfrw\" (UID: \"4139a04b-4804-475f-9da3-6c40dad56690\") " pod="openstack/dnsmasq-dns-9449c877-vxfrw"
Mar 20 16:04:10 crc kubenswrapper[4730]: I0320 16:04:10.351374    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4139a04b-4804-475f-9da3-6c40dad56690-ovsdbserver-sb\") pod \"dnsmasq-dns-9449c877-vxfrw\" (UID: \"4139a04b-4804-475f-9da3-6c40dad56690\") " pod="openstack/dnsmasq-dns-9449c877-vxfrw"
Mar 20 16:04:10 crc kubenswrapper[4730]: I0320 16:04:10.352241    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4139a04b-4804-475f-9da3-6c40dad56690-openstack-edpm-ipam\") pod \"dnsmasq-dns-9449c877-vxfrw\" (UID: \"4139a04b-4804-475f-9da3-6c40dad56690\") " pod="openstack/dnsmasq-dns-9449c877-vxfrw"
Mar 20 16:04:10 crc kubenswrapper[4730]: I0320 16:04:10.352307    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4139a04b-4804-475f-9da3-6c40dad56690-ovsdbserver-nb\") pod \"dnsmasq-dns-9449c877-vxfrw\" (UID: \"4139a04b-4804-475f-9da3-6c40dad56690\") " pod="openstack/dnsmasq-dns-9449c877-vxfrw"
Mar 20 16:04:10 crc kubenswrapper[4730]: I0320 16:04:10.352364    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4139a04b-4804-475f-9da3-6c40dad56690-ovsdbserver-sb\") pod \"dnsmasq-dns-9449c877-vxfrw\" (UID: \"4139a04b-4804-475f-9da3-6c40dad56690\") " pod="openstack/dnsmasq-dns-9449c877-vxfrw"
Mar 20 16:04:10 crc kubenswrapper[4730]: I0320 16:04:10.352561    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4139a04b-4804-475f-9da3-6c40dad56690-config\") pod \"dnsmasq-dns-9449c877-vxfrw\" (UID: \"4139a04b-4804-475f-9da3-6c40dad56690\") " pod="openstack/dnsmasq-dns-9449c877-vxfrw"
Mar 20 16:04:10 crc kubenswrapper[4730]: I0320 16:04:10.353224    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4139a04b-4804-475f-9da3-6c40dad56690-dns-swift-storage-0\") pod \"dnsmasq-dns-9449c877-vxfrw\" (UID: \"4139a04b-4804-475f-9da3-6c40dad56690\") " pod="openstack/dnsmasq-dns-9449c877-vxfrw"
Mar 20 16:04:10 crc kubenswrapper[4730]: I0320 16:04:10.353313    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4139a04b-4804-475f-9da3-6c40dad56690-dns-svc\") pod \"dnsmasq-dns-9449c877-vxfrw\" (UID: \"4139a04b-4804-475f-9da3-6c40dad56690\") " pod="openstack/dnsmasq-dns-9449c877-vxfrw"
Mar 20 16:04:10 crc kubenswrapper[4730]: I0320 16:04:10.369871    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgsp4\" (UniqueName: \"kubernetes.io/projected/4139a04b-4804-475f-9da3-6c40dad56690-kube-api-access-dgsp4\") pod \"dnsmasq-dns-9449c877-vxfrw\" (UID: \"4139a04b-4804-475f-9da3-6c40dad56690\") " pod="openstack/dnsmasq-dns-9449c877-vxfrw"
Mar 20 16:04:10 crc kubenswrapper[4730]: I0320 16:04:10.434454    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9449c877-vxfrw"
Mar 20 16:04:10 crc kubenswrapper[4730]: I0320 16:04:10.459753    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f99bcbd6f-lkgpt"
Mar 20 16:04:10 crc kubenswrapper[4730]: I0320 16:04:10.564745    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmzv5\" (UniqueName: \"kubernetes.io/projected/eee8e670-a743-4284-bde0-5a8a77d8058e-kube-api-access-jmzv5\") pod \"eee8e670-a743-4284-bde0-5a8a77d8058e\" (UID: \"eee8e670-a743-4284-bde0-5a8a77d8058e\") "
Mar 20 16:04:10 crc kubenswrapper[4730]: I0320 16:04:10.564828    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eee8e670-a743-4284-bde0-5a8a77d8058e-config\") pod \"eee8e670-a743-4284-bde0-5a8a77d8058e\" (UID: \"eee8e670-a743-4284-bde0-5a8a77d8058e\") "
Mar 20 16:04:10 crc kubenswrapper[4730]: I0320 16:04:10.564878    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eee8e670-a743-4284-bde0-5a8a77d8058e-dns-svc\") pod \"eee8e670-a743-4284-bde0-5a8a77d8058e\" (UID: \"eee8e670-a743-4284-bde0-5a8a77d8058e\") "
Mar 20 16:04:10 crc kubenswrapper[4730]: I0320 16:04:10.564998    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eee8e670-a743-4284-bde0-5a8a77d8058e-ovsdbserver-nb\") pod \"eee8e670-a743-4284-bde0-5a8a77d8058e\" (UID: \"eee8e670-a743-4284-bde0-5a8a77d8058e\") "
Mar 20 16:04:10 crc kubenswrapper[4730]: I0320 16:04:10.565138    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eee8e670-a743-4284-bde0-5a8a77d8058e-ovsdbserver-sb\") pod \"eee8e670-a743-4284-bde0-5a8a77d8058e\" (UID: \"eee8e670-a743-4284-bde0-5a8a77d8058e\") "
Mar 20 16:04:10 crc kubenswrapper[4730]: I0320 16:04:10.565238    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eee8e670-a743-4284-bde0-5a8a77d8058e-dns-swift-storage-0\") pod \"eee8e670-a743-4284-bde0-5a8a77d8058e\" (UID: \"eee8e670-a743-4284-bde0-5a8a77d8058e\") "
Mar 20 16:04:10 crc kubenswrapper[4730]: I0320 16:04:10.591089    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eee8e670-a743-4284-bde0-5a8a77d8058e-kube-api-access-jmzv5" (OuterVolumeSpecName: "kube-api-access-jmzv5") pod "eee8e670-a743-4284-bde0-5a8a77d8058e" (UID: "eee8e670-a743-4284-bde0-5a8a77d8058e"). InnerVolumeSpecName "kube-api-access-jmzv5". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:04:10 crc kubenswrapper[4730]: I0320 16:04:10.650860    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eee8e670-a743-4284-bde0-5a8a77d8058e-config" (OuterVolumeSpecName: "config") pod "eee8e670-a743-4284-bde0-5a8a77d8058e" (UID: "eee8e670-a743-4284-bde0-5a8a77d8058e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:04:10 crc kubenswrapper[4730]: I0320 16:04:10.664018    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eee8e670-a743-4284-bde0-5a8a77d8058e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "eee8e670-a743-4284-bde0-5a8a77d8058e" (UID: "eee8e670-a743-4284-bde0-5a8a77d8058e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:04:10 crc kubenswrapper[4730]: I0320 16:04:10.669121    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eee8e670-a743-4284-bde0-5a8a77d8058e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "eee8e670-a743-4284-bde0-5a8a77d8058e" (UID: "eee8e670-a743-4284-bde0-5a8a77d8058e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:04:10 crc kubenswrapper[4730]: I0320 16:04:10.674214    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eee8e670-a743-4284-bde0-5a8a77d8058e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "eee8e670-a743-4284-bde0-5a8a77d8058e" (UID: "eee8e670-a743-4284-bde0-5a8a77d8058e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:04:10 crc kubenswrapper[4730]: I0320 16:04:10.680130    4730 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eee8e670-a743-4284-bde0-5a8a77d8058e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\""
Mar 20 16:04:10 crc kubenswrapper[4730]: I0320 16:04:10.680241    4730 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eee8e670-a743-4284-bde0-5a8a77d8058e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\""
Mar 20 16:04:10 crc kubenswrapper[4730]: I0320 16:04:10.680317    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmzv5\" (UniqueName: \"kubernetes.io/projected/eee8e670-a743-4284-bde0-5a8a77d8058e-kube-api-access-jmzv5\") on node \"crc\" DevicePath \"\""
Mar 20 16:04:10 crc kubenswrapper[4730]: I0320 16:04:10.680374    4730 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eee8e670-a743-4284-bde0-5a8a77d8058e-config\") on node \"crc\" DevicePath \"\""
Mar 20 16:04:10 crc kubenswrapper[4730]: I0320 16:04:10.680431    4730 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eee8e670-a743-4284-bde0-5a8a77d8058e-dns-svc\") on node \"crc\" DevicePath \"\""
Mar 20 16:04:10 crc kubenswrapper[4730]: I0320 16:04:10.689396    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eee8e670-a743-4284-bde0-5a8a77d8058e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "eee8e670-a743-4284-bde0-5a8a77d8058e" (UID: "eee8e670-a743-4284-bde0-5a8a77d8058e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:04:10 crc kubenswrapper[4730]: I0320 16:04:10.783061    4730 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eee8e670-a743-4284-bde0-5a8a77d8058e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\""
Mar 20 16:04:11 crc kubenswrapper[4730]: W0320 16:04:11.061983    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4139a04b_4804_475f_9da3_6c40dad56690.slice/crio-7bcd0ba2a7c520ed08c0538dc5a73015e36d26a4b3b4bc92dbbb7058fce4f6a5 WatchSource:0}: Error finding container 7bcd0ba2a7c520ed08c0538dc5a73015e36d26a4b3b4bc92dbbb7058fce4f6a5: Status 404 returned error can't find the container with id 7bcd0ba2a7c520ed08c0538dc5a73015e36d26a4b3b4bc92dbbb7058fce4f6a5
Mar 20 16:04:11 crc kubenswrapper[4730]: I0320 16:04:11.066053    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9449c877-vxfrw"]
Mar 20 16:04:11 crc kubenswrapper[4730]: I0320 16:04:11.085506    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f99bcbd6f-lkgpt" event={"ID":"eee8e670-a743-4284-bde0-5a8a77d8058e","Type":"ContainerDied","Data":"0440fe1ccefaeb24724ee16cfedb746eead11f6c1b7d84e1056a70f539a4c85b"}
Mar 20 16:04:11 crc kubenswrapper[4730]: I0320 16:04:11.085564    4730 scope.go:117] "RemoveContainer" containerID="b9b68041f5e6d75af1b6ac50a20d8efa0a3cf0ab36fd4d2c2cf619950c911910"
Mar 20 16:04:11 crc kubenswrapper[4730]: I0320 16:04:11.085578    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f99bcbd6f-lkgpt"
Mar 20 16:04:11 crc kubenswrapper[4730]: I0320 16:04:11.087176    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9449c877-vxfrw" event={"ID":"4139a04b-4804-475f-9da3-6c40dad56690","Type":"ContainerStarted","Data":"7bcd0ba2a7c520ed08c0538dc5a73015e36d26a4b3b4bc92dbbb7058fce4f6a5"}
Mar 20 16:04:11 crc kubenswrapper[4730]: I0320 16:04:11.247087    4730 scope.go:117] "RemoveContainer" containerID="eb503bcef68144de38c1262fa118798748090de6157da609f58bcca7e2bbdbcc"
Mar 20 16:04:11 crc kubenswrapper[4730]: I0320 16:04:11.278608    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f99bcbd6f-lkgpt"]
Mar 20 16:04:11 crc kubenswrapper[4730]: I0320 16:04:11.291385    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f99bcbd6f-lkgpt"]
Mar 20 16:04:11 crc kubenswrapper[4730]: I0320 16:04:11.546048    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eee8e670-a743-4284-bde0-5a8a77d8058e" path="/var/lib/kubelet/pods/eee8e670-a743-4284-bde0-5a8a77d8058e/volumes"
Mar 20 16:04:12 crc kubenswrapper[4730]: I0320 16:04:12.099547    4730 generic.go:334] "Generic (PLEG): container finished" podID="4139a04b-4804-475f-9da3-6c40dad56690" containerID="65ca3621b3f61bde9805bb83282e2da0767cc223325f4fd92eaef5aa91036539" exitCode=0
Mar 20 16:04:12 crc kubenswrapper[4730]: I0320 16:04:12.099590    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9449c877-vxfrw" event={"ID":"4139a04b-4804-475f-9da3-6c40dad56690","Type":"ContainerDied","Data":"65ca3621b3f61bde9805bb83282e2da0767cc223325f4fd92eaef5aa91036539"}
Mar 20 16:04:12 crc kubenswrapper[4730]: I0320 16:04:12.880571    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 16:04:12 crc kubenswrapper[4730]: I0320 16:04:12.880620    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 16:04:13 crc kubenswrapper[4730]: I0320 16:04:13.111426    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9449c877-vxfrw" event={"ID":"4139a04b-4804-475f-9da3-6c40dad56690","Type":"ContainerStarted","Data":"1ffb68f266e414b3bc77b5326ad16b0f15aca2fb7b78309a33ced5f4eef7f1db"}
Mar 20 16:04:13 crc kubenswrapper[4730]: I0320 16:04:13.111581    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-9449c877-vxfrw"
Mar 20 16:04:13 crc kubenswrapper[4730]: I0320 16:04:13.136048    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-9449c877-vxfrw" podStartSLOduration=3.136029081 podStartE2EDuration="3.136029081s" podCreationTimestamp="2026-03-20 16:04:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:13.128760133 +0000 UTC m=+1512.342131522" watchObservedRunningTime="2026-03-20 16:04:13.136029081 +0000 UTC m=+1512.349400460"
Mar 20 16:04:20 crc kubenswrapper[4730]: I0320 16:04:20.436140    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-9449c877-vxfrw"
Mar 20 16:04:20 crc kubenswrapper[4730]: I0320 16:04:20.540492    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-559876945-l7ht2"]
Mar 20 16:04:20 crc kubenswrapper[4730]: I0320 16:04:20.540756    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-559876945-l7ht2" podUID="99a6ce04-c06b-4fb6-84d6-a836cc82d87a" containerName="dnsmasq-dns" containerID="cri-o://1cf3b8c0059e1e31ef949cf289521c33a2ceeb42e33377927e55f4fd8a97b5b3" gracePeriod=10
Mar 20 16:04:21 crc kubenswrapper[4730]: I0320 16:04:21.034287    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-559876945-l7ht2"
Mar 20 16:04:21 crc kubenswrapper[4730]: I0320 16:04:21.124595    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99a6ce04-c06b-4fb6-84d6-a836cc82d87a-ovsdbserver-sb\") pod \"99a6ce04-c06b-4fb6-84d6-a836cc82d87a\" (UID: \"99a6ce04-c06b-4fb6-84d6-a836cc82d87a\") "
Mar 20 16:04:21 crc kubenswrapper[4730]: I0320 16:04:21.124650    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g8ft\" (UniqueName: \"kubernetes.io/projected/99a6ce04-c06b-4fb6-84d6-a836cc82d87a-kube-api-access-6g8ft\") pod \"99a6ce04-c06b-4fb6-84d6-a836cc82d87a\" (UID: \"99a6ce04-c06b-4fb6-84d6-a836cc82d87a\") "
Mar 20 16:04:21 crc kubenswrapper[4730]: I0320 16:04:21.124684    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99a6ce04-c06b-4fb6-84d6-a836cc82d87a-dns-swift-storage-0\") pod \"99a6ce04-c06b-4fb6-84d6-a836cc82d87a\" (UID: \"99a6ce04-c06b-4fb6-84d6-a836cc82d87a\") "
Mar 20 16:04:21 crc kubenswrapper[4730]: I0320 16:04:21.124724    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99a6ce04-c06b-4fb6-84d6-a836cc82d87a-ovsdbserver-nb\") pod \"99a6ce04-c06b-4fb6-84d6-a836cc82d87a\" (UID: \"99a6ce04-c06b-4fb6-84d6-a836cc82d87a\") "
Mar 20 16:04:21 crc kubenswrapper[4730]: I0320 16:04:21.124764    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99a6ce04-c06b-4fb6-84d6-a836cc82d87a-config\") pod \"99a6ce04-c06b-4fb6-84d6-a836cc82d87a\" (UID: \"99a6ce04-c06b-4fb6-84d6-a836cc82d87a\") "
Mar 20 16:04:21 crc kubenswrapper[4730]: I0320 16:04:21.124785    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99a6ce04-c06b-4fb6-84d6-a836cc82d87a-dns-svc\") pod \"99a6ce04-c06b-4fb6-84d6-a836cc82d87a\" (UID: \"99a6ce04-c06b-4fb6-84d6-a836cc82d87a\") "
Mar 20 16:04:21 crc kubenswrapper[4730]: I0320 16:04:21.124807    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/99a6ce04-c06b-4fb6-84d6-a836cc82d87a-openstack-edpm-ipam\") pod \"99a6ce04-c06b-4fb6-84d6-a836cc82d87a\" (UID: \"99a6ce04-c06b-4fb6-84d6-a836cc82d87a\") "
Mar 20 16:04:21 crc kubenswrapper[4730]: I0320 16:04:21.151140    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99a6ce04-c06b-4fb6-84d6-a836cc82d87a-kube-api-access-6g8ft" (OuterVolumeSpecName: "kube-api-access-6g8ft") pod "99a6ce04-c06b-4fb6-84d6-a836cc82d87a" (UID: "99a6ce04-c06b-4fb6-84d6-a836cc82d87a"). InnerVolumeSpecName "kube-api-access-6g8ft". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:04:21 crc kubenswrapper[4730]: I0320 16:04:21.181542    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99a6ce04-c06b-4fb6-84d6-a836cc82d87a-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "99a6ce04-c06b-4fb6-84d6-a836cc82d87a" (UID: "99a6ce04-c06b-4fb6-84d6-a836cc82d87a"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:04:21 crc kubenswrapper[4730]: I0320 16:04:21.189124    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99a6ce04-c06b-4fb6-84d6-a836cc82d87a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "99a6ce04-c06b-4fb6-84d6-a836cc82d87a" (UID: "99a6ce04-c06b-4fb6-84d6-a836cc82d87a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:04:21 crc kubenswrapper[4730]: I0320 16:04:21.193544    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99a6ce04-c06b-4fb6-84d6-a836cc82d87a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "99a6ce04-c06b-4fb6-84d6-a836cc82d87a" (UID: "99a6ce04-c06b-4fb6-84d6-a836cc82d87a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:04:21 crc kubenswrapper[4730]: I0320 16:04:21.193702    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99a6ce04-c06b-4fb6-84d6-a836cc82d87a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "99a6ce04-c06b-4fb6-84d6-a836cc82d87a" (UID: "99a6ce04-c06b-4fb6-84d6-a836cc82d87a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:04:21 crc kubenswrapper[4730]: I0320 16:04:21.204725    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99a6ce04-c06b-4fb6-84d6-a836cc82d87a-config" (OuterVolumeSpecName: "config") pod "99a6ce04-c06b-4fb6-84d6-a836cc82d87a" (UID: "99a6ce04-c06b-4fb6-84d6-a836cc82d87a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:04:21 crc kubenswrapper[4730]: I0320 16:04:21.212591    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99a6ce04-c06b-4fb6-84d6-a836cc82d87a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "99a6ce04-c06b-4fb6-84d6-a836cc82d87a" (UID: "99a6ce04-c06b-4fb6-84d6-a836cc82d87a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:04:21 crc kubenswrapper[4730]: I0320 16:04:21.218802    4730 generic.go:334] "Generic (PLEG): container finished" podID="99a6ce04-c06b-4fb6-84d6-a836cc82d87a" containerID="1cf3b8c0059e1e31ef949cf289521c33a2ceeb42e33377927e55f4fd8a97b5b3" exitCode=0
Mar 20 16:04:21 crc kubenswrapper[4730]: I0320 16:04:21.218846    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-559876945-l7ht2" event={"ID":"99a6ce04-c06b-4fb6-84d6-a836cc82d87a","Type":"ContainerDied","Data":"1cf3b8c0059e1e31ef949cf289521c33a2ceeb42e33377927e55f4fd8a97b5b3"}
Mar 20 16:04:21 crc kubenswrapper[4730]: I0320 16:04:21.218873    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-559876945-l7ht2" event={"ID":"99a6ce04-c06b-4fb6-84d6-a836cc82d87a","Type":"ContainerDied","Data":"81e53bbf904ba77b9ef44d62d3e4f4d28f5a6cda01dd315128148262131a319a"}
Mar 20 16:04:21 crc kubenswrapper[4730]: I0320 16:04:21.218890    4730 scope.go:117] "RemoveContainer" containerID="1cf3b8c0059e1e31ef949cf289521c33a2ceeb42e33377927e55f4fd8a97b5b3"
Mar 20 16:04:21 crc kubenswrapper[4730]: I0320 16:04:21.219034    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-559876945-l7ht2"
Mar 20 16:04:21 crc kubenswrapper[4730]: I0320 16:04:21.227429    4730 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/99a6ce04-c06b-4fb6-84d6-a836cc82d87a-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\""
Mar 20 16:04:21 crc kubenswrapper[4730]: I0320 16:04:21.228421    4730 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99a6ce04-c06b-4fb6-84d6-a836cc82d87a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\""
Mar 20 16:04:21 crc kubenswrapper[4730]: I0320 16:04:21.228446    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g8ft\" (UniqueName: \"kubernetes.io/projected/99a6ce04-c06b-4fb6-84d6-a836cc82d87a-kube-api-access-6g8ft\") on node \"crc\" DevicePath \"\""
Mar 20 16:04:21 crc kubenswrapper[4730]: I0320 16:04:21.228460    4730 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99a6ce04-c06b-4fb6-84d6-a836cc82d87a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\""
Mar 20 16:04:21 crc kubenswrapper[4730]: I0320 16:04:21.228470    4730 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99a6ce04-c06b-4fb6-84d6-a836cc82d87a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\""
Mar 20 16:04:21 crc kubenswrapper[4730]: I0320 16:04:21.228480    4730 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99a6ce04-c06b-4fb6-84d6-a836cc82d87a-config\") on node \"crc\" DevicePath \"\""
Mar 20 16:04:21 crc kubenswrapper[4730]: I0320 16:04:21.228490    4730 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99a6ce04-c06b-4fb6-84d6-a836cc82d87a-dns-svc\") on node \"crc\" DevicePath \"\""
Mar 20 16:04:21 crc kubenswrapper[4730]: I0320 16:04:21.266648    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-559876945-l7ht2"]
Mar 20 16:04:21 crc kubenswrapper[4730]: I0320 16:04:21.270313    4730 scope.go:117] "RemoveContainer" containerID="9d119601c72d61103c3dc16a3891cc72c3cd26d5d7d837176227b7a8dffe1c9a"
Mar 20 16:04:21 crc kubenswrapper[4730]: I0320 16:04:21.277703    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-559876945-l7ht2"]
Mar 20 16:04:21 crc kubenswrapper[4730]: I0320 16:04:21.287931    4730 scope.go:117] "RemoveContainer" containerID="1cf3b8c0059e1e31ef949cf289521c33a2ceeb42e33377927e55f4fd8a97b5b3"
Mar 20 16:04:21 crc kubenswrapper[4730]: E0320 16:04:21.288405    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cf3b8c0059e1e31ef949cf289521c33a2ceeb42e33377927e55f4fd8a97b5b3\": container with ID starting with 1cf3b8c0059e1e31ef949cf289521c33a2ceeb42e33377927e55f4fd8a97b5b3 not found: ID does not exist" containerID="1cf3b8c0059e1e31ef949cf289521c33a2ceeb42e33377927e55f4fd8a97b5b3"
Mar 20 16:04:21 crc kubenswrapper[4730]: I0320 16:04:21.288447    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cf3b8c0059e1e31ef949cf289521c33a2ceeb42e33377927e55f4fd8a97b5b3"} err="failed to get container status \"1cf3b8c0059e1e31ef949cf289521c33a2ceeb42e33377927e55f4fd8a97b5b3\": rpc error: code = NotFound desc = could not find container \"1cf3b8c0059e1e31ef949cf289521c33a2ceeb42e33377927e55f4fd8a97b5b3\": container with ID starting with 1cf3b8c0059e1e31ef949cf289521c33a2ceeb42e33377927e55f4fd8a97b5b3 not found: ID does not exist"
Mar 20 16:04:21 crc kubenswrapper[4730]: I0320 16:04:21.288475    4730 scope.go:117] "RemoveContainer" containerID="9d119601c72d61103c3dc16a3891cc72c3cd26d5d7d837176227b7a8dffe1c9a"
Mar 20 16:04:21 crc kubenswrapper[4730]: E0320 16:04:21.288916    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d119601c72d61103c3dc16a3891cc72c3cd26d5d7d837176227b7a8dffe1c9a\": container with ID starting with 9d119601c72d61103c3dc16a3891cc72c3cd26d5d7d837176227b7a8dffe1c9a not found: ID does not exist" containerID="9d119601c72d61103c3dc16a3891cc72c3cd26d5d7d837176227b7a8dffe1c9a"
Mar 20 16:04:21 crc kubenswrapper[4730]: I0320 16:04:21.288957    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d119601c72d61103c3dc16a3891cc72c3cd26d5d7d837176227b7a8dffe1c9a"} err="failed to get container status \"9d119601c72d61103c3dc16a3891cc72c3cd26d5d7d837176227b7a8dffe1c9a\": rpc error: code = NotFound desc = could not find container \"9d119601c72d61103c3dc16a3891cc72c3cd26d5d7d837176227b7a8dffe1c9a\": container with ID starting with 9d119601c72d61103c3dc16a3891cc72c3cd26d5d7d837176227b7a8dffe1c9a not found: ID does not exist"
Mar 20 16:04:21 crc kubenswrapper[4730]: I0320 16:04:21.547659    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99a6ce04-c06b-4fb6-84d6-a836cc82d87a" path="/var/lib/kubelet/pods/99a6ce04-c06b-4fb6-84d6-a836cc82d87a/volumes"
Mar 20 16:04:25 crc kubenswrapper[4730]: I0320 16:04:25.268637    4730 generic.go:334] "Generic (PLEG): container finished" podID="707f8f93-76f2-4472-a015-5dccae194c5e" containerID="de03597ab9cfb6693189c9786007baca422ddc54452cd14a6d93d946dbc0292f" exitCode=0
Mar 20 16:04:25 crc kubenswrapper[4730]: I0320 16:04:25.268743    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"707f8f93-76f2-4472-a015-5dccae194c5e","Type":"ContainerDied","Data":"de03597ab9cfb6693189c9786007baca422ddc54452cd14a6d93d946dbc0292f"}
Mar 20 16:04:25 crc kubenswrapper[4730]: I0320 16:04:25.272872    4730 generic.go:334] "Generic (PLEG): container finished" podID="b92f799a-be4e-45a1-9e2e-c93c4992c9ce" containerID="1e29d00f6f29fa8f0367c00832e311cff753dbf51d6fb8d2e00ec1e4fe83f33b" exitCode=0
Mar 20 16:04:25 crc kubenswrapper[4730]: I0320 16:04:25.272899    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b92f799a-be4e-45a1-9e2e-c93c4992c9ce","Type":"ContainerDied","Data":"1e29d00f6f29fa8f0367c00832e311cff753dbf51d6fb8d2e00ec1e4fe83f33b"}
Mar 20 16:04:26 crc kubenswrapper[4730]: I0320 16:04:26.282106    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"707f8f93-76f2-4472-a015-5dccae194c5e","Type":"ContainerStarted","Data":"3586ba883914e6057094df55473b2fa2d64372ac4b8f14b7b1a92b955e87ee1f"}
Mar 20 16:04:26 crc kubenswrapper[4730]: I0320 16:04:26.283312    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0"
Mar 20 16:04:26 crc kubenswrapper[4730]: I0320 16:04:26.283909    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b92f799a-be4e-45a1-9e2e-c93c4992c9ce","Type":"ContainerStarted","Data":"de3d681419b61f50ed45583f461818df98687096ac5479acda76b8915295f730"}
Mar 20 16:04:26 crc kubenswrapper[4730]: I0320 16:04:26.284282    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0"
Mar 20 16:04:26 crc kubenswrapper[4730]: I0320 16:04:26.310422    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.310405316 podStartE2EDuration="37.310405316s" podCreationTimestamp="2026-03-20 16:03:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:26.309467029 +0000 UTC m=+1525.522838398" watchObservedRunningTime="2026-03-20 16:04:26.310405316 +0000 UTC m=+1525.523776685"
Mar 20 16:04:26 crc kubenswrapper[4730]: I0320 16:04:26.345124    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.345103218 podStartE2EDuration="37.345103218s" podCreationTimestamp="2026-03-20 16:03:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:04:26.33956293 +0000 UTC m=+1525.552934299" watchObservedRunningTime="2026-03-20 16:04:26.345103218 +0000 UTC m=+1525.558474587"
Mar 20 16:04:30 crc kubenswrapper[4730]: I0320 16:04:30.956496    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xhvz9"]
Mar 20 16:04:30 crc kubenswrapper[4730]: E0320 16:04:30.957595    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eee8e670-a743-4284-bde0-5a8a77d8058e" containerName="init"
Mar 20 16:04:30 crc kubenswrapper[4730]: I0320 16:04:30.957614    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="eee8e670-a743-4284-bde0-5a8a77d8058e" containerName="init"
Mar 20 16:04:30 crc kubenswrapper[4730]: E0320 16:04:30.957639    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99a6ce04-c06b-4fb6-84d6-a836cc82d87a" containerName="dnsmasq-dns"
Mar 20 16:04:30 crc kubenswrapper[4730]: I0320 16:04:30.957646    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="99a6ce04-c06b-4fb6-84d6-a836cc82d87a" containerName="dnsmasq-dns"
Mar 20 16:04:30 crc kubenswrapper[4730]: E0320 16:04:30.957690    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eee8e670-a743-4284-bde0-5a8a77d8058e" containerName="dnsmasq-dns"
Mar 20 16:04:30 crc kubenswrapper[4730]: I0320 16:04:30.957698    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="eee8e670-a743-4284-bde0-5a8a77d8058e" containerName="dnsmasq-dns"
Mar 20 16:04:30 crc kubenswrapper[4730]: E0320 16:04:30.957712    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99a6ce04-c06b-4fb6-84d6-a836cc82d87a" containerName="init"
Mar 20 16:04:30 crc kubenswrapper[4730]: I0320 16:04:30.957719    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="99a6ce04-c06b-4fb6-84d6-a836cc82d87a" containerName="init"
Mar 20 16:04:30 crc kubenswrapper[4730]: I0320 16:04:30.957957    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="99a6ce04-c06b-4fb6-84d6-a836cc82d87a" containerName="dnsmasq-dns"
Mar 20 16:04:30 crc kubenswrapper[4730]: I0320 16:04:30.957995    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="eee8e670-a743-4284-bde0-5a8a77d8058e" containerName="dnsmasq-dns"
Mar 20 16:04:30 crc kubenswrapper[4730]: I0320 16:04:30.959972    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xhvz9"
Mar 20 16:04:30 crc kubenswrapper[4730]: I0320 16:04:30.973862    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xhvz9"]
Mar 20 16:04:31 crc kubenswrapper[4730]: I0320 16:04:31.114561    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xbl4\" (UniqueName: \"kubernetes.io/projected/42c5867a-e6e4-43d8-8529-e75f856fb943-kube-api-access-6xbl4\") pod \"redhat-operators-xhvz9\" (UID: \"42c5867a-e6e4-43d8-8529-e75f856fb943\") " pod="openshift-marketplace/redhat-operators-xhvz9"
Mar 20 16:04:31 crc kubenswrapper[4730]: I0320 16:04:31.115047    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42c5867a-e6e4-43d8-8529-e75f856fb943-utilities\") pod \"redhat-operators-xhvz9\" (UID: \"42c5867a-e6e4-43d8-8529-e75f856fb943\") " pod="openshift-marketplace/redhat-operators-xhvz9"
Mar 20 16:04:31 crc kubenswrapper[4730]: I0320 16:04:31.115372    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42c5867a-e6e4-43d8-8529-e75f856fb943-catalog-content\") pod \"redhat-operators-xhvz9\" (UID: \"42c5867a-e6e4-43d8-8529-e75f856fb943\") " pod="openshift-marketplace/redhat-operators-xhvz9"
Mar 20 16:04:31 crc kubenswrapper[4730]: I0320 16:04:31.217481    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42c5867a-e6e4-43d8-8529-e75f856fb943-utilities\") pod \"redhat-operators-xhvz9\" (UID: \"42c5867a-e6e4-43d8-8529-e75f856fb943\") " pod="openshift-marketplace/redhat-operators-xhvz9"
Mar 20 16:04:31 crc kubenswrapper[4730]: I0320 16:04:31.217573    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42c5867a-e6e4-43d8-8529-e75f856fb943-catalog-content\") pod \"redhat-operators-xhvz9\" (UID: \"42c5867a-e6e4-43d8-8529-e75f856fb943\") " pod="openshift-marketplace/redhat-operators-xhvz9"
Mar 20 16:04:31 crc kubenswrapper[4730]: I0320 16:04:31.217624    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xbl4\" (UniqueName: \"kubernetes.io/projected/42c5867a-e6e4-43d8-8529-e75f856fb943-kube-api-access-6xbl4\") pod \"redhat-operators-xhvz9\" (UID: \"42c5867a-e6e4-43d8-8529-e75f856fb943\") " pod="openshift-marketplace/redhat-operators-xhvz9"
Mar 20 16:04:31 crc kubenswrapper[4730]: I0320 16:04:31.217987    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42c5867a-e6e4-43d8-8529-e75f856fb943-utilities\") pod \"redhat-operators-xhvz9\" (UID: \"42c5867a-e6e4-43d8-8529-e75f856fb943\") " pod="openshift-marketplace/redhat-operators-xhvz9"
Mar 20 16:04:31 crc kubenswrapper[4730]: I0320 16:04:31.218263    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42c5867a-e6e4-43d8-8529-e75f856fb943-catalog-content\") pod \"redhat-operators-xhvz9\" (UID: \"42c5867a-e6e4-43d8-8529-e75f856fb943\") " pod="openshift-marketplace/redhat-operators-xhvz9"
Mar 20 16:04:31 crc kubenswrapper[4730]: I0320 16:04:31.246079    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xbl4\" (UniqueName: \"kubernetes.io/projected/42c5867a-e6e4-43d8-8529-e75f856fb943-kube-api-access-6xbl4\") pod \"redhat-operators-xhvz9\" (UID: \"42c5867a-e6e4-43d8-8529-e75f856fb943\") " pod="openshift-marketplace/redhat-operators-xhvz9"
Mar 20 16:04:31 crc kubenswrapper[4730]: I0320 16:04:31.323399    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xhvz9"
Mar 20 16:04:31 crc kubenswrapper[4730]: I0320 16:04:31.787773    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xhvz9"]
Mar 20 16:04:32 crc kubenswrapper[4730]: I0320 16:04:32.341734    4730 generic.go:334] "Generic (PLEG): container finished" podID="42c5867a-e6e4-43d8-8529-e75f856fb943" containerID="2e057f74260f665473cbacc2a40a935ef6c07128483ad035e7764c0000379270" exitCode=0
Mar 20 16:04:32 crc kubenswrapper[4730]: I0320 16:04:32.341910    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xhvz9" event={"ID":"42c5867a-e6e4-43d8-8529-e75f856fb943","Type":"ContainerDied","Data":"2e057f74260f665473cbacc2a40a935ef6c07128483ad035e7764c0000379270"}
Mar 20 16:04:32 crc kubenswrapper[4730]: I0320 16:04:32.342108    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xhvz9" event={"ID":"42c5867a-e6e4-43d8-8529-e75f856fb943","Type":"ContainerStarted","Data":"420ee018ee3dd1e22e20f8b86a79c9d53492851583d49970e0c00652b3d5f76f"}
Mar 20 16:04:32 crc kubenswrapper[4730]: I0320 16:04:32.764011    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-l27lg"]
Mar 20 16:04:32 crc kubenswrapper[4730]: I0320 16:04:32.767550    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-l27lg"
Mar 20 16:04:32 crc kubenswrapper[4730]: I0320 16:04:32.772728    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vvsxx"
Mar 20 16:04:32 crc kubenswrapper[4730]: I0320 16:04:32.772898    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam"
Mar 20 16:04:32 crc kubenswrapper[4730]: I0320 16:04:32.772961    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret"
Mar 20 16:04:32 crc kubenswrapper[4730]: I0320 16:04:32.774283    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env"
Mar 20 16:04:32 crc kubenswrapper[4730]: I0320 16:04:32.776507    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-l27lg"]
Mar 20 16:04:32 crc kubenswrapper[4730]: I0320 16:04:32.953224    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/16667e9d-1075-4c26-8002-61c737a8f76a-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-l27lg\" (UID: \"16667e9d-1075-4c26-8002-61c737a8f76a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-l27lg"
Mar 20 16:04:32 crc kubenswrapper[4730]: I0320 16:04:32.953619    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/16667e9d-1075-4c26-8002-61c737a8f76a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-l27lg\" (UID: \"16667e9d-1075-4c26-8002-61c737a8f76a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-l27lg"
Mar 20 16:04:32 crc kubenswrapper[4730]: I0320 16:04:32.953804    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16667e9d-1075-4c26-8002-61c737a8f76a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-l27lg\" (UID: \"16667e9d-1075-4c26-8002-61c737a8f76a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-l27lg"
Mar 20 16:04:32 crc kubenswrapper[4730]: I0320 16:04:32.953986    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5psf\" (UniqueName: \"kubernetes.io/projected/16667e9d-1075-4c26-8002-61c737a8f76a-kube-api-access-x5psf\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-l27lg\" (UID: \"16667e9d-1075-4c26-8002-61c737a8f76a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-l27lg"
Mar 20 16:04:33 crc kubenswrapper[4730]: I0320 16:04:33.056018    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16667e9d-1075-4c26-8002-61c737a8f76a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-l27lg\" (UID: \"16667e9d-1075-4c26-8002-61c737a8f76a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-l27lg"
Mar 20 16:04:33 crc kubenswrapper[4730]: I0320 16:04:33.057375    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5psf\" (UniqueName: \"kubernetes.io/projected/16667e9d-1075-4c26-8002-61c737a8f76a-kube-api-access-x5psf\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-l27lg\" (UID: \"16667e9d-1075-4c26-8002-61c737a8f76a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-l27lg"
Mar 20 16:04:33 crc kubenswrapper[4730]: I0320 16:04:33.057516    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/16667e9d-1075-4c26-8002-61c737a8f76a-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-l27lg\" (UID: \"16667e9d-1075-4c26-8002-61c737a8f76a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-l27lg"
Mar 20 16:04:33 crc kubenswrapper[4730]: I0320 16:04:33.057576    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/16667e9d-1075-4c26-8002-61c737a8f76a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-l27lg\" (UID: \"16667e9d-1075-4c26-8002-61c737a8f76a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-l27lg"
Mar 20 16:04:33 crc kubenswrapper[4730]: I0320 16:04:33.063127    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/16667e9d-1075-4c26-8002-61c737a8f76a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-l27lg\" (UID: \"16667e9d-1075-4c26-8002-61c737a8f76a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-l27lg"
Mar 20 16:04:33 crc kubenswrapper[4730]: I0320 16:04:33.064997    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16667e9d-1075-4c26-8002-61c737a8f76a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-l27lg\" (UID: \"16667e9d-1075-4c26-8002-61c737a8f76a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-l27lg"
Mar 20 16:04:33 crc kubenswrapper[4730]: I0320 16:04:33.072550    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5psf\" (UniqueName: \"kubernetes.io/projected/16667e9d-1075-4c26-8002-61c737a8f76a-kube-api-access-x5psf\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-l27lg\" (UID: \"16667e9d-1075-4c26-8002-61c737a8f76a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-l27lg"
Mar 20 16:04:33 crc kubenswrapper[4730]: I0320 16:04:33.073532    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/16667e9d-1075-4c26-8002-61c737a8f76a-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-l27lg\" (UID: \"16667e9d-1075-4c26-8002-61c737a8f76a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-l27lg"
Mar 20 16:04:33 crc kubenswrapper[4730]: I0320 16:04:33.090498    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-l27lg"
Mar 20 16:04:33 crc kubenswrapper[4730]: I0320 16:04:33.678708    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-l27lg"]
Mar 20 16:04:34 crc kubenswrapper[4730]: I0320 16:04:34.379642    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xhvz9" event={"ID":"42c5867a-e6e4-43d8-8529-e75f856fb943","Type":"ContainerStarted","Data":"61b00a2f85c0359c92c4dd158661d158b1cc1247d0b76c3308b6285914985d68"}
Mar 20 16:04:34 crc kubenswrapper[4730]: I0320 16:04:34.382377    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-l27lg" event={"ID":"16667e9d-1075-4c26-8002-61c737a8f76a","Type":"ContainerStarted","Data":"494267ff651b1f0f1a44ff0b53d0791cf31d3778677557978f567145c3808dbc"}
Mar 20 16:04:36 crc kubenswrapper[4730]: I0320 16:04:36.415787    4730 generic.go:334] "Generic (PLEG): container finished" podID="42c5867a-e6e4-43d8-8529-e75f856fb943" containerID="61b00a2f85c0359c92c4dd158661d158b1cc1247d0b76c3308b6285914985d68" exitCode=0
Mar 20 16:04:36 crc kubenswrapper[4730]: I0320 16:04:36.415905    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xhvz9" event={"ID":"42c5867a-e6e4-43d8-8529-e75f856fb943","Type":"ContainerDied","Data":"61b00a2f85c0359c92c4dd158661d158b1cc1247d0b76c3308b6285914985d68"}
Mar 20 16:04:40 crc kubenswrapper[4730]: I0320 16:04:40.199330    4730 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="707f8f93-76f2-4472-a015-5dccae194c5e" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.230:5671: connect: connection refused"
Mar 20 16:04:40 crc kubenswrapper[4730]: I0320 16:04:40.249508    4730 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="b92f799a-be4e-45a1-9e2e-c93c4992c9ce" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.231:5671: connect: connection refused"
Mar 20 16:04:42 crc kubenswrapper[4730]: I0320 16:04:42.880669    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 16:04:42 crc kubenswrapper[4730]: I0320 16:04:42.881240    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 16:04:43 crc kubenswrapper[4730]: I0320 16:04:43.501926    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xhvz9" event={"ID":"42c5867a-e6e4-43d8-8529-e75f856fb943","Type":"ContainerStarted","Data":"07f978941b4653bb8671c40ee0f55a53c4f8e766a6251cc1507c3fb977516d33"}
Mar 20 16:04:43 crc kubenswrapper[4730]: I0320 16:04:43.504405    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-l27lg" event={"ID":"16667e9d-1075-4c26-8002-61c737a8f76a","Type":"ContainerStarted","Data":"605e5e943174f16d82be8a689b82d05d8b0a01532e03b17f3a5561bc1ac00775"}
Mar 20 16:04:43 crc kubenswrapper[4730]: I0320 16:04:43.520784    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xhvz9" podStartSLOduration=2.672452167 podStartE2EDuration="13.520766117s" podCreationTimestamp="2026-03-20 16:04:30 +0000 UTC" firstStartedPulling="2026-03-20 16:04:32.343629329 +0000 UTC m=+1531.557000698" lastFinishedPulling="2026-03-20 16:04:43.191943279 +0000 UTC m=+1542.405314648" observedRunningTime="2026-03-20 16:04:43.518965426 +0000 UTC m=+1542.732336805" watchObservedRunningTime="2026-03-20 16:04:43.520766117 +0000 UTC m=+1542.734137486"
Mar 20 16:04:43 crc kubenswrapper[4730]: I0320 16:04:43.540060    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-l27lg" podStartSLOduration=1.987685247 podStartE2EDuration="11.540030568s" podCreationTimestamp="2026-03-20 16:04:32 +0000 UTC" firstStartedPulling="2026-03-20 16:04:33.685558293 +0000 UTC m=+1532.898929662" lastFinishedPulling="2026-03-20 16:04:43.237903604 +0000 UTC m=+1542.451274983" observedRunningTime="2026-03-20 16:04:43.539091221 +0000 UTC m=+1542.752462610" watchObservedRunningTime="2026-03-20 16:04:43.540030568 +0000 UTC m=+1542.753401947"
Mar 20 16:04:46 crc kubenswrapper[4730]: I0320 16:04:46.192171    4730 scope.go:117] "RemoveContainer" containerID="caec51e4f1b5d91020f11b5970f403cd0356b8c6fa1f260cecf4ea6e449980f1"
Mar 20 16:04:46 crc kubenswrapper[4730]: I0320 16:04:46.275402    4730 scope.go:117] "RemoveContainer" containerID="13985a1e2e3d58d396be0af6437cdcdb0bbdea54308502442707c077b36e9713"
Mar 20 16:04:50 crc kubenswrapper[4730]: I0320 16:04:50.199437    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0"
Mar 20 16:04:50 crc kubenswrapper[4730]: I0320 16:04:50.249516    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0"
Mar 20 16:04:51 crc kubenswrapper[4730]: I0320 16:04:51.324371    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xhvz9"
Mar 20 16:04:51 crc kubenswrapper[4730]: I0320 16:04:51.324710    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xhvz9"
Mar 20 16:04:52 crc kubenswrapper[4730]: I0320 16:04:52.378600    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xhvz9" podUID="42c5867a-e6e4-43d8-8529-e75f856fb943" containerName="registry-server" probeResult="failure" output=<
Mar 20 16:04:52 crc kubenswrapper[4730]:         timeout: failed to connect service ":50051" within 1s
Mar 20 16:04:52 crc kubenswrapper[4730]:  >
Mar 20 16:04:54 crc kubenswrapper[4730]: I0320 16:04:54.611430    4730 generic.go:334] "Generic (PLEG): container finished" podID="16667e9d-1075-4c26-8002-61c737a8f76a" containerID="605e5e943174f16d82be8a689b82d05d8b0a01532e03b17f3a5561bc1ac00775" exitCode=0
Mar 20 16:04:54 crc kubenswrapper[4730]: I0320 16:04:54.611519    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-l27lg" event={"ID":"16667e9d-1075-4c26-8002-61c737a8f76a","Type":"ContainerDied","Data":"605e5e943174f16d82be8a689b82d05d8b0a01532e03b17f3a5561bc1ac00775"}
Mar 20 16:04:56 crc kubenswrapper[4730]: I0320 16:04:56.447909    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-l27lg"
Mar 20 16:04:56 crc kubenswrapper[4730]: I0320 16:04:56.545854    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5psf\" (UniqueName: \"kubernetes.io/projected/16667e9d-1075-4c26-8002-61c737a8f76a-kube-api-access-x5psf\") pod \"16667e9d-1075-4c26-8002-61c737a8f76a\" (UID: \"16667e9d-1075-4c26-8002-61c737a8f76a\") "
Mar 20 16:04:56 crc kubenswrapper[4730]: I0320 16:04:56.546035    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16667e9d-1075-4c26-8002-61c737a8f76a-repo-setup-combined-ca-bundle\") pod \"16667e9d-1075-4c26-8002-61c737a8f76a\" (UID: \"16667e9d-1075-4c26-8002-61c737a8f76a\") "
Mar 20 16:04:56 crc kubenswrapper[4730]: I0320 16:04:56.546078    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/16667e9d-1075-4c26-8002-61c737a8f76a-inventory\") pod \"16667e9d-1075-4c26-8002-61c737a8f76a\" (UID: \"16667e9d-1075-4c26-8002-61c737a8f76a\") "
Mar 20 16:04:56 crc kubenswrapper[4730]: I0320 16:04:56.546184    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/16667e9d-1075-4c26-8002-61c737a8f76a-ssh-key-openstack-edpm-ipam\") pod \"16667e9d-1075-4c26-8002-61c737a8f76a\" (UID: \"16667e9d-1075-4c26-8002-61c737a8f76a\") "
Mar 20 16:04:56 crc kubenswrapper[4730]: I0320 16:04:56.552023    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16667e9d-1075-4c26-8002-61c737a8f76a-kube-api-access-x5psf" (OuterVolumeSpecName: "kube-api-access-x5psf") pod "16667e9d-1075-4c26-8002-61c737a8f76a" (UID: "16667e9d-1075-4c26-8002-61c737a8f76a"). InnerVolumeSpecName "kube-api-access-x5psf". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:04:56 crc kubenswrapper[4730]: I0320 16:04:56.552747    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16667e9d-1075-4c26-8002-61c737a8f76a-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "16667e9d-1075-4c26-8002-61c737a8f76a" (UID: "16667e9d-1075-4c26-8002-61c737a8f76a"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:04:56 crc kubenswrapper[4730]: I0320 16:04:56.577893    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16667e9d-1075-4c26-8002-61c737a8f76a-inventory" (OuterVolumeSpecName: "inventory") pod "16667e9d-1075-4c26-8002-61c737a8f76a" (UID: "16667e9d-1075-4c26-8002-61c737a8f76a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:04:56 crc kubenswrapper[4730]: I0320 16:04:56.582997    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16667e9d-1075-4c26-8002-61c737a8f76a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "16667e9d-1075-4c26-8002-61c737a8f76a" (UID: "16667e9d-1075-4c26-8002-61c737a8f76a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:04:56 crc kubenswrapper[4730]: I0320 16:04:56.629504    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-l27lg" event={"ID":"16667e9d-1075-4c26-8002-61c737a8f76a","Type":"ContainerDied","Data":"494267ff651b1f0f1a44ff0b53d0791cf31d3778677557978f567145c3808dbc"}
Mar 20 16:04:56 crc kubenswrapper[4730]: I0320 16:04:56.629551    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-l27lg"
Mar 20 16:04:56 crc kubenswrapper[4730]: I0320 16:04:56.629562    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="494267ff651b1f0f1a44ff0b53d0791cf31d3778677557978f567145c3808dbc"
Mar 20 16:04:56 crc kubenswrapper[4730]: I0320 16:04:56.655994    4730 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/16667e9d-1075-4c26-8002-61c737a8f76a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\""
Mar 20 16:04:56 crc kubenswrapper[4730]: I0320 16:04:56.656037    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5psf\" (UniqueName: \"kubernetes.io/projected/16667e9d-1075-4c26-8002-61c737a8f76a-kube-api-access-x5psf\") on node \"crc\" DevicePath \"\""
Mar 20 16:04:56 crc kubenswrapper[4730]: I0320 16:04:56.656050    4730 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16667e9d-1075-4c26-8002-61c737a8f76a-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:04:56 crc kubenswrapper[4730]: I0320 16:04:56.656064    4730 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/16667e9d-1075-4c26-8002-61c737a8f76a-inventory\") on node \"crc\" DevicePath \"\""
Mar 20 16:04:56 crc kubenswrapper[4730]: I0320 16:04:56.745639    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-q8dm9"]
Mar 20 16:04:56 crc kubenswrapper[4730]: E0320 16:04:56.746100    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16667e9d-1075-4c26-8002-61c737a8f76a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam"
Mar 20 16:04:56 crc kubenswrapper[4730]: I0320 16:04:56.746123    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="16667e9d-1075-4c26-8002-61c737a8f76a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam"
Mar 20 16:04:56 crc kubenswrapper[4730]: I0320 16:04:56.746381    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="16667e9d-1075-4c26-8002-61c737a8f76a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam"
Mar 20 16:04:56 crc kubenswrapper[4730]: I0320 16:04:56.747089    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q8dm9"
Mar 20 16:04:56 crc kubenswrapper[4730]: I0320 16:04:56.749186    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam"
Mar 20 16:04:56 crc kubenswrapper[4730]: I0320 16:04:56.749402    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret"
Mar 20 16:04:56 crc kubenswrapper[4730]: I0320 16:04:56.749545    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vvsxx"
Mar 20 16:04:56 crc kubenswrapper[4730]: I0320 16:04:56.749709    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env"
Mar 20 16:04:56 crc kubenswrapper[4730]: I0320 16:04:56.758080    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-q8dm9"]
Mar 20 16:04:56 crc kubenswrapper[4730]: I0320 16:04:56.859998    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcgbk\" (UniqueName: \"kubernetes.io/projected/129ce6b6-b215-4ca0-9583-78aae3c2371c-kube-api-access-pcgbk\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-q8dm9\" (UID: \"129ce6b6-b215-4ca0-9583-78aae3c2371c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q8dm9"
Mar 20 16:04:56 crc kubenswrapper[4730]: I0320 16:04:56.860061    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/129ce6b6-b215-4ca0-9583-78aae3c2371c-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-q8dm9\" (UID: \"129ce6b6-b215-4ca0-9583-78aae3c2371c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q8dm9"
Mar 20 16:04:56 crc kubenswrapper[4730]: I0320 16:04:56.860188    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/129ce6b6-b215-4ca0-9583-78aae3c2371c-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-q8dm9\" (UID: \"129ce6b6-b215-4ca0-9583-78aae3c2371c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q8dm9"
Mar 20 16:04:56 crc kubenswrapper[4730]: I0320 16:04:56.961642    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/129ce6b6-b215-4ca0-9583-78aae3c2371c-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-q8dm9\" (UID: \"129ce6b6-b215-4ca0-9583-78aae3c2371c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q8dm9"
Mar 20 16:04:56 crc kubenswrapper[4730]: I0320 16:04:56.961745    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcgbk\" (UniqueName: \"kubernetes.io/projected/129ce6b6-b215-4ca0-9583-78aae3c2371c-kube-api-access-pcgbk\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-q8dm9\" (UID: \"129ce6b6-b215-4ca0-9583-78aae3c2371c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q8dm9"
Mar 20 16:04:56 crc kubenswrapper[4730]: I0320 16:04:56.961784    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/129ce6b6-b215-4ca0-9583-78aae3c2371c-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-q8dm9\" (UID: \"129ce6b6-b215-4ca0-9583-78aae3c2371c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q8dm9"
Mar 20 16:04:56 crc kubenswrapper[4730]: I0320 16:04:56.965402    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/129ce6b6-b215-4ca0-9583-78aae3c2371c-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-q8dm9\" (UID: \"129ce6b6-b215-4ca0-9583-78aae3c2371c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q8dm9"
Mar 20 16:04:56 crc kubenswrapper[4730]: I0320 16:04:56.965847    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/129ce6b6-b215-4ca0-9583-78aae3c2371c-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-q8dm9\" (UID: \"129ce6b6-b215-4ca0-9583-78aae3c2371c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q8dm9"
Mar 20 16:04:56 crc kubenswrapper[4730]: I0320 16:04:56.982234    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcgbk\" (UniqueName: \"kubernetes.io/projected/129ce6b6-b215-4ca0-9583-78aae3c2371c-kube-api-access-pcgbk\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-q8dm9\" (UID: \"129ce6b6-b215-4ca0-9583-78aae3c2371c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q8dm9"
Mar 20 16:04:57 crc kubenswrapper[4730]: I0320 16:04:57.063759    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q8dm9"
Mar 20 16:04:57 crc kubenswrapper[4730]: I0320 16:04:57.554977    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-q8dm9"]
Mar 20 16:04:57 crc kubenswrapper[4730]: W0320 16:04:57.560625    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod129ce6b6_b215_4ca0_9583_78aae3c2371c.slice/crio-262a083a2b3b3ee1d0eda393fa4359a321eb491526e40334edf47918d7caf764 WatchSource:0}: Error finding container 262a083a2b3b3ee1d0eda393fa4359a321eb491526e40334edf47918d7caf764: Status 404 returned error can't find the container with id 262a083a2b3b3ee1d0eda393fa4359a321eb491526e40334edf47918d7caf764
Mar 20 16:04:57 crc kubenswrapper[4730]: I0320 16:04:57.640763    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q8dm9" event={"ID":"129ce6b6-b215-4ca0-9583-78aae3c2371c","Type":"ContainerStarted","Data":"262a083a2b3b3ee1d0eda393fa4359a321eb491526e40334edf47918d7caf764"}
Mar 20 16:05:00 crc kubenswrapper[4730]: I0320 16:05:00.677320    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q8dm9" event={"ID":"129ce6b6-b215-4ca0-9583-78aae3c2371c","Type":"ContainerStarted","Data":"7d0ef78deb033d3651878fe486ebe9545f0d71329e31ee70e6f2dbfebae138c9"}
Mar 20 16:05:00 crc kubenswrapper[4730]: I0320 16:05:00.717197    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q8dm9" podStartSLOduration=2.692386946 podStartE2EDuration="4.71717121s" podCreationTimestamp="2026-03-20 16:04:56 +0000 UTC" firstStartedPulling="2026-03-20 16:04:57.564286509 +0000 UTC m=+1556.777657878" lastFinishedPulling="2026-03-20 16:04:59.589070773 +0000 UTC m=+1558.802442142" observedRunningTime="2026-03-20 16:05:00.707410301 +0000 UTC m=+1559.920781680" watchObservedRunningTime="2026-03-20 16:05:00.71717121 +0000 UTC m=+1559.930542599"
Mar 20 16:05:01 crc kubenswrapper[4730]: I0320 16:05:01.385869    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xhvz9"
Mar 20 16:05:01 crc kubenswrapper[4730]: I0320 16:05:01.443783    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xhvz9"
Mar 20 16:05:02 crc kubenswrapper[4730]: I0320 16:05:02.158677    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xhvz9"]
Mar 20 16:05:02 crc kubenswrapper[4730]: I0320 16:05:02.694212    4730 generic.go:334] "Generic (PLEG): container finished" podID="129ce6b6-b215-4ca0-9583-78aae3c2371c" containerID="7d0ef78deb033d3651878fe486ebe9545f0d71329e31ee70e6f2dbfebae138c9" exitCode=0
Mar 20 16:05:02 crc kubenswrapper[4730]: I0320 16:05:02.694303    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q8dm9" event={"ID":"129ce6b6-b215-4ca0-9583-78aae3c2371c","Type":"ContainerDied","Data":"7d0ef78deb033d3651878fe486ebe9545f0d71329e31ee70e6f2dbfebae138c9"}
Mar 20 16:05:02 crc kubenswrapper[4730]: I0320 16:05:02.694482    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xhvz9" podUID="42c5867a-e6e4-43d8-8529-e75f856fb943" containerName="registry-server" containerID="cri-o://07f978941b4653bb8671c40ee0f55a53c4f8e766a6251cc1507c3fb977516d33" gracePeriod=2
Mar 20 16:05:03 crc kubenswrapper[4730]: I0320 16:05:03.212336    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xhvz9"
Mar 20 16:05:03 crc kubenswrapper[4730]: I0320 16:05:03.287317    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42c5867a-e6e4-43d8-8529-e75f856fb943-utilities\") pod \"42c5867a-e6e4-43d8-8529-e75f856fb943\" (UID: \"42c5867a-e6e4-43d8-8529-e75f856fb943\") "
Mar 20 16:05:03 crc kubenswrapper[4730]: I0320 16:05:03.287459    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xbl4\" (UniqueName: \"kubernetes.io/projected/42c5867a-e6e4-43d8-8529-e75f856fb943-kube-api-access-6xbl4\") pod \"42c5867a-e6e4-43d8-8529-e75f856fb943\" (UID: \"42c5867a-e6e4-43d8-8529-e75f856fb943\") "
Mar 20 16:05:03 crc kubenswrapper[4730]: I0320 16:05:03.287752    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42c5867a-e6e4-43d8-8529-e75f856fb943-catalog-content\") pod \"42c5867a-e6e4-43d8-8529-e75f856fb943\" (UID: \"42c5867a-e6e4-43d8-8529-e75f856fb943\") "
Mar 20 16:05:03 crc kubenswrapper[4730]: I0320 16:05:03.288265    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42c5867a-e6e4-43d8-8529-e75f856fb943-utilities" (OuterVolumeSpecName: "utilities") pod "42c5867a-e6e4-43d8-8529-e75f856fb943" (UID: "42c5867a-e6e4-43d8-8529-e75f856fb943"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:05:03 crc kubenswrapper[4730]: I0320 16:05:03.299480    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42c5867a-e6e4-43d8-8529-e75f856fb943-kube-api-access-6xbl4" (OuterVolumeSpecName: "kube-api-access-6xbl4") pod "42c5867a-e6e4-43d8-8529-e75f856fb943" (UID: "42c5867a-e6e4-43d8-8529-e75f856fb943"). InnerVolumeSpecName "kube-api-access-6xbl4". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:05:03 crc kubenswrapper[4730]: I0320 16:05:03.390142    4730 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42c5867a-e6e4-43d8-8529-e75f856fb943-utilities\") on node \"crc\" DevicePath \"\""
Mar 20 16:05:03 crc kubenswrapper[4730]: I0320 16:05:03.390191    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xbl4\" (UniqueName: \"kubernetes.io/projected/42c5867a-e6e4-43d8-8529-e75f856fb943-kube-api-access-6xbl4\") on node \"crc\" DevicePath \"\""
Mar 20 16:05:03 crc kubenswrapper[4730]: I0320 16:05:03.430591    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42c5867a-e6e4-43d8-8529-e75f856fb943-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "42c5867a-e6e4-43d8-8529-e75f856fb943" (UID: "42c5867a-e6e4-43d8-8529-e75f856fb943"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:05:03 crc kubenswrapper[4730]: I0320 16:05:03.491653    4730 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42c5867a-e6e4-43d8-8529-e75f856fb943-catalog-content\") on node \"crc\" DevicePath \"\""
Mar 20 16:05:03 crc kubenswrapper[4730]: I0320 16:05:03.708599    4730 generic.go:334] "Generic (PLEG): container finished" podID="42c5867a-e6e4-43d8-8529-e75f856fb943" containerID="07f978941b4653bb8671c40ee0f55a53c4f8e766a6251cc1507c3fb977516d33" exitCode=0
Mar 20 16:05:03 crc kubenswrapper[4730]: I0320 16:05:03.708670    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xhvz9" event={"ID":"42c5867a-e6e4-43d8-8529-e75f856fb943","Type":"ContainerDied","Data":"07f978941b4653bb8671c40ee0f55a53c4f8e766a6251cc1507c3fb977516d33"}
Mar 20 16:05:03 crc kubenswrapper[4730]: I0320 16:05:03.708741    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xhvz9"
Mar 20 16:05:03 crc kubenswrapper[4730]: I0320 16:05:03.709025    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xhvz9" event={"ID":"42c5867a-e6e4-43d8-8529-e75f856fb943","Type":"ContainerDied","Data":"420ee018ee3dd1e22e20f8b86a79c9d53492851583d49970e0c00652b3d5f76f"}
Mar 20 16:05:03 crc kubenswrapper[4730]: I0320 16:05:03.709081    4730 scope.go:117] "RemoveContainer" containerID="07f978941b4653bb8671c40ee0f55a53c4f8e766a6251cc1507c3fb977516d33"
Mar 20 16:05:03 crc kubenswrapper[4730]: I0320 16:05:03.737036    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xhvz9"]
Mar 20 16:05:03 crc kubenswrapper[4730]: I0320 16:05:03.745873    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xhvz9"]
Mar 20 16:05:03 crc kubenswrapper[4730]: I0320 16:05:03.746767    4730 scope.go:117] "RemoveContainer" containerID="61b00a2f85c0359c92c4dd158661d158b1cc1247d0b76c3308b6285914985d68"
Mar 20 16:05:03 crc kubenswrapper[4730]: I0320 16:05:03.784229    4730 scope.go:117] "RemoveContainer" containerID="2e057f74260f665473cbacc2a40a935ef6c07128483ad035e7764c0000379270"
Mar 20 16:05:03 crc kubenswrapper[4730]: I0320 16:05:03.828704    4730 scope.go:117] "RemoveContainer" containerID="07f978941b4653bb8671c40ee0f55a53c4f8e766a6251cc1507c3fb977516d33"
Mar 20 16:05:03 crc kubenswrapper[4730]: E0320 16:05:03.829320    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07f978941b4653bb8671c40ee0f55a53c4f8e766a6251cc1507c3fb977516d33\": container with ID starting with 07f978941b4653bb8671c40ee0f55a53c4f8e766a6251cc1507c3fb977516d33 not found: ID does not exist" containerID="07f978941b4653bb8671c40ee0f55a53c4f8e766a6251cc1507c3fb977516d33"
Mar 20 16:05:03 crc kubenswrapper[4730]: I0320 16:05:03.829370    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07f978941b4653bb8671c40ee0f55a53c4f8e766a6251cc1507c3fb977516d33"} err="failed to get container status \"07f978941b4653bb8671c40ee0f55a53c4f8e766a6251cc1507c3fb977516d33\": rpc error: code = NotFound desc = could not find container \"07f978941b4653bb8671c40ee0f55a53c4f8e766a6251cc1507c3fb977516d33\": container with ID starting with 07f978941b4653bb8671c40ee0f55a53c4f8e766a6251cc1507c3fb977516d33 not found: ID does not exist"
Mar 20 16:05:03 crc kubenswrapper[4730]: I0320 16:05:03.829405    4730 scope.go:117] "RemoveContainer" containerID="61b00a2f85c0359c92c4dd158661d158b1cc1247d0b76c3308b6285914985d68"
Mar 20 16:05:03 crc kubenswrapper[4730]: E0320 16:05:03.829835    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61b00a2f85c0359c92c4dd158661d158b1cc1247d0b76c3308b6285914985d68\": container with ID starting with 61b00a2f85c0359c92c4dd158661d158b1cc1247d0b76c3308b6285914985d68 not found: ID does not exist" containerID="61b00a2f85c0359c92c4dd158661d158b1cc1247d0b76c3308b6285914985d68"
Mar 20 16:05:03 crc kubenswrapper[4730]: I0320 16:05:03.829867    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61b00a2f85c0359c92c4dd158661d158b1cc1247d0b76c3308b6285914985d68"} err="failed to get container status \"61b00a2f85c0359c92c4dd158661d158b1cc1247d0b76c3308b6285914985d68\": rpc error: code = NotFound desc = could not find container \"61b00a2f85c0359c92c4dd158661d158b1cc1247d0b76c3308b6285914985d68\": container with ID starting with 61b00a2f85c0359c92c4dd158661d158b1cc1247d0b76c3308b6285914985d68 not found: ID does not exist"
Mar 20 16:05:03 crc kubenswrapper[4730]: I0320 16:05:03.829885    4730 scope.go:117] "RemoveContainer" containerID="2e057f74260f665473cbacc2a40a935ef6c07128483ad035e7764c0000379270"
Mar 20 16:05:03 crc kubenswrapper[4730]: E0320 16:05:03.830304    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e057f74260f665473cbacc2a40a935ef6c07128483ad035e7764c0000379270\": container with ID starting with 2e057f74260f665473cbacc2a40a935ef6c07128483ad035e7764c0000379270 not found: ID does not exist" containerID="2e057f74260f665473cbacc2a40a935ef6c07128483ad035e7764c0000379270"
Mar 20 16:05:03 crc kubenswrapper[4730]: I0320 16:05:03.830346    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e057f74260f665473cbacc2a40a935ef6c07128483ad035e7764c0000379270"} err="failed to get container status \"2e057f74260f665473cbacc2a40a935ef6c07128483ad035e7764c0000379270\": rpc error: code = NotFound desc = could not find container \"2e057f74260f665473cbacc2a40a935ef6c07128483ad035e7764c0000379270\": container with ID starting with 2e057f74260f665473cbacc2a40a935ef6c07128483ad035e7764c0000379270 not found: ID does not exist"
Mar 20 16:05:04 crc kubenswrapper[4730]: I0320 16:05:04.156152    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q8dm9"
Mar 20 16:05:04 crc kubenswrapper[4730]: I0320 16:05:04.331009    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcgbk\" (UniqueName: \"kubernetes.io/projected/129ce6b6-b215-4ca0-9583-78aae3c2371c-kube-api-access-pcgbk\") pod \"129ce6b6-b215-4ca0-9583-78aae3c2371c\" (UID: \"129ce6b6-b215-4ca0-9583-78aae3c2371c\") "
Mar 20 16:05:04 crc kubenswrapper[4730]: I0320 16:05:04.331225    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/129ce6b6-b215-4ca0-9583-78aae3c2371c-inventory\") pod \"129ce6b6-b215-4ca0-9583-78aae3c2371c\" (UID: \"129ce6b6-b215-4ca0-9583-78aae3c2371c\") "
Mar 20 16:05:04 crc kubenswrapper[4730]: I0320 16:05:04.331282    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/129ce6b6-b215-4ca0-9583-78aae3c2371c-ssh-key-openstack-edpm-ipam\") pod \"129ce6b6-b215-4ca0-9583-78aae3c2371c\" (UID: \"129ce6b6-b215-4ca0-9583-78aae3c2371c\") "
Mar 20 16:05:04 crc kubenswrapper[4730]: I0320 16:05:04.337402    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/129ce6b6-b215-4ca0-9583-78aae3c2371c-kube-api-access-pcgbk" (OuterVolumeSpecName: "kube-api-access-pcgbk") pod "129ce6b6-b215-4ca0-9583-78aae3c2371c" (UID: "129ce6b6-b215-4ca0-9583-78aae3c2371c"). InnerVolumeSpecName "kube-api-access-pcgbk". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:05:04 crc kubenswrapper[4730]: I0320 16:05:04.360313    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/129ce6b6-b215-4ca0-9583-78aae3c2371c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "129ce6b6-b215-4ca0-9583-78aae3c2371c" (UID: "129ce6b6-b215-4ca0-9583-78aae3c2371c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:05:04 crc kubenswrapper[4730]: I0320 16:05:04.368769    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/129ce6b6-b215-4ca0-9583-78aae3c2371c-inventory" (OuterVolumeSpecName: "inventory") pod "129ce6b6-b215-4ca0-9583-78aae3c2371c" (UID: "129ce6b6-b215-4ca0-9583-78aae3c2371c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:05:04 crc kubenswrapper[4730]: I0320 16:05:04.434267    4730 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/129ce6b6-b215-4ca0-9583-78aae3c2371c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\""
Mar 20 16:05:04 crc kubenswrapper[4730]: I0320 16:05:04.434920    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcgbk\" (UniqueName: \"kubernetes.io/projected/129ce6b6-b215-4ca0-9583-78aae3c2371c-kube-api-access-pcgbk\") on node \"crc\" DevicePath \"\""
Mar 20 16:05:04 crc kubenswrapper[4730]: I0320 16:05:04.434967    4730 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/129ce6b6-b215-4ca0-9583-78aae3c2371c-inventory\") on node \"crc\" DevicePath \"\""
Mar 20 16:05:04 crc kubenswrapper[4730]: I0320 16:05:04.718805    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q8dm9" event={"ID":"129ce6b6-b215-4ca0-9583-78aae3c2371c","Type":"ContainerDied","Data":"262a083a2b3b3ee1d0eda393fa4359a321eb491526e40334edf47918d7caf764"}
Mar 20 16:05:04 crc kubenswrapper[4730]: I0320 16:05:04.718836    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q8dm9"
Mar 20 16:05:04 crc kubenswrapper[4730]: I0320 16:05:04.718850    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="262a083a2b3b3ee1d0eda393fa4359a321eb491526e40334edf47918d7caf764"
Mar 20 16:05:04 crc kubenswrapper[4730]: I0320 16:05:04.793878    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ppqdh"]
Mar 20 16:05:04 crc kubenswrapper[4730]: E0320 16:05:04.794365    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42c5867a-e6e4-43d8-8529-e75f856fb943" containerName="registry-server"
Mar 20 16:05:04 crc kubenswrapper[4730]: I0320 16:05:04.794385    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="42c5867a-e6e4-43d8-8529-e75f856fb943" containerName="registry-server"
Mar 20 16:05:04 crc kubenswrapper[4730]: E0320 16:05:04.794409    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="129ce6b6-b215-4ca0-9583-78aae3c2371c" containerName="redhat-edpm-deployment-openstack-edpm-ipam"
Mar 20 16:05:04 crc kubenswrapper[4730]: I0320 16:05:04.794418    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="129ce6b6-b215-4ca0-9583-78aae3c2371c" containerName="redhat-edpm-deployment-openstack-edpm-ipam"
Mar 20 16:05:04 crc kubenswrapper[4730]: E0320 16:05:04.794432    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42c5867a-e6e4-43d8-8529-e75f856fb943" containerName="extract-utilities"
Mar 20 16:05:04 crc kubenswrapper[4730]: I0320 16:05:04.794439    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="42c5867a-e6e4-43d8-8529-e75f856fb943" containerName="extract-utilities"
Mar 20 16:05:04 crc kubenswrapper[4730]: E0320 16:05:04.794466    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42c5867a-e6e4-43d8-8529-e75f856fb943" containerName="extract-content"
Mar 20 16:05:04 crc kubenswrapper[4730]: I0320 16:05:04.794473    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="42c5867a-e6e4-43d8-8529-e75f856fb943" containerName="extract-content"
Mar 20 16:05:04 crc kubenswrapper[4730]: I0320 16:05:04.794720    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="42c5867a-e6e4-43d8-8529-e75f856fb943" containerName="registry-server"
Mar 20 16:05:04 crc kubenswrapper[4730]: I0320 16:05:04.794764    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="129ce6b6-b215-4ca0-9583-78aae3c2371c" containerName="redhat-edpm-deployment-openstack-edpm-ipam"
Mar 20 16:05:04 crc kubenswrapper[4730]: I0320 16:05:04.795627    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ppqdh"
Mar 20 16:05:04 crc kubenswrapper[4730]: I0320 16:05:04.800782    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret"
Mar 20 16:05:04 crc kubenswrapper[4730]: I0320 16:05:04.801102    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vvsxx"
Mar 20 16:05:04 crc kubenswrapper[4730]: I0320 16:05:04.801881    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env"
Mar 20 16:05:04 crc kubenswrapper[4730]: I0320 16:05:04.802113    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam"
Mar 20 16:05:04 crc kubenswrapper[4730]: I0320 16:05:04.815703    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ppqdh"]
Mar 20 16:05:04 crc kubenswrapper[4730]: I0320 16:05:04.945988    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73c1c649-4459-497e-ba5b-245a4eb5ad04-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ppqdh\" (UID: \"73c1c649-4459-497e-ba5b-245a4eb5ad04\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ppqdh"
Mar 20 16:05:04 crc kubenswrapper[4730]: I0320 16:05:04.946236    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/73c1c649-4459-497e-ba5b-245a4eb5ad04-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ppqdh\" (UID: \"73c1c649-4459-497e-ba5b-245a4eb5ad04\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ppqdh"
Mar 20 16:05:04 crc kubenswrapper[4730]: I0320 16:05:04.946543    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nc4zf\" (UniqueName: \"kubernetes.io/projected/73c1c649-4459-497e-ba5b-245a4eb5ad04-kube-api-access-nc4zf\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ppqdh\" (UID: \"73c1c649-4459-497e-ba5b-245a4eb5ad04\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ppqdh"
Mar 20 16:05:04 crc kubenswrapper[4730]: I0320 16:05:04.946598    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73c1c649-4459-497e-ba5b-245a4eb5ad04-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ppqdh\" (UID: \"73c1c649-4459-497e-ba5b-245a4eb5ad04\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ppqdh"
Mar 20 16:05:05 crc kubenswrapper[4730]: I0320 16:05:05.049583    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73c1c649-4459-497e-ba5b-245a4eb5ad04-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ppqdh\" (UID: \"73c1c649-4459-497e-ba5b-245a4eb5ad04\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ppqdh"
Mar 20 16:05:05 crc kubenswrapper[4730]: I0320 16:05:05.049680    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/73c1c649-4459-497e-ba5b-245a4eb5ad04-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ppqdh\" (UID: \"73c1c649-4459-497e-ba5b-245a4eb5ad04\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ppqdh"
Mar 20 16:05:05 crc kubenswrapper[4730]: I0320 16:05:05.049766    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nc4zf\" (UniqueName: \"kubernetes.io/projected/73c1c649-4459-497e-ba5b-245a4eb5ad04-kube-api-access-nc4zf\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ppqdh\" (UID: \"73c1c649-4459-497e-ba5b-245a4eb5ad04\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ppqdh"
Mar 20 16:05:05 crc kubenswrapper[4730]: I0320 16:05:05.049792    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73c1c649-4459-497e-ba5b-245a4eb5ad04-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ppqdh\" (UID: \"73c1c649-4459-497e-ba5b-245a4eb5ad04\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ppqdh"
Mar 20 16:05:05 crc kubenswrapper[4730]: I0320 16:05:05.054276    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73c1c649-4459-497e-ba5b-245a4eb5ad04-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ppqdh\" (UID: \"73c1c649-4459-497e-ba5b-245a4eb5ad04\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ppqdh"
Mar 20 16:05:05 crc kubenswrapper[4730]: I0320 16:05:05.054983    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73c1c649-4459-497e-ba5b-245a4eb5ad04-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ppqdh\" (UID: \"73c1c649-4459-497e-ba5b-245a4eb5ad04\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ppqdh"
Mar 20 16:05:05 crc kubenswrapper[4730]: I0320 16:05:05.056850    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/73c1c649-4459-497e-ba5b-245a4eb5ad04-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ppqdh\" (UID: \"73c1c649-4459-497e-ba5b-245a4eb5ad04\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ppqdh"
Mar 20 16:05:05 crc kubenswrapper[4730]: I0320 16:05:05.065554    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nc4zf\" (UniqueName: \"kubernetes.io/projected/73c1c649-4459-497e-ba5b-245a4eb5ad04-kube-api-access-nc4zf\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ppqdh\" (UID: \"73c1c649-4459-497e-ba5b-245a4eb5ad04\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ppqdh"
Mar 20 16:05:05 crc kubenswrapper[4730]: I0320 16:05:05.121523    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ppqdh"
Mar 20 16:05:05 crc kubenswrapper[4730]: I0320 16:05:05.550746    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42c5867a-e6e4-43d8-8529-e75f856fb943" path="/var/lib/kubelet/pods/42c5867a-e6e4-43d8-8529-e75f856fb943/volumes"
Mar 20 16:05:05 crc kubenswrapper[4730]: I0320 16:05:05.665210    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ppqdh"]
Mar 20 16:05:05 crc kubenswrapper[4730]: I0320 16:05:05.730724    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ppqdh" event={"ID":"73c1c649-4459-497e-ba5b-245a4eb5ad04","Type":"ContainerStarted","Data":"cd8b0f41c6bc95bfee5c3d1ca2c479defe33b5a42deb27db28effb10edd49695"}
Mar 20 16:05:06 crc kubenswrapper[4730]: I0320 16:05:06.758646    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ppqdh" event={"ID":"73c1c649-4459-497e-ba5b-245a4eb5ad04","Type":"ContainerStarted","Data":"70f50b402151af161dd2f676f7cb4e3b396dcdbae945eab08ae85bd1aeb89352"}
Mar 20 16:05:06 crc kubenswrapper[4730]: I0320 16:05:06.774817    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ppqdh" podStartSLOduration=2.106650344 podStartE2EDuration="2.774801161s" podCreationTimestamp="2026-03-20 16:05:04 +0000 UTC" firstStartedPulling="2026-03-20 16:05:05.670412632 +0000 UTC m=+1564.883784001" lastFinishedPulling="2026-03-20 16:05:06.338563459 +0000 UTC m=+1565.551934818" observedRunningTime="2026-03-20 16:05:06.77373829 +0000 UTC m=+1565.987109669" watchObservedRunningTime="2026-03-20 16:05:06.774801161 +0000 UTC m=+1565.988172530"
Mar 20 16:05:12 crc kubenswrapper[4730]: I0320 16:05:12.879821    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 16:05:12 crc kubenswrapper[4730]: I0320 16:05:12.880734    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 16:05:12 crc kubenswrapper[4730]: I0320 16:05:12.880851    4730 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf"
Mar 20 16:05:12 crc kubenswrapper[4730]: I0320 16:05:12.882157    4730 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fb7cef3383bd559653e29a00e754a8c3366946a9ed7a655b7b70a7214aec8143"} pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted"
Mar 20 16:05:12 crc kubenswrapper[4730]: I0320 16:05:12.882230    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" containerID="cri-o://fb7cef3383bd559653e29a00e754a8c3366946a9ed7a655b7b70a7214aec8143" gracePeriod=600
Mar 20 16:05:13 crc kubenswrapper[4730]: I0320 16:05:13.853541    4730 generic.go:334] "Generic (PLEG): container finished" podID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerID="fb7cef3383bd559653e29a00e754a8c3366946a9ed7a655b7b70a7214aec8143" exitCode=0
Mar 20 16:05:13 crc kubenswrapper[4730]: I0320 16:05:13.853870    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" event={"ID":"7fcd3db3-55f1-4c23-8fa9-78844495cea3","Type":"ContainerDied","Data":"fb7cef3383bd559653e29a00e754a8c3366946a9ed7a655b7b70a7214aec8143"}
Mar 20 16:05:13 crc kubenswrapper[4730]: I0320 16:05:13.853966    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" event={"ID":"7fcd3db3-55f1-4c23-8fa9-78844495cea3","Type":"ContainerStarted","Data":"4a5122c368d04f6b91f59fb7f1f1be3d99580fb45609b2315a001469955ae120"}
Mar 20 16:05:13 crc kubenswrapper[4730]: I0320 16:05:13.853999    4730 scope.go:117] "RemoveContainer" containerID="2aab75ddb2e10e731a7d582f69fae06a40e7e5a6270ff47496bdac5fb9c6ebfd"
Mar 20 16:05:46 crc kubenswrapper[4730]: I0320 16:05:46.414909    4730 scope.go:117] "RemoveContainer" containerID="e7513ea86a1e88bc7e61a8263e52b8513c6fca0f458503f246a5451604a802da"
Mar 20 16:05:46 crc kubenswrapper[4730]: I0320 16:05:46.481610    4730 scope.go:117] "RemoveContainer" containerID="3a783d296547ab247634b62ed131b57fa9392453e5aadc95036d56c15ea1686f"
Mar 20 16:05:46 crc kubenswrapper[4730]: I0320 16:05:46.510110    4730 scope.go:117] "RemoveContainer" containerID="7f523e2f068601b64366f703836d659d597d40e8112309dd07122d42b1769869"
Mar 20 16:06:00 crc kubenswrapper[4730]: I0320 16:06:00.144766    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567046-8f6sv"]
Mar 20 16:06:00 crc kubenswrapper[4730]: I0320 16:06:00.147073    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567046-8f6sv"
Mar 20 16:06:00 crc kubenswrapper[4730]: I0320 16:06:00.149233    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt"
Mar 20 16:06:00 crc kubenswrapper[4730]: I0320 16:06:00.149436    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt"
Mar 20 16:06:00 crc kubenswrapper[4730]: I0320 16:06:00.150140    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vlqbl"
Mar 20 16:06:00 crc kubenswrapper[4730]: I0320 16:06:00.163416    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567046-8f6sv"]
Mar 20 16:06:00 crc kubenswrapper[4730]: I0320 16:06:00.301481    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2gp6\" (UniqueName: \"kubernetes.io/projected/8dacfdca-1b6e-4336-8089-722d36388128-kube-api-access-d2gp6\") pod \"auto-csr-approver-29567046-8f6sv\" (UID: \"8dacfdca-1b6e-4336-8089-722d36388128\") " pod="openshift-infra/auto-csr-approver-29567046-8f6sv"
Mar 20 16:06:00 crc kubenswrapper[4730]: I0320 16:06:00.403436    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2gp6\" (UniqueName: \"kubernetes.io/projected/8dacfdca-1b6e-4336-8089-722d36388128-kube-api-access-d2gp6\") pod \"auto-csr-approver-29567046-8f6sv\" (UID: \"8dacfdca-1b6e-4336-8089-722d36388128\") " pod="openshift-infra/auto-csr-approver-29567046-8f6sv"
Mar 20 16:06:00 crc kubenswrapper[4730]: I0320 16:06:00.424049    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2gp6\" (UniqueName: \"kubernetes.io/projected/8dacfdca-1b6e-4336-8089-722d36388128-kube-api-access-d2gp6\") pod \"auto-csr-approver-29567046-8f6sv\" (UID: \"8dacfdca-1b6e-4336-8089-722d36388128\") " pod="openshift-infra/auto-csr-approver-29567046-8f6sv"
Mar 20 16:06:00 crc kubenswrapper[4730]: I0320 16:06:00.466012    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567046-8f6sv"
Mar 20 16:06:00 crc kubenswrapper[4730]: I0320 16:06:00.905372    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567046-8f6sv"]
Mar 20 16:06:00 crc kubenswrapper[4730]: W0320 16:06:00.910150    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8dacfdca_1b6e_4336_8089_722d36388128.slice/crio-b73bb10ec72b32b9bfcb5be49e347491430ce3cef3e789d130da7b549ea69c8b WatchSource:0}: Error finding container b73bb10ec72b32b9bfcb5be49e347491430ce3cef3e789d130da7b549ea69c8b: Status 404 returned error can't find the container with id b73bb10ec72b32b9bfcb5be49e347491430ce3cef3e789d130da7b549ea69c8b
Mar 20 16:06:01 crc kubenswrapper[4730]: I0320 16:06:01.343123    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567046-8f6sv" event={"ID":"8dacfdca-1b6e-4336-8089-722d36388128","Type":"ContainerStarted","Data":"b73bb10ec72b32b9bfcb5be49e347491430ce3cef3e789d130da7b549ea69c8b"}
Mar 20 16:06:02 crc kubenswrapper[4730]: I0320 16:06:02.353379    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567046-8f6sv" event={"ID":"8dacfdca-1b6e-4336-8089-722d36388128","Type":"ContainerStarted","Data":"6e67353e8a39d519cf4269a9771ca30ae4c8d30443c293283645c96cf02f2776"}
Mar 20 16:06:02 crc kubenswrapper[4730]: I0320 16:06:02.370940    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567046-8f6sv" podStartSLOduration=1.406102738 podStartE2EDuration="2.370919023s" podCreationTimestamp="2026-03-20 16:06:00 +0000 UTC" firstStartedPulling="2026-03-20 16:06:00.912762323 +0000 UTC m=+1620.126133692" lastFinishedPulling="2026-03-20 16:06:01.877578588 +0000 UTC m=+1621.090949977" observedRunningTime="2026-03-20 16:06:02.367598638 +0000 UTC m=+1621.580970027" watchObservedRunningTime="2026-03-20 16:06:02.370919023 +0000 UTC m=+1621.584290402"
Mar 20 16:06:03 crc kubenswrapper[4730]: I0320 16:06:03.363909    4730 generic.go:334] "Generic (PLEG): container finished" podID="8dacfdca-1b6e-4336-8089-722d36388128" containerID="6e67353e8a39d519cf4269a9771ca30ae4c8d30443c293283645c96cf02f2776" exitCode=0
Mar 20 16:06:03 crc kubenswrapper[4730]: I0320 16:06:03.363974    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567046-8f6sv" event={"ID":"8dacfdca-1b6e-4336-8089-722d36388128","Type":"ContainerDied","Data":"6e67353e8a39d519cf4269a9771ca30ae4c8d30443c293283645c96cf02f2776"}
Mar 20 16:06:04 crc kubenswrapper[4730]: I0320 16:06:04.760588    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567046-8f6sv"
Mar 20 16:06:04 crc kubenswrapper[4730]: I0320 16:06:04.902826    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2gp6\" (UniqueName: \"kubernetes.io/projected/8dacfdca-1b6e-4336-8089-722d36388128-kube-api-access-d2gp6\") pod \"8dacfdca-1b6e-4336-8089-722d36388128\" (UID: \"8dacfdca-1b6e-4336-8089-722d36388128\") "
Mar 20 16:06:04 crc kubenswrapper[4730]: I0320 16:06:04.908215    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dacfdca-1b6e-4336-8089-722d36388128-kube-api-access-d2gp6" (OuterVolumeSpecName: "kube-api-access-d2gp6") pod "8dacfdca-1b6e-4336-8089-722d36388128" (UID: "8dacfdca-1b6e-4336-8089-722d36388128"). InnerVolumeSpecName "kube-api-access-d2gp6". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:06:05 crc kubenswrapper[4730]: I0320 16:06:05.005919    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2gp6\" (UniqueName: \"kubernetes.io/projected/8dacfdca-1b6e-4336-8089-722d36388128-kube-api-access-d2gp6\") on node \"crc\" DevicePath \"\""
Mar 20 16:06:05 crc kubenswrapper[4730]: I0320 16:06:05.382590    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567046-8f6sv" event={"ID":"8dacfdca-1b6e-4336-8089-722d36388128","Type":"ContainerDied","Data":"b73bb10ec72b32b9bfcb5be49e347491430ce3cef3e789d130da7b549ea69c8b"}
Mar 20 16:06:05 crc kubenswrapper[4730]: I0320 16:06:05.382642    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b73bb10ec72b32b9bfcb5be49e347491430ce3cef3e789d130da7b549ea69c8b"
Mar 20 16:06:05 crc kubenswrapper[4730]: I0320 16:06:05.382654    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567046-8f6sv"
Mar 20 16:06:05 crc kubenswrapper[4730]: I0320 16:06:05.823930    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567040-2zl4f"]
Mar 20 16:06:05 crc kubenswrapper[4730]: I0320 16:06:05.833176    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567040-2zl4f"]
Mar 20 16:06:07 crc kubenswrapper[4730]: I0320 16:06:07.545171    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97b63a10-b572-4a37-a2a4-079852aa2d3d" path="/var/lib/kubelet/pods/97b63a10-b572-4a37-a2a4-079852aa2d3d/volumes"
Mar 20 16:06:46 crc kubenswrapper[4730]: I0320 16:06:46.644168    4730 scope.go:117] "RemoveContainer" containerID="347fe11ee7c05acba952c1a21fa83ca176c9f921071221e9dbdf6170682cd003"
Mar 20 16:06:46 crc kubenswrapper[4730]: I0320 16:06:46.702431    4730 scope.go:117] "RemoveContainer" containerID="89ef3de4f8d5002494a53f05fdcc4fa61cfc7cf388b35f48076aa3b98fc5e176"
Mar 20 16:06:46 crc kubenswrapper[4730]: I0320 16:06:46.743606    4730 scope.go:117] "RemoveContainer" containerID="859fbb7a55a48ffe4a6d03732d3cc6088c3d226367ded637f27bf24936c41dba"
Mar 20 16:07:42 crc kubenswrapper[4730]: I0320 16:07:42.881950    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 16:07:42 crc kubenswrapper[4730]: I0320 16:07:42.882455    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 16:08:00 crc kubenswrapper[4730]: I0320 16:08:00.163959    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567048-nhd7d"]
Mar 20 16:08:00 crc kubenswrapper[4730]: E0320 16:08:00.166763    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dacfdca-1b6e-4336-8089-722d36388128" containerName="oc"
Mar 20 16:08:00 crc kubenswrapper[4730]: I0320 16:08:00.166907    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dacfdca-1b6e-4336-8089-722d36388128" containerName="oc"
Mar 20 16:08:00 crc kubenswrapper[4730]: I0320 16:08:00.167325    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dacfdca-1b6e-4336-8089-722d36388128" containerName="oc"
Mar 20 16:08:00 crc kubenswrapper[4730]: I0320 16:08:00.168431    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567048-nhd7d"
Mar 20 16:08:00 crc kubenswrapper[4730]: I0320 16:08:00.173553    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt"
Mar 20 16:08:00 crc kubenswrapper[4730]: I0320 16:08:00.175685    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt"
Mar 20 16:08:00 crc kubenswrapper[4730]: I0320 16:08:00.176078    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vlqbl"
Mar 20 16:08:00 crc kubenswrapper[4730]: I0320 16:08:00.180567    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567048-nhd7d"]
Mar 20 16:08:00 crc kubenswrapper[4730]: I0320 16:08:00.202115    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49zn6\" (UniqueName: \"kubernetes.io/projected/28bea13e-dd2a-4ecf-9182-cc639a47c75f-kube-api-access-49zn6\") pod \"auto-csr-approver-29567048-nhd7d\" (UID: \"28bea13e-dd2a-4ecf-9182-cc639a47c75f\") " pod="openshift-infra/auto-csr-approver-29567048-nhd7d"
Mar 20 16:08:00 crc kubenswrapper[4730]: I0320 16:08:00.304266    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49zn6\" (UniqueName: \"kubernetes.io/projected/28bea13e-dd2a-4ecf-9182-cc639a47c75f-kube-api-access-49zn6\") pod \"auto-csr-approver-29567048-nhd7d\" (UID: \"28bea13e-dd2a-4ecf-9182-cc639a47c75f\") " pod="openshift-infra/auto-csr-approver-29567048-nhd7d"
Mar 20 16:08:00 crc kubenswrapper[4730]: I0320 16:08:00.322563    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49zn6\" (UniqueName: \"kubernetes.io/projected/28bea13e-dd2a-4ecf-9182-cc639a47c75f-kube-api-access-49zn6\") pod \"auto-csr-approver-29567048-nhd7d\" (UID: \"28bea13e-dd2a-4ecf-9182-cc639a47c75f\") " pod="openshift-infra/auto-csr-approver-29567048-nhd7d"
Mar 20 16:08:00 crc kubenswrapper[4730]: I0320 16:08:00.510228    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567048-nhd7d"
Mar 20 16:08:01 crc kubenswrapper[4730]: I0320 16:08:00.998319    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567048-nhd7d"]
Mar 20 16:08:01 crc kubenswrapper[4730]: I0320 16:08:01.640653    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567048-nhd7d" event={"ID":"28bea13e-dd2a-4ecf-9182-cc639a47c75f","Type":"ContainerStarted","Data":"17819f0755f55e41f14d87759253422fcd8742a0faa8a7cc476188b2acca2de7"}
Mar 20 16:08:02 crc kubenswrapper[4730]: I0320 16:08:02.652702    4730 generic.go:334] "Generic (PLEG): container finished" podID="28bea13e-dd2a-4ecf-9182-cc639a47c75f" containerID="3781eddb5c4e3f16097f248108ceebb43195728c87b3ad6512e75bc75dffb2bb" exitCode=0
Mar 20 16:08:02 crc kubenswrapper[4730]: I0320 16:08:02.652754    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567048-nhd7d" event={"ID":"28bea13e-dd2a-4ecf-9182-cc639a47c75f","Type":"ContainerDied","Data":"3781eddb5c4e3f16097f248108ceebb43195728c87b3ad6512e75bc75dffb2bb"}
Mar 20 16:08:04 crc kubenswrapper[4730]: I0320 16:08:04.098655    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567048-nhd7d"
Mar 20 16:08:04 crc kubenswrapper[4730]: I0320 16:08:04.193622    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49zn6\" (UniqueName: \"kubernetes.io/projected/28bea13e-dd2a-4ecf-9182-cc639a47c75f-kube-api-access-49zn6\") pod \"28bea13e-dd2a-4ecf-9182-cc639a47c75f\" (UID: \"28bea13e-dd2a-4ecf-9182-cc639a47c75f\") "
Mar 20 16:08:04 crc kubenswrapper[4730]: I0320 16:08:04.198550    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28bea13e-dd2a-4ecf-9182-cc639a47c75f-kube-api-access-49zn6" (OuterVolumeSpecName: "kube-api-access-49zn6") pod "28bea13e-dd2a-4ecf-9182-cc639a47c75f" (UID: "28bea13e-dd2a-4ecf-9182-cc639a47c75f"). InnerVolumeSpecName "kube-api-access-49zn6". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:08:04 crc kubenswrapper[4730]: I0320 16:08:04.295993    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49zn6\" (UniqueName: \"kubernetes.io/projected/28bea13e-dd2a-4ecf-9182-cc639a47c75f-kube-api-access-49zn6\") on node \"crc\" DevicePath \"\""
Mar 20 16:08:04 crc kubenswrapper[4730]: I0320 16:08:04.680239    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567048-nhd7d" event={"ID":"28bea13e-dd2a-4ecf-9182-cc639a47c75f","Type":"ContainerDied","Data":"17819f0755f55e41f14d87759253422fcd8742a0faa8a7cc476188b2acca2de7"}
Mar 20 16:08:04 crc kubenswrapper[4730]: I0320 16:08:04.680317    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17819f0755f55e41f14d87759253422fcd8742a0faa8a7cc476188b2acca2de7"
Mar 20 16:08:04 crc kubenswrapper[4730]: I0320 16:08:04.680387    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567048-nhd7d"
Mar 20 16:08:05 crc kubenswrapper[4730]: I0320 16:08:05.179407    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567042-rngpc"]
Mar 20 16:08:05 crc kubenswrapper[4730]: I0320 16:08:05.189188    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567042-rngpc"]
Mar 20 16:08:05 crc kubenswrapper[4730]: I0320 16:08:05.543536    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f1785c3-01b9-48cd-bfc9-c0fdb1c18455" path="/var/lib/kubelet/pods/8f1785c3-01b9-48cd-bfc9-c0fdb1c18455/volumes"
Mar 20 16:08:10 crc kubenswrapper[4730]: I0320 16:08:10.759137    4730 generic.go:334] "Generic (PLEG): container finished" podID="73c1c649-4459-497e-ba5b-245a4eb5ad04" containerID="70f50b402151af161dd2f676f7cb4e3b396dcdbae945eab08ae85bd1aeb89352" exitCode=0
Mar 20 16:08:10 crc kubenswrapper[4730]: I0320 16:08:10.759277    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ppqdh" event={"ID":"73c1c649-4459-497e-ba5b-245a4eb5ad04","Type":"ContainerDied","Data":"70f50b402151af161dd2f676f7cb4e3b396dcdbae945eab08ae85bd1aeb89352"}
Mar 20 16:08:12 crc kubenswrapper[4730]: I0320 16:08:12.191517    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ppqdh"
Mar 20 16:08:12 crc kubenswrapper[4730]: I0320 16:08:12.380703    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/73c1c649-4459-497e-ba5b-245a4eb5ad04-ssh-key-openstack-edpm-ipam\") pod \"73c1c649-4459-497e-ba5b-245a4eb5ad04\" (UID: \"73c1c649-4459-497e-ba5b-245a4eb5ad04\") "
Mar 20 16:08:12 crc kubenswrapper[4730]: I0320 16:08:12.380755    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73c1c649-4459-497e-ba5b-245a4eb5ad04-bootstrap-combined-ca-bundle\") pod \"73c1c649-4459-497e-ba5b-245a4eb5ad04\" (UID: \"73c1c649-4459-497e-ba5b-245a4eb5ad04\") "
Mar 20 16:08:12 crc kubenswrapper[4730]: I0320 16:08:12.380820    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73c1c649-4459-497e-ba5b-245a4eb5ad04-inventory\") pod \"73c1c649-4459-497e-ba5b-245a4eb5ad04\" (UID: \"73c1c649-4459-497e-ba5b-245a4eb5ad04\") "
Mar 20 16:08:12 crc kubenswrapper[4730]: I0320 16:08:12.380890    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nc4zf\" (UniqueName: \"kubernetes.io/projected/73c1c649-4459-497e-ba5b-245a4eb5ad04-kube-api-access-nc4zf\") pod \"73c1c649-4459-497e-ba5b-245a4eb5ad04\" (UID: \"73c1c649-4459-497e-ba5b-245a4eb5ad04\") "
Mar 20 16:08:12 crc kubenswrapper[4730]: I0320 16:08:12.394306    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73c1c649-4459-497e-ba5b-245a4eb5ad04-kube-api-access-nc4zf" (OuterVolumeSpecName: "kube-api-access-nc4zf") pod "73c1c649-4459-497e-ba5b-245a4eb5ad04" (UID: "73c1c649-4459-497e-ba5b-245a4eb5ad04"). InnerVolumeSpecName "kube-api-access-nc4zf". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:08:12 crc kubenswrapper[4730]: I0320 16:08:12.394465    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73c1c649-4459-497e-ba5b-245a4eb5ad04-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "73c1c649-4459-497e-ba5b-245a4eb5ad04" (UID: "73c1c649-4459-497e-ba5b-245a4eb5ad04"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:08:12 crc kubenswrapper[4730]: I0320 16:08:12.416124    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73c1c649-4459-497e-ba5b-245a4eb5ad04-inventory" (OuterVolumeSpecName: "inventory") pod "73c1c649-4459-497e-ba5b-245a4eb5ad04" (UID: "73c1c649-4459-497e-ba5b-245a4eb5ad04"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:08:12 crc kubenswrapper[4730]: I0320 16:08:12.422021    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73c1c649-4459-497e-ba5b-245a4eb5ad04-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "73c1c649-4459-497e-ba5b-245a4eb5ad04" (UID: "73c1c649-4459-497e-ba5b-245a4eb5ad04"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:08:12 crc kubenswrapper[4730]: I0320 16:08:12.483765    4730 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/73c1c649-4459-497e-ba5b-245a4eb5ad04-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\""
Mar 20 16:08:12 crc kubenswrapper[4730]: I0320 16:08:12.483806    4730 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73c1c649-4459-497e-ba5b-245a4eb5ad04-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:08:12 crc kubenswrapper[4730]: I0320 16:08:12.483819    4730 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73c1c649-4459-497e-ba5b-245a4eb5ad04-inventory\") on node \"crc\" DevicePath \"\""
Mar 20 16:08:12 crc kubenswrapper[4730]: I0320 16:08:12.483831    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nc4zf\" (UniqueName: \"kubernetes.io/projected/73c1c649-4459-497e-ba5b-245a4eb5ad04-kube-api-access-nc4zf\") on node \"crc\" DevicePath \"\""
Mar 20 16:08:12 crc kubenswrapper[4730]: I0320 16:08:12.783975    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ppqdh" event={"ID":"73c1c649-4459-497e-ba5b-245a4eb5ad04","Type":"ContainerDied","Data":"cd8b0f41c6bc95bfee5c3d1ca2c479defe33b5a42deb27db28effb10edd49695"}
Mar 20 16:08:12 crc kubenswrapper[4730]: I0320 16:08:12.784024    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd8b0f41c6bc95bfee5c3d1ca2c479defe33b5a42deb27db28effb10edd49695"
Mar 20 16:08:12 crc kubenswrapper[4730]: I0320 16:08:12.784022    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ppqdh"
Mar 20 16:08:12 crc kubenswrapper[4730]: I0320 16:08:12.879925    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 16:08:12 crc kubenswrapper[4730]: I0320 16:08:12.879997    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 16:08:12 crc kubenswrapper[4730]: I0320 16:08:12.888091    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9p7mz"]
Mar 20 16:08:12 crc kubenswrapper[4730]: E0320 16:08:12.888953    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73c1c649-4459-497e-ba5b-245a4eb5ad04" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam"
Mar 20 16:08:12 crc kubenswrapper[4730]: I0320 16:08:12.889097    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="73c1c649-4459-497e-ba5b-245a4eb5ad04" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam"
Mar 20 16:08:12 crc kubenswrapper[4730]: E0320 16:08:12.889239    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28bea13e-dd2a-4ecf-9182-cc639a47c75f" containerName="oc"
Mar 20 16:08:12 crc kubenswrapper[4730]: I0320 16:08:12.889331    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="28bea13e-dd2a-4ecf-9182-cc639a47c75f" containerName="oc"
Mar 20 16:08:12 crc kubenswrapper[4730]: I0320 16:08:12.889866    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="28bea13e-dd2a-4ecf-9182-cc639a47c75f" containerName="oc"
Mar 20 16:08:12 crc kubenswrapper[4730]: I0320 16:08:12.889979    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="73c1c649-4459-497e-ba5b-245a4eb5ad04" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam"
Mar 20 16:08:12 crc kubenswrapper[4730]: I0320 16:08:12.890943    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9p7mz"
Mar 20 16:08:12 crc kubenswrapper[4730]: I0320 16:08:12.894597    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam"
Mar 20 16:08:12 crc kubenswrapper[4730]: I0320 16:08:12.894701    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret"
Mar 20 16:08:12 crc kubenswrapper[4730]: I0320 16:08:12.894988    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vvsxx"
Mar 20 16:08:12 crc kubenswrapper[4730]: I0320 16:08:12.895310    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env"
Mar 20 16:08:12 crc kubenswrapper[4730]: I0320 16:08:12.899427    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9p7mz"]
Mar 20 16:08:12 crc kubenswrapper[4730]: I0320 16:08:12.991904    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/962231f7-41b6-4754-b63c-523277f7cf50-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-9p7mz\" (UID: \"962231f7-41b6-4754-b63c-523277f7cf50\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9p7mz"
Mar 20 16:08:12 crc kubenswrapper[4730]: I0320 16:08:12.992680    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/962231f7-41b6-4754-b63c-523277f7cf50-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-9p7mz\" (UID: \"962231f7-41b6-4754-b63c-523277f7cf50\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9p7mz"
Mar 20 16:08:12 crc kubenswrapper[4730]: I0320 16:08:12.993072    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw628\" (UniqueName: \"kubernetes.io/projected/962231f7-41b6-4754-b63c-523277f7cf50-kube-api-access-sw628\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-9p7mz\" (UID: \"962231f7-41b6-4754-b63c-523277f7cf50\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9p7mz"
Mar 20 16:08:13 crc kubenswrapper[4730]: I0320 16:08:13.097847    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/962231f7-41b6-4754-b63c-523277f7cf50-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-9p7mz\" (UID: \"962231f7-41b6-4754-b63c-523277f7cf50\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9p7mz"
Mar 20 16:08:13 crc kubenswrapper[4730]: I0320 16:08:13.098054    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/962231f7-41b6-4754-b63c-523277f7cf50-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-9p7mz\" (UID: \"962231f7-41b6-4754-b63c-523277f7cf50\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9p7mz"
Mar 20 16:08:13 crc kubenswrapper[4730]: I0320 16:08:13.098121    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sw628\" (UniqueName: \"kubernetes.io/projected/962231f7-41b6-4754-b63c-523277f7cf50-kube-api-access-sw628\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-9p7mz\" (UID: \"962231f7-41b6-4754-b63c-523277f7cf50\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9p7mz"
Mar 20 16:08:13 crc kubenswrapper[4730]: I0320 16:08:13.114957    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/962231f7-41b6-4754-b63c-523277f7cf50-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-9p7mz\" (UID: \"962231f7-41b6-4754-b63c-523277f7cf50\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9p7mz"
Mar 20 16:08:13 crc kubenswrapper[4730]: I0320 16:08:13.120074    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/962231f7-41b6-4754-b63c-523277f7cf50-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-9p7mz\" (UID: \"962231f7-41b6-4754-b63c-523277f7cf50\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9p7mz"
Mar 20 16:08:13 crc kubenswrapper[4730]: I0320 16:08:13.125054    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw628\" (UniqueName: \"kubernetes.io/projected/962231f7-41b6-4754-b63c-523277f7cf50-kube-api-access-sw628\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-9p7mz\" (UID: \"962231f7-41b6-4754-b63c-523277f7cf50\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9p7mz"
Mar 20 16:08:13 crc kubenswrapper[4730]: I0320 16:08:13.212809    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9p7mz"
Mar 20 16:08:13 crc kubenswrapper[4730]: I0320 16:08:13.597662    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9p7mz"]
Mar 20 16:08:13 crc kubenswrapper[4730]: I0320 16:08:13.793751    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9p7mz" event={"ID":"962231f7-41b6-4754-b63c-523277f7cf50","Type":"ContainerStarted","Data":"65f684006305344a33bcb883887a7102383b145f70b18f1bff5162dff68a6183"}
Mar 20 16:08:14 crc kubenswrapper[4730]: I0320 16:08:14.804481    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9p7mz" event={"ID":"962231f7-41b6-4754-b63c-523277f7cf50","Type":"ContainerStarted","Data":"23ee7694e655d68f9d7ceabeb66817e16ee241dd678d47fcd8c97daf31ec82f5"}
Mar 20 16:08:14 crc kubenswrapper[4730]: I0320 16:08:14.835781    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9p7mz" podStartSLOduration=2.258339066 podStartE2EDuration="2.835755921s" podCreationTimestamp="2026-03-20 16:08:12 +0000 UTC" firstStartedPulling="2026-03-20 16:08:13.602707062 +0000 UTC m=+1752.816078441" lastFinishedPulling="2026-03-20 16:08:14.180123937 +0000 UTC m=+1753.393495296" observedRunningTime="2026-03-20 16:08:14.829140228 +0000 UTC m=+1754.042511607" watchObservedRunningTime="2026-03-20 16:08:14.835755921 +0000 UTC m=+1754.049127310"
Mar 20 16:08:42 crc kubenswrapper[4730]: I0320 16:08:42.879946    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 16:08:42 crc kubenswrapper[4730]: I0320 16:08:42.880566    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 16:08:42 crc kubenswrapper[4730]: I0320 16:08:42.880638    4730 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf"
Mar 20 16:08:42 crc kubenswrapper[4730]: I0320 16:08:42.881543    4730 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4a5122c368d04f6b91f59fb7f1f1be3d99580fb45609b2315a001469955ae120"} pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted"
Mar 20 16:08:42 crc kubenswrapper[4730]: I0320 16:08:42.881606    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" containerID="cri-o://4a5122c368d04f6b91f59fb7f1f1be3d99580fb45609b2315a001469955ae120" gracePeriod=600
Mar 20 16:08:43 crc kubenswrapper[4730]: E0320 16:08:43.012148    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:08:43 crc kubenswrapper[4730]: I0320 16:08:43.119714    4730 generic.go:334] "Generic (PLEG): container finished" podID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerID="4a5122c368d04f6b91f59fb7f1f1be3d99580fb45609b2315a001469955ae120" exitCode=0
Mar 20 16:08:43 crc kubenswrapper[4730]: I0320 16:08:43.119760    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" event={"ID":"7fcd3db3-55f1-4c23-8fa9-78844495cea3","Type":"ContainerDied","Data":"4a5122c368d04f6b91f59fb7f1f1be3d99580fb45609b2315a001469955ae120"}
Mar 20 16:08:43 crc kubenswrapper[4730]: I0320 16:08:43.119800    4730 scope.go:117] "RemoveContainer" containerID="fb7cef3383bd559653e29a00e754a8c3366946a9ed7a655b7b70a7214aec8143"
Mar 20 16:08:43 crc kubenswrapper[4730]: I0320 16:08:43.120647    4730 scope.go:117] "RemoveContainer" containerID="4a5122c368d04f6b91f59fb7f1f1be3d99580fb45609b2315a001469955ae120"
Mar 20 16:08:43 crc kubenswrapper[4730]: E0320 16:08:43.120943    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:08:46 crc kubenswrapper[4730]: I0320 16:08:46.882672    4730 scope.go:117] "RemoveContainer" containerID="807d658a9e1c791073ad6dce59cf86eec477c7d4420c8a363f99c8986963ad00"
Mar 20 16:08:55 crc kubenswrapper[4730]: I0320 16:08:55.533856    4730 scope.go:117] "RemoveContainer" containerID="4a5122c368d04f6b91f59fb7f1f1be3d99580fb45609b2315a001469955ae120"
Mar 20 16:08:55 crc kubenswrapper[4730]: E0320 16:08:55.534637    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:08:57 crc kubenswrapper[4730]: I0320 16:08:57.079931    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-e285-account-create-update-6wk66"]
Mar 20 16:08:57 crc kubenswrapper[4730]: I0320 16:08:57.093789    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-bjqvh"]
Mar 20 16:08:57 crc kubenswrapper[4730]: I0320 16:08:57.101888    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-e285-account-create-update-6wk66"]
Mar 20 16:08:57 crc kubenswrapper[4730]: I0320 16:08:57.109463    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-bjqvh"]
Mar 20 16:08:57 crc kubenswrapper[4730]: I0320 16:08:57.546124    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16da1663-821b-4e05-95f6-df67e9fac962" path="/var/lib/kubelet/pods/16da1663-821b-4e05-95f6-df67e9fac962/volumes"
Mar 20 16:08:57 crc kubenswrapper[4730]: I0320 16:08:57.546882    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17c0870c-17e5-4bd4-91b1-a8df134a4fbd" path="/var/lib/kubelet/pods/17c0870c-17e5-4bd4-91b1-a8df134a4fbd/volumes"
Mar 20 16:08:59 crc kubenswrapper[4730]: I0320 16:08:59.046828    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-fvknw"]
Mar 20 16:08:59 crc kubenswrapper[4730]: I0320 16:08:59.063978    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-fvknw"]
Mar 20 16:08:59 crc kubenswrapper[4730]: I0320 16:08:59.072954    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-create-n9vdf"]
Mar 20 16:08:59 crc kubenswrapper[4730]: I0320 16:08:59.082891    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db41-account-create-update-x7l2w"]
Mar 20 16:08:59 crc kubenswrapper[4730]: I0320 16:08:59.091045    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-x4h5x"]
Mar 20 16:08:59 crc kubenswrapper[4730]: I0320 16:08:59.099199    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-c5a1-account-create-update-hfdmn"]
Mar 20 16:08:59 crc kubenswrapper[4730]: I0320 16:08:59.111575    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-create-n9vdf"]
Mar 20 16:08:59 crc kubenswrapper[4730]: I0320 16:08:59.122255    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-x4h5x"]
Mar 20 16:08:59 crc kubenswrapper[4730]: I0320 16:08:59.135456    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db41-account-create-update-x7l2w"]
Mar 20 16:08:59 crc kubenswrapper[4730]: I0320 16:08:59.145757    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-c5a1-account-create-update-hfdmn"]
Mar 20 16:08:59 crc kubenswrapper[4730]: I0320 16:08:59.548055    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3198c781-92f7-40f1-9b6e-ed5310febe0b" path="/var/lib/kubelet/pods/3198c781-92f7-40f1-9b6e-ed5310febe0b/volumes"
Mar 20 16:08:59 crc kubenswrapper[4730]: I0320 16:08:59.550802    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a532566c-ab86-4984-9212-1e48605d192b" path="/var/lib/kubelet/pods/a532566c-ab86-4984-9212-1e48605d192b/volumes"
Mar 20 16:08:59 crc kubenswrapper[4730]: I0320 16:08:59.552450    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c40f368e-f905-465b-9af0-b0ecb753de79" path="/var/lib/kubelet/pods/c40f368e-f905-465b-9af0-b0ecb753de79/volumes"
Mar 20 16:08:59 crc kubenswrapper[4730]: I0320 16:08:59.553796    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7b436cd-ff29-4a9f-9e58-4c8760b1e012" path="/var/lib/kubelet/pods/c7b436cd-ff29-4a9f-9e58-4c8760b1e012/volumes"
Mar 20 16:08:59 crc kubenswrapper[4730]: I0320 16:08:59.554531    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef40906b-a3dc-45b8-8bde-dd06eaaef85c" path="/var/lib/kubelet/pods/ef40906b-a3dc-45b8-8bde-dd06eaaef85c/volumes"
Mar 20 16:09:00 crc kubenswrapper[4730]: I0320 16:09:00.081390    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-87f2-account-create-update-lblc4"]
Mar 20 16:09:00 crc kubenswrapper[4730]: I0320 16:09:00.099058    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-87f2-account-create-update-lblc4"]
Mar 20 16:09:01 crc kubenswrapper[4730]: I0320 16:09:01.551237    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a132fe19-9294-49c6-9b1e-fe3eed7f4bae" path="/var/lib/kubelet/pods/a132fe19-9294-49c6-9b1e-fe3eed7f4bae/volumes"
Mar 20 16:09:07 crc kubenswrapper[4730]: I0320 16:09:07.533766    4730 scope.go:117] "RemoveContainer" containerID="4a5122c368d04f6b91f59fb7f1f1be3d99580fb45609b2315a001469955ae120"
Mar 20 16:09:07 crc kubenswrapper[4730]: E0320 16:09:07.534868    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:09:19 crc kubenswrapper[4730]: I0320 16:09:19.534678    4730 scope.go:117] "RemoveContainer" containerID="4a5122c368d04f6b91f59fb7f1f1be3d99580fb45609b2315a001469955ae120"
Mar 20 16:09:19 crc kubenswrapper[4730]: E0320 16:09:19.535917    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:09:20 crc kubenswrapper[4730]: I0320 16:09:20.042793    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-d92d8"]
Mar 20 16:09:20 crc kubenswrapper[4730]: I0320 16:09:20.064794    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-d92d8"]
Mar 20 16:09:21 crc kubenswrapper[4730]: I0320 16:09:21.559580    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed167127-4e44-4877-bf9b-dbb6a23a8b3f" path="/var/lib/kubelet/pods/ed167127-4e44-4877-bf9b-dbb6a23a8b3f/volumes"
Mar 20 16:09:24 crc kubenswrapper[4730]: I0320 16:09:24.044468    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-mkvv4"]
Mar 20 16:09:24 crc kubenswrapper[4730]: I0320 16:09:24.057569    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-mkvv4"]
Mar 20 16:09:25 crc kubenswrapper[4730]: I0320 16:09:25.549169    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37dd8777-c196-4db2-af7a-5560a939e02c" path="/var/lib/kubelet/pods/37dd8777-c196-4db2-af7a-5560a939e02c/volumes"
Mar 20 16:09:31 crc kubenswrapper[4730]: I0320 16:09:31.539186    4730 scope.go:117] "RemoveContainer" containerID="4a5122c368d04f6b91f59fb7f1f1be3d99580fb45609b2315a001469955ae120"
Mar 20 16:09:31 crc kubenswrapper[4730]: E0320 16:09:31.540157    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:09:43 crc kubenswrapper[4730]: I0320 16:09:43.052413    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-9q2kz"]
Mar 20 16:09:43 crc kubenswrapper[4730]: I0320 16:09:43.065297    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-9f59-account-create-update-vmg5j"]
Mar 20 16:09:43 crc kubenswrapper[4730]: I0320 16:09:43.073936    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-87csx"]
Mar 20 16:09:43 crc kubenswrapper[4730]: I0320 16:09:43.082875    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-9q2kz"]
Mar 20 16:09:43 crc kubenswrapper[4730]: I0320 16:09:43.091322    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-87csx"]
Mar 20 16:09:43 crc kubenswrapper[4730]: I0320 16:09:43.100280    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-9f59-account-create-update-vmg5j"]
Mar 20 16:09:43 crc kubenswrapper[4730]: I0320 16:09:43.549080    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44a72513-75fb-4b7e-912b-d28fa63d050a" path="/var/lib/kubelet/pods/44a72513-75fb-4b7e-912b-d28fa63d050a/volumes"
Mar 20 16:09:43 crc kubenswrapper[4730]: I0320 16:09:43.550711    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6118ed31-b8d7-4a7c-8769-69d996d26915" path="/var/lib/kubelet/pods/6118ed31-b8d7-4a7c-8769-69d996d26915/volumes"
Mar 20 16:09:43 crc kubenswrapper[4730]: I0320 16:09:43.552233    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad93c0a8-34d6-4fee-985c-7c7307f00c0c" path="/var/lib/kubelet/pods/ad93c0a8-34d6-4fee-985c-7c7307f00c0c/volumes"
Mar 20 16:09:44 crc kubenswrapper[4730]: I0320 16:09:44.035790    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-3959-account-create-update-qxd89"]
Mar 20 16:09:44 crc kubenswrapper[4730]: I0320 16:09:44.047890    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-3959-account-create-update-qxd89"]
Mar 20 16:09:44 crc kubenswrapper[4730]: I0320 16:09:44.534107    4730 scope.go:117] "RemoveContainer" containerID="4a5122c368d04f6b91f59fb7f1f1be3d99580fb45609b2315a001469955ae120"
Mar 20 16:09:44 crc kubenswrapper[4730]: E0320 16:09:44.535066    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:09:45 crc kubenswrapper[4730]: I0320 16:09:45.056380    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-qpb6s"]
Mar 20 16:09:45 crc kubenswrapper[4730]: I0320 16:09:45.066488    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-qpb6s"]
Mar 20 16:09:45 crc kubenswrapper[4730]: I0320 16:09:45.547501    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c94c6d8-4c40-455a-a536-7c64e3838986" path="/var/lib/kubelet/pods/1c94c6d8-4c40-455a-a536-7c64e3838986/volumes"
Mar 20 16:09:45 crc kubenswrapper[4730]: I0320 16:09:45.548740    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e01c2575-5301-494a-bf47-9a6053de9c64" path="/var/lib/kubelet/pods/e01c2575-5301-494a-bf47-9a6053de9c64/volumes"
Mar 20 16:09:46 crc kubenswrapper[4730]: I0320 16:09:46.957454    4730 scope.go:117] "RemoveContainer" containerID="67c549d0aa6a1c0db14f97c3aff414699de48b304aa4c5c416c420aae8bc31a7"
Mar 20 16:09:46 crc kubenswrapper[4730]: I0320 16:09:46.987412    4730 scope.go:117] "RemoveContainer" containerID="2857b9eca0093dd961d059808f0936df2d938583bfd861b80e003896a914c165"
Mar 20 16:09:47 crc kubenswrapper[4730]: I0320 16:09:47.063920    4730 scope.go:117] "RemoveContainer" containerID="a18801b5a50e28a1d043f07d02846b12496eaa787cc63d296052b7f86700e382"
Mar 20 16:09:47 crc kubenswrapper[4730]: I0320 16:09:47.098680    4730 scope.go:117] "RemoveContainer" containerID="1742d4d5f625e20265689269ef8d4a8b9f9546ddd3978d31dffe002e4353d662"
Mar 20 16:09:47 crc kubenswrapper[4730]: I0320 16:09:47.143704    4730 scope.go:117] "RemoveContainer" containerID="40e0babc7b2f63017ce242ba014b4798c26ae0c66070098c86ad2de5a7400e6c"
Mar 20 16:09:47 crc kubenswrapper[4730]: I0320 16:09:47.186162    4730 scope.go:117] "RemoveContainer" containerID="ae717d458b43c41d279b4f17419574a7ba6d139ccd8581e792b75559eb5cba0c"
Mar 20 16:09:47 crc kubenswrapper[4730]: I0320 16:09:47.228949    4730 scope.go:117] "RemoveContainer" containerID="a8ebba1aa3aefe2f2a84695ac23d42f8c9788cfc46f63bc2e9dead733c7274ec"
Mar 20 16:09:47 crc kubenswrapper[4730]: I0320 16:09:47.254842    4730 scope.go:117] "RemoveContainer" containerID="30a5fc8a5ea71396f4de5cb5ef85143858b4f15e175e2aea0d88617f137cddad"
Mar 20 16:09:47 crc kubenswrapper[4730]: I0320 16:09:47.278349    4730 scope.go:117] "RemoveContainer" containerID="58cffe0b249055c3d576f3ea017f6ee1185d299a1b49ba134b1ea8fcf81d53bd"
Mar 20 16:09:47 crc kubenswrapper[4730]: I0320 16:09:47.298813    4730 scope.go:117] "RemoveContainer" containerID="c35a1209cb7b3066725c4f8438840ab79db396745f3b69d5ee16580ca7ae88eb"
Mar 20 16:09:47 crc kubenswrapper[4730]: I0320 16:09:47.320257    4730 scope.go:117] "RemoveContainer" containerID="619c70ff24e78ebd6137bc20c79ee2dc5949bf1cca622b03e9fc4227379e48f4"
Mar 20 16:09:47 crc kubenswrapper[4730]: I0320 16:09:47.359008    4730 scope.go:117] "RemoveContainer" containerID="45a09c4f4bffe31b4f9cf83737f4a3331b9ba65b3e4bbf1a00d15070f2dd1fbb"
Mar 20 16:09:47 crc kubenswrapper[4730]: I0320 16:09:47.392939    4730 scope.go:117] "RemoveContainer" containerID="b503fc415bca6f276d3faa0fabe6ea4e17e93d2815b320d70f62ffa635dc90fc"
Mar 20 16:09:47 crc kubenswrapper[4730]: I0320 16:09:47.414511    4730 scope.go:117] "RemoveContainer" containerID="ecddce73fd871590be8e4104469454a63bf36c8d5e335fcb1236e7e17748fcf3"
Mar 20 16:09:47 crc kubenswrapper[4730]: I0320 16:09:47.435886    4730 scope.go:117] "RemoveContainer" containerID="6f4ac67e084527a1cb38bd3c525c24e61f9c884d533f00e2d72554c431fbb247"
Mar 20 16:09:48 crc kubenswrapper[4730]: I0320 16:09:48.034449    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-c423-account-create-update-dcjc2"]
Mar 20 16:09:48 crc kubenswrapper[4730]: I0320 16:09:48.048521    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-c423-account-create-update-dcjc2"]
Mar 20 16:09:49 crc kubenswrapper[4730]: I0320 16:09:49.544187    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06e5575b-c67a-46fe-8502-efc341523de2" path="/var/lib/kubelet/pods/06e5575b-c67a-46fe-8502-efc341523de2/volumes"
Mar 20 16:09:56 crc kubenswrapper[4730]: I0320 16:09:56.534689    4730 scope.go:117] "RemoveContainer" containerID="4a5122c368d04f6b91f59fb7f1f1be3d99580fb45609b2315a001469955ae120"
Mar 20 16:09:56 crc kubenswrapper[4730]: E0320 16:09:56.535661    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:09:58 crc kubenswrapper[4730]: I0320 16:09:58.030623    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-sync-ns6b5"]
Mar 20 16:09:58 crc kubenswrapper[4730]: I0320 16:09:58.040911    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-sync-ns6b5"]
Mar 20 16:09:59 crc kubenswrapper[4730]: I0320 16:09:59.038524    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-rb4pw"]
Mar 20 16:09:59 crc kubenswrapper[4730]: I0320 16:09:59.049318    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-rb4pw"]
Mar 20 16:09:59 crc kubenswrapper[4730]: I0320 16:09:59.543184    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92a7eed8-de7c-4816-8bd9-e922ace376ad" path="/var/lib/kubelet/pods/92a7eed8-de7c-4816-8bd9-e922ace376ad/volumes"
Mar 20 16:09:59 crc kubenswrapper[4730]: I0320 16:09:59.543814    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9577f66b-a45e-4d51-9d87-4ae757819182" path="/var/lib/kubelet/pods/9577f66b-a45e-4d51-9d87-4ae757819182/volumes"
Mar 20 16:10:00 crc kubenswrapper[4730]: I0320 16:10:00.165774    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567050-7gqln"]
Mar 20 16:10:00 crc kubenswrapper[4730]: I0320 16:10:00.167861    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567050-7gqln"
Mar 20 16:10:00 crc kubenswrapper[4730]: I0320 16:10:00.170377    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt"
Mar 20 16:10:00 crc kubenswrapper[4730]: I0320 16:10:00.170378    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt"
Mar 20 16:10:00 crc kubenswrapper[4730]: I0320 16:10:00.171015    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vlqbl"
Mar 20 16:10:00 crc kubenswrapper[4730]: I0320 16:10:00.178602    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567050-7gqln"]
Mar 20 16:10:00 crc kubenswrapper[4730]: I0320 16:10:00.211606    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ck85\" (UniqueName: \"kubernetes.io/projected/9d689cf2-4142-40fd-9af3-13b98b99296d-kube-api-access-4ck85\") pod \"auto-csr-approver-29567050-7gqln\" (UID: \"9d689cf2-4142-40fd-9af3-13b98b99296d\") " pod="openshift-infra/auto-csr-approver-29567050-7gqln"
Mar 20 16:10:00 crc kubenswrapper[4730]: I0320 16:10:00.314131    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ck85\" (UniqueName: \"kubernetes.io/projected/9d689cf2-4142-40fd-9af3-13b98b99296d-kube-api-access-4ck85\") pod \"auto-csr-approver-29567050-7gqln\" (UID: \"9d689cf2-4142-40fd-9af3-13b98b99296d\") " pod="openshift-infra/auto-csr-approver-29567050-7gqln"
Mar 20 16:10:00 crc kubenswrapper[4730]: I0320 16:10:00.334494    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ck85\" (UniqueName: \"kubernetes.io/projected/9d689cf2-4142-40fd-9af3-13b98b99296d-kube-api-access-4ck85\") pod \"auto-csr-approver-29567050-7gqln\" (UID: \"9d689cf2-4142-40fd-9af3-13b98b99296d\") " pod="openshift-infra/auto-csr-approver-29567050-7gqln"
Mar 20 16:10:00 crc kubenswrapper[4730]: I0320 16:10:00.512387    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567050-7gqln"
Mar 20 16:10:00 crc kubenswrapper[4730]: W0320 16:10:00.987036    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d689cf2_4142_40fd_9af3_13b98b99296d.slice/crio-38edbd01203ebf1ef4afe23c08185596756aa89c1057d140329631cc21b6dd74 WatchSource:0}: Error finding container 38edbd01203ebf1ef4afe23c08185596756aa89c1057d140329631cc21b6dd74: Status 404 returned error can't find the container with id 38edbd01203ebf1ef4afe23c08185596756aa89c1057d140329631cc21b6dd74
Mar 20 16:10:00 crc kubenswrapper[4730]: I0320 16:10:00.989302    4730 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider
Mar 20 16:10:01 crc kubenswrapper[4730]: I0320 16:10:01.000321    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567050-7gqln"]
Mar 20 16:10:01 crc kubenswrapper[4730]: I0320 16:10:01.998910    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567050-7gqln" event={"ID":"9d689cf2-4142-40fd-9af3-13b98b99296d","Type":"ContainerStarted","Data":"38edbd01203ebf1ef4afe23c08185596756aa89c1057d140329631cc21b6dd74"}
Mar 20 16:10:03 crc kubenswrapper[4730]: I0320 16:10:03.009188    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567050-7gqln" event={"ID":"9d689cf2-4142-40fd-9af3-13b98b99296d","Type":"ContainerStarted","Data":"41dec27fbddb23dabfba3fbf070ca912d2703e72f2511ee9ef62aa8a4e09aa09"}
Mar 20 16:10:03 crc kubenswrapper[4730]: I0320 16:10:03.036042    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567050-7gqln" podStartSLOduration=1.451113883 podStartE2EDuration="3.036023731s" podCreationTimestamp="2026-03-20 16:10:00 +0000 UTC" firstStartedPulling="2026-03-20 16:10:00.989085362 +0000 UTC m=+1860.202456721" lastFinishedPulling="2026-03-20 16:10:02.57399519 +0000 UTC m=+1861.787366569" observedRunningTime="2026-03-20 16:10:03.028920546 +0000 UTC m=+1862.242291915" watchObservedRunningTime="2026-03-20 16:10:03.036023731 +0000 UTC m=+1862.249395100"
Mar 20 16:10:04 crc kubenswrapper[4730]: I0320 16:10:04.022202    4730 generic.go:334] "Generic (PLEG): container finished" podID="9d689cf2-4142-40fd-9af3-13b98b99296d" containerID="41dec27fbddb23dabfba3fbf070ca912d2703e72f2511ee9ef62aa8a4e09aa09" exitCode=0
Mar 20 16:10:04 crc kubenswrapper[4730]: I0320 16:10:04.022292    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567050-7gqln" event={"ID":"9d689cf2-4142-40fd-9af3-13b98b99296d","Type":"ContainerDied","Data":"41dec27fbddb23dabfba3fbf070ca912d2703e72f2511ee9ef62aa8a4e09aa09"}
Mar 20 16:10:05 crc kubenswrapper[4730]: I0320 16:10:05.405178    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567050-7gqln"
Mar 20 16:10:05 crc kubenswrapper[4730]: I0320 16:10:05.430195    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ck85\" (UniqueName: \"kubernetes.io/projected/9d689cf2-4142-40fd-9af3-13b98b99296d-kube-api-access-4ck85\") pod \"9d689cf2-4142-40fd-9af3-13b98b99296d\" (UID: \"9d689cf2-4142-40fd-9af3-13b98b99296d\") "
Mar 20 16:10:05 crc kubenswrapper[4730]: I0320 16:10:05.436699    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d689cf2-4142-40fd-9af3-13b98b99296d-kube-api-access-4ck85" (OuterVolumeSpecName: "kube-api-access-4ck85") pod "9d689cf2-4142-40fd-9af3-13b98b99296d" (UID: "9d689cf2-4142-40fd-9af3-13b98b99296d"). InnerVolumeSpecName "kube-api-access-4ck85". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:10:05 crc kubenswrapper[4730]: I0320 16:10:05.532405    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ck85\" (UniqueName: \"kubernetes.io/projected/9d689cf2-4142-40fd-9af3-13b98b99296d-kube-api-access-4ck85\") on node \"crc\" DevicePath \"\""
Mar 20 16:10:06 crc kubenswrapper[4730]: I0320 16:10:06.075767    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567050-7gqln" event={"ID":"9d689cf2-4142-40fd-9af3-13b98b99296d","Type":"ContainerDied","Data":"38edbd01203ebf1ef4afe23c08185596756aa89c1057d140329631cc21b6dd74"}
Mar 20 16:10:06 crc kubenswrapper[4730]: I0320 16:10:06.075860    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38edbd01203ebf1ef4afe23c08185596756aa89c1057d140329631cc21b6dd74"
Mar 20 16:10:06 crc kubenswrapper[4730]: I0320 16:10:06.075976    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567050-7gqln"
Mar 20 16:10:06 crc kubenswrapper[4730]: I0320 16:10:06.106818    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567044-nw9nk"]
Mar 20 16:10:06 crc kubenswrapper[4730]: I0320 16:10:06.118203    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567044-nw9nk"]
Mar 20 16:10:07 crc kubenswrapper[4730]: I0320 16:10:07.550807    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44fa3d60-826d-4b59-b44a-0102f155b586" path="/var/lib/kubelet/pods/44fa3d60-826d-4b59-b44a-0102f155b586/volumes"
Mar 20 16:10:09 crc kubenswrapper[4730]: I0320 16:10:09.533886    4730 scope.go:117] "RemoveContainer" containerID="4a5122c368d04f6b91f59fb7f1f1be3d99580fb45609b2315a001469955ae120"
Mar 20 16:10:09 crc kubenswrapper[4730]: E0320 16:10:09.534591    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:10:20 crc kubenswrapper[4730]: I0320 16:10:20.533327    4730 scope.go:117] "RemoveContainer" containerID="4a5122c368d04f6b91f59fb7f1f1be3d99580fb45609b2315a001469955ae120"
Mar 20 16:10:20 crc kubenswrapper[4730]: E0320 16:10:20.534330    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:10:32 crc kubenswrapper[4730]: I0320 16:10:32.064637    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-4kfmn"]
Mar 20 16:10:32 crc kubenswrapper[4730]: I0320 16:10:32.077452    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-4kfmn"]
Mar 20 16:10:33 crc kubenswrapper[4730]: I0320 16:10:33.534383    4730 scope.go:117] "RemoveContainer" containerID="4a5122c368d04f6b91f59fb7f1f1be3d99580fb45609b2315a001469955ae120"
Mar 20 16:10:33 crc kubenswrapper[4730]: E0320 16:10:33.535070    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:10:33 crc kubenswrapper[4730]: I0320 16:10:33.549045    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fedef548-ce31-47a2-92fc-911f167635f9" path="/var/lib/kubelet/pods/fedef548-ce31-47a2-92fc-911f167635f9/volumes"
Mar 20 16:10:34 crc kubenswrapper[4730]: I0320 16:10:34.032838    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-x2t9r"]
Mar 20 16:10:34 crc kubenswrapper[4730]: I0320 16:10:34.066021    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-x2t9r"]
Mar 20 16:10:35 crc kubenswrapper[4730]: I0320 16:10:35.548649    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a05675d7-cd2f-4810-862b-cb0d2d13cbdd" path="/var/lib/kubelet/pods/a05675d7-cd2f-4810-862b-cb0d2d13cbdd/volumes"
Mar 20 16:10:37 crc kubenswrapper[4730]: I0320 16:10:37.030828    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-tz6x7"]
Mar 20 16:10:37 crc kubenswrapper[4730]: I0320 16:10:37.041234    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-tz6x7"]
Mar 20 16:10:37 crc kubenswrapper[4730]: I0320 16:10:37.551799    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48fc8af0-e30f-4f3f-88d3-8b054c6359ef" path="/var/lib/kubelet/pods/48fc8af0-e30f-4f3f-88d3-8b054c6359ef/volumes"
Mar 20 16:10:44 crc kubenswrapper[4730]: I0320 16:10:44.030689    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-z9mtx"]
Mar 20 16:10:44 crc kubenswrapper[4730]: I0320 16:10:44.040443    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-hbplf"]
Mar 20 16:10:44 crc kubenswrapper[4730]: I0320 16:10:44.048863    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-z9mtx"]
Mar 20 16:10:44 crc kubenswrapper[4730]: I0320 16:10:44.059582    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-hbplf"]
Mar 20 16:10:44 crc kubenswrapper[4730]: I0320 16:10:44.533839    4730 scope.go:117] "RemoveContainer" containerID="4a5122c368d04f6b91f59fb7f1f1be3d99580fb45609b2315a001469955ae120"
Mar 20 16:10:44 crc kubenswrapper[4730]: E0320 16:10:44.534300    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:10:45 crc kubenswrapper[4730]: I0320 16:10:45.554241    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09f27249-61fb-4e13-9eb9-9b804f256d81" path="/var/lib/kubelet/pods/09f27249-61fb-4e13-9eb9-9b804f256d81/volumes"
Mar 20 16:10:45 crc kubenswrapper[4730]: I0320 16:10:45.555148    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fb4d42d-6cd9-480c-8ee0-1e168504a4cd" path="/var/lib/kubelet/pods/6fb4d42d-6cd9-480c-8ee0-1e168504a4cd/volumes"
Mar 20 16:10:47 crc kubenswrapper[4730]: I0320 16:10:47.726575    4730 scope.go:117] "RemoveContainer" containerID="1763f714611816ce76b822616e2726ee2af2ec1d061896faecc0edc07186595f"
Mar 20 16:10:47 crc kubenswrapper[4730]: I0320 16:10:47.766368    4730 scope.go:117] "RemoveContainer" containerID="3f4c141955a3579b06be021435ce1c3642e2a9b4483a932d05648a4559764229"
Mar 20 16:10:47 crc kubenswrapper[4730]: I0320 16:10:47.844055    4730 scope.go:117] "RemoveContainer" containerID="29ed2b28b91aee9b1496fc9ae566fd345c663655b5eba7831621c42547aa8e83"
Mar 20 16:10:47 crc kubenswrapper[4730]: I0320 16:10:47.888546    4730 scope.go:117] "RemoveContainer" containerID="32fe76fbff47bfdd3ed0a42b1fb587052917346b2dd9af6a6803fc8251d250e7"
Mar 20 16:10:47 crc kubenswrapper[4730]: I0320 16:10:47.941961    4730 scope.go:117] "RemoveContainer" containerID="59a0ed1595de1b0849599bb5a7c10e7cfbb46ad061c13c2ab2d12fc1bc355373"
Mar 20 16:10:47 crc kubenswrapper[4730]: I0320 16:10:47.974908    4730 scope.go:117] "RemoveContainer" containerID="99fba5e2cadd379521ca79369b155ec13b031c591917c4f1be4fc608956b6dda"
Mar 20 16:10:48 crc kubenswrapper[4730]: I0320 16:10:48.020623    4730 scope.go:117] "RemoveContainer" containerID="2cbf92580c54611c192a57c093a66c2f77a3a73726fc9a21c3aef24b4e922f95"
Mar 20 16:10:48 crc kubenswrapper[4730]: I0320 16:10:48.061197    4730 scope.go:117] "RemoveContainer" containerID="6f35041c9925accfe452d038ab9d3c1753f640407e6e4a51f0b4d6916cb04e6f"
Mar 20 16:10:48 crc kubenswrapper[4730]: I0320 16:10:48.082515    4730 scope.go:117] "RemoveContainer" containerID="3ad79a5f57b1a4c7b377fb15d13f7708e0e00b53bbc48929b06820bc137a571e"
Mar 20 16:10:59 crc kubenswrapper[4730]: I0320 16:10:59.533265    4730 scope.go:117] "RemoveContainer" containerID="4a5122c368d04f6b91f59fb7f1f1be3d99580fb45609b2315a001469955ae120"
Mar 20 16:10:59 crc kubenswrapper[4730]: E0320 16:10:59.534015    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:11:02 crc kubenswrapper[4730]: I0320 16:11:02.717633    4730 generic.go:334] "Generic (PLEG): container finished" podID="962231f7-41b6-4754-b63c-523277f7cf50" containerID="23ee7694e655d68f9d7ceabeb66817e16ee241dd678d47fcd8c97daf31ec82f5" exitCode=0
Mar 20 16:11:02 crc kubenswrapper[4730]: I0320 16:11:02.717730    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9p7mz" event={"ID":"962231f7-41b6-4754-b63c-523277f7cf50","Type":"ContainerDied","Data":"23ee7694e655d68f9d7ceabeb66817e16ee241dd678d47fcd8c97daf31ec82f5"}
Mar 20 16:11:04 crc kubenswrapper[4730]: I0320 16:11:04.211826    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9p7mz"
Mar 20 16:11:04 crc kubenswrapper[4730]: I0320 16:11:04.370203    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sw628\" (UniqueName: \"kubernetes.io/projected/962231f7-41b6-4754-b63c-523277f7cf50-kube-api-access-sw628\") pod \"962231f7-41b6-4754-b63c-523277f7cf50\" (UID: \"962231f7-41b6-4754-b63c-523277f7cf50\") "
Mar 20 16:11:04 crc kubenswrapper[4730]: I0320 16:11:04.370310    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/962231f7-41b6-4754-b63c-523277f7cf50-ssh-key-openstack-edpm-ipam\") pod \"962231f7-41b6-4754-b63c-523277f7cf50\" (UID: \"962231f7-41b6-4754-b63c-523277f7cf50\") "
Mar 20 16:11:04 crc kubenswrapper[4730]: I0320 16:11:04.370348    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/962231f7-41b6-4754-b63c-523277f7cf50-inventory\") pod \"962231f7-41b6-4754-b63c-523277f7cf50\" (UID: \"962231f7-41b6-4754-b63c-523277f7cf50\") "
Mar 20 16:11:04 crc kubenswrapper[4730]: I0320 16:11:04.381866    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/962231f7-41b6-4754-b63c-523277f7cf50-kube-api-access-sw628" (OuterVolumeSpecName: "kube-api-access-sw628") pod "962231f7-41b6-4754-b63c-523277f7cf50" (UID: "962231f7-41b6-4754-b63c-523277f7cf50"). InnerVolumeSpecName "kube-api-access-sw628". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:11:04 crc kubenswrapper[4730]: I0320 16:11:04.396065    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/962231f7-41b6-4754-b63c-523277f7cf50-inventory" (OuterVolumeSpecName: "inventory") pod "962231f7-41b6-4754-b63c-523277f7cf50" (UID: "962231f7-41b6-4754-b63c-523277f7cf50"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:11:04 crc kubenswrapper[4730]: I0320 16:11:04.397072    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/962231f7-41b6-4754-b63c-523277f7cf50-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "962231f7-41b6-4754-b63c-523277f7cf50" (UID: "962231f7-41b6-4754-b63c-523277f7cf50"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:11:04 crc kubenswrapper[4730]: I0320 16:11:04.472416    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sw628\" (UniqueName: \"kubernetes.io/projected/962231f7-41b6-4754-b63c-523277f7cf50-kube-api-access-sw628\") on node \"crc\" DevicePath \"\""
Mar 20 16:11:04 crc kubenswrapper[4730]: I0320 16:11:04.472655    4730 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/962231f7-41b6-4754-b63c-523277f7cf50-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\""
Mar 20 16:11:04 crc kubenswrapper[4730]: I0320 16:11:04.472743    4730 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/962231f7-41b6-4754-b63c-523277f7cf50-inventory\") on node \"crc\" DevicePath \"\""
Mar 20 16:11:04 crc kubenswrapper[4730]: I0320 16:11:04.739314    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9p7mz" event={"ID":"962231f7-41b6-4754-b63c-523277f7cf50","Type":"ContainerDied","Data":"65f684006305344a33bcb883887a7102383b145f70b18f1bff5162dff68a6183"}
Mar 20 16:11:04 crc kubenswrapper[4730]: I0320 16:11:04.739361    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65f684006305344a33bcb883887a7102383b145f70b18f1bff5162dff68a6183"
Mar 20 16:11:04 crc kubenswrapper[4730]: I0320 16:11:04.739356    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9p7mz"
Mar 20 16:11:04 crc kubenswrapper[4730]: I0320 16:11:04.843017    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dff2j"]
Mar 20 16:11:04 crc kubenswrapper[4730]: E0320 16:11:04.843571    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="962231f7-41b6-4754-b63c-523277f7cf50" containerName="download-cache-edpm-deployment-openstack-edpm-ipam"
Mar 20 16:11:04 crc kubenswrapper[4730]: I0320 16:11:04.843595    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="962231f7-41b6-4754-b63c-523277f7cf50" containerName="download-cache-edpm-deployment-openstack-edpm-ipam"
Mar 20 16:11:04 crc kubenswrapper[4730]: E0320 16:11:04.843633    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d689cf2-4142-40fd-9af3-13b98b99296d" containerName="oc"
Mar 20 16:11:04 crc kubenswrapper[4730]: I0320 16:11:04.843643    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d689cf2-4142-40fd-9af3-13b98b99296d" containerName="oc"
Mar 20 16:11:04 crc kubenswrapper[4730]: I0320 16:11:04.843877    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d689cf2-4142-40fd-9af3-13b98b99296d" containerName="oc"
Mar 20 16:11:04 crc kubenswrapper[4730]: I0320 16:11:04.843920    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="962231f7-41b6-4754-b63c-523277f7cf50" containerName="download-cache-edpm-deployment-openstack-edpm-ipam"
Mar 20 16:11:04 crc kubenswrapper[4730]: I0320 16:11:04.844783    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dff2j"
Mar 20 16:11:04 crc kubenswrapper[4730]: I0320 16:11:04.846623    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vvsxx"
Mar 20 16:11:04 crc kubenswrapper[4730]: I0320 16:11:04.851704    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret"
Mar 20 16:11:04 crc kubenswrapper[4730]: I0320 16:11:04.852660    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam"
Mar 20 16:11:04 crc kubenswrapper[4730]: I0320 16:11:04.857095    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env"
Mar 20 16:11:04 crc kubenswrapper[4730]: I0320 16:11:04.874226    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dff2j"]
Mar 20 16:11:04 crc kubenswrapper[4730]: E0320 16:11:04.879434    4730 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod962231f7_41b6_4754_b63c_523277f7cf50.slice\": RecentStats: unable to find data in memory cache]"
Mar 20 16:11:04 crc kubenswrapper[4730]: I0320 16:11:04.982508    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ca62ee94-4983-4acc-856a-3faf59cae3e1-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dff2j\" (UID: \"ca62ee94-4983-4acc-856a-3faf59cae3e1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dff2j"
Mar 20 16:11:04 crc kubenswrapper[4730]: I0320 16:11:04.982840    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkgt4\" (UniqueName: \"kubernetes.io/projected/ca62ee94-4983-4acc-856a-3faf59cae3e1-kube-api-access-lkgt4\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dff2j\" (UID: \"ca62ee94-4983-4acc-856a-3faf59cae3e1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dff2j"
Mar 20 16:11:04 crc kubenswrapper[4730]: I0320 16:11:04.983013    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca62ee94-4983-4acc-856a-3faf59cae3e1-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dff2j\" (UID: \"ca62ee94-4983-4acc-856a-3faf59cae3e1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dff2j"
Mar 20 16:11:05 crc kubenswrapper[4730]: I0320 16:11:05.085536    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ca62ee94-4983-4acc-856a-3faf59cae3e1-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dff2j\" (UID: \"ca62ee94-4983-4acc-856a-3faf59cae3e1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dff2j"
Mar 20 16:11:05 crc kubenswrapper[4730]: I0320 16:11:05.085594    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkgt4\" (UniqueName: \"kubernetes.io/projected/ca62ee94-4983-4acc-856a-3faf59cae3e1-kube-api-access-lkgt4\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dff2j\" (UID: \"ca62ee94-4983-4acc-856a-3faf59cae3e1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dff2j"
Mar 20 16:11:05 crc kubenswrapper[4730]: I0320 16:11:05.085686    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca62ee94-4983-4acc-856a-3faf59cae3e1-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dff2j\" (UID: \"ca62ee94-4983-4acc-856a-3faf59cae3e1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dff2j"
Mar 20 16:11:05 crc kubenswrapper[4730]: I0320 16:11:05.097842    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca62ee94-4983-4acc-856a-3faf59cae3e1-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dff2j\" (UID: \"ca62ee94-4983-4acc-856a-3faf59cae3e1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dff2j"
Mar 20 16:11:05 crc kubenswrapper[4730]: I0320 16:11:05.098283    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ca62ee94-4983-4acc-856a-3faf59cae3e1-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dff2j\" (UID: \"ca62ee94-4983-4acc-856a-3faf59cae3e1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dff2j"
Mar 20 16:11:05 crc kubenswrapper[4730]: I0320 16:11:05.102873    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkgt4\" (UniqueName: \"kubernetes.io/projected/ca62ee94-4983-4acc-856a-3faf59cae3e1-kube-api-access-lkgt4\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dff2j\" (UID: \"ca62ee94-4983-4acc-856a-3faf59cae3e1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dff2j"
Mar 20 16:11:05 crc kubenswrapper[4730]: I0320 16:11:05.170949    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dff2j"
Mar 20 16:11:05 crc kubenswrapper[4730]: W0320 16:11:05.743563    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca62ee94_4983_4acc_856a_3faf59cae3e1.slice/crio-538df20e0c75ebd57cd2833241a13b43747c6f23a8c6a93077516d777fb91e76 WatchSource:0}: Error finding container 538df20e0c75ebd57cd2833241a13b43747c6f23a8c6a93077516d777fb91e76: Status 404 returned error can't find the container with id 538df20e0c75ebd57cd2833241a13b43747c6f23a8c6a93077516d777fb91e76
Mar 20 16:11:05 crc kubenswrapper[4730]: I0320 16:11:05.745020    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dff2j"]
Mar 20 16:11:06 crc kubenswrapper[4730]: I0320 16:11:06.759365    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dff2j" event={"ID":"ca62ee94-4983-4acc-856a-3faf59cae3e1","Type":"ContainerStarted","Data":"ba64daba5abdb4346cb601ada5074367a7382d23792549e6e38bb2f01ba55227"}
Mar 20 16:11:06 crc kubenswrapper[4730]: I0320 16:11:06.759884    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dff2j" event={"ID":"ca62ee94-4983-4acc-856a-3faf59cae3e1","Type":"ContainerStarted","Data":"538df20e0c75ebd57cd2833241a13b43747c6f23a8c6a93077516d777fb91e76"}
Mar 20 16:11:06 crc kubenswrapper[4730]: I0320 16:11:06.780546    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dff2j" podStartSLOduration=2.123453273 podStartE2EDuration="2.780517414s" podCreationTimestamp="2026-03-20 16:11:04 +0000 UTC" firstStartedPulling="2026-03-20 16:11:05.745793108 +0000 UTC m=+1924.959164477" lastFinishedPulling="2026-03-20 16:11:06.402857259 +0000 UTC m=+1925.616228618" observedRunningTime="2026-03-20 16:11:06.77721068 +0000 UTC m=+1925.990582049" watchObservedRunningTime="2026-03-20 16:11:06.780517414 +0000 UTC m=+1925.993888833"
Mar 20 16:11:10 crc kubenswrapper[4730]: I0320 16:11:10.534113    4730 scope.go:117] "RemoveContainer" containerID="4a5122c368d04f6b91f59fb7f1f1be3d99580fb45609b2315a001469955ae120"
Mar 20 16:11:10 crc kubenswrapper[4730]: E0320 16:11:10.534614    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:11:24 crc kubenswrapper[4730]: I0320 16:11:24.533066    4730 scope.go:117] "RemoveContainer" containerID="4a5122c368d04f6b91f59fb7f1f1be3d99580fb45609b2315a001469955ae120"
Mar 20 16:11:24 crc kubenswrapper[4730]: E0320 16:11:24.534219    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:11:26 crc kubenswrapper[4730]: I0320 16:11:26.059566    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-4a43-account-create-update-cj4kg"]
Mar 20 16:11:26 crc kubenswrapper[4730]: I0320 16:11:26.069126    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-4a43-account-create-update-cj4kg"]
Mar 20 16:11:27 crc kubenswrapper[4730]: I0320 16:11:27.032026    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-d0d2-account-create-update-z6v46"]
Mar 20 16:11:27 crc kubenswrapper[4730]: I0320 16:11:27.040747    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-d0d2-account-create-update-z6v46"]
Mar 20 16:11:27 crc kubenswrapper[4730]: I0320 16:11:27.544895    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="383cf79a-0636-4175-bcf8-7e369f101901" path="/var/lib/kubelet/pods/383cf79a-0636-4175-bcf8-7e369f101901/volumes"
Mar 20 16:11:27 crc kubenswrapper[4730]: I0320 16:11:27.545603    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7de61c5d-53ba-4d26-9a79-b82c2bc3b779" path="/var/lib/kubelet/pods/7de61c5d-53ba-4d26-9a79-b82c2bc3b779/volumes"
Mar 20 16:11:28 crc kubenswrapper[4730]: I0320 16:11:28.052682    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-tv4tn"]
Mar 20 16:11:28 crc kubenswrapper[4730]: I0320 16:11:28.063367    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-rlp9c"]
Mar 20 16:11:28 crc kubenswrapper[4730]: I0320 16:11:28.074013    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-tv4tn"]
Mar 20 16:11:28 crc kubenswrapper[4730]: I0320 16:11:28.084446    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-rlp9c"]
Mar 20 16:11:28 crc kubenswrapper[4730]: I0320 16:11:28.093887    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-qt4mz"]
Mar 20 16:11:28 crc kubenswrapper[4730]: I0320 16:11:28.104304    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-qt4mz"]
Mar 20 16:11:28 crc kubenswrapper[4730]: I0320 16:11:28.114112    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-2850-account-create-update-4lrrq"]
Mar 20 16:11:28 crc kubenswrapper[4730]: I0320 16:11:28.122426    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-2850-account-create-update-4lrrq"]
Mar 20 16:11:29 crc kubenswrapper[4730]: I0320 16:11:29.551915    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f625a9e-a940-476b-85b2-ff54c5e87785" path="/var/lib/kubelet/pods/3f625a9e-a940-476b-85b2-ff54c5e87785/volumes"
Mar 20 16:11:29 crc kubenswrapper[4730]: I0320 16:11:29.553095    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="475a52ba-bc8d-4c7b-ae99-330d6ec2b358" path="/var/lib/kubelet/pods/475a52ba-bc8d-4c7b-ae99-330d6ec2b358/volumes"
Mar 20 16:11:29 crc kubenswrapper[4730]: I0320 16:11:29.553961    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b9cde2b-3d05-44ba-898b-6fe8cfab4fe1" path="/var/lib/kubelet/pods/6b9cde2b-3d05-44ba-898b-6fe8cfab4fe1/volumes"
Mar 20 16:11:29 crc kubenswrapper[4730]: I0320 16:11:29.554772    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dac41622-7c80-4fce-a5ac-8a04d301669d" path="/var/lib/kubelet/pods/dac41622-7c80-4fce-a5ac-8a04d301669d/volumes"
Mar 20 16:11:37 crc kubenswrapper[4730]: I0320 16:11:37.533521    4730 scope.go:117] "RemoveContainer" containerID="4a5122c368d04f6b91f59fb7f1f1be3d99580fb45609b2315a001469955ae120"
Mar 20 16:11:37 crc kubenswrapper[4730]: E0320 16:11:37.534294    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:11:48 crc kubenswrapper[4730]: I0320 16:11:48.289482    4730 scope.go:117] "RemoveContainer" containerID="3a26f3f5793abd65e69907fa90ac71e2abaa4ea13397a929de033f2bbf59a251"
Mar 20 16:11:48 crc kubenswrapper[4730]: I0320 16:11:48.315023    4730 scope.go:117] "RemoveContainer" containerID="fa7fa1c1a12965d7d645639e89c04d923a3d343c7729667128d205eaaba9942e"
Mar 20 16:11:48 crc kubenswrapper[4730]: I0320 16:11:48.365492    4730 scope.go:117] "RemoveContainer" containerID="b8f3077acd6da12cfe1f43474ad395781d9175a1c666f3763b8d16340af465ed"
Mar 20 16:11:48 crc kubenswrapper[4730]: I0320 16:11:48.412156    4730 scope.go:117] "RemoveContainer" containerID="f2c396b5999dcbacb34f0cb38c776c483344cb3d6a6925954ac69d2fbac35de7"
Mar 20 16:11:48 crc kubenswrapper[4730]: I0320 16:11:48.470519    4730 scope.go:117] "RemoveContainer" containerID="be1153307c9e28a344ac73169445af77d8ee3c7d9c2256c03916bd83fc0e8437"
Mar 20 16:11:48 crc kubenswrapper[4730]: I0320 16:11:48.514019    4730 scope.go:117] "RemoveContainer" containerID="0d50b068c846deeebd08139bc4d49513ed414820309b36a4108d6ffa43871b84"
Mar 20 16:11:49 crc kubenswrapper[4730]: I0320 16:11:49.533582    4730 scope.go:117] "RemoveContainer" containerID="4a5122c368d04f6b91f59fb7f1f1be3d99580fb45609b2315a001469955ae120"
Mar 20 16:11:49 crc kubenswrapper[4730]: E0320 16:11:49.534215    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:12:00 crc kubenswrapper[4730]: I0320 16:12:00.149668    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567052-fb4zl"]
Mar 20 16:12:00 crc kubenswrapper[4730]: I0320 16:12:00.151646    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567052-fb4zl"
Mar 20 16:12:00 crc kubenswrapper[4730]: I0320 16:12:00.154528    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt"
Mar 20 16:12:00 crc kubenswrapper[4730]: I0320 16:12:00.154531    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt"
Mar 20 16:12:00 crc kubenswrapper[4730]: I0320 16:12:00.154899    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vlqbl"
Mar 20 16:12:00 crc kubenswrapper[4730]: I0320 16:12:00.160713    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567052-fb4zl"]
Mar 20 16:12:00 crc kubenswrapper[4730]: I0320 16:12:00.206702    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wstx\" (UniqueName: \"kubernetes.io/projected/28beb66f-2a64-4bcf-94eb-676ef7f1236a-kube-api-access-4wstx\") pod \"auto-csr-approver-29567052-fb4zl\" (UID: \"28beb66f-2a64-4bcf-94eb-676ef7f1236a\") " pod="openshift-infra/auto-csr-approver-29567052-fb4zl"
Mar 20 16:12:00 crc kubenswrapper[4730]: I0320 16:12:00.308693    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wstx\" (UniqueName: \"kubernetes.io/projected/28beb66f-2a64-4bcf-94eb-676ef7f1236a-kube-api-access-4wstx\") pod \"auto-csr-approver-29567052-fb4zl\" (UID: \"28beb66f-2a64-4bcf-94eb-676ef7f1236a\") " pod="openshift-infra/auto-csr-approver-29567052-fb4zl"
Mar 20 16:12:00 crc kubenswrapper[4730]: I0320 16:12:00.328164    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wstx\" (UniqueName: \"kubernetes.io/projected/28beb66f-2a64-4bcf-94eb-676ef7f1236a-kube-api-access-4wstx\") pod \"auto-csr-approver-29567052-fb4zl\" (UID: \"28beb66f-2a64-4bcf-94eb-676ef7f1236a\") " pod="openshift-infra/auto-csr-approver-29567052-fb4zl"
Mar 20 16:12:00 crc kubenswrapper[4730]: I0320 16:12:00.470738    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567052-fb4zl"
Mar 20 16:12:01 crc kubenswrapper[4730]: I0320 16:12:01.053625    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qsrbd"]
Mar 20 16:12:01 crc kubenswrapper[4730]: I0320 16:12:01.066173    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qsrbd"]
Mar 20 16:12:01 crc kubenswrapper[4730]: I0320 16:12:01.077141    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567052-fb4zl"]
Mar 20 16:12:01 crc kubenswrapper[4730]: I0320 16:12:01.294933    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567052-fb4zl" event={"ID":"28beb66f-2a64-4bcf-94eb-676ef7f1236a","Type":"ContainerStarted","Data":"54935f3ad3700c2130e20a5d274c6d8117588eac3d0758ef70339f1dc6c158e7"}
Mar 20 16:12:01 crc kubenswrapper[4730]: I0320 16:12:01.532941    4730 scope.go:117] "RemoveContainer" containerID="4a5122c368d04f6b91f59fb7f1f1be3d99580fb45609b2315a001469955ae120"
Mar 20 16:12:01 crc kubenswrapper[4730]: E0320 16:12:01.533177    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:12:01 crc kubenswrapper[4730]: I0320 16:12:01.544709    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="279d2368-abe1-465a-9007-68542e5dbfc4" path="/var/lib/kubelet/pods/279d2368-abe1-465a-9007-68542e5dbfc4/volumes"
Mar 20 16:12:04 crc kubenswrapper[4730]: I0320 16:12:04.336448    4730 generic.go:334] "Generic (PLEG): container finished" podID="28beb66f-2a64-4bcf-94eb-676ef7f1236a" containerID="09bf1a5b6b98230c97ec660d74eb6fc018c3a7b8b7105355e719108bd3861003" exitCode=0
Mar 20 16:12:04 crc kubenswrapper[4730]: I0320 16:12:04.336938    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567052-fb4zl" event={"ID":"28beb66f-2a64-4bcf-94eb-676ef7f1236a","Type":"ContainerDied","Data":"09bf1a5b6b98230c97ec660d74eb6fc018c3a7b8b7105355e719108bd3861003"}
Mar 20 16:12:05 crc kubenswrapper[4730]: I0320 16:12:05.689396    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567052-fb4zl"
Mar 20 16:12:05 crc kubenswrapper[4730]: I0320 16:12:05.830181    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wstx\" (UniqueName: \"kubernetes.io/projected/28beb66f-2a64-4bcf-94eb-676ef7f1236a-kube-api-access-4wstx\") pod \"28beb66f-2a64-4bcf-94eb-676ef7f1236a\" (UID: \"28beb66f-2a64-4bcf-94eb-676ef7f1236a\") "
Mar 20 16:12:05 crc kubenswrapper[4730]: I0320 16:12:05.837390    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28beb66f-2a64-4bcf-94eb-676ef7f1236a-kube-api-access-4wstx" (OuterVolumeSpecName: "kube-api-access-4wstx") pod "28beb66f-2a64-4bcf-94eb-676ef7f1236a" (UID: "28beb66f-2a64-4bcf-94eb-676ef7f1236a"). InnerVolumeSpecName "kube-api-access-4wstx". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:12:05 crc kubenswrapper[4730]: I0320 16:12:05.932690    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wstx\" (UniqueName: \"kubernetes.io/projected/28beb66f-2a64-4bcf-94eb-676ef7f1236a-kube-api-access-4wstx\") on node \"crc\" DevicePath \"\""
Mar 20 16:12:06 crc kubenswrapper[4730]: I0320 16:12:06.356083    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567052-fb4zl" event={"ID":"28beb66f-2a64-4bcf-94eb-676ef7f1236a","Type":"ContainerDied","Data":"54935f3ad3700c2130e20a5d274c6d8117588eac3d0758ef70339f1dc6c158e7"}
Mar 20 16:12:06 crc kubenswrapper[4730]: I0320 16:12:06.356163    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54935f3ad3700c2130e20a5d274c6d8117588eac3d0758ef70339f1dc6c158e7"
Mar 20 16:12:06 crc kubenswrapper[4730]: I0320 16:12:06.356127    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567052-fb4zl"
Mar 20 16:12:06 crc kubenswrapper[4730]: I0320 16:12:06.751894    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567046-8f6sv"]
Mar 20 16:12:06 crc kubenswrapper[4730]: I0320 16:12:06.762277    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567046-8f6sv"]
Mar 20 16:12:07 crc kubenswrapper[4730]: I0320 16:12:07.545706    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dacfdca-1b6e-4336-8089-722d36388128" path="/var/lib/kubelet/pods/8dacfdca-1b6e-4336-8089-722d36388128/volumes"
Mar 20 16:12:13 crc kubenswrapper[4730]: I0320 16:12:13.535240    4730 scope.go:117] "RemoveContainer" containerID="4a5122c368d04f6b91f59fb7f1f1be3d99580fb45609b2315a001469955ae120"
Mar 20 16:12:13 crc kubenswrapper[4730]: E0320 16:12:13.535991    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:12:15 crc kubenswrapper[4730]: I0320 16:12:15.464110    4730 generic.go:334] "Generic (PLEG): container finished" podID="ca62ee94-4983-4acc-856a-3faf59cae3e1" containerID="ba64daba5abdb4346cb601ada5074367a7382d23792549e6e38bb2f01ba55227" exitCode=0
Mar 20 16:12:15 crc kubenswrapper[4730]: I0320 16:12:15.464172    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dff2j" event={"ID":"ca62ee94-4983-4acc-856a-3faf59cae3e1","Type":"ContainerDied","Data":"ba64daba5abdb4346cb601ada5074367a7382d23792549e6e38bb2f01ba55227"}
Mar 20 16:12:16 crc kubenswrapper[4730]: I0320 16:12:16.896327    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dff2j"
Mar 20 16:12:17 crc kubenswrapper[4730]: I0320 16:12:17.047902    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkgt4\" (UniqueName: \"kubernetes.io/projected/ca62ee94-4983-4acc-856a-3faf59cae3e1-kube-api-access-lkgt4\") pod \"ca62ee94-4983-4acc-856a-3faf59cae3e1\" (UID: \"ca62ee94-4983-4acc-856a-3faf59cae3e1\") "
Mar 20 16:12:17 crc kubenswrapper[4730]: I0320 16:12:17.049683    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ca62ee94-4983-4acc-856a-3faf59cae3e1-ssh-key-openstack-edpm-ipam\") pod \"ca62ee94-4983-4acc-856a-3faf59cae3e1\" (UID: \"ca62ee94-4983-4acc-856a-3faf59cae3e1\") "
Mar 20 16:12:17 crc kubenswrapper[4730]: I0320 16:12:17.049949    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca62ee94-4983-4acc-856a-3faf59cae3e1-inventory\") pod \"ca62ee94-4983-4acc-856a-3faf59cae3e1\" (UID: \"ca62ee94-4983-4acc-856a-3faf59cae3e1\") "
Mar 20 16:12:17 crc kubenswrapper[4730]: I0320 16:12:17.054623    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca62ee94-4983-4acc-856a-3faf59cae3e1-kube-api-access-lkgt4" (OuterVolumeSpecName: "kube-api-access-lkgt4") pod "ca62ee94-4983-4acc-856a-3faf59cae3e1" (UID: "ca62ee94-4983-4acc-856a-3faf59cae3e1"). InnerVolumeSpecName "kube-api-access-lkgt4". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:12:17 crc kubenswrapper[4730]: I0320 16:12:17.082463    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca62ee94-4983-4acc-856a-3faf59cae3e1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ca62ee94-4983-4acc-856a-3faf59cae3e1" (UID: "ca62ee94-4983-4acc-856a-3faf59cae3e1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:12:17 crc kubenswrapper[4730]: I0320 16:12:17.086574    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca62ee94-4983-4acc-856a-3faf59cae3e1-inventory" (OuterVolumeSpecName: "inventory") pod "ca62ee94-4983-4acc-856a-3faf59cae3e1" (UID: "ca62ee94-4983-4acc-856a-3faf59cae3e1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:12:17 crc kubenswrapper[4730]: I0320 16:12:17.152293    4730 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ca62ee94-4983-4acc-856a-3faf59cae3e1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\""
Mar 20 16:12:17 crc kubenswrapper[4730]: I0320 16:12:17.152330    4730 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca62ee94-4983-4acc-856a-3faf59cae3e1-inventory\") on node \"crc\" DevicePath \"\""
Mar 20 16:12:17 crc kubenswrapper[4730]: I0320 16:12:17.152343    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkgt4\" (UniqueName: \"kubernetes.io/projected/ca62ee94-4983-4acc-856a-3faf59cae3e1-kube-api-access-lkgt4\") on node \"crc\" DevicePath \"\""
Mar 20 16:12:17 crc kubenswrapper[4730]: I0320 16:12:17.490925    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dff2j" event={"ID":"ca62ee94-4983-4acc-856a-3faf59cae3e1","Type":"ContainerDied","Data":"538df20e0c75ebd57cd2833241a13b43747c6f23a8c6a93077516d777fb91e76"}
Mar 20 16:12:17 crc kubenswrapper[4730]: I0320 16:12:17.490963    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="538df20e0c75ebd57cd2833241a13b43747c6f23a8c6a93077516d777fb91e76"
Mar 20 16:12:17 crc kubenswrapper[4730]: I0320 16:12:17.491009    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dff2j"
Mar 20 16:12:17 crc kubenswrapper[4730]: I0320 16:12:17.572950    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7pnjt"]
Mar 20 16:12:17 crc kubenswrapper[4730]: E0320 16:12:17.573457    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28beb66f-2a64-4bcf-94eb-676ef7f1236a" containerName="oc"
Mar 20 16:12:17 crc kubenswrapper[4730]: I0320 16:12:17.573478    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="28beb66f-2a64-4bcf-94eb-676ef7f1236a" containerName="oc"
Mar 20 16:12:17 crc kubenswrapper[4730]: E0320 16:12:17.573520    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca62ee94-4983-4acc-856a-3faf59cae3e1" containerName="configure-network-edpm-deployment-openstack-edpm-ipam"
Mar 20 16:12:17 crc kubenswrapper[4730]: I0320 16:12:17.573532    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca62ee94-4983-4acc-856a-3faf59cae3e1" containerName="configure-network-edpm-deployment-openstack-edpm-ipam"
Mar 20 16:12:17 crc kubenswrapper[4730]: I0320 16:12:17.573766    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca62ee94-4983-4acc-856a-3faf59cae3e1" containerName="configure-network-edpm-deployment-openstack-edpm-ipam"
Mar 20 16:12:17 crc kubenswrapper[4730]: I0320 16:12:17.573795    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="28beb66f-2a64-4bcf-94eb-676ef7f1236a" containerName="oc"
Mar 20 16:12:17 crc kubenswrapper[4730]: I0320 16:12:17.574641    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7pnjt"]
Mar 20 16:12:17 crc kubenswrapper[4730]: I0320 16:12:17.574740    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7pnjt"
Mar 20 16:12:17 crc kubenswrapper[4730]: I0320 16:12:17.577312    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vvsxx"
Mar 20 16:12:17 crc kubenswrapper[4730]: I0320 16:12:17.577663    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam"
Mar 20 16:12:17 crc kubenswrapper[4730]: I0320 16:12:17.577906    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret"
Mar 20 16:12:17 crc kubenswrapper[4730]: I0320 16:12:17.578632    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env"
Mar 20 16:12:17 crc kubenswrapper[4730]: I0320 16:12:17.664333    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cdec76f1-b3a6-4c3d-a9d6-3553fd6b1122-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7pnjt\" (UID: \"cdec76f1-b3a6-4c3d-a9d6-3553fd6b1122\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7pnjt"
Mar 20 16:12:17 crc kubenswrapper[4730]: I0320 16:12:17.664732    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cdec76f1-b3a6-4c3d-a9d6-3553fd6b1122-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7pnjt\" (UID: \"cdec76f1-b3a6-4c3d-a9d6-3553fd6b1122\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7pnjt"
Mar 20 16:12:17 crc kubenswrapper[4730]: I0320 16:12:17.664758    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jghq5\" (UniqueName: \"kubernetes.io/projected/cdec76f1-b3a6-4c3d-a9d6-3553fd6b1122-kube-api-access-jghq5\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7pnjt\" (UID: \"cdec76f1-b3a6-4c3d-a9d6-3553fd6b1122\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7pnjt"
Mar 20 16:12:17 crc kubenswrapper[4730]: I0320 16:12:17.766625    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cdec76f1-b3a6-4c3d-a9d6-3553fd6b1122-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7pnjt\" (UID: \"cdec76f1-b3a6-4c3d-a9d6-3553fd6b1122\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7pnjt"
Mar 20 16:12:17 crc kubenswrapper[4730]: I0320 16:12:17.766696    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jghq5\" (UniqueName: \"kubernetes.io/projected/cdec76f1-b3a6-4c3d-a9d6-3553fd6b1122-kube-api-access-jghq5\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7pnjt\" (UID: \"cdec76f1-b3a6-4c3d-a9d6-3553fd6b1122\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7pnjt"
Mar 20 16:12:17 crc kubenswrapper[4730]: I0320 16:12:17.766839    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cdec76f1-b3a6-4c3d-a9d6-3553fd6b1122-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7pnjt\" (UID: \"cdec76f1-b3a6-4c3d-a9d6-3553fd6b1122\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7pnjt"
Mar 20 16:12:17 crc kubenswrapper[4730]: I0320 16:12:17.772572    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cdec76f1-b3a6-4c3d-a9d6-3553fd6b1122-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7pnjt\" (UID: \"cdec76f1-b3a6-4c3d-a9d6-3553fd6b1122\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7pnjt"
Mar 20 16:12:17 crc kubenswrapper[4730]: I0320 16:12:17.783028    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cdec76f1-b3a6-4c3d-a9d6-3553fd6b1122-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7pnjt\" (UID: \"cdec76f1-b3a6-4c3d-a9d6-3553fd6b1122\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7pnjt"
Mar 20 16:12:17 crc kubenswrapper[4730]: I0320 16:12:17.785848    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jghq5\" (UniqueName: \"kubernetes.io/projected/cdec76f1-b3a6-4c3d-a9d6-3553fd6b1122-kube-api-access-jghq5\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7pnjt\" (UID: \"cdec76f1-b3a6-4c3d-a9d6-3553fd6b1122\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7pnjt"
Mar 20 16:12:17 crc kubenswrapper[4730]: I0320 16:12:17.917943    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7pnjt"
Mar 20 16:12:18 crc kubenswrapper[4730]: I0320 16:12:18.432036    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7pnjt"]
Mar 20 16:12:18 crc kubenswrapper[4730]: I0320 16:12:18.501053    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7pnjt" event={"ID":"cdec76f1-b3a6-4c3d-a9d6-3553fd6b1122","Type":"ContainerStarted","Data":"ba38e4aed467787dbc07075fa373dd244be411bd464608954c22c323b546253a"}
Mar 20 16:12:20 crc kubenswrapper[4730]: I0320 16:12:20.523229    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7pnjt" event={"ID":"cdec76f1-b3a6-4c3d-a9d6-3553fd6b1122","Type":"ContainerStarted","Data":"ea1db0887cceb3db382e0f4b2a1444525bccb60f8caf78058e5d02523c9362c1"}
Mar 20 16:12:20 crc kubenswrapper[4730]: I0320 16:12:20.547563    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7pnjt" podStartSLOduration=2.705316993 podStartE2EDuration="3.547538477s" podCreationTimestamp="2026-03-20 16:12:17 +0000 UTC" firstStartedPulling="2026-03-20 16:12:18.442732538 +0000 UTC m=+1997.656103907" lastFinishedPulling="2026-03-20 16:12:19.284954022 +0000 UTC m=+1998.498325391" observedRunningTime="2026-03-20 16:12:20.54555456 +0000 UTC m=+1999.758925929" watchObservedRunningTime="2026-03-20 16:12:20.547538477 +0000 UTC m=+1999.760909876"
Mar 20 16:12:22 crc kubenswrapper[4730]: I0320 16:12:22.029706    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-f7rjc"]
Mar 20 16:12:22 crc kubenswrapper[4730]: I0320 16:12:22.041069    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-f7rjc"]
Mar 20 16:12:23 crc kubenswrapper[4730]: I0320 16:12:23.544053    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f144e50-8d18-49a5-a3ef-84b72e6e119f" path="/var/lib/kubelet/pods/8f144e50-8d18-49a5-a3ef-84b72e6e119f/volumes"
Mar 20 16:12:24 crc kubenswrapper[4730]: I0320 16:12:24.533966    4730 scope.go:117] "RemoveContainer" containerID="4a5122c368d04f6b91f59fb7f1f1be3d99580fb45609b2315a001469955ae120"
Mar 20 16:12:24 crc kubenswrapper[4730]: E0320 16:12:24.534748    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:12:24 crc kubenswrapper[4730]: I0320 16:12:24.566665    4730 generic.go:334] "Generic (PLEG): container finished" podID="cdec76f1-b3a6-4c3d-a9d6-3553fd6b1122" containerID="ea1db0887cceb3db382e0f4b2a1444525bccb60f8caf78058e5d02523c9362c1" exitCode=0
Mar 20 16:12:24 crc kubenswrapper[4730]: I0320 16:12:24.566721    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7pnjt" event={"ID":"cdec76f1-b3a6-4c3d-a9d6-3553fd6b1122","Type":"ContainerDied","Data":"ea1db0887cceb3db382e0f4b2a1444525bccb60f8caf78058e5d02523c9362c1"}
Mar 20 16:12:26 crc kubenswrapper[4730]: I0320 16:12:26.034985    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7pnjt"
Mar 20 16:12:26 crc kubenswrapper[4730]: I0320 16:12:26.155561    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cdec76f1-b3a6-4c3d-a9d6-3553fd6b1122-inventory\") pod \"cdec76f1-b3a6-4c3d-a9d6-3553fd6b1122\" (UID: \"cdec76f1-b3a6-4c3d-a9d6-3553fd6b1122\") "
Mar 20 16:12:26 crc kubenswrapper[4730]: I0320 16:12:26.155670    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jghq5\" (UniqueName: \"kubernetes.io/projected/cdec76f1-b3a6-4c3d-a9d6-3553fd6b1122-kube-api-access-jghq5\") pod \"cdec76f1-b3a6-4c3d-a9d6-3553fd6b1122\" (UID: \"cdec76f1-b3a6-4c3d-a9d6-3553fd6b1122\") "
Mar 20 16:12:26 crc kubenswrapper[4730]: I0320 16:12:26.155749    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cdec76f1-b3a6-4c3d-a9d6-3553fd6b1122-ssh-key-openstack-edpm-ipam\") pod \"cdec76f1-b3a6-4c3d-a9d6-3553fd6b1122\" (UID: \"cdec76f1-b3a6-4c3d-a9d6-3553fd6b1122\") "
Mar 20 16:12:26 crc kubenswrapper[4730]: I0320 16:12:26.169102    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdec76f1-b3a6-4c3d-a9d6-3553fd6b1122-kube-api-access-jghq5" (OuterVolumeSpecName: "kube-api-access-jghq5") pod "cdec76f1-b3a6-4c3d-a9d6-3553fd6b1122" (UID: "cdec76f1-b3a6-4c3d-a9d6-3553fd6b1122"). InnerVolumeSpecName "kube-api-access-jghq5". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:12:26 crc kubenswrapper[4730]: I0320 16:12:26.183302    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdec76f1-b3a6-4c3d-a9d6-3553fd6b1122-inventory" (OuterVolumeSpecName: "inventory") pod "cdec76f1-b3a6-4c3d-a9d6-3553fd6b1122" (UID: "cdec76f1-b3a6-4c3d-a9d6-3553fd6b1122"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:12:26 crc kubenswrapper[4730]: I0320 16:12:26.186229    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdec76f1-b3a6-4c3d-a9d6-3553fd6b1122-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "cdec76f1-b3a6-4c3d-a9d6-3553fd6b1122" (UID: "cdec76f1-b3a6-4c3d-a9d6-3553fd6b1122"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:12:26 crc kubenswrapper[4730]: I0320 16:12:26.258049    4730 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cdec76f1-b3a6-4c3d-a9d6-3553fd6b1122-inventory\") on node \"crc\" DevicePath \"\""
Mar 20 16:12:26 crc kubenswrapper[4730]: I0320 16:12:26.258077    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jghq5\" (UniqueName: \"kubernetes.io/projected/cdec76f1-b3a6-4c3d-a9d6-3553fd6b1122-kube-api-access-jghq5\") on node \"crc\" DevicePath \"\""
Mar 20 16:12:26 crc kubenswrapper[4730]: I0320 16:12:26.258087    4730 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cdec76f1-b3a6-4c3d-a9d6-3553fd6b1122-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\""
Mar 20 16:12:26 crc kubenswrapper[4730]: I0320 16:12:26.591907    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7pnjt" event={"ID":"cdec76f1-b3a6-4c3d-a9d6-3553fd6b1122","Type":"ContainerDied","Data":"ba38e4aed467787dbc07075fa373dd244be411bd464608954c22c323b546253a"}
Mar 20 16:12:26 crc kubenswrapper[4730]: I0320 16:12:26.591948    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba38e4aed467787dbc07075fa373dd244be411bd464608954c22c323b546253a"
Mar 20 16:12:26 crc kubenswrapper[4730]: I0320 16:12:26.591991    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7pnjt"
Mar 20 16:12:26 crc kubenswrapper[4730]: I0320 16:12:26.690889    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-dkksx"]
Mar 20 16:12:26 crc kubenswrapper[4730]: E0320 16:12:26.691430    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdec76f1-b3a6-4c3d-a9d6-3553fd6b1122" containerName="validate-network-edpm-deployment-openstack-edpm-ipam"
Mar 20 16:12:26 crc kubenswrapper[4730]: I0320 16:12:26.691451    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdec76f1-b3a6-4c3d-a9d6-3553fd6b1122" containerName="validate-network-edpm-deployment-openstack-edpm-ipam"
Mar 20 16:12:26 crc kubenswrapper[4730]: I0320 16:12:26.691718    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdec76f1-b3a6-4c3d-a9d6-3553fd6b1122" containerName="validate-network-edpm-deployment-openstack-edpm-ipam"
Mar 20 16:12:26 crc kubenswrapper[4730]: I0320 16:12:26.692601    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dkksx"
Mar 20 16:12:26 crc kubenswrapper[4730]: I0320 16:12:26.695341    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam"
Mar 20 16:12:26 crc kubenswrapper[4730]: I0320 16:12:26.695874    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret"
Mar 20 16:12:26 crc kubenswrapper[4730]: I0320 16:12:26.695951    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vvsxx"
Mar 20 16:12:26 crc kubenswrapper[4730]: I0320 16:12:26.696176    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env"
Mar 20 16:12:26 crc kubenswrapper[4730]: I0320 16:12:26.703613    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-dkksx"]
Mar 20 16:12:26 crc kubenswrapper[4730]: I0320 16:12:26.867568    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/133f1969-bed7-44cd-9dac-b9dfaa376515-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-dkksx\" (UID: \"133f1969-bed7-44cd-9dac-b9dfaa376515\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dkksx"
Mar 20 16:12:26 crc kubenswrapper[4730]: I0320 16:12:26.867840    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/133f1969-bed7-44cd-9dac-b9dfaa376515-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-dkksx\" (UID: \"133f1969-bed7-44cd-9dac-b9dfaa376515\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dkksx"
Mar 20 16:12:26 crc kubenswrapper[4730]: I0320 16:12:26.867911    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txlld\" (UniqueName: \"kubernetes.io/projected/133f1969-bed7-44cd-9dac-b9dfaa376515-kube-api-access-txlld\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-dkksx\" (UID: \"133f1969-bed7-44cd-9dac-b9dfaa376515\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dkksx"
Mar 20 16:12:26 crc kubenswrapper[4730]: I0320 16:12:26.970288    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/133f1969-bed7-44cd-9dac-b9dfaa376515-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-dkksx\" (UID: \"133f1969-bed7-44cd-9dac-b9dfaa376515\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dkksx"
Mar 20 16:12:26 crc kubenswrapper[4730]: I0320 16:12:26.970719    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/133f1969-bed7-44cd-9dac-b9dfaa376515-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-dkksx\" (UID: \"133f1969-bed7-44cd-9dac-b9dfaa376515\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dkksx"
Mar 20 16:12:26 crc kubenswrapper[4730]: I0320 16:12:26.970843    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txlld\" (UniqueName: \"kubernetes.io/projected/133f1969-bed7-44cd-9dac-b9dfaa376515-kube-api-access-txlld\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-dkksx\" (UID: \"133f1969-bed7-44cd-9dac-b9dfaa376515\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dkksx"
Mar 20 16:12:26 crc kubenswrapper[4730]: I0320 16:12:26.975521    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/133f1969-bed7-44cd-9dac-b9dfaa376515-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-dkksx\" (UID: \"133f1969-bed7-44cd-9dac-b9dfaa376515\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dkksx"
Mar 20 16:12:26 crc kubenswrapper[4730]: I0320 16:12:26.975741    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/133f1969-bed7-44cd-9dac-b9dfaa376515-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-dkksx\" (UID: \"133f1969-bed7-44cd-9dac-b9dfaa376515\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dkksx"
Mar 20 16:12:26 crc kubenswrapper[4730]: I0320 16:12:26.987749    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txlld\" (UniqueName: \"kubernetes.io/projected/133f1969-bed7-44cd-9dac-b9dfaa376515-kube-api-access-txlld\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-dkksx\" (UID: \"133f1969-bed7-44cd-9dac-b9dfaa376515\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dkksx"
Mar 20 16:12:27 crc kubenswrapper[4730]: I0320 16:12:27.018055    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dkksx"
Mar 20 16:12:27 crc kubenswrapper[4730]: I0320 16:12:27.545791    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-dkksx"]
Mar 20 16:12:27 crc kubenswrapper[4730]: I0320 16:12:27.601138    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dkksx" event={"ID":"133f1969-bed7-44cd-9dac-b9dfaa376515","Type":"ContainerStarted","Data":"e961a9b21d5ba22c9457891c45fc6f0b988237cbd4055dc60985a974e06c6a15"}
Mar 20 16:12:28 crc kubenswrapper[4730]: I0320 16:12:28.067586    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-cdhdz"]
Mar 20 16:12:28 crc kubenswrapper[4730]: I0320 16:12:28.082160    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-cdhdz"]
Mar 20 16:12:29 crc kubenswrapper[4730]: I0320 16:12:29.547587    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647" path="/var/lib/kubelet/pods/ddb7b5ed-16a0-4fb8-96d3-df3c9c2cd647/volumes"
Mar 20 16:12:29 crc kubenswrapper[4730]: I0320 16:12:29.620736    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dkksx" event={"ID":"133f1969-bed7-44cd-9dac-b9dfaa376515","Type":"ContainerStarted","Data":"c4cdbcf8144688f6a26daaef331c10d18244b2a1a28d3b1d4832f69e23275537"}
Mar 20 16:12:38 crc kubenswrapper[4730]: I0320 16:12:38.533424    4730 scope.go:117] "RemoveContainer" containerID="4a5122c368d04f6b91f59fb7f1f1be3d99580fb45609b2315a001469955ae120"
Mar 20 16:12:38 crc kubenswrapper[4730]: E0320 16:12:38.534389    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:12:48 crc kubenswrapper[4730]: I0320 16:12:48.671433    4730 scope.go:117] "RemoveContainer" containerID="78840c5174380df1ec37853cf5867820b6c6e093e27294e16eb3795a80e9c2a8"
Mar 20 16:12:48 crc kubenswrapper[4730]: I0320 16:12:48.719411    4730 scope.go:117] "RemoveContainer" containerID="6e67353e8a39d519cf4269a9771ca30ae4c8d30443c293283645c96cf02f2776"
Mar 20 16:12:48 crc kubenswrapper[4730]: I0320 16:12:48.768400    4730 scope.go:117] "RemoveContainer" containerID="79f3caf37f32c7308d415a20940a3f1cbb774116c3657a269e41ea28bde4ad32"
Mar 20 16:12:48 crc kubenswrapper[4730]: I0320 16:12:48.817886    4730 scope.go:117] "RemoveContainer" containerID="712419554ee7980049c60af4c6c43298daddafb72a74d5efc70ba46df50bba0e"
Mar 20 16:12:50 crc kubenswrapper[4730]: I0320 16:12:50.533166    4730 scope.go:117] "RemoveContainer" containerID="4a5122c368d04f6b91f59fb7f1f1be3d99580fb45609b2315a001469955ae120"
Mar 20 16:12:50 crc kubenswrapper[4730]: E0320 16:12:50.533795    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:13:04 crc kubenswrapper[4730]: I0320 16:13:04.533163    4730 scope.go:117] "RemoveContainer" containerID="4a5122c368d04f6b91f59fb7f1f1be3d99580fb45609b2315a001469955ae120"
Mar 20 16:13:04 crc kubenswrapper[4730]: E0320 16:13:04.533983    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:13:04 crc kubenswrapper[4730]: I0320 16:13:04.943967    4730 generic.go:334] "Generic (PLEG): container finished" podID="133f1969-bed7-44cd-9dac-b9dfaa376515" containerID="c4cdbcf8144688f6a26daaef331c10d18244b2a1a28d3b1d4832f69e23275537" exitCode=0
Mar 20 16:13:04 crc kubenswrapper[4730]: I0320 16:13:04.944033    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dkksx" event={"ID":"133f1969-bed7-44cd-9dac-b9dfaa376515","Type":"ContainerDied","Data":"c4cdbcf8144688f6a26daaef331c10d18244b2a1a28d3b1d4832f69e23275537"}
Mar 20 16:13:06 crc kubenswrapper[4730]: I0320 16:13:06.050377    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-5j2w4"]
Mar 20 16:13:06 crc kubenswrapper[4730]: I0320 16:13:06.057757    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-5j2w4"]
Mar 20 16:13:06 crc kubenswrapper[4730]: I0320 16:13:06.396559    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dkksx"
Mar 20 16:13:06 crc kubenswrapper[4730]: I0320 16:13:06.590995    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/133f1969-bed7-44cd-9dac-b9dfaa376515-inventory\") pod \"133f1969-bed7-44cd-9dac-b9dfaa376515\" (UID: \"133f1969-bed7-44cd-9dac-b9dfaa376515\") "
Mar 20 16:13:06 crc kubenswrapper[4730]: I0320 16:13:06.591543    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txlld\" (UniqueName: \"kubernetes.io/projected/133f1969-bed7-44cd-9dac-b9dfaa376515-kube-api-access-txlld\") pod \"133f1969-bed7-44cd-9dac-b9dfaa376515\" (UID: \"133f1969-bed7-44cd-9dac-b9dfaa376515\") "
Mar 20 16:13:06 crc kubenswrapper[4730]: I0320 16:13:06.591586    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/133f1969-bed7-44cd-9dac-b9dfaa376515-ssh-key-openstack-edpm-ipam\") pod \"133f1969-bed7-44cd-9dac-b9dfaa376515\" (UID: \"133f1969-bed7-44cd-9dac-b9dfaa376515\") "
Mar 20 16:13:06 crc kubenswrapper[4730]: I0320 16:13:06.605214    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/133f1969-bed7-44cd-9dac-b9dfaa376515-kube-api-access-txlld" (OuterVolumeSpecName: "kube-api-access-txlld") pod "133f1969-bed7-44cd-9dac-b9dfaa376515" (UID: "133f1969-bed7-44cd-9dac-b9dfaa376515"). InnerVolumeSpecName "kube-api-access-txlld". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:13:06 crc kubenswrapper[4730]: I0320 16:13:06.617730    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/133f1969-bed7-44cd-9dac-b9dfaa376515-inventory" (OuterVolumeSpecName: "inventory") pod "133f1969-bed7-44cd-9dac-b9dfaa376515" (UID: "133f1969-bed7-44cd-9dac-b9dfaa376515"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:13:06 crc kubenswrapper[4730]: I0320 16:13:06.624230    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/133f1969-bed7-44cd-9dac-b9dfaa376515-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "133f1969-bed7-44cd-9dac-b9dfaa376515" (UID: "133f1969-bed7-44cd-9dac-b9dfaa376515"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:13:06 crc kubenswrapper[4730]: I0320 16:13:06.693989    4730 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/133f1969-bed7-44cd-9dac-b9dfaa376515-inventory\") on node \"crc\" DevicePath \"\""
Mar 20 16:13:06 crc kubenswrapper[4730]: I0320 16:13:06.694019    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txlld\" (UniqueName: \"kubernetes.io/projected/133f1969-bed7-44cd-9dac-b9dfaa376515-kube-api-access-txlld\") on node \"crc\" DevicePath \"\""
Mar 20 16:13:06 crc kubenswrapper[4730]: I0320 16:13:06.694029    4730 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/133f1969-bed7-44cd-9dac-b9dfaa376515-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\""
Mar 20 16:13:06 crc kubenswrapper[4730]: I0320 16:13:06.970631    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dkksx" event={"ID":"133f1969-bed7-44cd-9dac-b9dfaa376515","Type":"ContainerDied","Data":"e961a9b21d5ba22c9457891c45fc6f0b988237cbd4055dc60985a974e06c6a15"}
Mar 20 16:13:06 crc kubenswrapper[4730]: I0320 16:13:06.970671    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e961a9b21d5ba22c9457891c45fc6f0b988237cbd4055dc60985a974e06c6a15"
Mar 20 16:13:06 crc kubenswrapper[4730]: I0320 16:13:06.970714    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dkksx"
Mar 20 16:13:07 crc kubenswrapper[4730]: I0320 16:13:07.079234    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m89bj"]
Mar 20 16:13:07 crc kubenswrapper[4730]: E0320 16:13:07.079787    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="133f1969-bed7-44cd-9dac-b9dfaa376515" containerName="install-os-edpm-deployment-openstack-edpm-ipam"
Mar 20 16:13:07 crc kubenswrapper[4730]: I0320 16:13:07.079805    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="133f1969-bed7-44cd-9dac-b9dfaa376515" containerName="install-os-edpm-deployment-openstack-edpm-ipam"
Mar 20 16:13:07 crc kubenswrapper[4730]: I0320 16:13:07.080040    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="133f1969-bed7-44cd-9dac-b9dfaa376515" containerName="install-os-edpm-deployment-openstack-edpm-ipam"
Mar 20 16:13:07 crc kubenswrapper[4730]: I0320 16:13:07.080829    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m89bj"
Mar 20 16:13:07 crc kubenswrapper[4730]: I0320 16:13:07.087994    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env"
Mar 20 16:13:07 crc kubenswrapper[4730]: I0320 16:13:07.087935    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret"
Mar 20 16:13:07 crc kubenswrapper[4730]: I0320 16:13:07.088228    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vvsxx"
Mar 20 16:13:07 crc kubenswrapper[4730]: I0320 16:13:07.088346    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam"
Mar 20 16:13:07 crc kubenswrapper[4730]: I0320 16:13:07.100578    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a8c27e63-ebf9-45ff-87b2-4782b20e19e3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m89bj\" (UID: \"a8c27e63-ebf9-45ff-87b2-4782b20e19e3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m89bj"
Mar 20 16:13:07 crc kubenswrapper[4730]: I0320 16:13:07.100636    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5q7k\" (UniqueName: \"kubernetes.io/projected/a8c27e63-ebf9-45ff-87b2-4782b20e19e3-kube-api-access-k5q7k\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m89bj\" (UID: \"a8c27e63-ebf9-45ff-87b2-4782b20e19e3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m89bj"
Mar 20 16:13:07 crc kubenswrapper[4730]: I0320 16:13:07.100692    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a8c27e63-ebf9-45ff-87b2-4782b20e19e3-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m89bj\" (UID: \"a8c27e63-ebf9-45ff-87b2-4782b20e19e3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m89bj"
Mar 20 16:13:07 crc kubenswrapper[4730]: I0320 16:13:07.106004    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m89bj"]
Mar 20 16:13:07 crc kubenswrapper[4730]: I0320 16:13:07.203011    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a8c27e63-ebf9-45ff-87b2-4782b20e19e3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m89bj\" (UID: \"a8c27e63-ebf9-45ff-87b2-4782b20e19e3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m89bj"
Mar 20 16:13:07 crc kubenswrapper[4730]: I0320 16:13:07.203083    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5q7k\" (UniqueName: \"kubernetes.io/projected/a8c27e63-ebf9-45ff-87b2-4782b20e19e3-kube-api-access-k5q7k\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m89bj\" (UID: \"a8c27e63-ebf9-45ff-87b2-4782b20e19e3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m89bj"
Mar 20 16:13:07 crc kubenswrapper[4730]: I0320 16:13:07.203169    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a8c27e63-ebf9-45ff-87b2-4782b20e19e3-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m89bj\" (UID: \"a8c27e63-ebf9-45ff-87b2-4782b20e19e3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m89bj"
Mar 20 16:13:07 crc kubenswrapper[4730]: I0320 16:13:07.206929    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a8c27e63-ebf9-45ff-87b2-4782b20e19e3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m89bj\" (UID: \"a8c27e63-ebf9-45ff-87b2-4782b20e19e3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m89bj"
Mar 20 16:13:07 crc kubenswrapper[4730]: I0320 16:13:07.207080    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a8c27e63-ebf9-45ff-87b2-4782b20e19e3-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m89bj\" (UID: \"a8c27e63-ebf9-45ff-87b2-4782b20e19e3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m89bj"
Mar 20 16:13:07 crc kubenswrapper[4730]: I0320 16:13:07.222480    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5q7k\" (UniqueName: \"kubernetes.io/projected/a8c27e63-ebf9-45ff-87b2-4782b20e19e3-kube-api-access-k5q7k\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m89bj\" (UID: \"a8c27e63-ebf9-45ff-87b2-4782b20e19e3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m89bj"
Mar 20 16:13:07 crc kubenswrapper[4730]: I0320 16:13:07.402134    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m89bj"
Mar 20 16:13:07 crc kubenswrapper[4730]: I0320 16:13:07.546818    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6941d556-3020-4344-b185-5d79cf68187c" path="/var/lib/kubelet/pods/6941d556-3020-4344-b185-5d79cf68187c/volumes"
Mar 20 16:13:07 crc kubenswrapper[4730]: I0320 16:13:07.718642    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m89bj"]
Mar 20 16:13:07 crc kubenswrapper[4730]: I0320 16:13:07.980615    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m89bj" event={"ID":"a8c27e63-ebf9-45ff-87b2-4782b20e19e3","Type":"ContainerStarted","Data":"01d39be61543c09265f967d5b7ada582d06aa7ac17db1c635eb6ef1c127d2c54"}
Mar 20 16:13:08 crc kubenswrapper[4730]: I0320 16:13:08.991742    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m89bj" event={"ID":"a8c27e63-ebf9-45ff-87b2-4782b20e19e3","Type":"ContainerStarted","Data":"f5e9f1ce126d736eed319722f72b288bfa7b28dcea13acceb2dec38e7da569bc"}
Mar 20 16:13:09 crc kubenswrapper[4730]: I0320 16:13:09.016146    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m89bj" podStartSLOduration=1.492085203 podStartE2EDuration="2.016131757s" podCreationTimestamp="2026-03-20 16:13:07 +0000 UTC" firstStartedPulling="2026-03-20 16:13:07.721547067 +0000 UTC m=+2046.934918436" lastFinishedPulling="2026-03-20 16:13:08.245593591 +0000 UTC m=+2047.458964990" observedRunningTime="2026-03-20 16:13:09.01442872 +0000 UTC m=+2048.227800089" watchObservedRunningTime="2026-03-20 16:13:09.016131757 +0000 UTC m=+2048.229503126"
Mar 20 16:13:18 crc kubenswrapper[4730]: I0320 16:13:18.533411    4730 scope.go:117] "RemoveContainer" containerID="4a5122c368d04f6b91f59fb7f1f1be3d99580fb45609b2315a001469955ae120"
Mar 20 16:13:18 crc kubenswrapper[4730]: E0320 16:13:18.534355    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:13:32 crc kubenswrapper[4730]: I0320 16:13:32.346053    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cf7jm"]
Mar 20 16:13:32 crc kubenswrapper[4730]: I0320 16:13:32.348669    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cf7jm"
Mar 20 16:13:32 crc kubenswrapper[4730]: I0320 16:13:32.360827    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cf7jm"]
Mar 20 16:13:32 crc kubenswrapper[4730]: I0320 16:13:32.425835    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04ed0a2f-539e-4f9e-b106-6012931ca0f2-catalog-content\") pod \"redhat-marketplace-cf7jm\" (UID: \"04ed0a2f-539e-4f9e-b106-6012931ca0f2\") " pod="openshift-marketplace/redhat-marketplace-cf7jm"
Mar 20 16:13:32 crc kubenswrapper[4730]: I0320 16:13:32.425955    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04ed0a2f-539e-4f9e-b106-6012931ca0f2-utilities\") pod \"redhat-marketplace-cf7jm\" (UID: \"04ed0a2f-539e-4f9e-b106-6012931ca0f2\") " pod="openshift-marketplace/redhat-marketplace-cf7jm"
Mar 20 16:13:32 crc kubenswrapper[4730]: I0320 16:13:32.426178    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6t25\" (UniqueName: \"kubernetes.io/projected/04ed0a2f-539e-4f9e-b106-6012931ca0f2-kube-api-access-r6t25\") pod \"redhat-marketplace-cf7jm\" (UID: \"04ed0a2f-539e-4f9e-b106-6012931ca0f2\") " pod="openshift-marketplace/redhat-marketplace-cf7jm"
Mar 20 16:13:32 crc kubenswrapper[4730]: I0320 16:13:32.527552    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04ed0a2f-539e-4f9e-b106-6012931ca0f2-utilities\") pod \"redhat-marketplace-cf7jm\" (UID: \"04ed0a2f-539e-4f9e-b106-6012931ca0f2\") " pod="openshift-marketplace/redhat-marketplace-cf7jm"
Mar 20 16:13:32 crc kubenswrapper[4730]: I0320 16:13:32.527681    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6t25\" (UniqueName: \"kubernetes.io/projected/04ed0a2f-539e-4f9e-b106-6012931ca0f2-kube-api-access-r6t25\") pod \"redhat-marketplace-cf7jm\" (UID: \"04ed0a2f-539e-4f9e-b106-6012931ca0f2\") " pod="openshift-marketplace/redhat-marketplace-cf7jm"
Mar 20 16:13:32 crc kubenswrapper[4730]: I0320 16:13:32.527778    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04ed0a2f-539e-4f9e-b106-6012931ca0f2-catalog-content\") pod \"redhat-marketplace-cf7jm\" (UID: \"04ed0a2f-539e-4f9e-b106-6012931ca0f2\") " pod="openshift-marketplace/redhat-marketplace-cf7jm"
Mar 20 16:13:32 crc kubenswrapper[4730]: I0320 16:13:32.528135    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04ed0a2f-539e-4f9e-b106-6012931ca0f2-utilities\") pod \"redhat-marketplace-cf7jm\" (UID: \"04ed0a2f-539e-4f9e-b106-6012931ca0f2\") " pod="openshift-marketplace/redhat-marketplace-cf7jm"
Mar 20 16:13:32 crc kubenswrapper[4730]: I0320 16:13:32.528151    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04ed0a2f-539e-4f9e-b106-6012931ca0f2-catalog-content\") pod \"redhat-marketplace-cf7jm\" (UID: \"04ed0a2f-539e-4f9e-b106-6012931ca0f2\") " pod="openshift-marketplace/redhat-marketplace-cf7jm"
Mar 20 16:13:32 crc kubenswrapper[4730]: I0320 16:13:32.548044    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6t25\" (UniqueName: \"kubernetes.io/projected/04ed0a2f-539e-4f9e-b106-6012931ca0f2-kube-api-access-r6t25\") pod \"redhat-marketplace-cf7jm\" (UID: \"04ed0a2f-539e-4f9e-b106-6012931ca0f2\") " pod="openshift-marketplace/redhat-marketplace-cf7jm"
Mar 20 16:13:32 crc kubenswrapper[4730]: I0320 16:13:32.671531    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cf7jm"
Mar 20 16:13:32 crc kubenswrapper[4730]: I0320 16:13:32.941611    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kknb6"]
Mar 20 16:13:32 crc kubenswrapper[4730]: I0320 16:13:32.945584    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kknb6"
Mar 20 16:13:32 crc kubenswrapper[4730]: I0320 16:13:32.976734    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kknb6"]
Mar 20 16:13:33 crc kubenswrapper[4730]: I0320 16:13:33.043639    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c64hh\" (UniqueName: \"kubernetes.io/projected/b134de24-b8f7-4727-b32f-82c48b28787c-kube-api-access-c64hh\") pod \"community-operators-kknb6\" (UID: \"b134de24-b8f7-4727-b32f-82c48b28787c\") " pod="openshift-marketplace/community-operators-kknb6"
Mar 20 16:13:33 crc kubenswrapper[4730]: I0320 16:13:33.043836    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b134de24-b8f7-4727-b32f-82c48b28787c-catalog-content\") pod \"community-operators-kknb6\" (UID: \"b134de24-b8f7-4727-b32f-82c48b28787c\") " pod="openshift-marketplace/community-operators-kknb6"
Mar 20 16:13:33 crc kubenswrapper[4730]: I0320 16:13:33.043889    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b134de24-b8f7-4727-b32f-82c48b28787c-utilities\") pod \"community-operators-kknb6\" (UID: \"b134de24-b8f7-4727-b32f-82c48b28787c\") " pod="openshift-marketplace/community-operators-kknb6"
Mar 20 16:13:33 crc kubenswrapper[4730]: I0320 16:13:33.130110    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cf7jm"]
Mar 20 16:13:33 crc kubenswrapper[4730]: I0320 16:13:33.145778    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c64hh\" (UniqueName: \"kubernetes.io/projected/b134de24-b8f7-4727-b32f-82c48b28787c-kube-api-access-c64hh\") pod \"community-operators-kknb6\" (UID: \"b134de24-b8f7-4727-b32f-82c48b28787c\") " pod="openshift-marketplace/community-operators-kknb6"
Mar 20 16:13:33 crc kubenswrapper[4730]: I0320 16:13:33.145964    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b134de24-b8f7-4727-b32f-82c48b28787c-catalog-content\") pod \"community-operators-kknb6\" (UID: \"b134de24-b8f7-4727-b32f-82c48b28787c\") " pod="openshift-marketplace/community-operators-kknb6"
Mar 20 16:13:33 crc kubenswrapper[4730]: I0320 16:13:33.146013    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b134de24-b8f7-4727-b32f-82c48b28787c-utilities\") pod \"community-operators-kknb6\" (UID: \"b134de24-b8f7-4727-b32f-82c48b28787c\") " pod="openshift-marketplace/community-operators-kknb6"
Mar 20 16:13:33 crc kubenswrapper[4730]: I0320 16:13:33.146601    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b134de24-b8f7-4727-b32f-82c48b28787c-catalog-content\") pod \"community-operators-kknb6\" (UID: \"b134de24-b8f7-4727-b32f-82c48b28787c\") " pod="openshift-marketplace/community-operators-kknb6"
Mar 20 16:13:33 crc kubenswrapper[4730]: I0320 16:13:33.146685    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b134de24-b8f7-4727-b32f-82c48b28787c-utilities\") pod \"community-operators-kknb6\" (UID: \"b134de24-b8f7-4727-b32f-82c48b28787c\") " pod="openshift-marketplace/community-operators-kknb6"
Mar 20 16:13:33 crc kubenswrapper[4730]: I0320 16:13:33.172303    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c64hh\" (UniqueName: \"kubernetes.io/projected/b134de24-b8f7-4727-b32f-82c48b28787c-kube-api-access-c64hh\") pod \"community-operators-kknb6\" (UID: \"b134de24-b8f7-4727-b32f-82c48b28787c\") " pod="openshift-marketplace/community-operators-kknb6"
Mar 20 16:13:33 crc kubenswrapper[4730]: I0320 16:13:33.237739    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cf7jm" event={"ID":"04ed0a2f-539e-4f9e-b106-6012931ca0f2","Type":"ContainerStarted","Data":"b4039747777cef975b706863c807a20fc0909a15358be96231a91bbe7aaf26bc"}
Mar 20 16:13:33 crc kubenswrapper[4730]: I0320 16:13:33.274282    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kknb6"
Mar 20 16:13:33 crc kubenswrapper[4730]: I0320 16:13:33.536529    4730 scope.go:117] "RemoveContainer" containerID="4a5122c368d04f6b91f59fb7f1f1be3d99580fb45609b2315a001469955ae120"
Mar 20 16:13:33 crc kubenswrapper[4730]: E0320 16:13:33.537681    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:13:33 crc kubenswrapper[4730]: W0320 16:13:33.783271    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb134de24_b8f7_4727_b32f_82c48b28787c.slice/crio-a676f4331dd426611b34d9f9540e81297862d80665c28c5215dc7f1301e36ad2 WatchSource:0}: Error finding container a676f4331dd426611b34d9f9540e81297862d80665c28c5215dc7f1301e36ad2: Status 404 returned error can't find the container with id a676f4331dd426611b34d9f9540e81297862d80665c28c5215dc7f1301e36ad2
Mar 20 16:13:33 crc kubenswrapper[4730]: I0320 16:13:33.783404    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kknb6"]
Mar 20 16:13:34 crc kubenswrapper[4730]: I0320 16:13:34.247634    4730 generic.go:334] "Generic (PLEG): container finished" podID="04ed0a2f-539e-4f9e-b106-6012931ca0f2" containerID="329624f3588bc30d04808da83dac1197f8b8d3102b182217d737f7794a07afae" exitCode=0
Mar 20 16:13:34 crc kubenswrapper[4730]: I0320 16:13:34.247709    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cf7jm" event={"ID":"04ed0a2f-539e-4f9e-b106-6012931ca0f2","Type":"ContainerDied","Data":"329624f3588bc30d04808da83dac1197f8b8d3102b182217d737f7794a07afae"}
Mar 20 16:13:34 crc kubenswrapper[4730]: I0320 16:13:34.252010    4730 generic.go:334] "Generic (PLEG): container finished" podID="b134de24-b8f7-4727-b32f-82c48b28787c" containerID="9ea9db45b55ad8d6732f20392de0be9c4973c0210527a27e6f448c63676275c8" exitCode=0
Mar 20 16:13:34 crc kubenswrapper[4730]: I0320 16:13:34.252055    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kknb6" event={"ID":"b134de24-b8f7-4727-b32f-82c48b28787c","Type":"ContainerDied","Data":"9ea9db45b55ad8d6732f20392de0be9c4973c0210527a27e6f448c63676275c8"}
Mar 20 16:13:34 crc kubenswrapper[4730]: I0320 16:13:34.252088    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kknb6" event={"ID":"b134de24-b8f7-4727-b32f-82c48b28787c","Type":"ContainerStarted","Data":"a676f4331dd426611b34d9f9540e81297862d80665c28c5215dc7f1301e36ad2"}
Mar 20 16:13:35 crc kubenswrapper[4730]: I0320 16:13:35.267362    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kknb6" event={"ID":"b134de24-b8f7-4727-b32f-82c48b28787c","Type":"ContainerStarted","Data":"6c36cedbc8afbe16b472f5fbd77a0dd22f8e529944cc3bfadfb6feb49119485f"}
Mar 20 16:13:35 crc kubenswrapper[4730]: I0320 16:13:35.277847    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cf7jm" event={"ID":"04ed0a2f-539e-4f9e-b106-6012931ca0f2","Type":"ContainerStarted","Data":"478d1155c51193816a900b0e29ac07329d923136e4d365fde22bb334a413a294"}
Mar 20 16:13:35 crc kubenswrapper[4730]: I0320 16:13:35.347602    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-smz8t"]
Mar 20 16:13:35 crc kubenswrapper[4730]: I0320 16:13:35.350333    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-smz8t"
Mar 20 16:13:35 crc kubenswrapper[4730]: I0320 16:13:35.357304    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-smz8t"]
Mar 20 16:13:35 crc kubenswrapper[4730]: I0320 16:13:35.392069    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a30ccad-6262-4840-bab0-7c70cce5c54e-utilities\") pod \"certified-operators-smz8t\" (UID: \"8a30ccad-6262-4840-bab0-7c70cce5c54e\") " pod="openshift-marketplace/certified-operators-smz8t"
Mar 20 16:13:35 crc kubenswrapper[4730]: I0320 16:13:35.392159    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcg8l\" (UniqueName: \"kubernetes.io/projected/8a30ccad-6262-4840-bab0-7c70cce5c54e-kube-api-access-tcg8l\") pod \"certified-operators-smz8t\" (UID: \"8a30ccad-6262-4840-bab0-7c70cce5c54e\") " pod="openshift-marketplace/certified-operators-smz8t"
Mar 20 16:13:35 crc kubenswrapper[4730]: I0320 16:13:35.392258    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a30ccad-6262-4840-bab0-7c70cce5c54e-catalog-content\") pod \"certified-operators-smz8t\" (UID: \"8a30ccad-6262-4840-bab0-7c70cce5c54e\") " pod="openshift-marketplace/certified-operators-smz8t"
Mar 20 16:13:35 crc kubenswrapper[4730]: I0320 16:13:35.493882    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a30ccad-6262-4840-bab0-7c70cce5c54e-catalog-content\") pod \"certified-operators-smz8t\" (UID: \"8a30ccad-6262-4840-bab0-7c70cce5c54e\") " pod="openshift-marketplace/certified-operators-smz8t"
Mar 20 16:13:35 crc kubenswrapper[4730]: I0320 16:13:35.494370    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a30ccad-6262-4840-bab0-7c70cce5c54e-utilities\") pod \"certified-operators-smz8t\" (UID: \"8a30ccad-6262-4840-bab0-7c70cce5c54e\") " pod="openshift-marketplace/certified-operators-smz8t"
Mar 20 16:13:35 crc kubenswrapper[4730]: I0320 16:13:35.494388    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a30ccad-6262-4840-bab0-7c70cce5c54e-catalog-content\") pod \"certified-operators-smz8t\" (UID: \"8a30ccad-6262-4840-bab0-7c70cce5c54e\") " pod="openshift-marketplace/certified-operators-smz8t"
Mar 20 16:13:35 crc kubenswrapper[4730]: I0320 16:13:35.494492    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcg8l\" (UniqueName: \"kubernetes.io/projected/8a30ccad-6262-4840-bab0-7c70cce5c54e-kube-api-access-tcg8l\") pod \"certified-operators-smz8t\" (UID: \"8a30ccad-6262-4840-bab0-7c70cce5c54e\") " pod="openshift-marketplace/certified-operators-smz8t"
Mar 20 16:13:35 crc kubenswrapper[4730]: I0320 16:13:35.494641    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a30ccad-6262-4840-bab0-7c70cce5c54e-utilities\") pod \"certified-operators-smz8t\" (UID: \"8a30ccad-6262-4840-bab0-7c70cce5c54e\") " pod="openshift-marketplace/certified-operators-smz8t"
Mar 20 16:13:35 crc kubenswrapper[4730]: I0320 16:13:35.514206    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcg8l\" (UniqueName: \"kubernetes.io/projected/8a30ccad-6262-4840-bab0-7c70cce5c54e-kube-api-access-tcg8l\") pod \"certified-operators-smz8t\" (UID: \"8a30ccad-6262-4840-bab0-7c70cce5c54e\") " pod="openshift-marketplace/certified-operators-smz8t"
Mar 20 16:13:35 crc kubenswrapper[4730]: I0320 16:13:35.716735    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-smz8t"
Mar 20 16:13:36 crc kubenswrapper[4730]: I0320 16:13:36.043126    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-smz8t"]
Mar 20 16:13:36 crc kubenswrapper[4730]: I0320 16:13:36.287803    4730 generic.go:334] "Generic (PLEG): container finished" podID="b134de24-b8f7-4727-b32f-82c48b28787c" containerID="6c36cedbc8afbe16b472f5fbd77a0dd22f8e529944cc3bfadfb6feb49119485f" exitCode=0
Mar 20 16:13:36 crc kubenswrapper[4730]: I0320 16:13:36.287891    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kknb6" event={"ID":"b134de24-b8f7-4727-b32f-82c48b28787c","Type":"ContainerDied","Data":"6c36cedbc8afbe16b472f5fbd77a0dd22f8e529944cc3bfadfb6feb49119485f"}
Mar 20 16:13:36 crc kubenswrapper[4730]: I0320 16:13:36.289648    4730 generic.go:334] "Generic (PLEG): container finished" podID="8a30ccad-6262-4840-bab0-7c70cce5c54e" containerID="695ec40a8d4727ad133ecc0b1c4d07769a4ff2a01a7d067cb8b42c4ff5501acd" exitCode=0
Mar 20 16:13:36 crc kubenswrapper[4730]: I0320 16:13:36.289709    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-smz8t" event={"ID":"8a30ccad-6262-4840-bab0-7c70cce5c54e","Type":"ContainerDied","Data":"695ec40a8d4727ad133ecc0b1c4d07769a4ff2a01a7d067cb8b42c4ff5501acd"}
Mar 20 16:13:36 crc kubenswrapper[4730]: I0320 16:13:36.289731    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-smz8t" event={"ID":"8a30ccad-6262-4840-bab0-7c70cce5c54e","Type":"ContainerStarted","Data":"000e4fc2d8a82235e0aca159d20f3965efb17246205b4d4c27e700483ac7437b"}
Mar 20 16:13:36 crc kubenswrapper[4730]: I0320 16:13:36.292143    4730 generic.go:334] "Generic (PLEG): container finished" podID="04ed0a2f-539e-4f9e-b106-6012931ca0f2" containerID="478d1155c51193816a900b0e29ac07329d923136e4d365fde22bb334a413a294" exitCode=0
Mar 20 16:13:36 crc kubenswrapper[4730]: I0320 16:13:36.292170    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cf7jm" event={"ID":"04ed0a2f-539e-4f9e-b106-6012931ca0f2","Type":"ContainerDied","Data":"478d1155c51193816a900b0e29ac07329d923136e4d365fde22bb334a413a294"}
Mar 20 16:13:37 crc kubenswrapper[4730]: I0320 16:13:37.302999    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-smz8t" event={"ID":"8a30ccad-6262-4840-bab0-7c70cce5c54e","Type":"ContainerStarted","Data":"299f8ee9143efa4f898b30f6780be69975c00a31d3fe75022db1b4bdfde1ce2e"}
Mar 20 16:13:37 crc kubenswrapper[4730]: I0320 16:13:37.307402    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cf7jm" event={"ID":"04ed0a2f-539e-4f9e-b106-6012931ca0f2","Type":"ContainerStarted","Data":"8be767447f54495978c4a0af427a61695bb8c8d39e01bf3e3cf8a755068b39ad"}
Mar 20 16:13:37 crc kubenswrapper[4730]: I0320 16:13:37.352224    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cf7jm" podStartSLOduration=2.6395005830000002 podStartE2EDuration="5.352204427s" podCreationTimestamp="2026-03-20 16:13:32 +0000 UTC" firstStartedPulling="2026-03-20 16:13:34.250223694 +0000 UTC m=+2073.463595093" lastFinishedPulling="2026-03-20 16:13:36.962927568 +0000 UTC m=+2076.176298937" observedRunningTime="2026-03-20 16:13:37.348365431 +0000 UTC m=+2076.561736820" watchObservedRunningTime="2026-03-20 16:13:37.352204427 +0000 UTC m=+2076.565575806"
Mar 20 16:13:38 crc kubenswrapper[4730]: I0320 16:13:38.317171    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kknb6" event={"ID":"b134de24-b8f7-4727-b32f-82c48b28787c","Type":"ContainerStarted","Data":"df6f84b7aedb1f807fdbfc4478ea27c0ba29ede6e07cc494eb8ff2f4f0f2becc"}
Mar 20 16:13:38 crc kubenswrapper[4730]: I0320 16:13:38.319888    4730 generic.go:334] "Generic (PLEG): container finished" podID="8a30ccad-6262-4840-bab0-7c70cce5c54e" containerID="299f8ee9143efa4f898b30f6780be69975c00a31d3fe75022db1b4bdfde1ce2e" exitCode=0
Mar 20 16:13:38 crc kubenswrapper[4730]: I0320 16:13:38.320965    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-smz8t" event={"ID":"8a30ccad-6262-4840-bab0-7c70cce5c54e","Type":"ContainerDied","Data":"299f8ee9143efa4f898b30f6780be69975c00a31d3fe75022db1b4bdfde1ce2e"}
Mar 20 16:13:38 crc kubenswrapper[4730]: I0320 16:13:38.341528    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kknb6" podStartSLOduration=3.377736151 podStartE2EDuration="6.341509934s" podCreationTimestamp="2026-03-20 16:13:32 +0000 UTC" firstStartedPulling="2026-03-20 16:13:34.253804103 +0000 UTC m=+2073.467175472" lastFinishedPulling="2026-03-20 16:13:37.217577896 +0000 UTC m=+2076.430949255" observedRunningTime="2026-03-20 16:13:38.33416931 +0000 UTC m=+2077.547540679" watchObservedRunningTime="2026-03-20 16:13:38.341509934 +0000 UTC m=+2077.554881303"
Mar 20 16:13:39 crc kubenswrapper[4730]: I0320 16:13:39.329225    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-smz8t" event={"ID":"8a30ccad-6262-4840-bab0-7c70cce5c54e","Type":"ContainerStarted","Data":"4c8596c65726e22fe600b5967470f2bf5449f3246bf3d726509883465d6ebbde"}
Mar 20 16:13:39 crc kubenswrapper[4730]: I0320 16:13:39.349448    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-smz8t" podStartSLOduration=1.6895445310000001 podStartE2EDuration="4.349428527s" podCreationTimestamp="2026-03-20 16:13:35 +0000 UTC" firstStartedPulling="2026-03-20 16:13:36.291175739 +0000 UTC m=+2075.504547108" lastFinishedPulling="2026-03-20 16:13:38.951059735 +0000 UTC m=+2078.164431104" observedRunningTime="2026-03-20 16:13:39.346089334 +0000 UTC m=+2078.559460693" watchObservedRunningTime="2026-03-20 16:13:39.349428527 +0000 UTC m=+2078.562799896"
Mar 20 16:13:42 crc kubenswrapper[4730]: I0320 16:13:42.672040    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cf7jm"
Mar 20 16:13:42 crc kubenswrapper[4730]: I0320 16:13:42.673496    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cf7jm"
Mar 20 16:13:42 crc kubenswrapper[4730]: I0320 16:13:42.741791    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cf7jm"
Mar 20 16:13:43 crc kubenswrapper[4730]: I0320 16:13:43.274830    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kknb6"
Mar 20 16:13:43 crc kubenswrapper[4730]: I0320 16:13:43.274937    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kknb6"
Mar 20 16:13:43 crc kubenswrapper[4730]: I0320 16:13:43.332172    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kknb6"
Mar 20 16:13:43 crc kubenswrapper[4730]: I0320 16:13:43.407694    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kknb6"
Mar 20 16:13:43 crc kubenswrapper[4730]: I0320 16:13:43.421033    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cf7jm"
Mar 20 16:13:44 crc kubenswrapper[4730]: I0320 16:13:44.533373    4730 scope.go:117] "RemoveContainer" containerID="4a5122c368d04f6b91f59fb7f1f1be3d99580fb45609b2315a001469955ae120"
Mar 20 16:13:45 crc kubenswrapper[4730]: I0320 16:13:45.389426    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" event={"ID":"7fcd3db3-55f1-4c23-8fa9-78844495cea3","Type":"ContainerStarted","Data":"b7dfce64faa161154e6afb28ae3e7685cc4caead60043a4bb51030ee7d13fb31"}
Mar 20 16:13:45 crc kubenswrapper[4730]: I0320 16:13:45.531101    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kknb6"]
Mar 20 16:13:45 crc kubenswrapper[4730]: I0320 16:13:45.531676    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kknb6" podUID="b134de24-b8f7-4727-b32f-82c48b28787c" containerName="registry-server" containerID="cri-o://df6f84b7aedb1f807fdbfc4478ea27c0ba29ede6e07cc494eb8ff2f4f0f2becc" gracePeriod=2
Mar 20 16:13:45 crc kubenswrapper[4730]: I0320 16:13:45.716934    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-smz8t"
Mar 20 16:13:45 crc kubenswrapper[4730]: I0320 16:13:45.718419    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-smz8t"
Mar 20 16:13:45 crc kubenswrapper[4730]: I0320 16:13:45.732142    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cf7jm"]
Mar 20 16:13:45 crc kubenswrapper[4730]: I0320 16:13:45.766369    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-smz8t"
Mar 20 16:13:46 crc kubenswrapper[4730]: I0320 16:13:46.007441    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kknb6"
Mar 20 16:13:46 crc kubenswrapper[4730]: I0320 16:13:46.100099    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b134de24-b8f7-4727-b32f-82c48b28787c-utilities\") pod \"b134de24-b8f7-4727-b32f-82c48b28787c\" (UID: \"b134de24-b8f7-4727-b32f-82c48b28787c\") "
Mar 20 16:13:46 crc kubenswrapper[4730]: I0320 16:13:46.100305    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c64hh\" (UniqueName: \"kubernetes.io/projected/b134de24-b8f7-4727-b32f-82c48b28787c-kube-api-access-c64hh\") pod \"b134de24-b8f7-4727-b32f-82c48b28787c\" (UID: \"b134de24-b8f7-4727-b32f-82c48b28787c\") "
Mar 20 16:13:46 crc kubenswrapper[4730]: I0320 16:13:46.100354    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b134de24-b8f7-4727-b32f-82c48b28787c-catalog-content\") pod \"b134de24-b8f7-4727-b32f-82c48b28787c\" (UID: \"b134de24-b8f7-4727-b32f-82c48b28787c\") "
Mar 20 16:13:46 crc kubenswrapper[4730]: I0320 16:13:46.101346    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b134de24-b8f7-4727-b32f-82c48b28787c-utilities" (OuterVolumeSpecName: "utilities") pod "b134de24-b8f7-4727-b32f-82c48b28787c" (UID: "b134de24-b8f7-4727-b32f-82c48b28787c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:13:46 crc kubenswrapper[4730]: I0320 16:13:46.106228    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b134de24-b8f7-4727-b32f-82c48b28787c-kube-api-access-c64hh" (OuterVolumeSpecName: "kube-api-access-c64hh") pod "b134de24-b8f7-4727-b32f-82c48b28787c" (UID: "b134de24-b8f7-4727-b32f-82c48b28787c"). InnerVolumeSpecName "kube-api-access-c64hh". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:13:46 crc kubenswrapper[4730]: I0320 16:13:46.148439    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b134de24-b8f7-4727-b32f-82c48b28787c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b134de24-b8f7-4727-b32f-82c48b28787c" (UID: "b134de24-b8f7-4727-b32f-82c48b28787c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:13:46 crc kubenswrapper[4730]: I0320 16:13:46.202324    4730 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b134de24-b8f7-4727-b32f-82c48b28787c-utilities\") on node \"crc\" DevicePath \"\""
Mar 20 16:13:46 crc kubenswrapper[4730]: I0320 16:13:46.202351    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c64hh\" (UniqueName: \"kubernetes.io/projected/b134de24-b8f7-4727-b32f-82c48b28787c-kube-api-access-c64hh\") on node \"crc\" DevicePath \"\""
Mar 20 16:13:46 crc kubenswrapper[4730]: I0320 16:13:46.202362    4730 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b134de24-b8f7-4727-b32f-82c48b28787c-catalog-content\") on node \"crc\" DevicePath \"\""
Mar 20 16:13:46 crc kubenswrapper[4730]: I0320 16:13:46.400116    4730 generic.go:334] "Generic (PLEG): container finished" podID="b134de24-b8f7-4727-b32f-82c48b28787c" containerID="df6f84b7aedb1f807fdbfc4478ea27c0ba29ede6e07cc494eb8ff2f4f0f2becc" exitCode=0
Mar 20 16:13:46 crc kubenswrapper[4730]: I0320 16:13:46.400174    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kknb6"
Mar 20 16:13:46 crc kubenswrapper[4730]: I0320 16:13:46.400191    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kknb6" event={"ID":"b134de24-b8f7-4727-b32f-82c48b28787c","Type":"ContainerDied","Data":"df6f84b7aedb1f807fdbfc4478ea27c0ba29ede6e07cc494eb8ff2f4f0f2becc"}
Mar 20 16:13:46 crc kubenswrapper[4730]: I0320 16:13:46.400903    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kknb6" event={"ID":"b134de24-b8f7-4727-b32f-82c48b28787c","Type":"ContainerDied","Data":"a676f4331dd426611b34d9f9540e81297862d80665c28c5215dc7f1301e36ad2"}
Mar 20 16:13:46 crc kubenswrapper[4730]: I0320 16:13:46.400985    4730 scope.go:117] "RemoveContainer" containerID="df6f84b7aedb1f807fdbfc4478ea27c0ba29ede6e07cc494eb8ff2f4f0f2becc"
Mar 20 16:13:46 crc kubenswrapper[4730]: I0320 16:13:46.401654    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cf7jm" podUID="04ed0a2f-539e-4f9e-b106-6012931ca0f2" containerName="registry-server" containerID="cri-o://8be767447f54495978c4a0af427a61695bb8c8d39e01bf3e3cf8a755068b39ad" gracePeriod=2
Mar 20 16:13:46 crc kubenswrapper[4730]: I0320 16:13:46.426508    4730 scope.go:117] "RemoveContainer" containerID="6c36cedbc8afbe16b472f5fbd77a0dd22f8e529944cc3bfadfb6feb49119485f"
Mar 20 16:13:46 crc kubenswrapper[4730]: I0320 16:13:46.434558    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kknb6"]
Mar 20 16:13:46 crc kubenswrapper[4730]: I0320 16:13:46.443603    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kknb6"]
Mar 20 16:13:46 crc kubenswrapper[4730]: I0320 16:13:46.462001    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-smz8t"
Mar 20 16:13:46 crc kubenswrapper[4730]: I0320 16:13:46.509483    4730 scope.go:117] "RemoveContainer" containerID="9ea9db45b55ad8d6732f20392de0be9c4973c0210527a27e6f448c63676275c8"
Mar 20 16:13:46 crc kubenswrapper[4730]: I0320 16:13:46.551868    4730 scope.go:117] "RemoveContainer" containerID="df6f84b7aedb1f807fdbfc4478ea27c0ba29ede6e07cc494eb8ff2f4f0f2becc"
Mar 20 16:13:46 crc kubenswrapper[4730]: E0320 16:13:46.552516    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df6f84b7aedb1f807fdbfc4478ea27c0ba29ede6e07cc494eb8ff2f4f0f2becc\": container with ID starting with df6f84b7aedb1f807fdbfc4478ea27c0ba29ede6e07cc494eb8ff2f4f0f2becc not found: ID does not exist" containerID="df6f84b7aedb1f807fdbfc4478ea27c0ba29ede6e07cc494eb8ff2f4f0f2becc"
Mar 20 16:13:46 crc kubenswrapper[4730]: I0320 16:13:46.552549    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df6f84b7aedb1f807fdbfc4478ea27c0ba29ede6e07cc494eb8ff2f4f0f2becc"} err="failed to get container status \"df6f84b7aedb1f807fdbfc4478ea27c0ba29ede6e07cc494eb8ff2f4f0f2becc\": rpc error: code = NotFound desc = could not find container \"df6f84b7aedb1f807fdbfc4478ea27c0ba29ede6e07cc494eb8ff2f4f0f2becc\": container with ID starting with df6f84b7aedb1f807fdbfc4478ea27c0ba29ede6e07cc494eb8ff2f4f0f2becc not found: ID does not exist"
Mar 20 16:13:46 crc kubenswrapper[4730]: I0320 16:13:46.552569    4730 scope.go:117] "RemoveContainer" containerID="6c36cedbc8afbe16b472f5fbd77a0dd22f8e529944cc3bfadfb6feb49119485f"
Mar 20 16:13:46 crc kubenswrapper[4730]: E0320 16:13:46.552943    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c36cedbc8afbe16b472f5fbd77a0dd22f8e529944cc3bfadfb6feb49119485f\": container with ID starting with 6c36cedbc8afbe16b472f5fbd77a0dd22f8e529944cc3bfadfb6feb49119485f not found: ID does not exist" containerID="6c36cedbc8afbe16b472f5fbd77a0dd22f8e529944cc3bfadfb6feb49119485f"
Mar 20 16:13:46 crc kubenswrapper[4730]: I0320 16:13:46.552976    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c36cedbc8afbe16b472f5fbd77a0dd22f8e529944cc3bfadfb6feb49119485f"} err="failed to get container status \"6c36cedbc8afbe16b472f5fbd77a0dd22f8e529944cc3bfadfb6feb49119485f\": rpc error: code = NotFound desc = could not find container \"6c36cedbc8afbe16b472f5fbd77a0dd22f8e529944cc3bfadfb6feb49119485f\": container with ID starting with 6c36cedbc8afbe16b472f5fbd77a0dd22f8e529944cc3bfadfb6feb49119485f not found: ID does not exist"
Mar 20 16:13:46 crc kubenswrapper[4730]: I0320 16:13:46.553001    4730 scope.go:117] "RemoveContainer" containerID="9ea9db45b55ad8d6732f20392de0be9c4973c0210527a27e6f448c63676275c8"
Mar 20 16:13:46 crc kubenswrapper[4730]: E0320 16:13:46.553261    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ea9db45b55ad8d6732f20392de0be9c4973c0210527a27e6f448c63676275c8\": container with ID starting with 9ea9db45b55ad8d6732f20392de0be9c4973c0210527a27e6f448c63676275c8 not found: ID does not exist" containerID="9ea9db45b55ad8d6732f20392de0be9c4973c0210527a27e6f448c63676275c8"
Mar 20 16:13:46 crc kubenswrapper[4730]: I0320 16:13:46.553287    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ea9db45b55ad8d6732f20392de0be9c4973c0210527a27e6f448c63676275c8"} err="failed to get container status \"9ea9db45b55ad8d6732f20392de0be9c4973c0210527a27e6f448c63676275c8\": rpc error: code = NotFound desc = could not find container \"9ea9db45b55ad8d6732f20392de0be9c4973c0210527a27e6f448c63676275c8\": container with ID starting with 9ea9db45b55ad8d6732f20392de0be9c4973c0210527a27e6f448c63676275c8 not found: ID does not exist"
Mar 20 16:13:46 crc kubenswrapper[4730]: I0320 16:13:46.823526    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cf7jm"
Mar 20 16:13:46 crc kubenswrapper[4730]: I0320 16:13:46.915938    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04ed0a2f-539e-4f9e-b106-6012931ca0f2-catalog-content\") pod \"04ed0a2f-539e-4f9e-b106-6012931ca0f2\" (UID: \"04ed0a2f-539e-4f9e-b106-6012931ca0f2\") "
Mar 20 16:13:46 crc kubenswrapper[4730]: I0320 16:13:46.916109    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6t25\" (UniqueName: \"kubernetes.io/projected/04ed0a2f-539e-4f9e-b106-6012931ca0f2-kube-api-access-r6t25\") pod \"04ed0a2f-539e-4f9e-b106-6012931ca0f2\" (UID: \"04ed0a2f-539e-4f9e-b106-6012931ca0f2\") "
Mar 20 16:13:46 crc kubenswrapper[4730]: I0320 16:13:46.916156    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04ed0a2f-539e-4f9e-b106-6012931ca0f2-utilities\") pod \"04ed0a2f-539e-4f9e-b106-6012931ca0f2\" (UID: \"04ed0a2f-539e-4f9e-b106-6012931ca0f2\") "
Mar 20 16:13:46 crc kubenswrapper[4730]: I0320 16:13:46.917084    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04ed0a2f-539e-4f9e-b106-6012931ca0f2-utilities" (OuterVolumeSpecName: "utilities") pod "04ed0a2f-539e-4f9e-b106-6012931ca0f2" (UID: "04ed0a2f-539e-4f9e-b106-6012931ca0f2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:13:46 crc kubenswrapper[4730]: I0320 16:13:46.923484    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04ed0a2f-539e-4f9e-b106-6012931ca0f2-kube-api-access-r6t25" (OuterVolumeSpecName: "kube-api-access-r6t25") pod "04ed0a2f-539e-4f9e-b106-6012931ca0f2" (UID: "04ed0a2f-539e-4f9e-b106-6012931ca0f2"). InnerVolumeSpecName "kube-api-access-r6t25". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:13:46 crc kubenswrapper[4730]: I0320 16:13:46.950841    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04ed0a2f-539e-4f9e-b106-6012931ca0f2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "04ed0a2f-539e-4f9e-b106-6012931ca0f2" (UID: "04ed0a2f-539e-4f9e-b106-6012931ca0f2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:13:47 crc kubenswrapper[4730]: I0320 16:13:47.018887    4730 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04ed0a2f-539e-4f9e-b106-6012931ca0f2-catalog-content\") on node \"crc\" DevicePath \"\""
Mar 20 16:13:47 crc kubenswrapper[4730]: I0320 16:13:47.018925    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6t25\" (UniqueName: \"kubernetes.io/projected/04ed0a2f-539e-4f9e-b106-6012931ca0f2-kube-api-access-r6t25\") on node \"crc\" DevicePath \"\""
Mar 20 16:13:47 crc kubenswrapper[4730]: I0320 16:13:47.018936    4730 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04ed0a2f-539e-4f9e-b106-6012931ca0f2-utilities\") on node \"crc\" DevicePath \"\""
Mar 20 16:13:47 crc kubenswrapper[4730]: I0320 16:13:47.409659    4730 generic.go:334] "Generic (PLEG): container finished" podID="04ed0a2f-539e-4f9e-b106-6012931ca0f2" containerID="8be767447f54495978c4a0af427a61695bb8c8d39e01bf3e3cf8a755068b39ad" exitCode=0
Mar 20 16:13:47 crc kubenswrapper[4730]: I0320 16:13:47.409729    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cf7jm" event={"ID":"04ed0a2f-539e-4f9e-b106-6012931ca0f2","Type":"ContainerDied","Data":"8be767447f54495978c4a0af427a61695bb8c8d39e01bf3e3cf8a755068b39ad"}
Mar 20 16:13:47 crc kubenswrapper[4730]: I0320 16:13:47.410076    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cf7jm" event={"ID":"04ed0a2f-539e-4f9e-b106-6012931ca0f2","Type":"ContainerDied","Data":"b4039747777cef975b706863c807a20fc0909a15358be96231a91bbe7aaf26bc"}
Mar 20 16:13:47 crc kubenswrapper[4730]: I0320 16:13:47.409763    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cf7jm"
Mar 20 16:13:47 crc kubenswrapper[4730]: I0320 16:13:47.410096    4730 scope.go:117] "RemoveContainer" containerID="8be767447f54495978c4a0af427a61695bb8c8d39e01bf3e3cf8a755068b39ad"
Mar 20 16:13:47 crc kubenswrapper[4730]: I0320 16:13:47.428550    4730 scope.go:117] "RemoveContainer" containerID="478d1155c51193816a900b0e29ac07329d923136e4d365fde22bb334a413a294"
Mar 20 16:13:47 crc kubenswrapper[4730]: I0320 16:13:47.444102    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cf7jm"]
Mar 20 16:13:47 crc kubenswrapper[4730]: I0320 16:13:47.451373    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cf7jm"]
Mar 20 16:13:47 crc kubenswrapper[4730]: I0320 16:13:47.466435    4730 scope.go:117] "RemoveContainer" containerID="329624f3588bc30d04808da83dac1197f8b8d3102b182217d737f7794a07afae"
Mar 20 16:13:47 crc kubenswrapper[4730]: I0320 16:13:47.501269    4730 scope.go:117] "RemoveContainer" containerID="8be767447f54495978c4a0af427a61695bb8c8d39e01bf3e3cf8a755068b39ad"
Mar 20 16:13:47 crc kubenswrapper[4730]: E0320 16:13:47.501848    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8be767447f54495978c4a0af427a61695bb8c8d39e01bf3e3cf8a755068b39ad\": container with ID starting with 8be767447f54495978c4a0af427a61695bb8c8d39e01bf3e3cf8a755068b39ad not found: ID does not exist" containerID="8be767447f54495978c4a0af427a61695bb8c8d39e01bf3e3cf8a755068b39ad"
Mar 20 16:13:47 crc kubenswrapper[4730]: I0320 16:13:47.501893    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8be767447f54495978c4a0af427a61695bb8c8d39e01bf3e3cf8a755068b39ad"} err="failed to get container status \"8be767447f54495978c4a0af427a61695bb8c8d39e01bf3e3cf8a755068b39ad\": rpc error: code = NotFound desc = could not find container \"8be767447f54495978c4a0af427a61695bb8c8d39e01bf3e3cf8a755068b39ad\": container with ID starting with 8be767447f54495978c4a0af427a61695bb8c8d39e01bf3e3cf8a755068b39ad not found: ID does not exist"
Mar 20 16:13:47 crc kubenswrapper[4730]: I0320 16:13:47.501921    4730 scope.go:117] "RemoveContainer" containerID="478d1155c51193816a900b0e29ac07329d923136e4d365fde22bb334a413a294"
Mar 20 16:13:47 crc kubenswrapper[4730]: E0320 16:13:47.502181    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"478d1155c51193816a900b0e29ac07329d923136e4d365fde22bb334a413a294\": container with ID starting with 478d1155c51193816a900b0e29ac07329d923136e4d365fde22bb334a413a294 not found: ID does not exist" containerID="478d1155c51193816a900b0e29ac07329d923136e4d365fde22bb334a413a294"
Mar 20 16:13:47 crc kubenswrapper[4730]: I0320 16:13:47.502206    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"478d1155c51193816a900b0e29ac07329d923136e4d365fde22bb334a413a294"} err="failed to get container status \"478d1155c51193816a900b0e29ac07329d923136e4d365fde22bb334a413a294\": rpc error: code = NotFound desc = could not find container \"478d1155c51193816a900b0e29ac07329d923136e4d365fde22bb334a413a294\": container with ID starting with 478d1155c51193816a900b0e29ac07329d923136e4d365fde22bb334a413a294 not found: ID does not exist"
Mar 20 16:13:47 crc kubenswrapper[4730]: I0320 16:13:47.502220    4730 scope.go:117] "RemoveContainer" containerID="329624f3588bc30d04808da83dac1197f8b8d3102b182217d737f7794a07afae"
Mar 20 16:13:47 crc kubenswrapper[4730]: E0320 16:13:47.502464    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"329624f3588bc30d04808da83dac1197f8b8d3102b182217d737f7794a07afae\": container with ID starting with 329624f3588bc30d04808da83dac1197f8b8d3102b182217d737f7794a07afae not found: ID does not exist" containerID="329624f3588bc30d04808da83dac1197f8b8d3102b182217d737f7794a07afae"
Mar 20 16:13:47 crc kubenswrapper[4730]: I0320 16:13:47.502498    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"329624f3588bc30d04808da83dac1197f8b8d3102b182217d737f7794a07afae"} err="failed to get container status \"329624f3588bc30d04808da83dac1197f8b8d3102b182217d737f7794a07afae\": rpc error: code = NotFound desc = could not find container \"329624f3588bc30d04808da83dac1197f8b8d3102b182217d737f7794a07afae\": container with ID starting with 329624f3588bc30d04808da83dac1197f8b8d3102b182217d737f7794a07afae not found: ID does not exist"
Mar 20 16:13:47 crc kubenswrapper[4730]: I0320 16:13:47.545303    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04ed0a2f-539e-4f9e-b106-6012931ca0f2" path="/var/lib/kubelet/pods/04ed0a2f-539e-4f9e-b106-6012931ca0f2/volumes"
Mar 20 16:13:47 crc kubenswrapper[4730]: I0320 16:13:47.546213    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b134de24-b8f7-4727-b32f-82c48b28787c" path="/var/lib/kubelet/pods/b134de24-b8f7-4727-b32f-82c48b28787c/volumes"
Mar 20 16:13:48 crc kubenswrapper[4730]: I0320 16:13:48.937538    4730 scope.go:117] "RemoveContainer" containerID="c002961b60e7f958b6eac722566b65c8c9c5ccb02bb7acae14d9879bae50e4f2"
Mar 20 16:13:49 crc kubenswrapper[4730]: I0320 16:13:49.929371    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-smz8t"]
Mar 20 16:13:49 crc kubenswrapper[4730]: I0320 16:13:49.929651    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-smz8t" podUID="8a30ccad-6262-4840-bab0-7c70cce5c54e" containerName="registry-server" containerID="cri-o://4c8596c65726e22fe600b5967470f2bf5449f3246bf3d726509883465d6ebbde" gracePeriod=2
Mar 20 16:13:50 crc kubenswrapper[4730]: I0320 16:13:50.350580    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-smz8t"
Mar 20 16:13:50 crc kubenswrapper[4730]: I0320 16:13:50.438780    4730 generic.go:334] "Generic (PLEG): container finished" podID="8a30ccad-6262-4840-bab0-7c70cce5c54e" containerID="4c8596c65726e22fe600b5967470f2bf5449f3246bf3d726509883465d6ebbde" exitCode=0
Mar 20 16:13:50 crc kubenswrapper[4730]: I0320 16:13:50.438818    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-smz8t" event={"ID":"8a30ccad-6262-4840-bab0-7c70cce5c54e","Type":"ContainerDied","Data":"4c8596c65726e22fe600b5967470f2bf5449f3246bf3d726509883465d6ebbde"}
Mar 20 16:13:50 crc kubenswrapper[4730]: I0320 16:13:50.438841    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-smz8t" event={"ID":"8a30ccad-6262-4840-bab0-7c70cce5c54e","Type":"ContainerDied","Data":"000e4fc2d8a82235e0aca159d20f3965efb17246205b4d4c27e700483ac7437b"}
Mar 20 16:13:50 crc kubenswrapper[4730]: I0320 16:13:50.438857    4730 scope.go:117] "RemoveContainer" containerID="4c8596c65726e22fe600b5967470f2bf5449f3246bf3d726509883465d6ebbde"
Mar 20 16:13:50 crc kubenswrapper[4730]: I0320 16:13:50.438952    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-smz8t"
Mar 20 16:13:50 crc kubenswrapper[4730]: I0320 16:13:50.456525    4730 scope.go:117] "RemoveContainer" containerID="299f8ee9143efa4f898b30f6780be69975c00a31d3fe75022db1b4bdfde1ce2e"
Mar 20 16:13:50 crc kubenswrapper[4730]: I0320 16:13:50.474430    4730 scope.go:117] "RemoveContainer" containerID="695ec40a8d4727ad133ecc0b1c4d07769a4ff2a01a7d067cb8b42c4ff5501acd"
Mar 20 16:13:50 crc kubenswrapper[4730]: I0320 16:13:50.479773    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcg8l\" (UniqueName: \"kubernetes.io/projected/8a30ccad-6262-4840-bab0-7c70cce5c54e-kube-api-access-tcg8l\") pod \"8a30ccad-6262-4840-bab0-7c70cce5c54e\" (UID: \"8a30ccad-6262-4840-bab0-7c70cce5c54e\") "
Mar 20 16:13:50 crc kubenswrapper[4730]: I0320 16:13:50.479829    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a30ccad-6262-4840-bab0-7c70cce5c54e-catalog-content\") pod \"8a30ccad-6262-4840-bab0-7c70cce5c54e\" (UID: \"8a30ccad-6262-4840-bab0-7c70cce5c54e\") "
Mar 20 16:13:50 crc kubenswrapper[4730]: I0320 16:13:50.479896    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a30ccad-6262-4840-bab0-7c70cce5c54e-utilities\") pod \"8a30ccad-6262-4840-bab0-7c70cce5c54e\" (UID: \"8a30ccad-6262-4840-bab0-7c70cce5c54e\") "
Mar 20 16:13:50 crc kubenswrapper[4730]: I0320 16:13:50.481649    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a30ccad-6262-4840-bab0-7c70cce5c54e-utilities" (OuterVolumeSpecName: "utilities") pod "8a30ccad-6262-4840-bab0-7c70cce5c54e" (UID: "8a30ccad-6262-4840-bab0-7c70cce5c54e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:13:50 crc kubenswrapper[4730]: I0320 16:13:50.485984    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a30ccad-6262-4840-bab0-7c70cce5c54e-kube-api-access-tcg8l" (OuterVolumeSpecName: "kube-api-access-tcg8l") pod "8a30ccad-6262-4840-bab0-7c70cce5c54e" (UID: "8a30ccad-6262-4840-bab0-7c70cce5c54e"). InnerVolumeSpecName "kube-api-access-tcg8l". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:13:50 crc kubenswrapper[4730]: I0320 16:13:50.530568    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a30ccad-6262-4840-bab0-7c70cce5c54e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8a30ccad-6262-4840-bab0-7c70cce5c54e" (UID: "8a30ccad-6262-4840-bab0-7c70cce5c54e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:13:50 crc kubenswrapper[4730]: I0320 16:13:50.568287    4730 scope.go:117] "RemoveContainer" containerID="4c8596c65726e22fe600b5967470f2bf5449f3246bf3d726509883465d6ebbde"
Mar 20 16:13:50 crc kubenswrapper[4730]: E0320 16:13:50.568772    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c8596c65726e22fe600b5967470f2bf5449f3246bf3d726509883465d6ebbde\": container with ID starting with 4c8596c65726e22fe600b5967470f2bf5449f3246bf3d726509883465d6ebbde not found: ID does not exist" containerID="4c8596c65726e22fe600b5967470f2bf5449f3246bf3d726509883465d6ebbde"
Mar 20 16:13:50 crc kubenswrapper[4730]: I0320 16:13:50.568824    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c8596c65726e22fe600b5967470f2bf5449f3246bf3d726509883465d6ebbde"} err="failed to get container status \"4c8596c65726e22fe600b5967470f2bf5449f3246bf3d726509883465d6ebbde\": rpc error: code = NotFound desc = could not find container \"4c8596c65726e22fe600b5967470f2bf5449f3246bf3d726509883465d6ebbde\": container with ID starting with 4c8596c65726e22fe600b5967470f2bf5449f3246bf3d726509883465d6ebbde not found: ID does not exist"
Mar 20 16:13:50 crc kubenswrapper[4730]: I0320 16:13:50.568852    4730 scope.go:117] "RemoveContainer" containerID="299f8ee9143efa4f898b30f6780be69975c00a31d3fe75022db1b4bdfde1ce2e"
Mar 20 16:13:50 crc kubenswrapper[4730]: E0320 16:13:50.569590    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"299f8ee9143efa4f898b30f6780be69975c00a31d3fe75022db1b4bdfde1ce2e\": container with ID starting with 299f8ee9143efa4f898b30f6780be69975c00a31d3fe75022db1b4bdfde1ce2e not found: ID does not exist" containerID="299f8ee9143efa4f898b30f6780be69975c00a31d3fe75022db1b4bdfde1ce2e"
Mar 20 16:13:50 crc kubenswrapper[4730]: I0320 16:13:50.569637    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"299f8ee9143efa4f898b30f6780be69975c00a31d3fe75022db1b4bdfde1ce2e"} err="failed to get container status \"299f8ee9143efa4f898b30f6780be69975c00a31d3fe75022db1b4bdfde1ce2e\": rpc error: code = NotFound desc = could not find container \"299f8ee9143efa4f898b30f6780be69975c00a31d3fe75022db1b4bdfde1ce2e\": container with ID starting with 299f8ee9143efa4f898b30f6780be69975c00a31d3fe75022db1b4bdfde1ce2e not found: ID does not exist"
Mar 20 16:13:50 crc kubenswrapper[4730]: I0320 16:13:50.569666    4730 scope.go:117] "RemoveContainer" containerID="695ec40a8d4727ad133ecc0b1c4d07769a4ff2a01a7d067cb8b42c4ff5501acd"
Mar 20 16:13:50 crc kubenswrapper[4730]: E0320 16:13:50.569911    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"695ec40a8d4727ad133ecc0b1c4d07769a4ff2a01a7d067cb8b42c4ff5501acd\": container with ID starting with 695ec40a8d4727ad133ecc0b1c4d07769a4ff2a01a7d067cb8b42c4ff5501acd not found: ID does not exist" containerID="695ec40a8d4727ad133ecc0b1c4d07769a4ff2a01a7d067cb8b42c4ff5501acd"
Mar 20 16:13:50 crc kubenswrapper[4730]: I0320 16:13:50.569938    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"695ec40a8d4727ad133ecc0b1c4d07769a4ff2a01a7d067cb8b42c4ff5501acd"} err="failed to get container status \"695ec40a8d4727ad133ecc0b1c4d07769a4ff2a01a7d067cb8b42c4ff5501acd\": rpc error: code = NotFound desc = could not find container \"695ec40a8d4727ad133ecc0b1c4d07769a4ff2a01a7d067cb8b42c4ff5501acd\": container with ID starting with 695ec40a8d4727ad133ecc0b1c4d07769a4ff2a01a7d067cb8b42c4ff5501acd not found: ID does not exist"
Mar 20 16:13:50 crc kubenswrapper[4730]: I0320 16:13:50.582889    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcg8l\" (UniqueName: \"kubernetes.io/projected/8a30ccad-6262-4840-bab0-7c70cce5c54e-kube-api-access-tcg8l\") on node \"crc\" DevicePath \"\""
Mar 20 16:13:50 crc kubenswrapper[4730]: I0320 16:13:50.582922    4730 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a30ccad-6262-4840-bab0-7c70cce5c54e-catalog-content\") on node \"crc\" DevicePath \"\""
Mar 20 16:13:50 crc kubenswrapper[4730]: I0320 16:13:50.582934    4730 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a30ccad-6262-4840-bab0-7c70cce5c54e-utilities\") on node \"crc\" DevicePath \"\""
Mar 20 16:13:50 crc kubenswrapper[4730]: I0320 16:13:50.775901    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-smz8t"]
Mar 20 16:13:50 crc kubenswrapper[4730]: I0320 16:13:50.785076    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-smz8t"]
Mar 20 16:13:51 crc kubenswrapper[4730]: I0320 16:13:51.546044    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a30ccad-6262-4840-bab0-7c70cce5c54e" path="/var/lib/kubelet/pods/8a30ccad-6262-4840-bab0-7c70cce5c54e/volumes"
Mar 20 16:13:56 crc kubenswrapper[4730]: I0320 16:13:56.518680    4730 generic.go:334] "Generic (PLEG): container finished" podID="a8c27e63-ebf9-45ff-87b2-4782b20e19e3" containerID="f5e9f1ce126d736eed319722f72b288bfa7b28dcea13acceb2dec38e7da569bc" exitCode=0
Mar 20 16:13:56 crc kubenswrapper[4730]: I0320 16:13:56.518824    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m89bj" event={"ID":"a8c27e63-ebf9-45ff-87b2-4782b20e19e3","Type":"ContainerDied","Data":"f5e9f1ce126d736eed319722f72b288bfa7b28dcea13acceb2dec38e7da569bc"}
Mar 20 16:13:58 crc kubenswrapper[4730]: I0320 16:13:58.051120    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m89bj"
Mar 20 16:13:58 crc kubenswrapper[4730]: I0320 16:13:58.132129    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5q7k\" (UniqueName: \"kubernetes.io/projected/a8c27e63-ebf9-45ff-87b2-4782b20e19e3-kube-api-access-k5q7k\") pod \"a8c27e63-ebf9-45ff-87b2-4782b20e19e3\" (UID: \"a8c27e63-ebf9-45ff-87b2-4782b20e19e3\") "
Mar 20 16:13:58 crc kubenswrapper[4730]: I0320 16:13:58.132285    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a8c27e63-ebf9-45ff-87b2-4782b20e19e3-inventory\") pod \"a8c27e63-ebf9-45ff-87b2-4782b20e19e3\" (UID: \"a8c27e63-ebf9-45ff-87b2-4782b20e19e3\") "
Mar 20 16:13:58 crc kubenswrapper[4730]: I0320 16:13:58.132375    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a8c27e63-ebf9-45ff-87b2-4782b20e19e3-ssh-key-openstack-edpm-ipam\") pod \"a8c27e63-ebf9-45ff-87b2-4782b20e19e3\" (UID: \"a8c27e63-ebf9-45ff-87b2-4782b20e19e3\") "
Mar 20 16:13:58 crc kubenswrapper[4730]: I0320 16:13:58.138569    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8c27e63-ebf9-45ff-87b2-4782b20e19e3-kube-api-access-k5q7k" (OuterVolumeSpecName: "kube-api-access-k5q7k") pod "a8c27e63-ebf9-45ff-87b2-4782b20e19e3" (UID: "a8c27e63-ebf9-45ff-87b2-4782b20e19e3"). InnerVolumeSpecName "kube-api-access-k5q7k". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:13:58 crc kubenswrapper[4730]: I0320 16:13:58.169345    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8c27e63-ebf9-45ff-87b2-4782b20e19e3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a8c27e63-ebf9-45ff-87b2-4782b20e19e3" (UID: "a8c27e63-ebf9-45ff-87b2-4782b20e19e3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:13:58 crc kubenswrapper[4730]: I0320 16:13:58.183546    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8c27e63-ebf9-45ff-87b2-4782b20e19e3-inventory" (OuterVolumeSpecName: "inventory") pod "a8c27e63-ebf9-45ff-87b2-4782b20e19e3" (UID: "a8c27e63-ebf9-45ff-87b2-4782b20e19e3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:13:58 crc kubenswrapper[4730]: I0320 16:13:58.235655    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5q7k\" (UniqueName: \"kubernetes.io/projected/a8c27e63-ebf9-45ff-87b2-4782b20e19e3-kube-api-access-k5q7k\") on node \"crc\" DevicePath \"\""
Mar 20 16:13:58 crc kubenswrapper[4730]: I0320 16:13:58.235709    4730 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a8c27e63-ebf9-45ff-87b2-4782b20e19e3-inventory\") on node \"crc\" DevicePath \"\""
Mar 20 16:13:58 crc kubenswrapper[4730]: I0320 16:13:58.235728    4730 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a8c27e63-ebf9-45ff-87b2-4782b20e19e3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\""
Mar 20 16:13:58 crc kubenswrapper[4730]: I0320 16:13:58.539351    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m89bj" event={"ID":"a8c27e63-ebf9-45ff-87b2-4782b20e19e3","Type":"ContainerDied","Data":"01d39be61543c09265f967d5b7ada582d06aa7ac17db1c635eb6ef1c127d2c54"}
Mar 20 16:13:58 crc kubenswrapper[4730]: I0320 16:13:58.539663    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01d39be61543c09265f967d5b7ada582d06aa7ac17db1c635eb6ef1c127d2c54"
Mar 20 16:13:58 crc kubenswrapper[4730]: I0320 16:13:58.539465    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m89bj"
Mar 20 16:13:58 crc kubenswrapper[4730]: I0320 16:13:58.639890    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-4mmgp"]
Mar 20 16:13:58 crc kubenswrapper[4730]: E0320 16:13:58.640372    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04ed0a2f-539e-4f9e-b106-6012931ca0f2" containerName="extract-utilities"
Mar 20 16:13:58 crc kubenswrapper[4730]: I0320 16:13:58.640391    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="04ed0a2f-539e-4f9e-b106-6012931ca0f2" containerName="extract-utilities"
Mar 20 16:13:58 crc kubenswrapper[4730]: E0320 16:13:58.640412    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b134de24-b8f7-4727-b32f-82c48b28787c" containerName="extract-content"
Mar 20 16:13:58 crc kubenswrapper[4730]: I0320 16:13:58.640421    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="b134de24-b8f7-4727-b32f-82c48b28787c" containerName="extract-content"
Mar 20 16:13:58 crc kubenswrapper[4730]: E0320 16:13:58.640437    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b134de24-b8f7-4727-b32f-82c48b28787c" containerName="extract-utilities"
Mar 20 16:13:58 crc kubenswrapper[4730]: I0320 16:13:58.640446    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="b134de24-b8f7-4727-b32f-82c48b28787c" containerName="extract-utilities"
Mar 20 16:13:58 crc kubenswrapper[4730]: E0320 16:13:58.640462    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a30ccad-6262-4840-bab0-7c70cce5c54e" containerName="extract-utilities"
Mar 20 16:13:58 crc kubenswrapper[4730]: I0320 16:13:58.640470    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a30ccad-6262-4840-bab0-7c70cce5c54e" containerName="extract-utilities"
Mar 20 16:13:58 crc kubenswrapper[4730]: E0320 16:13:58.640494    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8c27e63-ebf9-45ff-87b2-4782b20e19e3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam"
Mar 20 16:13:58 crc kubenswrapper[4730]: I0320 16:13:58.640510    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8c27e63-ebf9-45ff-87b2-4782b20e19e3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam"
Mar 20 16:13:58 crc kubenswrapper[4730]: E0320 16:13:58.640530    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04ed0a2f-539e-4f9e-b106-6012931ca0f2" containerName="extract-content"
Mar 20 16:13:58 crc kubenswrapper[4730]: I0320 16:13:58.640539    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="04ed0a2f-539e-4f9e-b106-6012931ca0f2" containerName="extract-content"
Mar 20 16:13:58 crc kubenswrapper[4730]: E0320 16:13:58.640560    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a30ccad-6262-4840-bab0-7c70cce5c54e" containerName="extract-content"
Mar 20 16:13:58 crc kubenswrapper[4730]: I0320 16:13:58.640568    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a30ccad-6262-4840-bab0-7c70cce5c54e" containerName="extract-content"
Mar 20 16:13:58 crc kubenswrapper[4730]: E0320 16:13:58.640582    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b134de24-b8f7-4727-b32f-82c48b28787c" containerName="registry-server"
Mar 20 16:13:58 crc kubenswrapper[4730]: I0320 16:13:58.640590    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="b134de24-b8f7-4727-b32f-82c48b28787c" containerName="registry-server"
Mar 20 16:13:58 crc kubenswrapper[4730]: E0320 16:13:58.640606    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a30ccad-6262-4840-bab0-7c70cce5c54e" containerName="registry-server"
Mar 20 16:13:58 crc kubenswrapper[4730]: I0320 16:13:58.640614    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a30ccad-6262-4840-bab0-7c70cce5c54e" containerName="registry-server"
Mar 20 16:13:58 crc kubenswrapper[4730]: E0320 16:13:58.640631    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04ed0a2f-539e-4f9e-b106-6012931ca0f2" containerName="registry-server"
Mar 20 16:13:58 crc kubenswrapper[4730]: I0320 16:13:58.640640    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="04ed0a2f-539e-4f9e-b106-6012931ca0f2" containerName="registry-server"
Mar 20 16:13:58 crc kubenswrapper[4730]: I0320 16:13:58.640913    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="04ed0a2f-539e-4f9e-b106-6012931ca0f2" containerName="registry-server"
Mar 20 16:13:58 crc kubenswrapper[4730]: I0320 16:13:58.640940    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a30ccad-6262-4840-bab0-7c70cce5c54e" containerName="registry-server"
Mar 20 16:13:58 crc kubenswrapper[4730]: I0320 16:13:58.640953    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8c27e63-ebf9-45ff-87b2-4782b20e19e3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam"
Mar 20 16:13:58 crc kubenswrapper[4730]: I0320 16:13:58.640967    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="b134de24-b8f7-4727-b32f-82c48b28787c" containerName="registry-server"
Mar 20 16:13:58 crc kubenswrapper[4730]: I0320 16:13:58.641812    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-4mmgp"
Mar 20 16:13:58 crc kubenswrapper[4730]: I0320 16:13:58.643781    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam"
Mar 20 16:13:58 crc kubenswrapper[4730]: I0320 16:13:58.644640    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret"
Mar 20 16:13:58 crc kubenswrapper[4730]: I0320 16:13:58.651565    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env"
Mar 20 16:13:58 crc kubenswrapper[4730]: I0320 16:13:58.651724    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vvsxx"
Mar 20 16:13:58 crc kubenswrapper[4730]: I0320 16:13:58.661191    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-4mmgp"]
Mar 20 16:13:58 crc kubenswrapper[4730]: I0320 16:13:58.745307    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zg2zr\" (UniqueName: \"kubernetes.io/projected/eebb2eb5-4553-41b0-85e6-81e470576d50-kube-api-access-zg2zr\") pod \"ssh-known-hosts-edpm-deployment-4mmgp\" (UID: \"eebb2eb5-4553-41b0-85e6-81e470576d50\") " pod="openstack/ssh-known-hosts-edpm-deployment-4mmgp"
Mar 20 16:13:58 crc kubenswrapper[4730]: I0320 16:13:58.745492    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eebb2eb5-4553-41b0-85e6-81e470576d50-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-4mmgp\" (UID: \"eebb2eb5-4553-41b0-85e6-81e470576d50\") " pod="openstack/ssh-known-hosts-edpm-deployment-4mmgp"
Mar 20 16:13:58 crc kubenswrapper[4730]: I0320 16:13:58.745609    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/eebb2eb5-4553-41b0-85e6-81e470576d50-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-4mmgp\" (UID: \"eebb2eb5-4553-41b0-85e6-81e470576d50\") " pod="openstack/ssh-known-hosts-edpm-deployment-4mmgp"
Mar 20 16:13:58 crc kubenswrapper[4730]: I0320 16:13:58.847362    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eebb2eb5-4553-41b0-85e6-81e470576d50-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-4mmgp\" (UID: \"eebb2eb5-4553-41b0-85e6-81e470576d50\") " pod="openstack/ssh-known-hosts-edpm-deployment-4mmgp"
Mar 20 16:13:58 crc kubenswrapper[4730]: I0320 16:13:58.847422    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/eebb2eb5-4553-41b0-85e6-81e470576d50-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-4mmgp\" (UID: \"eebb2eb5-4553-41b0-85e6-81e470576d50\") " pod="openstack/ssh-known-hosts-edpm-deployment-4mmgp"
Mar 20 16:13:58 crc kubenswrapper[4730]: I0320 16:13:58.847647    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zg2zr\" (UniqueName: \"kubernetes.io/projected/eebb2eb5-4553-41b0-85e6-81e470576d50-kube-api-access-zg2zr\") pod \"ssh-known-hosts-edpm-deployment-4mmgp\" (UID: \"eebb2eb5-4553-41b0-85e6-81e470576d50\") " pod="openstack/ssh-known-hosts-edpm-deployment-4mmgp"
Mar 20 16:13:58 crc kubenswrapper[4730]: I0320 16:13:58.853088    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/eebb2eb5-4553-41b0-85e6-81e470576d50-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-4mmgp\" (UID: \"eebb2eb5-4553-41b0-85e6-81e470576d50\") " pod="openstack/ssh-known-hosts-edpm-deployment-4mmgp"
Mar 20 16:13:58 crc kubenswrapper[4730]: I0320 16:13:58.853136    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eebb2eb5-4553-41b0-85e6-81e470576d50-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-4mmgp\" (UID: \"eebb2eb5-4553-41b0-85e6-81e470576d50\") " pod="openstack/ssh-known-hosts-edpm-deployment-4mmgp"
Mar 20 16:13:58 crc kubenswrapper[4730]: I0320 16:13:58.865609    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zg2zr\" (UniqueName: \"kubernetes.io/projected/eebb2eb5-4553-41b0-85e6-81e470576d50-kube-api-access-zg2zr\") pod \"ssh-known-hosts-edpm-deployment-4mmgp\" (UID: \"eebb2eb5-4553-41b0-85e6-81e470576d50\") " pod="openstack/ssh-known-hosts-edpm-deployment-4mmgp"
Mar 20 16:13:58 crc kubenswrapper[4730]: I0320 16:13:58.966650    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-4mmgp"
Mar 20 16:13:59 crc kubenswrapper[4730]: I0320 16:13:59.487431    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-4mmgp"]
Mar 20 16:13:59 crc kubenswrapper[4730]: I0320 16:13:59.549213    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-4mmgp" event={"ID":"eebb2eb5-4553-41b0-85e6-81e470576d50","Type":"ContainerStarted","Data":"d15eba2a3cb4372e1d50223b264561216531c96c5a30cbb5b72ec2027f141f11"}
Mar 20 16:14:00 crc kubenswrapper[4730]: I0320 16:14:00.134781    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567054-glf2f"]
Mar 20 16:14:00 crc kubenswrapper[4730]: I0320 16:14:00.136547    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567054-glf2f"
Mar 20 16:14:00 crc kubenswrapper[4730]: I0320 16:14:00.138449    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt"
Mar 20 16:14:00 crc kubenswrapper[4730]: I0320 16:14:00.141982    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt"
Mar 20 16:14:00 crc kubenswrapper[4730]: I0320 16:14:00.144293    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vlqbl"
Mar 20 16:14:00 crc kubenswrapper[4730]: I0320 16:14:00.150787    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567054-glf2f"]
Mar 20 16:14:00 crc kubenswrapper[4730]: I0320 16:14:00.200025    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq6kq\" (UniqueName: \"kubernetes.io/projected/d62c2430-2f2b-49f0-848a-015a72d04090-kube-api-access-zq6kq\") pod \"auto-csr-approver-29567054-glf2f\" (UID: \"d62c2430-2f2b-49f0-848a-015a72d04090\") " pod="openshift-infra/auto-csr-approver-29567054-glf2f"
Mar 20 16:14:00 crc kubenswrapper[4730]: I0320 16:14:00.302286    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq6kq\" (UniqueName: \"kubernetes.io/projected/d62c2430-2f2b-49f0-848a-015a72d04090-kube-api-access-zq6kq\") pod \"auto-csr-approver-29567054-glf2f\" (UID: \"d62c2430-2f2b-49f0-848a-015a72d04090\") " pod="openshift-infra/auto-csr-approver-29567054-glf2f"
Mar 20 16:14:00 crc kubenswrapper[4730]: I0320 16:14:00.323673    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq6kq\" (UniqueName: \"kubernetes.io/projected/d62c2430-2f2b-49f0-848a-015a72d04090-kube-api-access-zq6kq\") pod \"auto-csr-approver-29567054-glf2f\" (UID: \"d62c2430-2f2b-49f0-848a-015a72d04090\") " pod="openshift-infra/auto-csr-approver-29567054-glf2f"
Mar 20 16:14:00 crc kubenswrapper[4730]: I0320 16:14:00.513940    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567054-glf2f"
Mar 20 16:14:00 crc kubenswrapper[4730]: I0320 16:14:00.941738    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567054-glf2f"]
Mar 20 16:14:00 crc kubenswrapper[4730]: W0320 16:14:00.958166    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd62c2430_2f2b_49f0_848a_015a72d04090.slice/crio-d0a691de51bc653167d6747ca95afa97af85f0a35af0278e824af53532069c23 WatchSource:0}: Error finding container d0a691de51bc653167d6747ca95afa97af85f0a35af0278e824af53532069c23: Status 404 returned error can't find the container with id d0a691de51bc653167d6747ca95afa97af85f0a35af0278e824af53532069c23
Mar 20 16:14:01 crc kubenswrapper[4730]: I0320 16:14:01.570125    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-4mmgp" event={"ID":"eebb2eb5-4553-41b0-85e6-81e470576d50","Type":"ContainerStarted","Data":"cf9fbc8e58f88b901eea9f003d08be59e29a519886cb230e8c1e76d47a42f08e"}
Mar 20 16:14:01 crc kubenswrapper[4730]: I0320 16:14:01.571288    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567054-glf2f" event={"ID":"d62c2430-2f2b-49f0-848a-015a72d04090","Type":"ContainerStarted","Data":"d0a691de51bc653167d6747ca95afa97af85f0a35af0278e824af53532069c23"}
Mar 20 16:14:01 crc kubenswrapper[4730]: I0320 16:14:01.600406    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-4mmgp" podStartSLOduration=2.38354246 podStartE2EDuration="3.60038819s" podCreationTimestamp="2026-03-20 16:13:58 +0000 UTC" firstStartedPulling="2026-03-20 16:13:59.483803064 +0000 UTC m=+2098.697174433" lastFinishedPulling="2026-03-20 16:14:00.700648794 +0000 UTC m=+2099.914020163" observedRunningTime="2026-03-20 16:14:01.592609564 +0000 UTC m=+2100.805980943" watchObservedRunningTime="2026-03-20 16:14:01.60038819 +0000 UTC m=+2100.813759559"
Mar 20 16:14:03 crc kubenswrapper[4730]: I0320 16:14:03.600336    4730 generic.go:334] "Generic (PLEG): container finished" podID="d62c2430-2f2b-49f0-848a-015a72d04090" containerID="88155d5b3d3f84b9a79ccb85b9d478d5415c51d7944f4bb78a548434ba4fb653" exitCode=0
Mar 20 16:14:03 crc kubenswrapper[4730]: I0320 16:14:03.600440    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567054-glf2f" event={"ID":"d62c2430-2f2b-49f0-848a-015a72d04090","Type":"ContainerDied","Data":"88155d5b3d3f84b9a79ccb85b9d478d5415c51d7944f4bb78a548434ba4fb653"}
Mar 20 16:14:05 crc kubenswrapper[4730]: I0320 16:14:05.007569    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567054-glf2f"
Mar 20 16:14:05 crc kubenswrapper[4730]: I0320 16:14:05.098812    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zq6kq\" (UniqueName: \"kubernetes.io/projected/d62c2430-2f2b-49f0-848a-015a72d04090-kube-api-access-zq6kq\") pod \"d62c2430-2f2b-49f0-848a-015a72d04090\" (UID: \"d62c2430-2f2b-49f0-848a-015a72d04090\") "
Mar 20 16:14:05 crc kubenswrapper[4730]: I0320 16:14:05.104069    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d62c2430-2f2b-49f0-848a-015a72d04090-kube-api-access-zq6kq" (OuterVolumeSpecName: "kube-api-access-zq6kq") pod "d62c2430-2f2b-49f0-848a-015a72d04090" (UID: "d62c2430-2f2b-49f0-848a-015a72d04090"). InnerVolumeSpecName "kube-api-access-zq6kq". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:14:05 crc kubenswrapper[4730]: I0320 16:14:05.202024    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zq6kq\" (UniqueName: \"kubernetes.io/projected/d62c2430-2f2b-49f0-848a-015a72d04090-kube-api-access-zq6kq\") on node \"crc\" DevicePath \"\""
Mar 20 16:14:05 crc kubenswrapper[4730]: I0320 16:14:05.624055    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567054-glf2f" event={"ID":"d62c2430-2f2b-49f0-848a-015a72d04090","Type":"ContainerDied","Data":"d0a691de51bc653167d6747ca95afa97af85f0a35af0278e824af53532069c23"}
Mar 20 16:14:05 crc kubenswrapper[4730]: I0320 16:14:05.624109    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0a691de51bc653167d6747ca95afa97af85f0a35af0278e824af53532069c23"
Mar 20 16:14:05 crc kubenswrapper[4730]: I0320 16:14:05.624109    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567054-glf2f"
Mar 20 16:14:06 crc kubenswrapper[4730]: I0320 16:14:06.083759    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567048-nhd7d"]
Mar 20 16:14:06 crc kubenswrapper[4730]: I0320 16:14:06.092090    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567048-nhd7d"]
Mar 20 16:14:07 crc kubenswrapper[4730]: I0320 16:14:07.546508    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28bea13e-dd2a-4ecf-9182-cc639a47c75f" path="/var/lib/kubelet/pods/28bea13e-dd2a-4ecf-9182-cc639a47c75f/volumes"
Mar 20 16:14:07 crc kubenswrapper[4730]: I0320 16:14:07.650034    4730 generic.go:334] "Generic (PLEG): container finished" podID="eebb2eb5-4553-41b0-85e6-81e470576d50" containerID="cf9fbc8e58f88b901eea9f003d08be59e29a519886cb230e8c1e76d47a42f08e" exitCode=0
Mar 20 16:14:07 crc kubenswrapper[4730]: I0320 16:14:07.650094    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-4mmgp" event={"ID":"eebb2eb5-4553-41b0-85e6-81e470576d50","Type":"ContainerDied","Data":"cf9fbc8e58f88b901eea9f003d08be59e29a519886cb230e8c1e76d47a42f08e"}
Mar 20 16:14:09 crc kubenswrapper[4730]: I0320 16:14:09.050669    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-4mmgp"
Mar 20 16:14:09 crc kubenswrapper[4730]: I0320 16:14:09.184423    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zg2zr\" (UniqueName: \"kubernetes.io/projected/eebb2eb5-4553-41b0-85e6-81e470576d50-kube-api-access-zg2zr\") pod \"eebb2eb5-4553-41b0-85e6-81e470576d50\" (UID: \"eebb2eb5-4553-41b0-85e6-81e470576d50\") "
Mar 20 16:14:09 crc kubenswrapper[4730]: I0320 16:14:09.184962    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/eebb2eb5-4553-41b0-85e6-81e470576d50-inventory-0\") pod \"eebb2eb5-4553-41b0-85e6-81e470576d50\" (UID: \"eebb2eb5-4553-41b0-85e6-81e470576d50\") "
Mar 20 16:14:09 crc kubenswrapper[4730]: I0320 16:14:09.185040    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eebb2eb5-4553-41b0-85e6-81e470576d50-ssh-key-openstack-edpm-ipam\") pod \"eebb2eb5-4553-41b0-85e6-81e470576d50\" (UID: \"eebb2eb5-4553-41b0-85e6-81e470576d50\") "
Mar 20 16:14:09 crc kubenswrapper[4730]: I0320 16:14:09.191906    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eebb2eb5-4553-41b0-85e6-81e470576d50-kube-api-access-zg2zr" (OuterVolumeSpecName: "kube-api-access-zg2zr") pod "eebb2eb5-4553-41b0-85e6-81e470576d50" (UID: "eebb2eb5-4553-41b0-85e6-81e470576d50"). InnerVolumeSpecName "kube-api-access-zg2zr". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:14:09 crc kubenswrapper[4730]: I0320 16:14:09.211760    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eebb2eb5-4553-41b0-85e6-81e470576d50-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "eebb2eb5-4553-41b0-85e6-81e470576d50" (UID: "eebb2eb5-4553-41b0-85e6-81e470576d50"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:14:09 crc kubenswrapper[4730]: I0320 16:14:09.213018    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eebb2eb5-4553-41b0-85e6-81e470576d50-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "eebb2eb5-4553-41b0-85e6-81e470576d50" (UID: "eebb2eb5-4553-41b0-85e6-81e470576d50"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:14:09 crc kubenswrapper[4730]: I0320 16:14:09.286652    4730 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eebb2eb5-4553-41b0-85e6-81e470576d50-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\""
Mar 20 16:14:09 crc kubenswrapper[4730]: I0320 16:14:09.286810    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zg2zr\" (UniqueName: \"kubernetes.io/projected/eebb2eb5-4553-41b0-85e6-81e470576d50-kube-api-access-zg2zr\") on node \"crc\" DevicePath \"\""
Mar 20 16:14:09 crc kubenswrapper[4730]: I0320 16:14:09.286888    4730 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/eebb2eb5-4553-41b0-85e6-81e470576d50-inventory-0\") on node \"crc\" DevicePath \"\""
Mar 20 16:14:09 crc kubenswrapper[4730]: I0320 16:14:09.667611    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-4mmgp" event={"ID":"eebb2eb5-4553-41b0-85e6-81e470576d50","Type":"ContainerDied","Data":"d15eba2a3cb4372e1d50223b264561216531c96c5a30cbb5b72ec2027f141f11"}
Mar 20 16:14:09 crc kubenswrapper[4730]: I0320 16:14:09.667651    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d15eba2a3cb4372e1d50223b264561216531c96c5a30cbb5b72ec2027f141f11"
Mar 20 16:14:09 crc kubenswrapper[4730]: I0320 16:14:09.667731    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-4mmgp"
Mar 20 16:14:09 crc kubenswrapper[4730]: I0320 16:14:09.756179    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-ckzj9"]
Mar 20 16:14:09 crc kubenswrapper[4730]: E0320 16:14:09.756861    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eebb2eb5-4553-41b0-85e6-81e470576d50" containerName="ssh-known-hosts-edpm-deployment"
Mar 20 16:14:09 crc kubenswrapper[4730]: I0320 16:14:09.756897    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="eebb2eb5-4553-41b0-85e6-81e470576d50" containerName="ssh-known-hosts-edpm-deployment"
Mar 20 16:14:09 crc kubenswrapper[4730]: E0320 16:14:09.756913    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d62c2430-2f2b-49f0-848a-015a72d04090" containerName="oc"
Mar 20 16:14:09 crc kubenswrapper[4730]: I0320 16:14:09.756922    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="d62c2430-2f2b-49f0-848a-015a72d04090" containerName="oc"
Mar 20 16:14:09 crc kubenswrapper[4730]: I0320 16:14:09.757157    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="d62c2430-2f2b-49f0-848a-015a72d04090" containerName="oc"
Mar 20 16:14:09 crc kubenswrapper[4730]: I0320 16:14:09.757187    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="eebb2eb5-4553-41b0-85e6-81e470576d50" containerName="ssh-known-hosts-edpm-deployment"
Mar 20 16:14:09 crc kubenswrapper[4730]: I0320 16:14:09.758141    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ckzj9"
Mar 20 16:14:09 crc kubenswrapper[4730]: I0320 16:14:09.760362    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env"
Mar 20 16:14:09 crc kubenswrapper[4730]: I0320 16:14:09.760641    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret"
Mar 20 16:14:09 crc kubenswrapper[4730]: I0320 16:14:09.761141    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vvsxx"
Mar 20 16:14:09 crc kubenswrapper[4730]: I0320 16:14:09.761196    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam"
Mar 20 16:14:09 crc kubenswrapper[4730]: I0320 16:14:09.768227    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-ckzj9"]
Mar 20 16:14:09 crc kubenswrapper[4730]: I0320 16:14:09.805115    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64xpg\" (UniqueName: \"kubernetes.io/projected/0cdca1cb-76d0-4486-8ae9-d67a1ed9b79c-kube-api-access-64xpg\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ckzj9\" (UID: \"0cdca1cb-76d0-4486-8ae9-d67a1ed9b79c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ckzj9"
Mar 20 16:14:09 crc kubenswrapper[4730]: I0320 16:14:09.805274    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0cdca1cb-76d0-4486-8ae9-d67a1ed9b79c-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ckzj9\" (UID: \"0cdca1cb-76d0-4486-8ae9-d67a1ed9b79c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ckzj9"
Mar 20 16:14:09 crc kubenswrapper[4730]: I0320 16:14:09.805322    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0cdca1cb-76d0-4486-8ae9-d67a1ed9b79c-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ckzj9\" (UID: \"0cdca1cb-76d0-4486-8ae9-d67a1ed9b79c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ckzj9"
Mar 20 16:14:09 crc kubenswrapper[4730]: I0320 16:14:09.907224    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0cdca1cb-76d0-4486-8ae9-d67a1ed9b79c-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ckzj9\" (UID: \"0cdca1cb-76d0-4486-8ae9-d67a1ed9b79c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ckzj9"
Mar 20 16:14:09 crc kubenswrapper[4730]: I0320 16:14:09.907322    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0cdca1cb-76d0-4486-8ae9-d67a1ed9b79c-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ckzj9\" (UID: \"0cdca1cb-76d0-4486-8ae9-d67a1ed9b79c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ckzj9"
Mar 20 16:14:09 crc kubenswrapper[4730]: I0320 16:14:09.907411    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64xpg\" (UniqueName: \"kubernetes.io/projected/0cdca1cb-76d0-4486-8ae9-d67a1ed9b79c-kube-api-access-64xpg\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ckzj9\" (UID: \"0cdca1cb-76d0-4486-8ae9-d67a1ed9b79c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ckzj9"
Mar 20 16:14:09 crc kubenswrapper[4730]: I0320 16:14:09.912229    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0cdca1cb-76d0-4486-8ae9-d67a1ed9b79c-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ckzj9\" (UID: \"0cdca1cb-76d0-4486-8ae9-d67a1ed9b79c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ckzj9"
Mar 20 16:14:09 crc kubenswrapper[4730]: I0320 16:14:09.914355    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0cdca1cb-76d0-4486-8ae9-d67a1ed9b79c-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ckzj9\" (UID: \"0cdca1cb-76d0-4486-8ae9-d67a1ed9b79c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ckzj9"
Mar 20 16:14:09 crc kubenswrapper[4730]: I0320 16:14:09.936716    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64xpg\" (UniqueName: \"kubernetes.io/projected/0cdca1cb-76d0-4486-8ae9-d67a1ed9b79c-kube-api-access-64xpg\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ckzj9\" (UID: \"0cdca1cb-76d0-4486-8ae9-d67a1ed9b79c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ckzj9"
Mar 20 16:14:10 crc kubenswrapper[4730]: I0320 16:14:10.086181    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ckzj9"
Mar 20 16:14:10 crc kubenswrapper[4730]: I0320 16:14:10.626660    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-ckzj9"]
Mar 20 16:14:10 crc kubenswrapper[4730]: I0320 16:14:10.676726    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ckzj9" event={"ID":"0cdca1cb-76d0-4486-8ae9-d67a1ed9b79c","Type":"ContainerStarted","Data":"fb1b51d3ab50918406f030cebcd373b0e4840f9c2af0258d10e23a73e1e02bde"}
Mar 20 16:14:11 crc kubenswrapper[4730]: I0320 16:14:11.693696    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ckzj9" event={"ID":"0cdca1cb-76d0-4486-8ae9-d67a1ed9b79c","Type":"ContainerStarted","Data":"d3df01e18681838ce7ec48f5f7fe992384e53dd9a8111ca0991db497dc1cb894"}
Mar 20 16:14:11 crc kubenswrapper[4730]: I0320 16:14:11.720331    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ckzj9" podStartSLOduration=2.302805272 podStartE2EDuration="2.720315076s" podCreationTimestamp="2026-03-20 16:14:09 +0000 UTC" firstStartedPulling="2026-03-20 16:14:10.635579258 +0000 UTC m=+2109.848950627" lastFinishedPulling="2026-03-20 16:14:11.053089042 +0000 UTC m=+2110.266460431" observedRunningTime="2026-03-20 16:14:11.715187594 +0000 UTC m=+2110.928558983" watchObservedRunningTime="2026-03-20 16:14:11.720315076 +0000 UTC m=+2110.933686445"
Mar 20 16:14:18 crc kubenswrapper[4730]: I0320 16:14:18.767828    4730 generic.go:334] "Generic (PLEG): container finished" podID="0cdca1cb-76d0-4486-8ae9-d67a1ed9b79c" containerID="d3df01e18681838ce7ec48f5f7fe992384e53dd9a8111ca0991db497dc1cb894" exitCode=0
Mar 20 16:14:18 crc kubenswrapper[4730]: I0320 16:14:18.767997    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ckzj9" event={"ID":"0cdca1cb-76d0-4486-8ae9-d67a1ed9b79c","Type":"ContainerDied","Data":"d3df01e18681838ce7ec48f5f7fe992384e53dd9a8111ca0991db497dc1cb894"}
Mar 20 16:14:20 crc kubenswrapper[4730]: I0320 16:14:20.171923    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ckzj9"
Mar 20 16:14:20 crc kubenswrapper[4730]: I0320 16:14:20.241005    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0cdca1cb-76d0-4486-8ae9-d67a1ed9b79c-inventory\") pod \"0cdca1cb-76d0-4486-8ae9-d67a1ed9b79c\" (UID: \"0cdca1cb-76d0-4486-8ae9-d67a1ed9b79c\") "
Mar 20 16:14:20 crc kubenswrapper[4730]: I0320 16:14:20.241114    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64xpg\" (UniqueName: \"kubernetes.io/projected/0cdca1cb-76d0-4486-8ae9-d67a1ed9b79c-kube-api-access-64xpg\") pod \"0cdca1cb-76d0-4486-8ae9-d67a1ed9b79c\" (UID: \"0cdca1cb-76d0-4486-8ae9-d67a1ed9b79c\") "
Mar 20 16:14:20 crc kubenswrapper[4730]: I0320 16:14:20.241217    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0cdca1cb-76d0-4486-8ae9-d67a1ed9b79c-ssh-key-openstack-edpm-ipam\") pod \"0cdca1cb-76d0-4486-8ae9-d67a1ed9b79c\" (UID: \"0cdca1cb-76d0-4486-8ae9-d67a1ed9b79c\") "
Mar 20 16:14:20 crc kubenswrapper[4730]: I0320 16:14:20.247493    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cdca1cb-76d0-4486-8ae9-d67a1ed9b79c-kube-api-access-64xpg" (OuterVolumeSpecName: "kube-api-access-64xpg") pod "0cdca1cb-76d0-4486-8ae9-d67a1ed9b79c" (UID: "0cdca1cb-76d0-4486-8ae9-d67a1ed9b79c"). InnerVolumeSpecName "kube-api-access-64xpg". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:14:20 crc kubenswrapper[4730]: I0320 16:14:20.268808    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cdca1cb-76d0-4486-8ae9-d67a1ed9b79c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0cdca1cb-76d0-4486-8ae9-d67a1ed9b79c" (UID: "0cdca1cb-76d0-4486-8ae9-d67a1ed9b79c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:14:20 crc kubenswrapper[4730]: I0320 16:14:20.270922    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cdca1cb-76d0-4486-8ae9-d67a1ed9b79c-inventory" (OuterVolumeSpecName: "inventory") pod "0cdca1cb-76d0-4486-8ae9-d67a1ed9b79c" (UID: "0cdca1cb-76d0-4486-8ae9-d67a1ed9b79c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:14:20 crc kubenswrapper[4730]: I0320 16:14:20.343619    4730 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0cdca1cb-76d0-4486-8ae9-d67a1ed9b79c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\""
Mar 20 16:14:20 crc kubenswrapper[4730]: I0320 16:14:20.343671    4730 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0cdca1cb-76d0-4486-8ae9-d67a1ed9b79c-inventory\") on node \"crc\" DevicePath \"\""
Mar 20 16:14:20 crc kubenswrapper[4730]: I0320 16:14:20.343700    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64xpg\" (UniqueName: \"kubernetes.io/projected/0cdca1cb-76d0-4486-8ae9-d67a1ed9b79c-kube-api-access-64xpg\") on node \"crc\" DevicePath \"\""
Mar 20 16:14:20 crc kubenswrapper[4730]: I0320 16:14:20.787390    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ckzj9" event={"ID":"0cdca1cb-76d0-4486-8ae9-d67a1ed9b79c","Type":"ContainerDied","Data":"fb1b51d3ab50918406f030cebcd373b0e4840f9c2af0258d10e23a73e1e02bde"}
Mar 20 16:14:20 crc kubenswrapper[4730]: I0320 16:14:20.787441    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb1b51d3ab50918406f030cebcd373b0e4840f9c2af0258d10e23a73e1e02bde"
Mar 20 16:14:20 crc kubenswrapper[4730]: I0320 16:14:20.787459    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ckzj9"
Mar 20 16:14:20 crc kubenswrapper[4730]: I0320 16:14:20.887867    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fcg5s"]
Mar 20 16:14:20 crc kubenswrapper[4730]: E0320 16:14:20.889597    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cdca1cb-76d0-4486-8ae9-d67a1ed9b79c" containerName="run-os-edpm-deployment-openstack-edpm-ipam"
Mar 20 16:14:20 crc kubenswrapper[4730]: I0320 16:14:20.889634    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cdca1cb-76d0-4486-8ae9-d67a1ed9b79c" containerName="run-os-edpm-deployment-openstack-edpm-ipam"
Mar 20 16:14:20 crc kubenswrapper[4730]: I0320 16:14:20.890377    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cdca1cb-76d0-4486-8ae9-d67a1ed9b79c" containerName="run-os-edpm-deployment-openstack-edpm-ipam"
Mar 20 16:14:20 crc kubenswrapper[4730]: I0320 16:14:20.891487    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fcg5s"
Mar 20 16:14:20 crc kubenswrapper[4730]: I0320 16:14:20.905590    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fcg5s"]
Mar 20 16:14:20 crc kubenswrapper[4730]: I0320 16:14:20.930782    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vvsxx"
Mar 20 16:14:20 crc kubenswrapper[4730]: I0320 16:14:20.930839    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env"
Mar 20 16:14:20 crc kubenswrapper[4730]: I0320 16:14:20.931443    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret"
Mar 20 16:14:20 crc kubenswrapper[4730]: I0320 16:14:20.931698    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam"
Mar 20 16:14:21 crc kubenswrapper[4730]: I0320 16:14:21.057773    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b49a7544-a685-49c3-81fa-e1bbec4453ba-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fcg5s\" (UID: \"b49a7544-a685-49c3-81fa-e1bbec4453ba\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fcg5s"
Mar 20 16:14:21 crc kubenswrapper[4730]: I0320 16:14:21.058774    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs9fl\" (UniqueName: \"kubernetes.io/projected/b49a7544-a685-49c3-81fa-e1bbec4453ba-kube-api-access-hs9fl\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fcg5s\" (UID: \"b49a7544-a685-49c3-81fa-e1bbec4453ba\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fcg5s"
Mar 20 16:14:21 crc kubenswrapper[4730]: I0320 16:14:21.058925    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b49a7544-a685-49c3-81fa-e1bbec4453ba-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fcg5s\" (UID: \"b49a7544-a685-49c3-81fa-e1bbec4453ba\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fcg5s"
Mar 20 16:14:21 crc kubenswrapper[4730]: I0320 16:14:21.160954    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b49a7544-a685-49c3-81fa-e1bbec4453ba-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fcg5s\" (UID: \"b49a7544-a685-49c3-81fa-e1bbec4453ba\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fcg5s"
Mar 20 16:14:21 crc kubenswrapper[4730]: I0320 16:14:21.161126    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hs9fl\" (UniqueName: \"kubernetes.io/projected/b49a7544-a685-49c3-81fa-e1bbec4453ba-kube-api-access-hs9fl\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fcg5s\" (UID: \"b49a7544-a685-49c3-81fa-e1bbec4453ba\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fcg5s"
Mar 20 16:14:21 crc kubenswrapper[4730]: I0320 16:14:21.161178    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b49a7544-a685-49c3-81fa-e1bbec4453ba-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fcg5s\" (UID: \"b49a7544-a685-49c3-81fa-e1bbec4453ba\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fcg5s"
Mar 20 16:14:21 crc kubenswrapper[4730]: I0320 16:14:21.166311    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b49a7544-a685-49c3-81fa-e1bbec4453ba-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fcg5s\" (UID: \"b49a7544-a685-49c3-81fa-e1bbec4453ba\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fcg5s"
Mar 20 16:14:21 crc kubenswrapper[4730]: I0320 16:14:21.168507    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b49a7544-a685-49c3-81fa-e1bbec4453ba-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fcg5s\" (UID: \"b49a7544-a685-49c3-81fa-e1bbec4453ba\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fcg5s"
Mar 20 16:14:21 crc kubenswrapper[4730]: I0320 16:14:21.188095    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs9fl\" (UniqueName: \"kubernetes.io/projected/b49a7544-a685-49c3-81fa-e1bbec4453ba-kube-api-access-hs9fl\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fcg5s\" (UID: \"b49a7544-a685-49c3-81fa-e1bbec4453ba\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fcg5s"
Mar 20 16:14:21 crc kubenswrapper[4730]: I0320 16:14:21.250714    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fcg5s"
Mar 20 16:14:21 crc kubenswrapper[4730]: I0320 16:14:21.758429    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fcg5s"]
Mar 20 16:14:21 crc kubenswrapper[4730]: I0320 16:14:21.796879    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fcg5s" event={"ID":"b49a7544-a685-49c3-81fa-e1bbec4453ba","Type":"ContainerStarted","Data":"63fc1fd4c0c125b6658b4511d055c2663314adc2c857f6656144f940ae01a113"}
Mar 20 16:14:22 crc kubenswrapper[4730]: I0320 16:14:22.807067    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fcg5s" event={"ID":"b49a7544-a685-49c3-81fa-e1bbec4453ba","Type":"ContainerStarted","Data":"b263b8f0c1f575dac6afecb99dcc89954593438c3ff4e5d432e149952ec178a8"}
Mar 20 16:14:22 crc kubenswrapper[4730]: I0320 16:14:22.824277    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fcg5s" podStartSLOduration=2.373506203 podStartE2EDuration="2.82423125s" podCreationTimestamp="2026-03-20 16:14:20 +0000 UTC" firstStartedPulling="2026-03-20 16:14:21.765067352 +0000 UTC m=+2120.978438711" lastFinishedPulling="2026-03-20 16:14:22.215792389 +0000 UTC m=+2121.429163758" observedRunningTime="2026-03-20 16:14:22.821346819 +0000 UTC m=+2122.034718188" watchObservedRunningTime="2026-03-20 16:14:22.82423125 +0000 UTC m=+2122.037602619"
Mar 20 16:14:31 crc kubenswrapper[4730]: I0320 16:14:31.917813    4730 generic.go:334] "Generic (PLEG): container finished" podID="b49a7544-a685-49c3-81fa-e1bbec4453ba" containerID="b263b8f0c1f575dac6afecb99dcc89954593438c3ff4e5d432e149952ec178a8" exitCode=0
Mar 20 16:14:31 crc kubenswrapper[4730]: I0320 16:14:31.917882    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fcg5s" event={"ID":"b49a7544-a685-49c3-81fa-e1bbec4453ba","Type":"ContainerDied","Data":"b263b8f0c1f575dac6afecb99dcc89954593438c3ff4e5d432e149952ec178a8"}
Mar 20 16:14:33 crc kubenswrapper[4730]: I0320 16:14:33.350913    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fcg5s"
Mar 20 16:14:33 crc kubenswrapper[4730]: I0320 16:14:33.374933    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b49a7544-a685-49c3-81fa-e1bbec4453ba-ssh-key-openstack-edpm-ipam\") pod \"b49a7544-a685-49c3-81fa-e1bbec4453ba\" (UID: \"b49a7544-a685-49c3-81fa-e1bbec4453ba\") "
Mar 20 16:14:33 crc kubenswrapper[4730]: I0320 16:14:33.375050    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hs9fl\" (UniqueName: \"kubernetes.io/projected/b49a7544-a685-49c3-81fa-e1bbec4453ba-kube-api-access-hs9fl\") pod \"b49a7544-a685-49c3-81fa-e1bbec4453ba\" (UID: \"b49a7544-a685-49c3-81fa-e1bbec4453ba\") "
Mar 20 16:14:33 crc kubenswrapper[4730]: I0320 16:14:33.375087    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b49a7544-a685-49c3-81fa-e1bbec4453ba-inventory\") pod \"b49a7544-a685-49c3-81fa-e1bbec4453ba\" (UID: \"b49a7544-a685-49c3-81fa-e1bbec4453ba\") "
Mar 20 16:14:33 crc kubenswrapper[4730]: I0320 16:14:33.420897    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b49a7544-a685-49c3-81fa-e1bbec4453ba-kube-api-access-hs9fl" (OuterVolumeSpecName: "kube-api-access-hs9fl") pod "b49a7544-a685-49c3-81fa-e1bbec4453ba" (UID: "b49a7544-a685-49c3-81fa-e1bbec4453ba"). InnerVolumeSpecName "kube-api-access-hs9fl". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:14:33 crc kubenswrapper[4730]: E0320 16:14:33.420983    4730 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b49a7544-a685-49c3-81fa-e1bbec4453ba-ssh-key-openstack-edpm-ipam podName:b49a7544-a685-49c3-81fa-e1bbec4453ba nodeName:}" failed. No retries permitted until 2026-03-20 16:14:33.920950456 +0000 UTC m=+2133.134321835 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ssh-key-openstack-edpm-ipam" (UniqueName: "kubernetes.io/secret/b49a7544-a685-49c3-81fa-e1bbec4453ba-ssh-key-openstack-edpm-ipam") pod "b49a7544-a685-49c3-81fa-e1bbec4453ba" (UID: "b49a7544-a685-49c3-81fa-e1bbec4453ba") : error deleting /var/lib/kubelet/pods/b49a7544-a685-49c3-81fa-e1bbec4453ba/volume-subpaths: remove /var/lib/kubelet/pods/b49a7544-a685-49c3-81fa-e1bbec4453ba/volume-subpaths: no such file or directory
Mar 20 16:14:33 crc kubenswrapper[4730]: I0320 16:14:33.423738    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b49a7544-a685-49c3-81fa-e1bbec4453ba-inventory" (OuterVolumeSpecName: "inventory") pod "b49a7544-a685-49c3-81fa-e1bbec4453ba" (UID: "b49a7544-a685-49c3-81fa-e1bbec4453ba"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:14:33 crc kubenswrapper[4730]: I0320 16:14:33.477929    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hs9fl\" (UniqueName: \"kubernetes.io/projected/b49a7544-a685-49c3-81fa-e1bbec4453ba-kube-api-access-hs9fl\") on node \"crc\" DevicePath \"\""
Mar 20 16:14:33 crc kubenswrapper[4730]: I0320 16:14:33.477970    4730 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b49a7544-a685-49c3-81fa-e1bbec4453ba-inventory\") on node \"crc\" DevicePath \"\""
Mar 20 16:14:33 crc kubenswrapper[4730]: I0320 16:14:33.939964    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fcg5s" event={"ID":"b49a7544-a685-49c3-81fa-e1bbec4453ba","Type":"ContainerDied","Data":"63fc1fd4c0c125b6658b4511d055c2663314adc2c857f6656144f940ae01a113"}
Mar 20 16:14:33 crc kubenswrapper[4730]: I0320 16:14:33.940482    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63fc1fd4c0c125b6658b4511d055c2663314adc2c857f6656144f940ae01a113"
Mar 20 16:14:33 crc kubenswrapper[4730]: I0320 16:14:33.940016    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fcg5s"
Mar 20 16:14:33 crc kubenswrapper[4730]: I0320 16:14:33.988922    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b49a7544-a685-49c3-81fa-e1bbec4453ba-ssh-key-openstack-edpm-ipam\") pod \"b49a7544-a685-49c3-81fa-e1bbec4453ba\" (UID: \"b49a7544-a685-49c3-81fa-e1bbec4453ba\") "
Mar 20 16:14:33 crc kubenswrapper[4730]: I0320 16:14:33.994821    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b49a7544-a685-49c3-81fa-e1bbec4453ba-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b49a7544-a685-49c3-81fa-e1bbec4453ba" (UID: "b49a7544-a685-49c3-81fa-e1bbec4453ba"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.054820    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n"]
Mar 20 16:14:34 crc kubenswrapper[4730]: E0320 16:14:34.056025    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b49a7544-a685-49c3-81fa-e1bbec4453ba" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam"
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.056048    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="b49a7544-a685-49c3-81fa-e1bbec4453ba" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam"
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.056577    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="b49a7544-a685-49c3-81fa-e1bbec4453ba" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam"
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.057899    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n"
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.067551    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0"
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.067867    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0"
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.068987    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0"
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.069163    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0"
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.087585    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n"]
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.099540    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4d78n\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n"
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.099592    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4d78n\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n"
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.099624    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4d78n\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n"
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.099663    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4d78n\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n"
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.099698    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/423144fa-9b01-4466-993c-6ab7075e1ad5-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4d78n\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n"
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.099717    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4d78n\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n"
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.099738    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/423144fa-9b01-4466-993c-6ab7075e1ad5-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4d78n\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n"
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.099757    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4d78n\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n"
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.099782    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4d78n\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n"
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.099810    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/423144fa-9b01-4466-993c-6ab7075e1ad5-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4d78n\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n"
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.099833    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/423144fa-9b01-4466-993c-6ab7075e1ad5-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4d78n\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n"
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.099870    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4d78n\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n"
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.099910    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhrdz\" (UniqueName: \"kubernetes.io/projected/423144fa-9b01-4466-993c-6ab7075e1ad5-kube-api-access-dhrdz\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4d78n\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n"
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.099928    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4d78n\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n"
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.100000    4730 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b49a7544-a685-49c3-81fa-e1bbec4453ba-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\""
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.201808    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4d78n\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n"
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.201889    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhrdz\" (UniqueName: \"kubernetes.io/projected/423144fa-9b01-4466-993c-6ab7075e1ad5-kube-api-access-dhrdz\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4d78n\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n"
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.201911    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4d78n\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n"
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.201968    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4d78n\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n"
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.201995    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4d78n\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n"
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.202019    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4d78n\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n"
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.202053    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4d78n\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n"
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.202089    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/423144fa-9b01-4466-993c-6ab7075e1ad5-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4d78n\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n"
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.202111    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4d78n\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n"
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.202169    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/423144fa-9b01-4466-993c-6ab7075e1ad5-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4d78n\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n"
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.202188    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4d78n\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n"
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.202219    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4d78n\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n"
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.202261    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/423144fa-9b01-4466-993c-6ab7075e1ad5-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4d78n\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n"
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.202285    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/423144fa-9b01-4466-993c-6ab7075e1ad5-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4d78n\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n"
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.205903    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4d78n\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n"
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.206623    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4d78n\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n"
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.207231    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4d78n\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n"
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.207474    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/423144fa-9b01-4466-993c-6ab7075e1ad5-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4d78n\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n"
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.207840    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4d78n\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n"
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.208256    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4d78n\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n"
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.208330    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4d78n\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n"
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.208821    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4d78n\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n"
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.209175    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/423144fa-9b01-4466-993c-6ab7075e1ad5-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4d78n\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n"
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.209823    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/423144fa-9b01-4466-993c-6ab7075e1ad5-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4d78n\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n"
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.211545    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4d78n\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n"
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.212581    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/423144fa-9b01-4466-993c-6ab7075e1ad5-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4d78n\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n"
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.213545    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4d78n\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n"
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.226564    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhrdz\" (UniqueName: \"kubernetes.io/projected/423144fa-9b01-4466-993c-6ab7075e1ad5-kube-api-access-dhrdz\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-4d78n\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n"
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.411608    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n"
Mar 20 16:14:34 crc kubenswrapper[4730]: I0320 16:14:34.987338    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n"]
Mar 20 16:14:34 crc kubenswrapper[4730]: W0320 16:14:34.988772    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod423144fa_9b01_4466_993c_6ab7075e1ad5.slice/crio-21c9b74b7d25aff2921555a89646dca1c3794f9a5ed3d4441b6c26e7fd3963a0 WatchSource:0}: Error finding container 21c9b74b7d25aff2921555a89646dca1c3794f9a5ed3d4441b6c26e7fd3963a0: Status 404 returned error can't find the container with id 21c9b74b7d25aff2921555a89646dca1c3794f9a5ed3d4441b6c26e7fd3963a0
Mar 20 16:14:35 crc kubenswrapper[4730]: I0320 16:14:35.961280    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n" event={"ID":"423144fa-9b01-4466-993c-6ab7075e1ad5","Type":"ContainerStarted","Data":"01fd28163e9e73afde319534feb3ad11d63319d2239c28468dad672e5ab2ebe3"}
Mar 20 16:14:35 crc kubenswrapper[4730]: I0320 16:14:35.961887    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n" event={"ID":"423144fa-9b01-4466-993c-6ab7075e1ad5","Type":"ContainerStarted","Data":"21c9b74b7d25aff2921555a89646dca1c3794f9a5ed3d4441b6c26e7fd3963a0"}
Mar 20 16:14:35 crc kubenswrapper[4730]: I0320 16:14:35.996427    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n" podStartSLOduration=1.499690779 podStartE2EDuration="1.996404795s" podCreationTimestamp="2026-03-20 16:14:34 +0000 UTC" firstStartedPulling="2026-03-20 16:14:34.991736842 +0000 UTC m=+2134.205108211" lastFinishedPulling="2026-03-20 16:14:35.488450838 +0000 UTC m=+2134.701822227" observedRunningTime="2026-03-20 16:14:35.992462236 +0000 UTC m=+2135.205833615" watchObservedRunningTime="2026-03-20 16:14:35.996404795 +0000 UTC m=+2135.209776174"
Mar 20 16:14:49 crc kubenswrapper[4730]: I0320 16:14:49.055795    4730 scope.go:117] "RemoveContainer" containerID="3781eddb5c4e3f16097f248108ceebb43195728c87b3ad6512e75bc75dffb2bb"
Mar 20 16:14:49 crc kubenswrapper[4730]: I0320 16:14:49.422684    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nlxpq"]
Mar 20 16:14:49 crc kubenswrapper[4730]: I0320 16:14:49.428429    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nlxpq"
Mar 20 16:14:49 crc kubenswrapper[4730]: I0320 16:14:49.434944    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nlxpq"]
Mar 20 16:14:49 crc kubenswrapper[4730]: I0320 16:14:49.538813    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3-utilities\") pod \"redhat-operators-nlxpq\" (UID: \"f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3\") " pod="openshift-marketplace/redhat-operators-nlxpq"
Mar 20 16:14:49 crc kubenswrapper[4730]: I0320 16:14:49.539141    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97jfs\" (UniqueName: \"kubernetes.io/projected/f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3-kube-api-access-97jfs\") pod \"redhat-operators-nlxpq\" (UID: \"f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3\") " pod="openshift-marketplace/redhat-operators-nlxpq"
Mar 20 16:14:49 crc kubenswrapper[4730]: I0320 16:14:49.539504    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3-catalog-content\") pod \"redhat-operators-nlxpq\" (UID: \"f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3\") " pod="openshift-marketplace/redhat-operators-nlxpq"
Mar 20 16:14:49 crc kubenswrapper[4730]: I0320 16:14:49.641377    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3-catalog-content\") pod \"redhat-operators-nlxpq\" (UID: \"f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3\") " pod="openshift-marketplace/redhat-operators-nlxpq"
Mar 20 16:14:49 crc kubenswrapper[4730]: I0320 16:14:49.641501    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3-utilities\") pod \"redhat-operators-nlxpq\" (UID: \"f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3\") " pod="openshift-marketplace/redhat-operators-nlxpq"
Mar 20 16:14:49 crc kubenswrapper[4730]: I0320 16:14:49.641543    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97jfs\" (UniqueName: \"kubernetes.io/projected/f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3-kube-api-access-97jfs\") pod \"redhat-operators-nlxpq\" (UID: \"f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3\") " pod="openshift-marketplace/redhat-operators-nlxpq"
Mar 20 16:14:49 crc kubenswrapper[4730]: I0320 16:14:49.642005    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3-catalog-content\") pod \"redhat-operators-nlxpq\" (UID: \"f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3\") " pod="openshift-marketplace/redhat-operators-nlxpq"
Mar 20 16:14:49 crc kubenswrapper[4730]: I0320 16:14:49.642050    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3-utilities\") pod \"redhat-operators-nlxpq\" (UID: \"f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3\") " pod="openshift-marketplace/redhat-operators-nlxpq"
Mar 20 16:14:49 crc kubenswrapper[4730]: I0320 16:14:49.663738    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97jfs\" (UniqueName: \"kubernetes.io/projected/f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3-kube-api-access-97jfs\") pod \"redhat-operators-nlxpq\" (UID: \"f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3\") " pod="openshift-marketplace/redhat-operators-nlxpq"
Mar 20 16:14:49 crc kubenswrapper[4730]: I0320 16:14:49.762634    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nlxpq"
Mar 20 16:14:50 crc kubenswrapper[4730]: I0320 16:14:50.252468    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nlxpq"]
Mar 20 16:14:50 crc kubenswrapper[4730]: W0320 16:14:50.257737    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7f46bf2_4fe4_4576_a9f1_eeec5b2bf8c3.slice/crio-2fba6e13ca4ae6246c5ca75c7721bb78ad010f572aced34848817074b527537d WatchSource:0}: Error finding container 2fba6e13ca4ae6246c5ca75c7721bb78ad010f572aced34848817074b527537d: Status 404 returned error can't find the container with id 2fba6e13ca4ae6246c5ca75c7721bb78ad010f572aced34848817074b527537d
Mar 20 16:14:51 crc kubenswrapper[4730]: I0320 16:14:51.106295    4730 generic.go:334] "Generic (PLEG): container finished" podID="f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3" containerID="9ca380081388b6f9c9877cc291353929dee2da6133f14e020a397e8856a314bb" exitCode=0
Mar 20 16:14:51 crc kubenswrapper[4730]: I0320 16:14:51.106398    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nlxpq" event={"ID":"f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3","Type":"ContainerDied","Data":"9ca380081388b6f9c9877cc291353929dee2da6133f14e020a397e8856a314bb"}
Mar 20 16:14:51 crc kubenswrapper[4730]: I0320 16:14:51.106632    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nlxpq" event={"ID":"f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3","Type":"ContainerStarted","Data":"2fba6e13ca4ae6246c5ca75c7721bb78ad010f572aced34848817074b527537d"}
Mar 20 16:14:52 crc kubenswrapper[4730]: I0320 16:14:52.115375    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nlxpq" event={"ID":"f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3","Type":"ContainerStarted","Data":"66dbf75889a1e65837bed6622eafa5ba26cfd66025601ae0dc945c0e8dce250e"}
Mar 20 16:14:54 crc kubenswrapper[4730]: I0320 16:14:54.137405    4730 generic.go:334] "Generic (PLEG): container finished" podID="f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3" containerID="66dbf75889a1e65837bed6622eafa5ba26cfd66025601ae0dc945c0e8dce250e" exitCode=0
Mar 20 16:14:54 crc kubenswrapper[4730]: I0320 16:14:54.137516    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nlxpq" event={"ID":"f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3","Type":"ContainerDied","Data":"66dbf75889a1e65837bed6622eafa5ba26cfd66025601ae0dc945c0e8dce250e"}
Mar 20 16:14:55 crc kubenswrapper[4730]: I0320 16:14:55.153654    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nlxpq" event={"ID":"f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3","Type":"ContainerStarted","Data":"b04b9a549015f8905f380aaf8421cc03f90628d6c219b0109a0e893083103713"}
Mar 20 16:14:55 crc kubenswrapper[4730]: I0320 16:14:55.185290    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nlxpq" podStartSLOduration=2.8084730540000002 podStartE2EDuration="6.185226137s" podCreationTimestamp="2026-03-20 16:14:49 +0000 UTC" firstStartedPulling="2026-03-20 16:14:51.167308318 +0000 UTC m=+2150.380679687" lastFinishedPulling="2026-03-20 16:14:54.544061401 +0000 UTC m=+2153.757432770" observedRunningTime="2026-03-20 16:14:55.176339116 +0000 UTC m=+2154.389710545" watchObservedRunningTime="2026-03-20 16:14:55.185226137 +0000 UTC m=+2154.398597546"
Mar 20 16:14:59 crc kubenswrapper[4730]: I0320 16:14:59.763355    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nlxpq"
Mar 20 16:14:59 crc kubenswrapper[4730]: I0320 16:14:59.764045    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nlxpq"
Mar 20 16:15:00 crc kubenswrapper[4730]: I0320 16:15:00.145149    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567055-r9s4z"]
Mar 20 16:15:00 crc kubenswrapper[4730]: I0320 16:15:00.146688    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-r9s4z"
Mar 20 16:15:00 crc kubenswrapper[4730]: I0320 16:15:00.150298    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config"
Mar 20 16:15:00 crc kubenswrapper[4730]: I0320 16:15:00.150544    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t"
Mar 20 16:15:00 crc kubenswrapper[4730]: I0320 16:15:00.157697    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567055-r9s4z"]
Mar 20 16:15:00 crc kubenswrapper[4730]: I0320 16:15:00.302613    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5af1b002-c577-4334-8304-5f44a67a5119-config-volume\") pod \"collect-profiles-29567055-r9s4z\" (UID: \"5af1b002-c577-4334-8304-5f44a67a5119\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-r9s4z"
Mar 20 16:15:00 crc kubenswrapper[4730]: I0320 16:15:00.302768    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsl7k\" (UniqueName: \"kubernetes.io/projected/5af1b002-c577-4334-8304-5f44a67a5119-kube-api-access-gsl7k\") pod \"collect-profiles-29567055-r9s4z\" (UID: \"5af1b002-c577-4334-8304-5f44a67a5119\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-r9s4z"
Mar 20 16:15:00 crc kubenswrapper[4730]: I0320 16:15:00.302878    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5af1b002-c577-4334-8304-5f44a67a5119-secret-volume\") pod \"collect-profiles-29567055-r9s4z\" (UID: \"5af1b002-c577-4334-8304-5f44a67a5119\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-r9s4z"
Mar 20 16:15:00 crc kubenswrapper[4730]: I0320 16:15:00.404575    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5af1b002-c577-4334-8304-5f44a67a5119-secret-volume\") pod \"collect-profiles-29567055-r9s4z\" (UID: \"5af1b002-c577-4334-8304-5f44a67a5119\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-r9s4z"
Mar 20 16:15:00 crc kubenswrapper[4730]: I0320 16:15:00.404970    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5af1b002-c577-4334-8304-5f44a67a5119-config-volume\") pod \"collect-profiles-29567055-r9s4z\" (UID: \"5af1b002-c577-4334-8304-5f44a67a5119\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-r9s4z"
Mar 20 16:15:00 crc kubenswrapper[4730]: I0320 16:15:00.405095    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsl7k\" (UniqueName: \"kubernetes.io/projected/5af1b002-c577-4334-8304-5f44a67a5119-kube-api-access-gsl7k\") pod \"collect-profiles-29567055-r9s4z\" (UID: \"5af1b002-c577-4334-8304-5f44a67a5119\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-r9s4z"
Mar 20 16:15:00 crc kubenswrapper[4730]: I0320 16:15:00.405876    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5af1b002-c577-4334-8304-5f44a67a5119-config-volume\") pod \"collect-profiles-29567055-r9s4z\" (UID: \"5af1b002-c577-4334-8304-5f44a67a5119\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-r9s4z"
Mar 20 16:15:00 crc kubenswrapper[4730]: I0320 16:15:00.416398    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5af1b002-c577-4334-8304-5f44a67a5119-secret-volume\") pod \"collect-profiles-29567055-r9s4z\" (UID: \"5af1b002-c577-4334-8304-5f44a67a5119\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-r9s4z"
Mar 20 16:15:00 crc kubenswrapper[4730]: I0320 16:15:00.430237    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsl7k\" (UniqueName: \"kubernetes.io/projected/5af1b002-c577-4334-8304-5f44a67a5119-kube-api-access-gsl7k\") pod \"collect-profiles-29567055-r9s4z\" (UID: \"5af1b002-c577-4334-8304-5f44a67a5119\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-r9s4z"
Mar 20 16:15:00 crc kubenswrapper[4730]: I0320 16:15:00.465609    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-r9s4z"
Mar 20 16:15:00 crc kubenswrapper[4730]: I0320 16:15:00.828102    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nlxpq" podUID="f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3" containerName="registry-server" probeResult="failure" output=<
Mar 20 16:15:00 crc kubenswrapper[4730]:         timeout: failed to connect service ":50051" within 1s
Mar 20 16:15:00 crc kubenswrapper[4730]:  >
Mar 20 16:15:00 crc kubenswrapper[4730]: I0320 16:15:00.986020    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567055-r9s4z"]
Mar 20 16:15:01 crc kubenswrapper[4730]: I0320 16:15:01.216534    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-r9s4z" event={"ID":"5af1b002-c577-4334-8304-5f44a67a5119","Type":"ContainerStarted","Data":"e6a0ef74485773b0b9248ebe8daaaea6116dc164de300c92368c1b1d44b5c372"}
Mar 20 16:15:01 crc kubenswrapper[4730]: I0320 16:15:01.217135    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-r9s4z" event={"ID":"5af1b002-c577-4334-8304-5f44a67a5119","Type":"ContainerStarted","Data":"b2d223de9580f697cadc3d5b517a4d5741fc74d1cc4709134ea792dc8f34fde0"}
Mar 20 16:15:01 crc kubenswrapper[4730]: I0320 16:15:01.240534    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-r9s4z" podStartSLOduration=1.240511597 podStartE2EDuration="1.240511597s" podCreationTimestamp="2026-03-20 16:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:15:01.235201608 +0000 UTC m=+2160.448572997" watchObservedRunningTime="2026-03-20 16:15:01.240511597 +0000 UTC m=+2160.453882966"
Mar 20 16:15:02 crc kubenswrapper[4730]: I0320 16:15:02.250343    4730 generic.go:334] "Generic (PLEG): container finished" podID="5af1b002-c577-4334-8304-5f44a67a5119" containerID="e6a0ef74485773b0b9248ebe8daaaea6116dc164de300c92368c1b1d44b5c372" exitCode=0
Mar 20 16:15:02 crc kubenswrapper[4730]: I0320 16:15:02.250715    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-r9s4z" event={"ID":"5af1b002-c577-4334-8304-5f44a67a5119","Type":"ContainerDied","Data":"e6a0ef74485773b0b9248ebe8daaaea6116dc164de300c92368c1b1d44b5c372"}
Mar 20 16:15:03 crc kubenswrapper[4730]: I0320 16:15:03.711240    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-r9s4z"
Mar 20 16:15:03 crc kubenswrapper[4730]: I0320 16:15:03.778215    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5af1b002-c577-4334-8304-5f44a67a5119-secret-volume\") pod \"5af1b002-c577-4334-8304-5f44a67a5119\" (UID: \"5af1b002-c577-4334-8304-5f44a67a5119\") "
Mar 20 16:15:03 crc kubenswrapper[4730]: I0320 16:15:03.778417    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsl7k\" (UniqueName: \"kubernetes.io/projected/5af1b002-c577-4334-8304-5f44a67a5119-kube-api-access-gsl7k\") pod \"5af1b002-c577-4334-8304-5f44a67a5119\" (UID: \"5af1b002-c577-4334-8304-5f44a67a5119\") "
Mar 20 16:15:03 crc kubenswrapper[4730]: I0320 16:15:03.778473    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5af1b002-c577-4334-8304-5f44a67a5119-config-volume\") pod \"5af1b002-c577-4334-8304-5f44a67a5119\" (UID: \"5af1b002-c577-4334-8304-5f44a67a5119\") "
Mar 20 16:15:03 crc kubenswrapper[4730]: I0320 16:15:03.778992    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5af1b002-c577-4334-8304-5f44a67a5119-config-volume" (OuterVolumeSpecName: "config-volume") pod "5af1b002-c577-4334-8304-5f44a67a5119" (UID: "5af1b002-c577-4334-8304-5f44a67a5119"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:15:03 crc kubenswrapper[4730]: I0320 16:15:03.785106    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5af1b002-c577-4334-8304-5f44a67a5119-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5af1b002-c577-4334-8304-5f44a67a5119" (UID: "5af1b002-c577-4334-8304-5f44a67a5119"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:15:03 crc kubenswrapper[4730]: I0320 16:15:03.795489    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5af1b002-c577-4334-8304-5f44a67a5119-kube-api-access-gsl7k" (OuterVolumeSpecName: "kube-api-access-gsl7k") pod "5af1b002-c577-4334-8304-5f44a67a5119" (UID: "5af1b002-c577-4334-8304-5f44a67a5119"). InnerVolumeSpecName "kube-api-access-gsl7k". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:15:03 crc kubenswrapper[4730]: I0320 16:15:03.881383    4730 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5af1b002-c577-4334-8304-5f44a67a5119-secret-volume\") on node \"crc\" DevicePath \"\""
Mar 20 16:15:03 crc kubenswrapper[4730]: I0320 16:15:03.881416    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsl7k\" (UniqueName: \"kubernetes.io/projected/5af1b002-c577-4334-8304-5f44a67a5119-kube-api-access-gsl7k\") on node \"crc\" DevicePath \"\""
Mar 20 16:15:03 crc kubenswrapper[4730]: I0320 16:15:03.881425    4730 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5af1b002-c577-4334-8304-5f44a67a5119-config-volume\") on node \"crc\" DevicePath \"\""
Mar 20 16:15:04 crc kubenswrapper[4730]: I0320 16:15:04.273648    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-r9s4z" event={"ID":"5af1b002-c577-4334-8304-5f44a67a5119","Type":"ContainerDied","Data":"b2d223de9580f697cadc3d5b517a4d5741fc74d1cc4709134ea792dc8f34fde0"}
Mar 20 16:15:04 crc kubenswrapper[4730]: I0320 16:15:04.273998    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2d223de9580f697cadc3d5b517a4d5741fc74d1cc4709134ea792dc8f34fde0"
Mar 20 16:15:04 crc kubenswrapper[4730]: I0320 16:15:04.273689    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567055-r9s4z"
Mar 20 16:15:04 crc kubenswrapper[4730]: I0320 16:15:04.319530    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567010-d69bc"]
Mar 20 16:15:04 crc kubenswrapper[4730]: I0320 16:15:04.328138    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567010-d69bc"]
Mar 20 16:15:05 crc kubenswrapper[4730]: I0320 16:15:05.548115    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be19fb65-a04f-42df-9b96-e620b58754bb" path="/var/lib/kubelet/pods/be19fb65-a04f-42df-9b96-e620b58754bb/volumes"
Mar 20 16:15:09 crc kubenswrapper[4730]: I0320 16:15:09.813431    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nlxpq"
Mar 20 16:15:09 crc kubenswrapper[4730]: I0320 16:15:09.873362    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nlxpq"
Mar 20 16:15:10 crc kubenswrapper[4730]: I0320 16:15:10.055705    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nlxpq"]
Mar 20 16:15:11 crc kubenswrapper[4730]: I0320 16:15:11.332918    4730 generic.go:334] "Generic (PLEG): container finished" podID="423144fa-9b01-4466-993c-6ab7075e1ad5" containerID="01fd28163e9e73afde319534feb3ad11d63319d2239c28468dad672e5ab2ebe3" exitCode=0
Mar 20 16:15:11 crc kubenswrapper[4730]: I0320 16:15:11.333012    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n" event={"ID":"423144fa-9b01-4466-993c-6ab7075e1ad5","Type":"ContainerDied","Data":"01fd28163e9e73afde319534feb3ad11d63319d2239c28468dad672e5ab2ebe3"}
Mar 20 16:15:11 crc kubenswrapper[4730]: I0320 16:15:11.333414    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nlxpq" podUID="f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3" containerName="registry-server" containerID="cri-o://b04b9a549015f8905f380aaf8421cc03f90628d6c219b0109a0e893083103713" gracePeriod=2
Mar 20 16:15:11 crc kubenswrapper[4730]: I0320 16:15:11.786409    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nlxpq"
Mar 20 16:15:11 crc kubenswrapper[4730]: I0320 16:15:11.845792    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3-catalog-content\") pod \"f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3\" (UID: \"f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3\") "
Mar 20 16:15:11 crc kubenswrapper[4730]: I0320 16:15:11.845916    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97jfs\" (UniqueName: \"kubernetes.io/projected/f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3-kube-api-access-97jfs\") pod \"f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3\" (UID: \"f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3\") "
Mar 20 16:15:11 crc kubenswrapper[4730]: I0320 16:15:11.845949    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3-utilities\") pod \"f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3\" (UID: \"f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3\") "
Mar 20 16:15:11 crc kubenswrapper[4730]: I0320 16:15:11.846915    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3-utilities" (OuterVolumeSpecName: "utilities") pod "f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3" (UID: "f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:15:11 crc kubenswrapper[4730]: I0320 16:15:11.851490    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3-kube-api-access-97jfs" (OuterVolumeSpecName: "kube-api-access-97jfs") pod "f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3" (UID: "f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3"). InnerVolumeSpecName "kube-api-access-97jfs". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:15:11 crc kubenswrapper[4730]: I0320 16:15:11.948133    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97jfs\" (UniqueName: \"kubernetes.io/projected/f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3-kube-api-access-97jfs\") on node \"crc\" DevicePath \"\""
Mar 20 16:15:11 crc kubenswrapper[4730]: I0320 16:15:11.948404    4730 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3-utilities\") on node \"crc\" DevicePath \"\""
Mar 20 16:15:11 crc kubenswrapper[4730]: I0320 16:15:11.981665    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3" (UID: "f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:15:12 crc kubenswrapper[4730]: I0320 16:15:12.050112    4730 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3-catalog-content\") on node \"crc\" DevicePath \"\""
Mar 20 16:15:12 crc kubenswrapper[4730]: I0320 16:15:12.345383    4730 generic.go:334] "Generic (PLEG): container finished" podID="f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3" containerID="b04b9a549015f8905f380aaf8421cc03f90628d6c219b0109a0e893083103713" exitCode=0
Mar 20 16:15:12 crc kubenswrapper[4730]: I0320 16:15:12.345443    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nlxpq" event={"ID":"f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3","Type":"ContainerDied","Data":"b04b9a549015f8905f380aaf8421cc03f90628d6c219b0109a0e893083103713"}
Mar 20 16:15:12 crc kubenswrapper[4730]: I0320 16:15:12.345477    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nlxpq" event={"ID":"f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3","Type":"ContainerDied","Data":"2fba6e13ca4ae6246c5ca75c7721bb78ad010f572aced34848817074b527537d"}
Mar 20 16:15:12 crc kubenswrapper[4730]: I0320 16:15:12.345493    4730 scope.go:117] "RemoveContainer" containerID="b04b9a549015f8905f380aaf8421cc03f90628d6c219b0109a0e893083103713"
Mar 20 16:15:12 crc kubenswrapper[4730]: I0320 16:15:12.345501    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nlxpq"
Mar 20 16:15:12 crc kubenswrapper[4730]: I0320 16:15:12.382743    4730 scope.go:117] "RemoveContainer" containerID="66dbf75889a1e65837bed6622eafa5ba26cfd66025601ae0dc945c0e8dce250e"
Mar 20 16:15:12 crc kubenswrapper[4730]: I0320 16:15:12.395627    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nlxpq"]
Mar 20 16:15:12 crc kubenswrapper[4730]: I0320 16:15:12.443461    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nlxpq"]
Mar 20 16:15:12 crc kubenswrapper[4730]: I0320 16:15:12.491787    4730 scope.go:117] "RemoveContainer" containerID="9ca380081388b6f9c9877cc291353929dee2da6133f14e020a397e8856a314bb"
Mar 20 16:15:12 crc kubenswrapper[4730]: I0320 16:15:12.539180    4730 scope.go:117] "RemoveContainer" containerID="b04b9a549015f8905f380aaf8421cc03f90628d6c219b0109a0e893083103713"
Mar 20 16:15:12 crc kubenswrapper[4730]: E0320 16:15:12.539608    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b04b9a549015f8905f380aaf8421cc03f90628d6c219b0109a0e893083103713\": container with ID starting with b04b9a549015f8905f380aaf8421cc03f90628d6c219b0109a0e893083103713 not found: ID does not exist" containerID="b04b9a549015f8905f380aaf8421cc03f90628d6c219b0109a0e893083103713"
Mar 20 16:15:12 crc kubenswrapper[4730]: I0320 16:15:12.539647    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b04b9a549015f8905f380aaf8421cc03f90628d6c219b0109a0e893083103713"} err="failed to get container status \"b04b9a549015f8905f380aaf8421cc03f90628d6c219b0109a0e893083103713\": rpc error: code = NotFound desc = could not find container \"b04b9a549015f8905f380aaf8421cc03f90628d6c219b0109a0e893083103713\": container with ID starting with b04b9a549015f8905f380aaf8421cc03f90628d6c219b0109a0e893083103713 not found: ID does not exist"
Mar 20 16:15:12 crc kubenswrapper[4730]: I0320 16:15:12.539679    4730 scope.go:117] "RemoveContainer" containerID="66dbf75889a1e65837bed6622eafa5ba26cfd66025601ae0dc945c0e8dce250e"
Mar 20 16:15:12 crc kubenswrapper[4730]: E0320 16:15:12.543728    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66dbf75889a1e65837bed6622eafa5ba26cfd66025601ae0dc945c0e8dce250e\": container with ID starting with 66dbf75889a1e65837bed6622eafa5ba26cfd66025601ae0dc945c0e8dce250e not found: ID does not exist" containerID="66dbf75889a1e65837bed6622eafa5ba26cfd66025601ae0dc945c0e8dce250e"
Mar 20 16:15:12 crc kubenswrapper[4730]: I0320 16:15:12.543779    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66dbf75889a1e65837bed6622eafa5ba26cfd66025601ae0dc945c0e8dce250e"} err="failed to get container status \"66dbf75889a1e65837bed6622eafa5ba26cfd66025601ae0dc945c0e8dce250e\": rpc error: code = NotFound desc = could not find container \"66dbf75889a1e65837bed6622eafa5ba26cfd66025601ae0dc945c0e8dce250e\": container with ID starting with 66dbf75889a1e65837bed6622eafa5ba26cfd66025601ae0dc945c0e8dce250e not found: ID does not exist"
Mar 20 16:15:12 crc kubenswrapper[4730]: I0320 16:15:12.543812    4730 scope.go:117] "RemoveContainer" containerID="9ca380081388b6f9c9877cc291353929dee2da6133f14e020a397e8856a314bb"
Mar 20 16:15:12 crc kubenswrapper[4730]: E0320 16:15:12.547684    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ca380081388b6f9c9877cc291353929dee2da6133f14e020a397e8856a314bb\": container with ID starting with 9ca380081388b6f9c9877cc291353929dee2da6133f14e020a397e8856a314bb not found: ID does not exist" containerID="9ca380081388b6f9c9877cc291353929dee2da6133f14e020a397e8856a314bb"
Mar 20 16:15:12 crc kubenswrapper[4730]: I0320 16:15:12.547738    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ca380081388b6f9c9877cc291353929dee2da6133f14e020a397e8856a314bb"} err="failed to get container status \"9ca380081388b6f9c9877cc291353929dee2da6133f14e020a397e8856a314bb\": rpc error: code = NotFound desc = could not find container \"9ca380081388b6f9c9877cc291353929dee2da6133f14e020a397e8856a314bb\": container with ID starting with 9ca380081388b6f9c9877cc291353929dee2da6133f14e020a397e8856a314bb not found: ID does not exist"
Mar 20 16:15:12 crc kubenswrapper[4730]: I0320 16:15:12.890152    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n"
Mar 20 16:15:12 crc kubenswrapper[4730]: I0320 16:15:12.991392    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-ovn-combined-ca-bundle\") pod \"423144fa-9b01-4466-993c-6ab7075e1ad5\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") "
Mar 20 16:15:12 crc kubenswrapper[4730]: I0320 16:15:12.991509    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-inventory\") pod \"423144fa-9b01-4466-993c-6ab7075e1ad5\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") "
Mar 20 16:15:12 crc kubenswrapper[4730]: I0320 16:15:12.991607    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/423144fa-9b01-4466-993c-6ab7075e1ad5-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"423144fa-9b01-4466-993c-6ab7075e1ad5\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") "
Mar 20 16:15:12 crc kubenswrapper[4730]: I0320 16:15:12.991698    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-neutron-metadata-combined-ca-bundle\") pod \"423144fa-9b01-4466-993c-6ab7075e1ad5\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") "
Mar 20 16:15:12 crc kubenswrapper[4730]: I0320 16:15:12.991764    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/423144fa-9b01-4466-993c-6ab7075e1ad5-openstack-edpm-ipam-ovn-default-certs-0\") pod \"423144fa-9b01-4466-993c-6ab7075e1ad5\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") "
Mar 20 16:15:12 crc kubenswrapper[4730]: I0320 16:15:12.991819    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-libvirt-combined-ca-bundle\") pod \"423144fa-9b01-4466-993c-6ab7075e1ad5\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") "
Mar 20 16:15:12 crc kubenswrapper[4730]: I0320 16:15:12.991873    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-telemetry-combined-ca-bundle\") pod \"423144fa-9b01-4466-993c-6ab7075e1ad5\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") "
Mar 20 16:15:12 crc kubenswrapper[4730]: I0320 16:15:12.991933    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/423144fa-9b01-4466-993c-6ab7075e1ad5-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"423144fa-9b01-4466-993c-6ab7075e1ad5\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") "
Mar 20 16:15:12 crc kubenswrapper[4730]: I0320 16:15:12.992057    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-bootstrap-combined-ca-bundle\") pod \"423144fa-9b01-4466-993c-6ab7075e1ad5\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") "
Mar 20 16:15:12 crc kubenswrapper[4730]: I0320 16:15:12.992158    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-nova-combined-ca-bundle\") pod \"423144fa-9b01-4466-993c-6ab7075e1ad5\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") "
Mar 20 16:15:12 crc kubenswrapper[4730]: I0320 16:15:12.992238    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-repo-setup-combined-ca-bundle\") pod \"423144fa-9b01-4466-993c-6ab7075e1ad5\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") "
Mar 20 16:15:12 crc kubenswrapper[4730]: I0320 16:15:12.992331    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/423144fa-9b01-4466-993c-6ab7075e1ad5-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"423144fa-9b01-4466-993c-6ab7075e1ad5\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") "
Mar 20 16:15:12 crc kubenswrapper[4730]: I0320 16:15:12.992407    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhrdz\" (UniqueName: \"kubernetes.io/projected/423144fa-9b01-4466-993c-6ab7075e1ad5-kube-api-access-dhrdz\") pod \"423144fa-9b01-4466-993c-6ab7075e1ad5\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") "
Mar 20 16:15:12 crc kubenswrapper[4730]: I0320 16:15:12.992494    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-ssh-key-openstack-edpm-ipam\") pod \"423144fa-9b01-4466-993c-6ab7075e1ad5\" (UID: \"423144fa-9b01-4466-993c-6ab7075e1ad5\") "
Mar 20 16:15:12 crc kubenswrapper[4730]: I0320 16:15:12.997488    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "423144fa-9b01-4466-993c-6ab7075e1ad5" (UID: "423144fa-9b01-4466-993c-6ab7075e1ad5"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:15:12 crc kubenswrapper[4730]: I0320 16:15:12.997565    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "423144fa-9b01-4466-993c-6ab7075e1ad5" (UID: "423144fa-9b01-4466-993c-6ab7075e1ad5"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:15:12 crc kubenswrapper[4730]: I0320 16:15:12.997619    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/423144fa-9b01-4466-993c-6ab7075e1ad5-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "423144fa-9b01-4466-993c-6ab7075e1ad5" (UID: "423144fa-9b01-4466-993c-6ab7075e1ad5"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:15:12 crc kubenswrapper[4730]: I0320 16:15:12.999702    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/423144fa-9b01-4466-993c-6ab7075e1ad5-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "423144fa-9b01-4466-993c-6ab7075e1ad5" (UID: "423144fa-9b01-4466-993c-6ab7075e1ad5"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.000190    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "423144fa-9b01-4466-993c-6ab7075e1ad5" (UID: "423144fa-9b01-4466-993c-6ab7075e1ad5"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.000212    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/423144fa-9b01-4466-993c-6ab7075e1ad5-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "423144fa-9b01-4466-993c-6ab7075e1ad5" (UID: "423144fa-9b01-4466-993c-6ab7075e1ad5"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.000781    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "423144fa-9b01-4466-993c-6ab7075e1ad5" (UID: "423144fa-9b01-4466-993c-6ab7075e1ad5"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.001333    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/423144fa-9b01-4466-993c-6ab7075e1ad5-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "423144fa-9b01-4466-993c-6ab7075e1ad5" (UID: "423144fa-9b01-4466-993c-6ab7075e1ad5"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.001702    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "423144fa-9b01-4466-993c-6ab7075e1ad5" (UID: "423144fa-9b01-4466-993c-6ab7075e1ad5"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.001816    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "423144fa-9b01-4466-993c-6ab7075e1ad5" (UID: "423144fa-9b01-4466-993c-6ab7075e1ad5"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.002706    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/423144fa-9b01-4466-993c-6ab7075e1ad5-kube-api-access-dhrdz" (OuterVolumeSpecName: "kube-api-access-dhrdz") pod "423144fa-9b01-4466-993c-6ab7075e1ad5" (UID: "423144fa-9b01-4466-993c-6ab7075e1ad5"). InnerVolumeSpecName "kube-api-access-dhrdz". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.006365    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "423144fa-9b01-4466-993c-6ab7075e1ad5" (UID: "423144fa-9b01-4466-993c-6ab7075e1ad5"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.022939    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-inventory" (OuterVolumeSpecName: "inventory") pod "423144fa-9b01-4466-993c-6ab7075e1ad5" (UID: "423144fa-9b01-4466-993c-6ab7075e1ad5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.050232    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "423144fa-9b01-4466-993c-6ab7075e1ad5" (UID: "423144fa-9b01-4466-993c-6ab7075e1ad5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.094810    4730 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.095057    4730 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.095147    4730 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/423144fa-9b01-4466-993c-6ab7075e1ad5-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\""
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.095241    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhrdz\" (UniqueName: \"kubernetes.io/projected/423144fa-9b01-4466-993c-6ab7075e1ad5-kube-api-access-dhrdz\") on node \"crc\" DevicePath \"\""
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.095347    4730 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\""
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.095435    4730 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.095509    4730 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-inventory\") on node \"crc\" DevicePath \"\""
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.095580    4730 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/423144fa-9b01-4466-993c-6ab7075e1ad5-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\""
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.095639    4730 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.095701    4730 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/423144fa-9b01-4466-993c-6ab7075e1ad5-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\""
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.095764    4730 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.095822    4730 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.095884    4730 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/423144fa-9b01-4466-993c-6ab7075e1ad5-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\""
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.095940    4730 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423144fa-9b01-4466-993c-6ab7075e1ad5-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.355345    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n" event={"ID":"423144fa-9b01-4466-993c-6ab7075e1ad5","Type":"ContainerDied","Data":"21c9b74b7d25aff2921555a89646dca1c3794f9a5ed3d4441b6c26e7fd3963a0"}
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.355394    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21c9b74b7d25aff2921555a89646dca1c3794f9a5ed3d4441b6c26e7fd3963a0"
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.355357    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-4d78n"
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.454658    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-fxwgt"]
Mar 20 16:15:13 crc kubenswrapper[4730]: E0320 16:15:13.455168    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5af1b002-c577-4334-8304-5f44a67a5119" containerName="collect-profiles"
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.455195    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="5af1b002-c577-4334-8304-5f44a67a5119" containerName="collect-profiles"
Mar 20 16:15:13 crc kubenswrapper[4730]: E0320 16:15:13.455220    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3" containerName="extract-utilities"
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.455227    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3" containerName="extract-utilities"
Mar 20 16:15:13 crc kubenswrapper[4730]: E0320 16:15:13.455240    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="423144fa-9b01-4466-993c-6ab7075e1ad5" containerName="install-certs-edpm-deployment-openstack-edpm-ipam"
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.455249    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="423144fa-9b01-4466-993c-6ab7075e1ad5" containerName="install-certs-edpm-deployment-openstack-edpm-ipam"
Mar 20 16:15:13 crc kubenswrapper[4730]: E0320 16:15:13.457287    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3" containerName="extract-content"
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.457336    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3" containerName="extract-content"
Mar 20 16:15:13 crc kubenswrapper[4730]: E0320 16:15:13.457373    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3" containerName="registry-server"
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.457382    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3" containerName="registry-server"
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.457707    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="5af1b002-c577-4334-8304-5f44a67a5119" containerName="collect-profiles"
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.457749    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="423144fa-9b01-4466-993c-6ab7075e1ad5" containerName="install-certs-edpm-deployment-openstack-edpm-ipam"
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.457772    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3" containerName="registry-server"
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.458627    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fxwgt"
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.463895    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env"
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.464415    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam"
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.465913    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret"
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.465970    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config"
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.466346    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vvsxx"
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.466958    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-fxwgt"]
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.544235    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3" path="/var/lib/kubelet/pods/f7f46bf2-4fe4-4576-a9f1-eeec5b2bf8c3/volumes"
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.605139    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/efd41cb9-678e-43d9-8643-b5aa95f1ec3e-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fxwgt\" (UID: \"efd41cb9-678e-43d9-8643-b5aa95f1ec3e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fxwgt"
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.605546    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/efd41cb9-678e-43d9-8643-b5aa95f1ec3e-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fxwgt\" (UID: \"efd41cb9-678e-43d9-8643-b5aa95f1ec3e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fxwgt"
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.605590    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/efd41cb9-678e-43d9-8643-b5aa95f1ec3e-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fxwgt\" (UID: \"efd41cb9-678e-43d9-8643-b5aa95f1ec3e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fxwgt"
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.605721    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8lcv\" (UniqueName: \"kubernetes.io/projected/efd41cb9-678e-43d9-8643-b5aa95f1ec3e-kube-api-access-q8lcv\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fxwgt\" (UID: \"efd41cb9-678e-43d9-8643-b5aa95f1ec3e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fxwgt"
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.605819    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efd41cb9-678e-43d9-8643-b5aa95f1ec3e-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fxwgt\" (UID: \"efd41cb9-678e-43d9-8643-b5aa95f1ec3e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fxwgt"
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.718397    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efd41cb9-678e-43d9-8643-b5aa95f1ec3e-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fxwgt\" (UID: \"efd41cb9-678e-43d9-8643-b5aa95f1ec3e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fxwgt"
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.718549    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/efd41cb9-678e-43d9-8643-b5aa95f1ec3e-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fxwgt\" (UID: \"efd41cb9-678e-43d9-8643-b5aa95f1ec3e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fxwgt"
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.718656    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/efd41cb9-678e-43d9-8643-b5aa95f1ec3e-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fxwgt\" (UID: \"efd41cb9-678e-43d9-8643-b5aa95f1ec3e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fxwgt"
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.718678    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/efd41cb9-678e-43d9-8643-b5aa95f1ec3e-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fxwgt\" (UID: \"efd41cb9-678e-43d9-8643-b5aa95f1ec3e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fxwgt"
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.718738    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8lcv\" (UniqueName: \"kubernetes.io/projected/efd41cb9-678e-43d9-8643-b5aa95f1ec3e-kube-api-access-q8lcv\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fxwgt\" (UID: \"efd41cb9-678e-43d9-8643-b5aa95f1ec3e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fxwgt"
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.720825    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/efd41cb9-678e-43d9-8643-b5aa95f1ec3e-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fxwgt\" (UID: \"efd41cb9-678e-43d9-8643-b5aa95f1ec3e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fxwgt"
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.723449    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/efd41cb9-678e-43d9-8643-b5aa95f1ec3e-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fxwgt\" (UID: \"efd41cb9-678e-43d9-8643-b5aa95f1ec3e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fxwgt"
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.724946    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/efd41cb9-678e-43d9-8643-b5aa95f1ec3e-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fxwgt\" (UID: \"efd41cb9-678e-43d9-8643-b5aa95f1ec3e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fxwgt"
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.726872    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efd41cb9-678e-43d9-8643-b5aa95f1ec3e-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fxwgt\" (UID: \"efd41cb9-678e-43d9-8643-b5aa95f1ec3e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fxwgt"
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.747699    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8lcv\" (UniqueName: \"kubernetes.io/projected/efd41cb9-678e-43d9-8643-b5aa95f1ec3e-kube-api-access-q8lcv\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fxwgt\" (UID: \"efd41cb9-678e-43d9-8643-b5aa95f1ec3e\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fxwgt"
Mar 20 16:15:13 crc kubenswrapper[4730]: I0320 16:15:13.778009    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fxwgt"
Mar 20 16:15:14 crc kubenswrapper[4730]: I0320 16:15:14.271888    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-fxwgt"]
Mar 20 16:15:14 crc kubenswrapper[4730]: I0320 16:15:14.278643    4730 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider
Mar 20 16:15:14 crc kubenswrapper[4730]: I0320 16:15:14.366415    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fxwgt" event={"ID":"efd41cb9-678e-43d9-8643-b5aa95f1ec3e","Type":"ContainerStarted","Data":"2ef45405b82ea3ab2a346b005f8ce69fed405496721b84e102d2e54946ebc3d7"}
Mar 20 16:15:16 crc kubenswrapper[4730]: I0320 16:15:16.387988    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fxwgt" event={"ID":"efd41cb9-678e-43d9-8643-b5aa95f1ec3e","Type":"ContainerStarted","Data":"cb32fa207a288ed830e19ca81c260eb403869416d2b909e2c71baf95e1baaba1"}
Mar 20 16:15:16 crc kubenswrapper[4730]: I0320 16:15:16.411060    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fxwgt" podStartSLOduration=2.492880686 podStartE2EDuration="3.411041336s" podCreationTimestamp="2026-03-20 16:15:13 +0000 UTC" firstStartedPulling="2026-03-20 16:15:14.278337745 +0000 UTC m=+2173.491709124" lastFinishedPulling="2026-03-20 16:15:15.196498405 +0000 UTC m=+2174.409869774" observedRunningTime="2026-03-20 16:15:16.407467645 +0000 UTC m=+2175.620839014" watchObservedRunningTime="2026-03-20 16:15:16.411041336 +0000 UTC m=+2175.624412705"
Mar 20 16:15:49 crc kubenswrapper[4730]: I0320 16:15:49.151969    4730 scope.go:117] "RemoveContainer" containerID="647092b460bb07570b06908ca4f98239d0470ba3df7bb23adf207cb830d51de7"
Mar 20 16:16:00 crc kubenswrapper[4730]: I0320 16:16:00.152758    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567056-5hbgw"]
Mar 20 16:16:00 crc kubenswrapper[4730]: I0320 16:16:00.154403    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567056-5hbgw"
Mar 20 16:16:00 crc kubenswrapper[4730]: I0320 16:16:00.156344    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vlqbl"
Mar 20 16:16:00 crc kubenswrapper[4730]: I0320 16:16:00.156765    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt"
Mar 20 16:16:00 crc kubenswrapper[4730]: I0320 16:16:00.157318    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt"
Mar 20 16:16:00 crc kubenswrapper[4730]: I0320 16:16:00.176156    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567056-5hbgw"]
Mar 20 16:16:00 crc kubenswrapper[4730]: I0320 16:16:00.309089    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xgkh\" (UniqueName: \"kubernetes.io/projected/331a4cf6-7d2c-4540-9686-064f27fee0cc-kube-api-access-7xgkh\") pod \"auto-csr-approver-29567056-5hbgw\" (UID: \"331a4cf6-7d2c-4540-9686-064f27fee0cc\") " pod="openshift-infra/auto-csr-approver-29567056-5hbgw"
Mar 20 16:16:00 crc kubenswrapper[4730]: I0320 16:16:00.410719    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xgkh\" (UniqueName: \"kubernetes.io/projected/331a4cf6-7d2c-4540-9686-064f27fee0cc-kube-api-access-7xgkh\") pod \"auto-csr-approver-29567056-5hbgw\" (UID: \"331a4cf6-7d2c-4540-9686-064f27fee0cc\") " pod="openshift-infra/auto-csr-approver-29567056-5hbgw"
Mar 20 16:16:00 crc kubenswrapper[4730]: I0320 16:16:00.432942    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xgkh\" (UniqueName: \"kubernetes.io/projected/331a4cf6-7d2c-4540-9686-064f27fee0cc-kube-api-access-7xgkh\") pod \"auto-csr-approver-29567056-5hbgw\" (UID: \"331a4cf6-7d2c-4540-9686-064f27fee0cc\") " pod="openshift-infra/auto-csr-approver-29567056-5hbgw"
Mar 20 16:16:00 crc kubenswrapper[4730]: I0320 16:16:00.470560    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567056-5hbgw"
Mar 20 16:16:00 crc kubenswrapper[4730]: I0320 16:16:00.983295    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567056-5hbgw"]
Mar 20 16:16:01 crc kubenswrapper[4730]: I0320 16:16:01.849749    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567056-5hbgw" event={"ID":"331a4cf6-7d2c-4540-9686-064f27fee0cc","Type":"ContainerStarted","Data":"37e797fe97430638fe1123d2140225c9dd6493a8ddfc738a4bc98b5e0be148ae"}
Mar 20 16:16:02 crc kubenswrapper[4730]: I0320 16:16:02.861887    4730 generic.go:334] "Generic (PLEG): container finished" podID="331a4cf6-7d2c-4540-9686-064f27fee0cc" containerID="a7f958f4919aca64c652d826ae81971e8c020ebd62d216c39f71ac76d6e91ef4" exitCode=0
Mar 20 16:16:02 crc kubenswrapper[4730]: I0320 16:16:02.862351    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567056-5hbgw" event={"ID":"331a4cf6-7d2c-4540-9686-064f27fee0cc","Type":"ContainerDied","Data":"a7f958f4919aca64c652d826ae81971e8c020ebd62d216c39f71ac76d6e91ef4"}
Mar 20 16:16:04 crc kubenswrapper[4730]: I0320 16:16:04.262421    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567056-5hbgw"
Mar 20 16:16:04 crc kubenswrapper[4730]: I0320 16:16:04.386658    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xgkh\" (UniqueName: \"kubernetes.io/projected/331a4cf6-7d2c-4540-9686-064f27fee0cc-kube-api-access-7xgkh\") pod \"331a4cf6-7d2c-4540-9686-064f27fee0cc\" (UID: \"331a4cf6-7d2c-4540-9686-064f27fee0cc\") "
Mar 20 16:16:04 crc kubenswrapper[4730]: I0320 16:16:04.393636    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/331a4cf6-7d2c-4540-9686-064f27fee0cc-kube-api-access-7xgkh" (OuterVolumeSpecName: "kube-api-access-7xgkh") pod "331a4cf6-7d2c-4540-9686-064f27fee0cc" (UID: "331a4cf6-7d2c-4540-9686-064f27fee0cc"). InnerVolumeSpecName "kube-api-access-7xgkh". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:16:04 crc kubenswrapper[4730]: I0320 16:16:04.488932    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xgkh\" (UniqueName: \"kubernetes.io/projected/331a4cf6-7d2c-4540-9686-064f27fee0cc-kube-api-access-7xgkh\") on node \"crc\" DevicePath \"\""
Mar 20 16:16:04 crc kubenswrapper[4730]: I0320 16:16:04.886407    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567056-5hbgw" event={"ID":"331a4cf6-7d2c-4540-9686-064f27fee0cc","Type":"ContainerDied","Data":"37e797fe97430638fe1123d2140225c9dd6493a8ddfc738a4bc98b5e0be148ae"}
Mar 20 16:16:04 crc kubenswrapper[4730]: I0320 16:16:04.886441    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567056-5hbgw"
Mar 20 16:16:04 crc kubenswrapper[4730]: I0320 16:16:04.886453    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37e797fe97430638fe1123d2140225c9dd6493a8ddfc738a4bc98b5e0be148ae"
Mar 20 16:16:05 crc kubenswrapper[4730]: I0320 16:16:05.327735    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567050-7gqln"]
Mar 20 16:16:05 crc kubenswrapper[4730]: I0320 16:16:05.337748    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567050-7gqln"]
Mar 20 16:16:05 crc kubenswrapper[4730]: I0320 16:16:05.544513    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d689cf2-4142-40fd-9af3-13b98b99296d" path="/var/lib/kubelet/pods/9d689cf2-4142-40fd-9af3-13b98b99296d/volumes"
Mar 20 16:16:12 crc kubenswrapper[4730]: I0320 16:16:12.879839    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 16:16:12 crc kubenswrapper[4730]: I0320 16:16:12.881740    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 16:16:19 crc kubenswrapper[4730]: I0320 16:16:19.007507    4730 generic.go:334] "Generic (PLEG): container finished" podID="efd41cb9-678e-43d9-8643-b5aa95f1ec3e" containerID="cb32fa207a288ed830e19ca81c260eb403869416d2b909e2c71baf95e1baaba1" exitCode=0
Mar 20 16:16:19 crc kubenswrapper[4730]: I0320 16:16:19.008397    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fxwgt" event={"ID":"efd41cb9-678e-43d9-8643-b5aa95f1ec3e","Type":"ContainerDied","Data":"cb32fa207a288ed830e19ca81c260eb403869416d2b909e2c71baf95e1baaba1"}
Mar 20 16:16:20 crc kubenswrapper[4730]: I0320 16:16:20.416641    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fxwgt"
Mar 20 16:16:20 crc kubenswrapper[4730]: I0320 16:16:20.500835    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/efd41cb9-678e-43d9-8643-b5aa95f1ec3e-inventory\") pod \"efd41cb9-678e-43d9-8643-b5aa95f1ec3e\" (UID: \"efd41cb9-678e-43d9-8643-b5aa95f1ec3e\") "
Mar 20 16:16:20 crc kubenswrapper[4730]: I0320 16:16:20.500927    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/efd41cb9-678e-43d9-8643-b5aa95f1ec3e-ssh-key-openstack-edpm-ipam\") pod \"efd41cb9-678e-43d9-8643-b5aa95f1ec3e\" (UID: \"efd41cb9-678e-43d9-8643-b5aa95f1ec3e\") "
Mar 20 16:16:20 crc kubenswrapper[4730]: I0320 16:16:20.501065    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8lcv\" (UniqueName: \"kubernetes.io/projected/efd41cb9-678e-43d9-8643-b5aa95f1ec3e-kube-api-access-q8lcv\") pod \"efd41cb9-678e-43d9-8643-b5aa95f1ec3e\" (UID: \"efd41cb9-678e-43d9-8643-b5aa95f1ec3e\") "
Mar 20 16:16:20 crc kubenswrapper[4730]: I0320 16:16:20.501099    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/efd41cb9-678e-43d9-8643-b5aa95f1ec3e-ovncontroller-config-0\") pod \"efd41cb9-678e-43d9-8643-b5aa95f1ec3e\" (UID: \"efd41cb9-678e-43d9-8643-b5aa95f1ec3e\") "
Mar 20 16:16:20 crc kubenswrapper[4730]: I0320 16:16:20.501152    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efd41cb9-678e-43d9-8643-b5aa95f1ec3e-ovn-combined-ca-bundle\") pod \"efd41cb9-678e-43d9-8643-b5aa95f1ec3e\" (UID: \"efd41cb9-678e-43d9-8643-b5aa95f1ec3e\") "
Mar 20 16:16:20 crc kubenswrapper[4730]: I0320 16:16:20.512965    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efd41cb9-678e-43d9-8643-b5aa95f1ec3e-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "efd41cb9-678e-43d9-8643-b5aa95f1ec3e" (UID: "efd41cb9-678e-43d9-8643-b5aa95f1ec3e"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:16:20 crc kubenswrapper[4730]: I0320 16:16:20.513713    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efd41cb9-678e-43d9-8643-b5aa95f1ec3e-kube-api-access-q8lcv" (OuterVolumeSpecName: "kube-api-access-q8lcv") pod "efd41cb9-678e-43d9-8643-b5aa95f1ec3e" (UID: "efd41cb9-678e-43d9-8643-b5aa95f1ec3e"). InnerVolumeSpecName "kube-api-access-q8lcv". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:16:20 crc kubenswrapper[4730]: I0320 16:16:20.528910    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efd41cb9-678e-43d9-8643-b5aa95f1ec3e-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "efd41cb9-678e-43d9-8643-b5aa95f1ec3e" (UID: "efd41cb9-678e-43d9-8643-b5aa95f1ec3e"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:16:20 crc kubenswrapper[4730]: I0320 16:16:20.533212    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efd41cb9-678e-43d9-8643-b5aa95f1ec3e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "efd41cb9-678e-43d9-8643-b5aa95f1ec3e" (UID: "efd41cb9-678e-43d9-8643-b5aa95f1ec3e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:16:20 crc kubenswrapper[4730]: I0320 16:16:20.536430    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efd41cb9-678e-43d9-8643-b5aa95f1ec3e-inventory" (OuterVolumeSpecName: "inventory") pod "efd41cb9-678e-43d9-8643-b5aa95f1ec3e" (UID: "efd41cb9-678e-43d9-8643-b5aa95f1ec3e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:16:20 crc kubenswrapper[4730]: I0320 16:16:20.603726    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8lcv\" (UniqueName: \"kubernetes.io/projected/efd41cb9-678e-43d9-8643-b5aa95f1ec3e-kube-api-access-q8lcv\") on node \"crc\" DevicePath \"\""
Mar 20 16:16:20 crc kubenswrapper[4730]: I0320 16:16:20.603762    4730 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/efd41cb9-678e-43d9-8643-b5aa95f1ec3e-ovncontroller-config-0\") on node \"crc\" DevicePath \"\""
Mar 20 16:16:20 crc kubenswrapper[4730]: I0320 16:16:20.603771    4730 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efd41cb9-678e-43d9-8643-b5aa95f1ec3e-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:16:20 crc kubenswrapper[4730]: I0320 16:16:20.603779    4730 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/efd41cb9-678e-43d9-8643-b5aa95f1ec3e-inventory\") on node \"crc\" DevicePath \"\""
Mar 20 16:16:20 crc kubenswrapper[4730]: I0320 16:16:20.603787    4730 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/efd41cb9-678e-43d9-8643-b5aa95f1ec3e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\""
Mar 20 16:16:21 crc kubenswrapper[4730]: I0320 16:16:21.032318    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fxwgt" event={"ID":"efd41cb9-678e-43d9-8643-b5aa95f1ec3e","Type":"ContainerDied","Data":"2ef45405b82ea3ab2a346b005f8ce69fed405496721b84e102d2e54946ebc3d7"}
Mar 20 16:16:21 crc kubenswrapper[4730]: I0320 16:16:21.032366    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ef45405b82ea3ab2a346b005f8ce69fed405496721b84e102d2e54946ebc3d7"
Mar 20 16:16:21 crc kubenswrapper[4730]: I0320 16:16:21.032370    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fxwgt"
Mar 20 16:16:21 crc kubenswrapper[4730]: I0320 16:16:21.129968    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n8bcw"]
Mar 20 16:16:21 crc kubenswrapper[4730]: E0320 16:16:21.130508    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="331a4cf6-7d2c-4540-9686-064f27fee0cc" containerName="oc"
Mar 20 16:16:21 crc kubenswrapper[4730]: I0320 16:16:21.130532    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="331a4cf6-7d2c-4540-9686-064f27fee0cc" containerName="oc"
Mar 20 16:16:21 crc kubenswrapper[4730]: E0320 16:16:21.130554    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efd41cb9-678e-43d9-8643-b5aa95f1ec3e" containerName="ovn-edpm-deployment-openstack-edpm-ipam"
Mar 20 16:16:21 crc kubenswrapper[4730]: I0320 16:16:21.130563    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="efd41cb9-678e-43d9-8643-b5aa95f1ec3e" containerName="ovn-edpm-deployment-openstack-edpm-ipam"
Mar 20 16:16:21 crc kubenswrapper[4730]: I0320 16:16:21.130794    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="331a4cf6-7d2c-4540-9686-064f27fee0cc" containerName="oc"
Mar 20 16:16:21 crc kubenswrapper[4730]: I0320 16:16:21.130819    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="efd41cb9-678e-43d9-8643-b5aa95f1ec3e" containerName="ovn-edpm-deployment-openstack-edpm-ipam"
Mar 20 16:16:21 crc kubenswrapper[4730]: I0320 16:16:21.131621    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n8bcw"
Mar 20 16:16:21 crc kubenswrapper[4730]: I0320 16:16:21.134930    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config"
Mar 20 16:16:21 crc kubenswrapper[4730]: I0320 16:16:21.134947    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config"
Mar 20 16:16:21 crc kubenswrapper[4730]: I0320 16:16:21.135053    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env"
Mar 20 16:16:21 crc kubenswrapper[4730]: I0320 16:16:21.135102    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret"
Mar 20 16:16:21 crc kubenswrapper[4730]: I0320 16:16:21.135138    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam"
Mar 20 16:16:21 crc kubenswrapper[4730]: I0320 16:16:21.135226    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vvsxx"
Mar 20 16:16:21 crc kubenswrapper[4730]: I0320 16:16:21.157191    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n8bcw"]
Mar 20 16:16:21 crc kubenswrapper[4730]: I0320 16:16:21.215576    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74d70014-6de1-4d90-b04a-8f8376d3a9e0-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n8bcw\" (UID: \"74d70014-6de1-4d90-b04a-8f8376d3a9e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n8bcw"
Mar 20 16:16:21 crc kubenswrapper[4730]: I0320 16:16:21.215919    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/74d70014-6de1-4d90-b04a-8f8376d3a9e0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n8bcw\" (UID: \"74d70014-6de1-4d90-b04a-8f8376d3a9e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n8bcw"
Mar 20 16:16:21 crc kubenswrapper[4730]: I0320 16:16:21.215955    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/74d70014-6de1-4d90-b04a-8f8376d3a9e0-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n8bcw\" (UID: \"74d70014-6de1-4d90-b04a-8f8376d3a9e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n8bcw"
Mar 20 16:16:21 crc kubenswrapper[4730]: I0320 16:16:21.215997    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74d70014-6de1-4d90-b04a-8f8376d3a9e0-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n8bcw\" (UID: \"74d70014-6de1-4d90-b04a-8f8376d3a9e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n8bcw"
Mar 20 16:16:21 crc kubenswrapper[4730]: I0320 16:16:21.216031    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqlr8\" (UniqueName: \"kubernetes.io/projected/74d70014-6de1-4d90-b04a-8f8376d3a9e0-kube-api-access-bqlr8\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n8bcw\" (UID: \"74d70014-6de1-4d90-b04a-8f8376d3a9e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n8bcw"
Mar 20 16:16:21 crc kubenswrapper[4730]: I0320 16:16:21.216103    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/74d70014-6de1-4d90-b04a-8f8376d3a9e0-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n8bcw\" (UID: \"74d70014-6de1-4d90-b04a-8f8376d3a9e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n8bcw"
Mar 20 16:16:21 crc kubenswrapper[4730]: I0320 16:16:21.318008    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74d70014-6de1-4d90-b04a-8f8376d3a9e0-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n8bcw\" (UID: \"74d70014-6de1-4d90-b04a-8f8376d3a9e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n8bcw"
Mar 20 16:16:21 crc kubenswrapper[4730]: I0320 16:16:21.318058    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/74d70014-6de1-4d90-b04a-8f8376d3a9e0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n8bcw\" (UID: \"74d70014-6de1-4d90-b04a-8f8376d3a9e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n8bcw"
Mar 20 16:16:21 crc kubenswrapper[4730]: I0320 16:16:21.318089    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/74d70014-6de1-4d90-b04a-8f8376d3a9e0-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n8bcw\" (UID: \"74d70014-6de1-4d90-b04a-8f8376d3a9e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n8bcw"
Mar 20 16:16:21 crc kubenswrapper[4730]: I0320 16:16:21.318110    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74d70014-6de1-4d90-b04a-8f8376d3a9e0-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n8bcw\" (UID: \"74d70014-6de1-4d90-b04a-8f8376d3a9e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n8bcw"
Mar 20 16:16:21 crc kubenswrapper[4730]: I0320 16:16:21.318808    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqlr8\" (UniqueName: \"kubernetes.io/projected/74d70014-6de1-4d90-b04a-8f8376d3a9e0-kube-api-access-bqlr8\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n8bcw\" (UID: \"74d70014-6de1-4d90-b04a-8f8376d3a9e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n8bcw"
Mar 20 16:16:21 crc kubenswrapper[4730]: I0320 16:16:21.318877    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/74d70014-6de1-4d90-b04a-8f8376d3a9e0-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n8bcw\" (UID: \"74d70014-6de1-4d90-b04a-8f8376d3a9e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n8bcw"
Mar 20 16:16:21 crc kubenswrapper[4730]: I0320 16:16:21.323310    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/74d70014-6de1-4d90-b04a-8f8376d3a9e0-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n8bcw\" (UID: \"74d70014-6de1-4d90-b04a-8f8376d3a9e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n8bcw"
Mar 20 16:16:21 crc kubenswrapper[4730]: I0320 16:16:21.323524    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/74d70014-6de1-4d90-b04a-8f8376d3a9e0-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n8bcw\" (UID: \"74d70014-6de1-4d90-b04a-8f8376d3a9e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n8bcw"
Mar 20 16:16:21 crc kubenswrapper[4730]: I0320 16:16:21.323938    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74d70014-6de1-4d90-b04a-8f8376d3a9e0-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n8bcw\" (UID: \"74d70014-6de1-4d90-b04a-8f8376d3a9e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n8bcw"
Mar 20 16:16:21 crc kubenswrapper[4730]: I0320 16:16:21.324125    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/74d70014-6de1-4d90-b04a-8f8376d3a9e0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n8bcw\" (UID: \"74d70014-6de1-4d90-b04a-8f8376d3a9e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n8bcw"
Mar 20 16:16:21 crc kubenswrapper[4730]: I0320 16:16:21.324943    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74d70014-6de1-4d90-b04a-8f8376d3a9e0-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n8bcw\" (UID: \"74d70014-6de1-4d90-b04a-8f8376d3a9e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n8bcw"
Mar 20 16:16:21 crc kubenswrapper[4730]: I0320 16:16:21.341211    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqlr8\" (UniqueName: \"kubernetes.io/projected/74d70014-6de1-4d90-b04a-8f8376d3a9e0-kube-api-access-bqlr8\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n8bcw\" (UID: \"74d70014-6de1-4d90-b04a-8f8376d3a9e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n8bcw"
Mar 20 16:16:21 crc kubenswrapper[4730]: I0320 16:16:21.493157    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n8bcw"
Mar 20 16:16:22 crc kubenswrapper[4730]: I0320 16:16:22.190178    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n8bcw"]
Mar 20 16:16:23 crc kubenswrapper[4730]: I0320 16:16:23.050784    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n8bcw" event={"ID":"74d70014-6de1-4d90-b04a-8f8376d3a9e0","Type":"ContainerStarted","Data":"50125756b79249e59512ad26ef5d86d2ed226aebe51736bf11f852d4e3237e00"}
Mar 20 16:16:23 crc kubenswrapper[4730]: I0320 16:16:23.051391    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n8bcw" event={"ID":"74d70014-6de1-4d90-b04a-8f8376d3a9e0","Type":"ContainerStarted","Data":"0838107093eacd9101dc280734e03b7fb6444281744acf3eb05444ff1c5b5488"}
Mar 20 16:16:23 crc kubenswrapper[4730]: I0320 16:16:23.068851    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n8bcw" podStartSLOduration=1.5380996420000002 podStartE2EDuration="2.068829783s" podCreationTimestamp="2026-03-20 16:16:21 +0000 UTC" firstStartedPulling="2026-03-20 16:16:22.1991371 +0000 UTC m=+2241.412508459" lastFinishedPulling="2026-03-20 16:16:22.729867231 +0000 UTC m=+2241.943238600" observedRunningTime="2026-03-20 16:16:23.064862811 +0000 UTC m=+2242.278234180" watchObservedRunningTime="2026-03-20 16:16:23.068829783 +0000 UTC m=+2242.282201152"
Mar 20 16:16:42 crc kubenswrapper[4730]: I0320 16:16:42.880052    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 16:16:42 crc kubenswrapper[4730]: I0320 16:16:42.880400    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 16:16:49 crc kubenswrapper[4730]: I0320 16:16:49.244995    4730 scope.go:117] "RemoveContainer" containerID="41dec27fbddb23dabfba3fbf070ca912d2703e72f2511ee9ef62aa8a4e09aa09"
Mar 20 16:17:11 crc kubenswrapper[4730]: I0320 16:17:11.548447    4730 generic.go:334] "Generic (PLEG): container finished" podID="74d70014-6de1-4d90-b04a-8f8376d3a9e0" containerID="50125756b79249e59512ad26ef5d86d2ed226aebe51736bf11f852d4e3237e00" exitCode=0
Mar 20 16:17:11 crc kubenswrapper[4730]: I0320 16:17:11.548512    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n8bcw" event={"ID":"74d70014-6de1-4d90-b04a-8f8376d3a9e0","Type":"ContainerDied","Data":"50125756b79249e59512ad26ef5d86d2ed226aebe51736bf11f852d4e3237e00"}
Mar 20 16:17:12 crc kubenswrapper[4730]: I0320 16:17:12.880748    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 16:17:12 crc kubenswrapper[4730]: I0320 16:17:12.881083    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 16:17:12 crc kubenswrapper[4730]: I0320 16:17:12.881131    4730 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf"
Mar 20 16:17:12 crc kubenswrapper[4730]: I0320 16:17:12.881894    4730 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b7dfce64faa161154e6afb28ae3e7685cc4caead60043a4bb51030ee7d13fb31"} pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted"
Mar 20 16:17:12 crc kubenswrapper[4730]: I0320 16:17:12.881950    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" containerID="cri-o://b7dfce64faa161154e6afb28ae3e7685cc4caead60043a4bb51030ee7d13fb31" gracePeriod=600
Mar 20 16:17:13 crc kubenswrapper[4730]: I0320 16:17:13.051785    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n8bcw"
Mar 20 16:17:13 crc kubenswrapper[4730]: I0320 16:17:13.144418    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/74d70014-6de1-4d90-b04a-8f8376d3a9e0-ssh-key-openstack-edpm-ipam\") pod \"74d70014-6de1-4d90-b04a-8f8376d3a9e0\" (UID: \"74d70014-6de1-4d90-b04a-8f8376d3a9e0\") "
Mar 20 16:17:13 crc kubenswrapper[4730]: I0320 16:17:13.144611    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74d70014-6de1-4d90-b04a-8f8376d3a9e0-neutron-metadata-combined-ca-bundle\") pod \"74d70014-6de1-4d90-b04a-8f8376d3a9e0\" (UID: \"74d70014-6de1-4d90-b04a-8f8376d3a9e0\") "
Mar 20 16:17:13 crc kubenswrapper[4730]: I0320 16:17:13.144668    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/74d70014-6de1-4d90-b04a-8f8376d3a9e0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"74d70014-6de1-4d90-b04a-8f8376d3a9e0\" (UID: \"74d70014-6de1-4d90-b04a-8f8376d3a9e0\") "
Mar 20 16:17:13 crc kubenswrapper[4730]: I0320 16:17:13.144706    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/74d70014-6de1-4d90-b04a-8f8376d3a9e0-nova-metadata-neutron-config-0\") pod \"74d70014-6de1-4d90-b04a-8f8376d3a9e0\" (UID: \"74d70014-6de1-4d90-b04a-8f8376d3a9e0\") "
Mar 20 16:17:13 crc kubenswrapper[4730]: I0320 16:17:13.144758    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqlr8\" (UniqueName: \"kubernetes.io/projected/74d70014-6de1-4d90-b04a-8f8376d3a9e0-kube-api-access-bqlr8\") pod \"74d70014-6de1-4d90-b04a-8f8376d3a9e0\" (UID: \"74d70014-6de1-4d90-b04a-8f8376d3a9e0\") "
Mar 20 16:17:13 crc kubenswrapper[4730]: I0320 16:17:13.144795    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74d70014-6de1-4d90-b04a-8f8376d3a9e0-inventory\") pod \"74d70014-6de1-4d90-b04a-8f8376d3a9e0\" (UID: \"74d70014-6de1-4d90-b04a-8f8376d3a9e0\") "
Mar 20 16:17:13 crc kubenswrapper[4730]: I0320 16:17:13.151430    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74d70014-6de1-4d90-b04a-8f8376d3a9e0-kube-api-access-bqlr8" (OuterVolumeSpecName: "kube-api-access-bqlr8") pod "74d70014-6de1-4d90-b04a-8f8376d3a9e0" (UID: "74d70014-6de1-4d90-b04a-8f8376d3a9e0"). InnerVolumeSpecName "kube-api-access-bqlr8". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:17:13 crc kubenswrapper[4730]: I0320 16:17:13.151773    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74d70014-6de1-4d90-b04a-8f8376d3a9e0-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "74d70014-6de1-4d90-b04a-8f8376d3a9e0" (UID: "74d70014-6de1-4d90-b04a-8f8376d3a9e0"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:17:13 crc kubenswrapper[4730]: I0320 16:17:13.183643    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74d70014-6de1-4d90-b04a-8f8376d3a9e0-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "74d70014-6de1-4d90-b04a-8f8376d3a9e0" (UID: "74d70014-6de1-4d90-b04a-8f8376d3a9e0"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:17:13 crc kubenswrapper[4730]: I0320 16:17:13.192360    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74d70014-6de1-4d90-b04a-8f8376d3a9e0-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "74d70014-6de1-4d90-b04a-8f8376d3a9e0" (UID: "74d70014-6de1-4d90-b04a-8f8376d3a9e0"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:17:13 crc kubenswrapper[4730]: I0320 16:17:13.192411    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74d70014-6de1-4d90-b04a-8f8376d3a9e0-inventory" (OuterVolumeSpecName: "inventory") pod "74d70014-6de1-4d90-b04a-8f8376d3a9e0" (UID: "74d70014-6de1-4d90-b04a-8f8376d3a9e0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:17:13 crc kubenswrapper[4730]: I0320 16:17:13.215832    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74d70014-6de1-4d90-b04a-8f8376d3a9e0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "74d70014-6de1-4d90-b04a-8f8376d3a9e0" (UID: "74d70014-6de1-4d90-b04a-8f8376d3a9e0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:17:13 crc kubenswrapper[4730]: I0320 16:17:13.246852    4730 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74d70014-6de1-4d90-b04a-8f8376d3a9e0-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:17:13 crc kubenswrapper[4730]: I0320 16:17:13.247210    4730 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/74d70014-6de1-4d90-b04a-8f8376d3a9e0-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\""
Mar 20 16:17:13 crc kubenswrapper[4730]: I0320 16:17:13.247224    4730 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/74d70014-6de1-4d90-b04a-8f8376d3a9e0-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\""
Mar 20 16:17:13 crc kubenswrapper[4730]: I0320 16:17:13.247235    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqlr8\" (UniqueName: \"kubernetes.io/projected/74d70014-6de1-4d90-b04a-8f8376d3a9e0-kube-api-access-bqlr8\") on node \"crc\" DevicePath \"\""
Mar 20 16:17:13 crc kubenswrapper[4730]: I0320 16:17:13.247262    4730 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74d70014-6de1-4d90-b04a-8f8376d3a9e0-inventory\") on node \"crc\" DevicePath \"\""
Mar 20 16:17:13 crc kubenswrapper[4730]: I0320 16:17:13.247276    4730 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/74d70014-6de1-4d90-b04a-8f8376d3a9e0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\""
Mar 20 16:17:13 crc kubenswrapper[4730]: I0320 16:17:13.576020    4730 generic.go:334] "Generic (PLEG): container finished" podID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerID="b7dfce64faa161154e6afb28ae3e7685cc4caead60043a4bb51030ee7d13fb31" exitCode=0
Mar 20 16:17:13 crc kubenswrapper[4730]: I0320 16:17:13.576118    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" event={"ID":"7fcd3db3-55f1-4c23-8fa9-78844495cea3","Type":"ContainerDied","Data":"b7dfce64faa161154e6afb28ae3e7685cc4caead60043a4bb51030ee7d13fb31"}
Mar 20 16:17:13 crc kubenswrapper[4730]: I0320 16:17:13.576173    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" event={"ID":"7fcd3db3-55f1-4c23-8fa9-78844495cea3","Type":"ContainerStarted","Data":"7c543dcd07b51b6540df53377dd2e3fd92636df6153928eaa86c229ad0e8e213"}
Mar 20 16:17:13 crc kubenswrapper[4730]: I0320 16:17:13.576192    4730 scope.go:117] "RemoveContainer" containerID="4a5122c368d04f6b91f59fb7f1f1be3d99580fb45609b2315a001469955ae120"
Mar 20 16:17:13 crc kubenswrapper[4730]: I0320 16:17:13.580204    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n8bcw" event={"ID":"74d70014-6de1-4d90-b04a-8f8376d3a9e0","Type":"ContainerDied","Data":"0838107093eacd9101dc280734e03b7fb6444281744acf3eb05444ff1c5b5488"}
Mar 20 16:17:13 crc kubenswrapper[4730]: I0320 16:17:13.580227    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0838107093eacd9101dc280734e03b7fb6444281744acf3eb05444ff1c5b5488"
Mar 20 16:17:13 crc kubenswrapper[4730]: I0320 16:17:13.580360    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n8bcw"
Mar 20 16:17:13 crc kubenswrapper[4730]: I0320 16:17:13.701027    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4vhj"]
Mar 20 16:17:13 crc kubenswrapper[4730]: E0320 16:17:13.701548    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74d70014-6de1-4d90-b04a-8f8376d3a9e0" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam"
Mar 20 16:17:13 crc kubenswrapper[4730]: I0320 16:17:13.701571    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="74d70014-6de1-4d90-b04a-8f8376d3a9e0" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam"
Mar 20 16:17:13 crc kubenswrapper[4730]: I0320 16:17:13.701820    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="74d70014-6de1-4d90-b04a-8f8376d3a9e0" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam"
Mar 20 16:17:13 crc kubenswrapper[4730]: I0320 16:17:13.702485    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4vhj"
Mar 20 16:17:13 crc kubenswrapper[4730]: I0320 16:17:13.705903    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vvsxx"
Mar 20 16:17:13 crc kubenswrapper[4730]: I0320 16:17:13.706139    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret"
Mar 20 16:17:13 crc kubenswrapper[4730]: I0320 16:17:13.706492    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret"
Mar 20 16:17:13 crc kubenswrapper[4730]: I0320 16:17:13.706635    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam"
Mar 20 16:17:13 crc kubenswrapper[4730]: I0320 16:17:13.706785    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env"
Mar 20 16:17:13 crc kubenswrapper[4730]: I0320 16:17:13.715775    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4vhj"]
Mar 20 16:17:13 crc kubenswrapper[4730]: I0320 16:17:13.858056    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43d453db-c8fb-438d-927e-6eaee8383df1-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z4vhj\" (UID: \"43d453db-c8fb-438d-927e-6eaee8383df1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4vhj"
Mar 20 16:17:13 crc kubenswrapper[4730]: I0320 16:17:13.859134    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43d453db-c8fb-438d-927e-6eaee8383df1-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z4vhj\" (UID: \"43d453db-c8fb-438d-927e-6eaee8383df1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4vhj"
Mar 20 16:17:13 crc kubenswrapper[4730]: I0320 16:17:13.859193    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/43d453db-c8fb-438d-927e-6eaee8383df1-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z4vhj\" (UID: \"43d453db-c8fb-438d-927e-6eaee8383df1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4vhj"
Mar 20 16:17:13 crc kubenswrapper[4730]: I0320 16:17:13.859212    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/43d453db-c8fb-438d-927e-6eaee8383df1-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z4vhj\" (UID: \"43d453db-c8fb-438d-927e-6eaee8383df1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4vhj"
Mar 20 16:17:13 crc kubenswrapper[4730]: I0320 16:17:13.859236    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nx8mb\" (UniqueName: \"kubernetes.io/projected/43d453db-c8fb-438d-927e-6eaee8383df1-kube-api-access-nx8mb\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z4vhj\" (UID: \"43d453db-c8fb-438d-927e-6eaee8383df1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4vhj"
Mar 20 16:17:13 crc kubenswrapper[4730]: I0320 16:17:13.961835    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43d453db-c8fb-438d-927e-6eaee8383df1-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z4vhj\" (UID: \"43d453db-c8fb-438d-927e-6eaee8383df1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4vhj"
Mar 20 16:17:13 crc kubenswrapper[4730]: I0320 16:17:13.962028    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43d453db-c8fb-438d-927e-6eaee8383df1-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z4vhj\" (UID: \"43d453db-c8fb-438d-927e-6eaee8383df1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4vhj"
Mar 20 16:17:13 crc kubenswrapper[4730]: I0320 16:17:13.962078    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/43d453db-c8fb-438d-927e-6eaee8383df1-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z4vhj\" (UID: \"43d453db-c8fb-438d-927e-6eaee8383df1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4vhj"
Mar 20 16:17:13 crc kubenswrapper[4730]: I0320 16:17:13.962102    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/43d453db-c8fb-438d-927e-6eaee8383df1-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z4vhj\" (UID: \"43d453db-c8fb-438d-927e-6eaee8383df1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4vhj"
Mar 20 16:17:13 crc kubenswrapper[4730]: I0320 16:17:13.962134    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nx8mb\" (UniqueName: \"kubernetes.io/projected/43d453db-c8fb-438d-927e-6eaee8383df1-kube-api-access-nx8mb\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z4vhj\" (UID: \"43d453db-c8fb-438d-927e-6eaee8383df1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4vhj"
Mar 20 16:17:13 crc kubenswrapper[4730]: I0320 16:17:13.967018    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/43d453db-c8fb-438d-927e-6eaee8383df1-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z4vhj\" (UID: \"43d453db-c8fb-438d-927e-6eaee8383df1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4vhj"
Mar 20 16:17:13 crc kubenswrapper[4730]: I0320 16:17:13.967379    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43d453db-c8fb-438d-927e-6eaee8383df1-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z4vhj\" (UID: \"43d453db-c8fb-438d-927e-6eaee8383df1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4vhj"
Mar 20 16:17:13 crc kubenswrapper[4730]: I0320 16:17:13.967578    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/43d453db-c8fb-438d-927e-6eaee8383df1-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z4vhj\" (UID: \"43d453db-c8fb-438d-927e-6eaee8383df1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4vhj"
Mar 20 16:17:13 crc kubenswrapper[4730]: I0320 16:17:13.970995    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43d453db-c8fb-438d-927e-6eaee8383df1-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z4vhj\" (UID: \"43d453db-c8fb-438d-927e-6eaee8383df1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4vhj"
Mar 20 16:17:13 crc kubenswrapper[4730]: I0320 16:17:13.979678    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nx8mb\" (UniqueName: \"kubernetes.io/projected/43d453db-c8fb-438d-927e-6eaee8383df1-kube-api-access-nx8mb\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z4vhj\" (UID: \"43d453db-c8fb-438d-927e-6eaee8383df1\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4vhj"
Mar 20 16:17:14 crc kubenswrapper[4730]: I0320 16:17:14.037773    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4vhj"
Mar 20 16:17:14 crc kubenswrapper[4730]: W0320 16:17:14.580823    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43d453db_c8fb_438d_927e_6eaee8383df1.slice/crio-4e9ce008b0379b201268b642fefda0155599bddcd66e572dcdc376e727bf72f7 WatchSource:0}: Error finding container 4e9ce008b0379b201268b642fefda0155599bddcd66e572dcdc376e727bf72f7: Status 404 returned error can't find the container with id 4e9ce008b0379b201268b642fefda0155599bddcd66e572dcdc376e727bf72f7
Mar 20 16:17:14 crc kubenswrapper[4730]: I0320 16:17:14.581138    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4vhj"]
Mar 20 16:17:15 crc kubenswrapper[4730]: I0320 16:17:15.604447    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4vhj" event={"ID":"43d453db-c8fb-438d-927e-6eaee8383df1","Type":"ContainerStarted","Data":"376af442bf9ebfb7a32ce79a2f7ed1f7e468476d5ad13d8d73708b12ea13c232"}
Mar 20 16:17:15 crc kubenswrapper[4730]: I0320 16:17:15.605016    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4vhj" event={"ID":"43d453db-c8fb-438d-927e-6eaee8383df1","Type":"ContainerStarted","Data":"4e9ce008b0379b201268b642fefda0155599bddcd66e572dcdc376e727bf72f7"}
Mar 20 16:17:15 crc kubenswrapper[4730]: I0320 16:17:15.654219    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4vhj" podStartSLOduration=2.205956246 podStartE2EDuration="2.654198087s" podCreationTimestamp="2026-03-20 16:17:13 +0000 UTC" firstStartedPulling="2026-03-20 16:17:14.58443459 +0000 UTC m=+2293.797805959" lastFinishedPulling="2026-03-20 16:17:15.032676401 +0000 UTC m=+2294.246047800" observedRunningTime="2026-03-20 16:17:15.649424261 +0000 UTC m=+2294.862795640" watchObservedRunningTime="2026-03-20 16:17:15.654198087 +0000 UTC m=+2294.867569456"
Mar 20 16:18:00 crc kubenswrapper[4730]: I0320 16:18:00.152414    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567058-dbb42"]
Mar 20 16:18:00 crc kubenswrapper[4730]: I0320 16:18:00.154614    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567058-dbb42"
Mar 20 16:18:00 crc kubenswrapper[4730]: I0320 16:18:00.156489    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt"
Mar 20 16:18:00 crc kubenswrapper[4730]: I0320 16:18:00.156491    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vlqbl"
Mar 20 16:18:00 crc kubenswrapper[4730]: I0320 16:18:00.156973    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt"
Mar 20 16:18:00 crc kubenswrapper[4730]: I0320 16:18:00.163602    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567058-dbb42"]
Mar 20 16:18:00 crc kubenswrapper[4730]: I0320 16:18:00.208750    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hh67\" (UniqueName: \"kubernetes.io/projected/3e26c2d2-d860-4e2a-b8e9-d607220b44f7-kube-api-access-5hh67\") pod \"auto-csr-approver-29567058-dbb42\" (UID: \"3e26c2d2-d860-4e2a-b8e9-d607220b44f7\") " pod="openshift-infra/auto-csr-approver-29567058-dbb42"
Mar 20 16:18:00 crc kubenswrapper[4730]: I0320 16:18:00.311031    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hh67\" (UniqueName: \"kubernetes.io/projected/3e26c2d2-d860-4e2a-b8e9-d607220b44f7-kube-api-access-5hh67\") pod \"auto-csr-approver-29567058-dbb42\" (UID: \"3e26c2d2-d860-4e2a-b8e9-d607220b44f7\") " pod="openshift-infra/auto-csr-approver-29567058-dbb42"
Mar 20 16:18:00 crc kubenswrapper[4730]: I0320 16:18:00.331764    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hh67\" (UniqueName: \"kubernetes.io/projected/3e26c2d2-d860-4e2a-b8e9-d607220b44f7-kube-api-access-5hh67\") pod \"auto-csr-approver-29567058-dbb42\" (UID: \"3e26c2d2-d860-4e2a-b8e9-d607220b44f7\") " pod="openshift-infra/auto-csr-approver-29567058-dbb42"
Mar 20 16:18:00 crc kubenswrapper[4730]: I0320 16:18:00.481295    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567058-dbb42"
Mar 20 16:18:00 crc kubenswrapper[4730]: I0320 16:18:00.933610    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567058-dbb42"]
Mar 20 16:18:01 crc kubenswrapper[4730]: I0320 16:18:01.048541    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567058-dbb42" event={"ID":"3e26c2d2-d860-4e2a-b8e9-d607220b44f7","Type":"ContainerStarted","Data":"87c535f2369abdc09480d5585fa4bc3a0c172bc32e1112ce544c29239d4f5deb"}
Mar 20 16:18:03 crc kubenswrapper[4730]: I0320 16:18:03.077179    4730 generic.go:334] "Generic (PLEG): container finished" podID="3e26c2d2-d860-4e2a-b8e9-d607220b44f7" containerID="7c5e3ddc85e2a694aa4b6f5091893d6172a7cc9cc61ecbf5665992a51d9cf392" exitCode=0
Mar 20 16:18:03 crc kubenswrapper[4730]: I0320 16:18:03.077307    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567058-dbb42" event={"ID":"3e26c2d2-d860-4e2a-b8e9-d607220b44f7","Type":"ContainerDied","Data":"7c5e3ddc85e2a694aa4b6f5091893d6172a7cc9cc61ecbf5665992a51d9cf392"}
Mar 20 16:18:04 crc kubenswrapper[4730]: I0320 16:18:04.418522    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567058-dbb42"
Mar 20 16:18:04 crc kubenswrapper[4730]: I0320 16:18:04.487750    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hh67\" (UniqueName: \"kubernetes.io/projected/3e26c2d2-d860-4e2a-b8e9-d607220b44f7-kube-api-access-5hh67\") pod \"3e26c2d2-d860-4e2a-b8e9-d607220b44f7\" (UID: \"3e26c2d2-d860-4e2a-b8e9-d607220b44f7\") "
Mar 20 16:18:04 crc kubenswrapper[4730]: I0320 16:18:04.493518    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e26c2d2-d860-4e2a-b8e9-d607220b44f7-kube-api-access-5hh67" (OuterVolumeSpecName: "kube-api-access-5hh67") pod "3e26c2d2-d860-4e2a-b8e9-d607220b44f7" (UID: "3e26c2d2-d860-4e2a-b8e9-d607220b44f7"). InnerVolumeSpecName "kube-api-access-5hh67". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:18:04 crc kubenswrapper[4730]: I0320 16:18:04.592090    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hh67\" (UniqueName: \"kubernetes.io/projected/3e26c2d2-d860-4e2a-b8e9-d607220b44f7-kube-api-access-5hh67\") on node \"crc\" DevicePath \"\""
Mar 20 16:18:05 crc kubenswrapper[4730]: I0320 16:18:05.099239    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567058-dbb42" event={"ID":"3e26c2d2-d860-4e2a-b8e9-d607220b44f7","Type":"ContainerDied","Data":"87c535f2369abdc09480d5585fa4bc3a0c172bc32e1112ce544c29239d4f5deb"}
Mar 20 16:18:05 crc kubenswrapper[4730]: I0320 16:18:05.099309    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87c535f2369abdc09480d5585fa4bc3a0c172bc32e1112ce544c29239d4f5deb"
Mar 20 16:18:05 crc kubenswrapper[4730]: I0320 16:18:05.099376    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567058-dbb42"
Mar 20 16:18:05 crc kubenswrapper[4730]: E0320 16:18:05.200025    4730 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e26c2d2_d860_4e2a_b8e9_d607220b44f7.slice\": RecentStats: unable to find data in memory cache]"
Mar 20 16:18:05 crc kubenswrapper[4730]: I0320 16:18:05.484653    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567052-fb4zl"]
Mar 20 16:18:05 crc kubenswrapper[4730]: I0320 16:18:05.494281    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567052-fb4zl"]
Mar 20 16:18:05 crc kubenswrapper[4730]: I0320 16:18:05.546555    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28beb66f-2a64-4bcf-94eb-676ef7f1236a" path="/var/lib/kubelet/pods/28beb66f-2a64-4bcf-94eb-676ef7f1236a/volumes"
Mar 20 16:18:49 crc kubenswrapper[4730]: I0320 16:18:49.376284    4730 scope.go:117] "RemoveContainer" containerID="09bf1a5b6b98230c97ec660d74eb6fc018c3a7b8b7105355e719108bd3861003"
Mar 20 16:19:42 crc kubenswrapper[4730]: I0320 16:19:42.880345    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 16:19:42 crc kubenswrapper[4730]: I0320 16:19:42.881172    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 16:20:00 crc kubenswrapper[4730]: I0320 16:20:00.139772    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567060-dbzpk"]
Mar 20 16:20:00 crc kubenswrapper[4730]: E0320 16:20:00.140696    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e26c2d2-d860-4e2a-b8e9-d607220b44f7" containerName="oc"
Mar 20 16:20:00 crc kubenswrapper[4730]: I0320 16:20:00.140707    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e26c2d2-d860-4e2a-b8e9-d607220b44f7" containerName="oc"
Mar 20 16:20:00 crc kubenswrapper[4730]: I0320 16:20:00.140913    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e26c2d2-d860-4e2a-b8e9-d607220b44f7" containerName="oc"
Mar 20 16:20:00 crc kubenswrapper[4730]: I0320 16:20:00.141625    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567060-dbzpk"
Mar 20 16:20:00 crc kubenswrapper[4730]: I0320 16:20:00.144055    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt"
Mar 20 16:20:00 crc kubenswrapper[4730]: I0320 16:20:00.144613    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt"
Mar 20 16:20:00 crc kubenswrapper[4730]: I0320 16:20:00.145052    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vlqbl"
Mar 20 16:20:00 crc kubenswrapper[4730]: I0320 16:20:00.151144    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567060-dbzpk"]
Mar 20 16:20:00 crc kubenswrapper[4730]: I0320 16:20:00.275797    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sz44r\" (UniqueName: \"kubernetes.io/projected/95b03afb-153e-4f6b-a88a-e64d8a889b97-kube-api-access-sz44r\") pod \"auto-csr-approver-29567060-dbzpk\" (UID: \"95b03afb-153e-4f6b-a88a-e64d8a889b97\") " pod="openshift-infra/auto-csr-approver-29567060-dbzpk"
Mar 20 16:20:00 crc kubenswrapper[4730]: I0320 16:20:00.377770    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sz44r\" (UniqueName: \"kubernetes.io/projected/95b03afb-153e-4f6b-a88a-e64d8a889b97-kube-api-access-sz44r\") pod \"auto-csr-approver-29567060-dbzpk\" (UID: \"95b03afb-153e-4f6b-a88a-e64d8a889b97\") " pod="openshift-infra/auto-csr-approver-29567060-dbzpk"
Mar 20 16:20:00 crc kubenswrapper[4730]: I0320 16:20:00.397035    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sz44r\" (UniqueName: \"kubernetes.io/projected/95b03afb-153e-4f6b-a88a-e64d8a889b97-kube-api-access-sz44r\") pod \"auto-csr-approver-29567060-dbzpk\" (UID: \"95b03afb-153e-4f6b-a88a-e64d8a889b97\") " pod="openshift-infra/auto-csr-approver-29567060-dbzpk"
Mar 20 16:20:00 crc kubenswrapper[4730]: I0320 16:20:00.473427    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567060-dbzpk"
Mar 20 16:20:00 crc kubenswrapper[4730]: I0320 16:20:00.913695    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567060-dbzpk"]
Mar 20 16:20:01 crc kubenswrapper[4730]: I0320 16:20:01.209019    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567060-dbzpk" event={"ID":"95b03afb-153e-4f6b-a88a-e64d8a889b97","Type":"ContainerStarted","Data":"0569d41e4f19d35f7cf3bc7363d39efefe7685a284238a7673c21960e8372c4d"}
Mar 20 16:20:03 crc kubenswrapper[4730]: I0320 16:20:03.231926    4730 generic.go:334] "Generic (PLEG): container finished" podID="95b03afb-153e-4f6b-a88a-e64d8a889b97" containerID="fea91f3e29a9829ef950c2b8d1b25f02c8cfb65d5e94d7557e912ffa33559bad" exitCode=0
Mar 20 16:20:03 crc kubenswrapper[4730]: I0320 16:20:03.232107    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567060-dbzpk" event={"ID":"95b03afb-153e-4f6b-a88a-e64d8a889b97","Type":"ContainerDied","Data":"fea91f3e29a9829ef950c2b8d1b25f02c8cfb65d5e94d7557e912ffa33559bad"}
Mar 20 16:20:04 crc kubenswrapper[4730]: I0320 16:20:04.548686    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567060-dbzpk"
Mar 20 16:20:04 crc kubenswrapper[4730]: I0320 16:20:04.668139    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sz44r\" (UniqueName: \"kubernetes.io/projected/95b03afb-153e-4f6b-a88a-e64d8a889b97-kube-api-access-sz44r\") pod \"95b03afb-153e-4f6b-a88a-e64d8a889b97\" (UID: \"95b03afb-153e-4f6b-a88a-e64d8a889b97\") "
Mar 20 16:20:04 crc kubenswrapper[4730]: I0320 16:20:04.677481    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95b03afb-153e-4f6b-a88a-e64d8a889b97-kube-api-access-sz44r" (OuterVolumeSpecName: "kube-api-access-sz44r") pod "95b03afb-153e-4f6b-a88a-e64d8a889b97" (UID: "95b03afb-153e-4f6b-a88a-e64d8a889b97"). InnerVolumeSpecName "kube-api-access-sz44r". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:20:04 crc kubenswrapper[4730]: I0320 16:20:04.770196    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sz44r\" (UniqueName: \"kubernetes.io/projected/95b03afb-153e-4f6b-a88a-e64d8a889b97-kube-api-access-sz44r\") on node \"crc\" DevicePath \"\""
Mar 20 16:20:05 crc kubenswrapper[4730]: I0320 16:20:05.254773    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567060-dbzpk" event={"ID":"95b03afb-153e-4f6b-a88a-e64d8a889b97","Type":"ContainerDied","Data":"0569d41e4f19d35f7cf3bc7363d39efefe7685a284238a7673c21960e8372c4d"}
Mar 20 16:20:05 crc kubenswrapper[4730]: I0320 16:20:05.254845    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0569d41e4f19d35f7cf3bc7363d39efefe7685a284238a7673c21960e8372c4d"
Mar 20 16:20:05 crc kubenswrapper[4730]: I0320 16:20:05.254893    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567060-dbzpk"
Mar 20 16:20:05 crc kubenswrapper[4730]: I0320 16:20:05.621438    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567054-glf2f"]
Mar 20 16:20:05 crc kubenswrapper[4730]: I0320 16:20:05.630674    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567054-glf2f"]
Mar 20 16:20:07 crc kubenswrapper[4730]: I0320 16:20:07.545650    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d62c2430-2f2b-49f0-848a-015a72d04090" path="/var/lib/kubelet/pods/d62c2430-2f2b-49f0-848a-015a72d04090/volumes"
Mar 20 16:20:12 crc kubenswrapper[4730]: I0320 16:20:12.879774    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 16:20:12 crc kubenswrapper[4730]: I0320 16:20:12.880417    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 16:20:42 crc kubenswrapper[4730]: I0320 16:20:42.880300    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 16:20:42 crc kubenswrapper[4730]: I0320 16:20:42.880800    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 16:20:42 crc kubenswrapper[4730]: I0320 16:20:42.880836    4730 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf"
Mar 20 16:20:42 crc kubenswrapper[4730]: I0320 16:20:42.881520    4730 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7c543dcd07b51b6540df53377dd2e3fd92636df6153928eaa86c229ad0e8e213"} pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted"
Mar 20 16:20:42 crc kubenswrapper[4730]: I0320 16:20:42.881561    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" containerID="cri-o://7c543dcd07b51b6540df53377dd2e3fd92636df6153928eaa86c229ad0e8e213" gracePeriod=600
Mar 20 16:20:43 crc kubenswrapper[4730]: E0320 16:20:43.012319    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:20:43 crc kubenswrapper[4730]: I0320 16:20:43.652322    4730 generic.go:334] "Generic (PLEG): container finished" podID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerID="7c543dcd07b51b6540df53377dd2e3fd92636df6153928eaa86c229ad0e8e213" exitCode=0
Mar 20 16:20:43 crc kubenswrapper[4730]: I0320 16:20:43.652398    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" event={"ID":"7fcd3db3-55f1-4c23-8fa9-78844495cea3","Type":"ContainerDied","Data":"7c543dcd07b51b6540df53377dd2e3fd92636df6153928eaa86c229ad0e8e213"}
Mar 20 16:20:43 crc kubenswrapper[4730]: I0320 16:20:43.652866    4730 scope.go:117] "RemoveContainer" containerID="b7dfce64faa161154e6afb28ae3e7685cc4caead60043a4bb51030ee7d13fb31"
Mar 20 16:20:43 crc kubenswrapper[4730]: I0320 16:20:43.653577    4730 scope.go:117] "RemoveContainer" containerID="7c543dcd07b51b6540df53377dd2e3fd92636df6153928eaa86c229ad0e8e213"
Mar 20 16:20:43 crc kubenswrapper[4730]: E0320 16:20:43.654053    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:20:49 crc kubenswrapper[4730]: I0320 16:20:49.495645    4730 scope.go:117] "RemoveContainer" containerID="88155d5b3d3f84b9a79ccb85b9d478d5415c51d7944f4bb78a548434ba4fb653"
Mar 20 16:20:57 crc kubenswrapper[4730]: I0320 16:20:57.534178    4730 scope.go:117] "RemoveContainer" containerID="7c543dcd07b51b6540df53377dd2e3fd92636df6153928eaa86c229ad0e8e213"
Mar 20 16:20:57 crc kubenswrapper[4730]: E0320 16:20:57.534920    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:21:09 crc kubenswrapper[4730]: I0320 16:21:09.913789    4730 generic.go:334] "Generic (PLEG): container finished" podID="43d453db-c8fb-438d-927e-6eaee8383df1" containerID="376af442bf9ebfb7a32ce79a2f7ed1f7e468476d5ad13d8d73708b12ea13c232" exitCode=0
Mar 20 16:21:09 crc kubenswrapper[4730]: I0320 16:21:09.913868    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4vhj" event={"ID":"43d453db-c8fb-438d-927e-6eaee8383df1","Type":"ContainerDied","Data":"376af442bf9ebfb7a32ce79a2f7ed1f7e468476d5ad13d8d73708b12ea13c232"}
Mar 20 16:21:11 crc kubenswrapper[4730]: I0320 16:21:11.323557    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4vhj"
Mar 20 16:21:11 crc kubenswrapper[4730]: I0320 16:21:11.405433    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/43d453db-c8fb-438d-927e-6eaee8383df1-ssh-key-openstack-edpm-ipam\") pod \"43d453db-c8fb-438d-927e-6eaee8383df1\" (UID: \"43d453db-c8fb-438d-927e-6eaee8383df1\") "
Mar 20 16:21:11 crc kubenswrapper[4730]: I0320 16:21:11.405534    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43d453db-c8fb-438d-927e-6eaee8383df1-inventory\") pod \"43d453db-c8fb-438d-927e-6eaee8383df1\" (UID: \"43d453db-c8fb-438d-927e-6eaee8383df1\") "
Mar 20 16:21:11 crc kubenswrapper[4730]: I0320 16:21:11.405564    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43d453db-c8fb-438d-927e-6eaee8383df1-libvirt-combined-ca-bundle\") pod \"43d453db-c8fb-438d-927e-6eaee8383df1\" (UID: \"43d453db-c8fb-438d-927e-6eaee8383df1\") "
Mar 20 16:21:11 crc kubenswrapper[4730]: I0320 16:21:11.405630    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nx8mb\" (UniqueName: \"kubernetes.io/projected/43d453db-c8fb-438d-927e-6eaee8383df1-kube-api-access-nx8mb\") pod \"43d453db-c8fb-438d-927e-6eaee8383df1\" (UID: \"43d453db-c8fb-438d-927e-6eaee8383df1\") "
Mar 20 16:21:11 crc kubenswrapper[4730]: I0320 16:21:11.405657    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/43d453db-c8fb-438d-927e-6eaee8383df1-libvirt-secret-0\") pod \"43d453db-c8fb-438d-927e-6eaee8383df1\" (UID: \"43d453db-c8fb-438d-927e-6eaee8383df1\") "
Mar 20 16:21:11 crc kubenswrapper[4730]: I0320 16:21:11.411050    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43d453db-c8fb-438d-927e-6eaee8383df1-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "43d453db-c8fb-438d-927e-6eaee8383df1" (UID: "43d453db-c8fb-438d-927e-6eaee8383df1"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:21:11 crc kubenswrapper[4730]: I0320 16:21:11.411261    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43d453db-c8fb-438d-927e-6eaee8383df1-kube-api-access-nx8mb" (OuterVolumeSpecName: "kube-api-access-nx8mb") pod "43d453db-c8fb-438d-927e-6eaee8383df1" (UID: "43d453db-c8fb-438d-927e-6eaee8383df1"). InnerVolumeSpecName "kube-api-access-nx8mb". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:21:11 crc kubenswrapper[4730]: I0320 16:21:11.434785    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43d453db-c8fb-438d-927e-6eaee8383df1-inventory" (OuterVolumeSpecName: "inventory") pod "43d453db-c8fb-438d-927e-6eaee8383df1" (UID: "43d453db-c8fb-438d-927e-6eaee8383df1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:21:11 crc kubenswrapper[4730]: I0320 16:21:11.435192    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43d453db-c8fb-438d-927e-6eaee8383df1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "43d453db-c8fb-438d-927e-6eaee8383df1" (UID: "43d453db-c8fb-438d-927e-6eaee8383df1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:21:11 crc kubenswrapper[4730]: I0320 16:21:11.439011    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43d453db-c8fb-438d-927e-6eaee8383df1-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "43d453db-c8fb-438d-927e-6eaee8383df1" (UID: "43d453db-c8fb-438d-927e-6eaee8383df1"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:21:11 crc kubenswrapper[4730]: I0320 16:21:11.508298    4730 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/43d453db-c8fb-438d-927e-6eaee8383df1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\""
Mar 20 16:21:11 crc kubenswrapper[4730]: I0320 16:21:11.508329    4730 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43d453db-c8fb-438d-927e-6eaee8383df1-inventory\") on node \"crc\" DevicePath \"\""
Mar 20 16:21:11 crc kubenswrapper[4730]: I0320 16:21:11.508340    4730 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43d453db-c8fb-438d-927e-6eaee8383df1-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:21:11 crc kubenswrapper[4730]: I0320 16:21:11.508349    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nx8mb\" (UniqueName: \"kubernetes.io/projected/43d453db-c8fb-438d-927e-6eaee8383df1-kube-api-access-nx8mb\") on node \"crc\" DevicePath \"\""
Mar 20 16:21:11 crc kubenswrapper[4730]: I0320 16:21:11.508362    4730 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/43d453db-c8fb-438d-927e-6eaee8383df1-libvirt-secret-0\") on node \"crc\" DevicePath \"\""
Mar 20 16:21:11 crc kubenswrapper[4730]: I0320 16:21:11.541442    4730 scope.go:117] "RemoveContainer" containerID="7c543dcd07b51b6540df53377dd2e3fd92636df6153928eaa86c229ad0e8e213"
Mar 20 16:21:11 crc kubenswrapper[4730]: E0320 16:21:11.541738    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:21:11 crc kubenswrapper[4730]: I0320 16:21:11.935758    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4vhj" event={"ID":"43d453db-c8fb-438d-927e-6eaee8383df1","Type":"ContainerDied","Data":"4e9ce008b0379b201268b642fefda0155599bddcd66e572dcdc376e727bf72f7"}
Mar 20 16:21:11 crc kubenswrapper[4730]: I0320 16:21:11.935804    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e9ce008b0379b201268b642fefda0155599bddcd66e572dcdc376e727bf72f7"
Mar 20 16:21:11 crc kubenswrapper[4730]: I0320 16:21:11.935819    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4vhj"
Mar 20 16:21:12 crc kubenswrapper[4730]: I0320 16:21:12.025154    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-x4t58"]
Mar 20 16:21:12 crc kubenswrapper[4730]: E0320 16:21:12.025655    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95b03afb-153e-4f6b-a88a-e64d8a889b97" containerName="oc"
Mar 20 16:21:12 crc kubenswrapper[4730]: I0320 16:21:12.025678    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="95b03afb-153e-4f6b-a88a-e64d8a889b97" containerName="oc"
Mar 20 16:21:12 crc kubenswrapper[4730]: E0320 16:21:12.025703    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43d453db-c8fb-438d-927e-6eaee8383df1" containerName="libvirt-edpm-deployment-openstack-edpm-ipam"
Mar 20 16:21:12 crc kubenswrapper[4730]: I0320 16:21:12.025712    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="43d453db-c8fb-438d-927e-6eaee8383df1" containerName="libvirt-edpm-deployment-openstack-edpm-ipam"
Mar 20 16:21:12 crc kubenswrapper[4730]: I0320 16:21:12.025960    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="95b03afb-153e-4f6b-a88a-e64d8a889b97" containerName="oc"
Mar 20 16:21:12 crc kubenswrapper[4730]: I0320 16:21:12.026003    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="43d453db-c8fb-438d-927e-6eaee8383df1" containerName="libvirt-edpm-deployment-openstack-edpm-ipam"
Mar 20 16:21:12 crc kubenswrapper[4730]: I0320 16:21:12.026903    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x4t58"
Mar 20 16:21:12 crc kubenswrapper[4730]: I0320 16:21:12.029172    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key"
Mar 20 16:21:12 crc kubenswrapper[4730]: I0320 16:21:12.030287    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret"
Mar 20 16:21:12 crc kubenswrapper[4730]: I0320 16:21:12.030524    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam"
Mar 20 16:21:12 crc kubenswrapper[4730]: I0320 16:21:12.030697    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config"
Mar 20 16:21:12 crc kubenswrapper[4730]: I0320 16:21:12.030765    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vvsxx"
Mar 20 16:21:12 crc kubenswrapper[4730]: I0320 16:21:12.030845    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config"
Mar 20 16:21:12 crc kubenswrapper[4730]: I0320 16:21:12.031401    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env"
Mar 20 16:21:12 crc kubenswrapper[4730]: I0320 16:21:12.051136    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-x4t58"]
Mar 20 16:21:12 crc kubenswrapper[4730]: I0320 16:21:12.123232    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x4t58\" (UID: \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x4t58"
Mar 20 16:21:12 crc kubenswrapper[4730]: I0320 16:21:12.123325    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkl8b\" (UniqueName: \"kubernetes.io/projected/6ffb462f-06f9-49df-bfe7-d41c274d4b05-kube-api-access-fkl8b\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x4t58\" (UID: \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x4t58"
Mar 20 16:21:12 crc kubenswrapper[4730]: I0320 16:21:12.123430    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x4t58\" (UID: \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x4t58"
Mar 20 16:21:12 crc kubenswrapper[4730]: I0320 16:21:12.123471    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x4t58\" (UID: \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x4t58"
Mar 20 16:21:12 crc kubenswrapper[4730]: I0320 16:21:12.123496    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x4t58\" (UID: \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x4t58"
Mar 20 16:21:12 crc kubenswrapper[4730]: I0320 16:21:12.123551    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/6ffb462f-06f9-49df-bfe7-d41c274d4b05-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x4t58\" (UID: \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x4t58"
Mar 20 16:21:12 crc kubenswrapper[4730]: I0320 16:21:12.123737    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x4t58\" (UID: \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x4t58"
Mar 20 16:21:12 crc kubenswrapper[4730]: I0320 16:21:12.123780    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x4t58\" (UID: \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x4t58"
Mar 20 16:21:12 crc kubenswrapper[4730]: I0320 16:21:12.123946    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x4t58\" (UID: \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x4t58"
Mar 20 16:21:12 crc kubenswrapper[4730]: I0320 16:21:12.124000    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x4t58\" (UID: \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x4t58"
Mar 20 16:21:12 crc kubenswrapper[4730]: I0320 16:21:12.124081    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x4t58\" (UID: \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x4t58"
Mar 20 16:21:12 crc kubenswrapper[4730]: I0320 16:21:12.225765    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x4t58\" (UID: \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x4t58"
Mar 20 16:21:12 crc kubenswrapper[4730]: I0320 16:21:12.225901    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x4t58\" (UID: \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x4t58"
Mar 20 16:21:12 crc kubenswrapper[4730]: I0320 16:21:12.225931    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkl8b\" (UniqueName: \"kubernetes.io/projected/6ffb462f-06f9-49df-bfe7-d41c274d4b05-kube-api-access-fkl8b\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x4t58\" (UID: \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x4t58"
Mar 20 16:21:12 crc kubenswrapper[4730]: I0320 16:21:12.226512    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x4t58\" (UID: \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x4t58"
Mar 20 16:21:12 crc kubenswrapper[4730]: I0320 16:21:12.226534    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x4t58\" (UID: \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x4t58"
Mar 20 16:21:12 crc kubenswrapper[4730]: I0320 16:21:12.226556    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x4t58\" (UID: \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x4t58"
Mar 20 16:21:12 crc kubenswrapper[4730]: I0320 16:21:12.226593    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/6ffb462f-06f9-49df-bfe7-d41c274d4b05-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x4t58\" (UID: \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x4t58"
Mar 20 16:21:12 crc kubenswrapper[4730]: I0320 16:21:12.226709    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x4t58\" (UID: \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x4t58"
Mar 20 16:21:12 crc kubenswrapper[4730]: I0320 16:21:12.226736    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x4t58\" (UID: \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x4t58"
Mar 20 16:21:12 crc kubenswrapper[4730]: I0320 16:21:12.226841    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x4t58\" (UID: \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x4t58"
Mar 20 16:21:12 crc kubenswrapper[4730]: I0320 16:21:12.226880    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x4t58\" (UID: \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x4t58"
Mar 20 16:21:12 crc kubenswrapper[4730]: I0320 16:21:12.230083    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x4t58\" (UID: \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x4t58"
Mar 20 16:21:12 crc kubenswrapper[4730]: I0320 16:21:12.230863    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/6ffb462f-06f9-49df-bfe7-d41c274d4b05-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x4t58\" (UID: \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x4t58"
Mar 20 16:21:12 crc kubenswrapper[4730]: I0320 16:21:12.243300    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x4t58\" (UID: \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x4t58"
Mar 20 16:21:12 crc kubenswrapper[4730]: I0320 16:21:12.255136    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x4t58\" (UID: \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x4t58"
Mar 20 16:21:12 crc kubenswrapper[4730]: I0320 16:21:12.256048    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x4t58\" (UID: \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x4t58"
Mar 20 16:21:12 crc kubenswrapper[4730]: I0320 16:21:12.256212    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x4t58\" (UID: \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x4t58"
Mar 20 16:21:12 crc kubenswrapper[4730]: I0320 16:21:12.256738    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x4t58\" (UID: \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x4t58"
Mar 20 16:21:12 crc kubenswrapper[4730]: I0320 16:21:12.264163    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x4t58\" (UID: \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x4t58"
Mar 20 16:21:12 crc kubenswrapper[4730]: I0320 16:21:12.269657    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x4t58\" (UID: \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x4t58"
Mar 20 16:21:12 crc kubenswrapper[4730]: I0320 16:21:12.270739    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkl8b\" (UniqueName: \"kubernetes.io/projected/6ffb462f-06f9-49df-bfe7-d41c274d4b05-kube-api-access-fkl8b\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x4t58\" (UID: \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x4t58"
Mar 20 16:21:12 crc kubenswrapper[4730]: I0320 16:21:12.274159    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-x4t58\" (UID: \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x4t58"
Mar 20 16:21:12 crc kubenswrapper[4730]: I0320 16:21:12.345325    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x4t58"
Mar 20 16:21:12 crc kubenswrapper[4730]: I0320 16:21:12.952229    4730 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider
Mar 20 16:21:12 crc kubenswrapper[4730]: I0320 16:21:12.954757    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-x4t58"]
Mar 20 16:21:13 crc kubenswrapper[4730]: I0320 16:21:13.970533    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x4t58" event={"ID":"6ffb462f-06f9-49df-bfe7-d41c274d4b05","Type":"ContainerStarted","Data":"08be8b8dbbd2047f288f6d426869c884bfb40bb5064e00dce56c6a11bd7559c8"}
Mar 20 16:21:13 crc kubenswrapper[4730]: I0320 16:21:13.971097    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x4t58" event={"ID":"6ffb462f-06f9-49df-bfe7-d41c274d4b05","Type":"ContainerStarted","Data":"f7754fd399f7dcad362df6e337682fddcbb9031ebb6354f3687e1026101703e4"}
Mar 20 16:21:13 crc kubenswrapper[4730]: I0320 16:21:13.992466    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x4t58" podStartSLOduration=1.5063335740000001 podStartE2EDuration="1.992445436s" podCreationTimestamp="2026-03-20 16:21:12 +0000 UTC" firstStartedPulling="2026-03-20 16:21:12.951951254 +0000 UTC m=+2532.165322623" lastFinishedPulling="2026-03-20 16:21:13.438063116 +0000 UTC m=+2532.651434485" observedRunningTime="2026-03-20 16:21:13.990762899 +0000 UTC m=+2533.204134268" watchObservedRunningTime="2026-03-20 16:21:13.992445436 +0000 UTC m=+2533.205816805"
Mar 20 16:21:25 crc kubenswrapper[4730]: I0320 16:21:25.533572    4730 scope.go:117] "RemoveContainer" containerID="7c543dcd07b51b6540df53377dd2e3fd92636df6153928eaa86c229ad0e8e213"
Mar 20 16:21:25 crc kubenswrapper[4730]: E0320 16:21:25.534185    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:21:39 crc kubenswrapper[4730]: I0320 16:21:39.533638    4730 scope.go:117] "RemoveContainer" containerID="7c543dcd07b51b6540df53377dd2e3fd92636df6153928eaa86c229ad0e8e213"
Mar 20 16:21:39 crc kubenswrapper[4730]: E0320 16:21:39.534582    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:21:51 crc kubenswrapper[4730]: I0320 16:21:51.539988    4730 scope.go:117] "RemoveContainer" containerID="7c543dcd07b51b6540df53377dd2e3fd92636df6153928eaa86c229ad0e8e213"
Mar 20 16:21:51 crc kubenswrapper[4730]: E0320 16:21:51.541011    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:22:00 crc kubenswrapper[4730]: I0320 16:22:00.146604    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567062-58ww8"]
Mar 20 16:22:00 crc kubenswrapper[4730]: I0320 16:22:00.149444    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567062-58ww8"
Mar 20 16:22:00 crc kubenswrapper[4730]: I0320 16:22:00.151697    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt"
Mar 20 16:22:00 crc kubenswrapper[4730]: I0320 16:22:00.155101    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vlqbl"
Mar 20 16:22:00 crc kubenswrapper[4730]: I0320 16:22:00.158375    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567062-58ww8"]
Mar 20 16:22:00 crc kubenswrapper[4730]: I0320 16:22:00.158748    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt"
Mar 20 16:22:00 crc kubenswrapper[4730]: I0320 16:22:00.202045    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpccc\" (UniqueName: \"kubernetes.io/projected/e053a518-d8c9-42a1-9c8d-83d5fec8de8c-kube-api-access-fpccc\") pod \"auto-csr-approver-29567062-58ww8\" (UID: \"e053a518-d8c9-42a1-9c8d-83d5fec8de8c\") " pod="openshift-infra/auto-csr-approver-29567062-58ww8"
Mar 20 16:22:00 crc kubenswrapper[4730]: I0320 16:22:00.304052    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpccc\" (UniqueName: \"kubernetes.io/projected/e053a518-d8c9-42a1-9c8d-83d5fec8de8c-kube-api-access-fpccc\") pod \"auto-csr-approver-29567062-58ww8\" (UID: \"e053a518-d8c9-42a1-9c8d-83d5fec8de8c\") " pod="openshift-infra/auto-csr-approver-29567062-58ww8"
Mar 20 16:22:00 crc kubenswrapper[4730]: I0320 16:22:00.325424    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpccc\" (UniqueName: \"kubernetes.io/projected/e053a518-d8c9-42a1-9c8d-83d5fec8de8c-kube-api-access-fpccc\") pod \"auto-csr-approver-29567062-58ww8\" (UID: \"e053a518-d8c9-42a1-9c8d-83d5fec8de8c\") " pod="openshift-infra/auto-csr-approver-29567062-58ww8"
Mar 20 16:22:00 crc kubenswrapper[4730]: I0320 16:22:00.474600    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567062-58ww8"
Mar 20 16:22:00 crc kubenswrapper[4730]: I0320 16:22:00.937905    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567062-58ww8"]
Mar 20 16:22:01 crc kubenswrapper[4730]: I0320 16:22:01.391798    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567062-58ww8" event={"ID":"e053a518-d8c9-42a1-9c8d-83d5fec8de8c","Type":"ContainerStarted","Data":"786f74eda8c049abd12af2c2ff6876a35e1dc8bbdf0c4f7dda9bf7bb8d5cecc3"}
Mar 20 16:22:03 crc kubenswrapper[4730]: I0320 16:22:03.422633    4730 generic.go:334] "Generic (PLEG): container finished" podID="e053a518-d8c9-42a1-9c8d-83d5fec8de8c" containerID="0ccfc2a0baaac1ea53dda2b6020b62a6460f7d7641aebee239ddb879e9c99ccb" exitCode=0
Mar 20 16:22:03 crc kubenswrapper[4730]: I0320 16:22:03.422719    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567062-58ww8" event={"ID":"e053a518-d8c9-42a1-9c8d-83d5fec8de8c","Type":"ContainerDied","Data":"0ccfc2a0baaac1ea53dda2b6020b62a6460f7d7641aebee239ddb879e9c99ccb"}
Mar 20 16:22:03 crc kubenswrapper[4730]: I0320 16:22:03.533853    4730 scope.go:117] "RemoveContainer" containerID="7c543dcd07b51b6540df53377dd2e3fd92636df6153928eaa86c229ad0e8e213"
Mar 20 16:22:03 crc kubenswrapper[4730]: E0320 16:22:03.534595    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:22:04 crc kubenswrapper[4730]: I0320 16:22:04.878380    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567062-58ww8"
Mar 20 16:22:05 crc kubenswrapper[4730]: I0320 16:22:05.013798    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpccc\" (UniqueName: \"kubernetes.io/projected/e053a518-d8c9-42a1-9c8d-83d5fec8de8c-kube-api-access-fpccc\") pod \"e053a518-d8c9-42a1-9c8d-83d5fec8de8c\" (UID: \"e053a518-d8c9-42a1-9c8d-83d5fec8de8c\") "
Mar 20 16:22:05 crc kubenswrapper[4730]: I0320 16:22:05.027888    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e053a518-d8c9-42a1-9c8d-83d5fec8de8c-kube-api-access-fpccc" (OuterVolumeSpecName: "kube-api-access-fpccc") pod "e053a518-d8c9-42a1-9c8d-83d5fec8de8c" (UID: "e053a518-d8c9-42a1-9c8d-83d5fec8de8c"). InnerVolumeSpecName "kube-api-access-fpccc". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:22:05 crc kubenswrapper[4730]: I0320 16:22:05.116626    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpccc\" (UniqueName: \"kubernetes.io/projected/e053a518-d8c9-42a1-9c8d-83d5fec8de8c-kube-api-access-fpccc\") on node \"crc\" DevicePath \"\""
Mar 20 16:22:05 crc kubenswrapper[4730]: I0320 16:22:05.440156    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567062-58ww8" event={"ID":"e053a518-d8c9-42a1-9c8d-83d5fec8de8c","Type":"ContainerDied","Data":"786f74eda8c049abd12af2c2ff6876a35e1dc8bbdf0c4f7dda9bf7bb8d5cecc3"}
Mar 20 16:22:05 crc kubenswrapper[4730]: I0320 16:22:05.440202    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="786f74eda8c049abd12af2c2ff6876a35e1dc8bbdf0c4f7dda9bf7bb8d5cecc3"
Mar 20 16:22:05 crc kubenswrapper[4730]: I0320 16:22:05.440232    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567062-58ww8"
Mar 20 16:22:05 crc kubenswrapper[4730]: I0320 16:22:05.951320    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567056-5hbgw"]
Mar 20 16:22:05 crc kubenswrapper[4730]: I0320 16:22:05.961043    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567056-5hbgw"]
Mar 20 16:22:07 crc kubenswrapper[4730]: I0320 16:22:07.545232    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="331a4cf6-7d2c-4540-9686-064f27fee0cc" path="/var/lib/kubelet/pods/331a4cf6-7d2c-4540-9686-064f27fee0cc/volumes"
Mar 20 16:22:15 crc kubenswrapper[4730]: I0320 16:22:15.534155    4730 scope.go:117] "RemoveContainer" containerID="7c543dcd07b51b6540df53377dd2e3fd92636df6153928eaa86c229ad0e8e213"
Mar 20 16:22:15 crc kubenswrapper[4730]: E0320 16:22:15.535137    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:22:27 crc kubenswrapper[4730]: I0320 16:22:27.533758    4730 scope.go:117] "RemoveContainer" containerID="7c543dcd07b51b6540df53377dd2e3fd92636df6153928eaa86c229ad0e8e213"
Mar 20 16:22:27 crc kubenswrapper[4730]: E0320 16:22:27.534538    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:22:41 crc kubenswrapper[4730]: I0320 16:22:41.538894    4730 scope.go:117] "RemoveContainer" containerID="7c543dcd07b51b6540df53377dd2e3fd92636df6153928eaa86c229ad0e8e213"
Mar 20 16:22:41 crc kubenswrapper[4730]: E0320 16:22:41.539739    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:22:49 crc kubenswrapper[4730]: I0320 16:22:49.615204    4730 scope.go:117] "RemoveContainer" containerID="a7f958f4919aca64c652d826ae81971e8c020ebd62d216c39f71ac76d6e91ef4"
Mar 20 16:22:52 crc kubenswrapper[4730]: I0320 16:22:52.533291    4730 scope.go:117] "RemoveContainer" containerID="7c543dcd07b51b6540df53377dd2e3fd92636df6153928eaa86c229ad0e8e213"
Mar 20 16:22:52 crc kubenswrapper[4730]: E0320 16:22:52.533811    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:23:06 crc kubenswrapper[4730]: I0320 16:23:06.534625    4730 scope.go:117] "RemoveContainer" containerID="7c543dcd07b51b6540df53377dd2e3fd92636df6153928eaa86c229ad0e8e213"
Mar 20 16:23:06 crc kubenswrapper[4730]: E0320 16:23:06.535838    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:23:17 crc kubenswrapper[4730]: I0320 16:23:17.533882    4730 scope.go:117] "RemoveContainer" containerID="7c543dcd07b51b6540df53377dd2e3fd92636df6153928eaa86c229ad0e8e213"
Mar 20 16:23:17 crc kubenswrapper[4730]: E0320 16:23:17.534705    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:23:29 crc kubenswrapper[4730]: I0320 16:23:29.532927    4730 scope.go:117] "RemoveContainer" containerID="7c543dcd07b51b6540df53377dd2e3fd92636df6153928eaa86c229ad0e8e213"
Mar 20 16:23:29 crc kubenswrapper[4730]: E0320 16:23:29.534846    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:23:36 crc kubenswrapper[4730]: I0320 16:23:36.291848    4730 generic.go:334] "Generic (PLEG): container finished" podID="6ffb462f-06f9-49df-bfe7-d41c274d4b05" containerID="08be8b8dbbd2047f288f6d426869c884bfb40bb5064e00dce56c6a11bd7559c8" exitCode=0
Mar 20 16:23:36 crc kubenswrapper[4730]: I0320 16:23:36.291988    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x4t58" event={"ID":"6ffb462f-06f9-49df-bfe7-d41c274d4b05","Type":"ContainerDied","Data":"08be8b8dbbd2047f288f6d426869c884bfb40bb5064e00dce56c6a11bd7559c8"}
Mar 20 16:23:37 crc kubenswrapper[4730]: I0320 16:23:37.715665    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x4t58"
Mar 20 16:23:37 crc kubenswrapper[4730]: I0320 16:23:37.790736    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/6ffb462f-06f9-49df-bfe7-d41c274d4b05-nova-extra-config-0\") pod \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\" (UID: \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\") "
Mar 20 16:23:37 crc kubenswrapper[4730]: I0320 16:23:37.790814    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-nova-migration-ssh-key-1\") pod \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\" (UID: \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\") "
Mar 20 16:23:37 crc kubenswrapper[4730]: I0320 16:23:37.790895    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkl8b\" (UniqueName: \"kubernetes.io/projected/6ffb462f-06f9-49df-bfe7-d41c274d4b05-kube-api-access-fkl8b\") pod \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\" (UID: \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\") "
Mar 20 16:23:37 crc kubenswrapper[4730]: I0320 16:23:37.790965    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-ssh-key-openstack-edpm-ipam\") pod \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\" (UID: \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\") "
Mar 20 16:23:37 crc kubenswrapper[4730]: I0320 16:23:37.791122    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-nova-cell1-compute-config-2\") pod \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\" (UID: \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\") "
Mar 20 16:23:37 crc kubenswrapper[4730]: I0320 16:23:37.791168    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-nova-migration-ssh-key-0\") pod \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\" (UID: \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\") "
Mar 20 16:23:37 crc kubenswrapper[4730]: I0320 16:23:37.791190    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-nova-cell1-compute-config-1\") pod \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\" (UID: \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\") "
Mar 20 16:23:37 crc kubenswrapper[4730]: I0320 16:23:37.791218    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-nova-combined-ca-bundle\") pod \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\" (UID: \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\") "
Mar 20 16:23:37 crc kubenswrapper[4730]: I0320 16:23:37.791301    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-nova-cell1-compute-config-0\") pod \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\" (UID: \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\") "
Mar 20 16:23:37 crc kubenswrapper[4730]: I0320 16:23:37.791324    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-nova-cell1-compute-config-3\") pod \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\" (UID: \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\") "
Mar 20 16:23:37 crc kubenswrapper[4730]: I0320 16:23:37.791382    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-inventory\") pod \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\" (UID: \"6ffb462f-06f9-49df-bfe7-d41c274d4b05\") "
Mar 20 16:23:37 crc kubenswrapper[4730]: I0320 16:23:37.797354    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ffb462f-06f9-49df-bfe7-d41c274d4b05-kube-api-access-fkl8b" (OuterVolumeSpecName: "kube-api-access-fkl8b") pod "6ffb462f-06f9-49df-bfe7-d41c274d4b05" (UID: "6ffb462f-06f9-49df-bfe7-d41c274d4b05"). InnerVolumeSpecName "kube-api-access-fkl8b". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:23:37 crc kubenswrapper[4730]: I0320 16:23:37.797463    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "6ffb462f-06f9-49df-bfe7-d41c274d4b05" (UID: "6ffb462f-06f9-49df-bfe7-d41c274d4b05"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:23:37 crc kubenswrapper[4730]: I0320 16:23:37.816766    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ffb462f-06f9-49df-bfe7-d41c274d4b05-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "6ffb462f-06f9-49df-bfe7-d41c274d4b05" (UID: "6ffb462f-06f9-49df-bfe7-d41c274d4b05"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:23:37 crc kubenswrapper[4730]: I0320 16:23:37.822146    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-inventory" (OuterVolumeSpecName: "inventory") pod "6ffb462f-06f9-49df-bfe7-d41c274d4b05" (UID: "6ffb462f-06f9-49df-bfe7-d41c274d4b05"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:23:37 crc kubenswrapper[4730]: I0320 16:23:37.823029    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6ffb462f-06f9-49df-bfe7-d41c274d4b05" (UID: "6ffb462f-06f9-49df-bfe7-d41c274d4b05"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:23:37 crc kubenswrapper[4730]: I0320 16:23:37.823213    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "6ffb462f-06f9-49df-bfe7-d41c274d4b05" (UID: "6ffb462f-06f9-49df-bfe7-d41c274d4b05"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:23:37 crc kubenswrapper[4730]: I0320 16:23:37.824389    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "6ffb462f-06f9-49df-bfe7-d41c274d4b05" (UID: "6ffb462f-06f9-49df-bfe7-d41c274d4b05"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:23:37 crc kubenswrapper[4730]: I0320 16:23:37.825475    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "6ffb462f-06f9-49df-bfe7-d41c274d4b05" (UID: "6ffb462f-06f9-49df-bfe7-d41c274d4b05"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:23:37 crc kubenswrapper[4730]: I0320 16:23:37.825924    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "6ffb462f-06f9-49df-bfe7-d41c274d4b05" (UID: "6ffb462f-06f9-49df-bfe7-d41c274d4b05"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:23:37 crc kubenswrapper[4730]: I0320 16:23:37.827894    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "6ffb462f-06f9-49df-bfe7-d41c274d4b05" (UID: "6ffb462f-06f9-49df-bfe7-d41c274d4b05"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:23:37 crc kubenswrapper[4730]: I0320 16:23:37.831635    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "6ffb462f-06f9-49df-bfe7-d41c274d4b05" (UID: "6ffb462f-06f9-49df-bfe7-d41c274d4b05"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:23:37 crc kubenswrapper[4730]: I0320 16:23:37.894797    4730 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-inventory\") on node \"crc\" DevicePath \"\""
Mar 20 16:23:37 crc kubenswrapper[4730]: I0320 16:23:37.894837    4730 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/6ffb462f-06f9-49df-bfe7-d41c274d4b05-nova-extra-config-0\") on node \"crc\" DevicePath \"\""
Mar 20 16:23:37 crc kubenswrapper[4730]: I0320 16:23:37.894847    4730 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\""
Mar 20 16:23:37 crc kubenswrapper[4730]: I0320 16:23:37.894856    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkl8b\" (UniqueName: \"kubernetes.io/projected/6ffb462f-06f9-49df-bfe7-d41c274d4b05-kube-api-access-fkl8b\") on node \"crc\" DevicePath \"\""
Mar 20 16:23:37 crc kubenswrapper[4730]: I0320 16:23:37.894866    4730 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\""
Mar 20 16:23:37 crc kubenswrapper[4730]: I0320 16:23:37.894875    4730 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\""
Mar 20 16:23:37 crc kubenswrapper[4730]: I0320 16:23:37.894891    4730 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\""
Mar 20 16:23:37 crc kubenswrapper[4730]: I0320 16:23:37.894899    4730 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\""
Mar 20 16:23:37 crc kubenswrapper[4730]: I0320 16:23:37.894907    4730 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:23:37 crc kubenswrapper[4730]: I0320 16:23:37.894917    4730 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\""
Mar 20 16:23:37 crc kubenswrapper[4730]: I0320 16:23:37.894933    4730 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/6ffb462f-06f9-49df-bfe7-d41c274d4b05-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\""
Mar 20 16:23:38 crc kubenswrapper[4730]: I0320 16:23:38.309214    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x4t58" event={"ID":"6ffb462f-06f9-49df-bfe7-d41c274d4b05","Type":"ContainerDied","Data":"f7754fd399f7dcad362df6e337682fddcbb9031ebb6354f3687e1026101703e4"}
Mar 20 16:23:38 crc kubenswrapper[4730]: I0320 16:23:38.309279    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-x4t58"
Mar 20 16:23:38 crc kubenswrapper[4730]: I0320 16:23:38.309301    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7754fd399f7dcad362df6e337682fddcbb9031ebb6354f3687e1026101703e4"
Mar 20 16:23:38 crc kubenswrapper[4730]: I0320 16:23:38.425486    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq"]
Mar 20 16:23:38 crc kubenswrapper[4730]: E0320 16:23:38.425876    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e053a518-d8c9-42a1-9c8d-83d5fec8de8c" containerName="oc"
Mar 20 16:23:38 crc kubenswrapper[4730]: I0320 16:23:38.425888    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="e053a518-d8c9-42a1-9c8d-83d5fec8de8c" containerName="oc"
Mar 20 16:23:38 crc kubenswrapper[4730]: E0320 16:23:38.425920    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ffb462f-06f9-49df-bfe7-d41c274d4b05" containerName="nova-edpm-deployment-openstack-edpm-ipam"
Mar 20 16:23:38 crc kubenswrapper[4730]: I0320 16:23:38.425928    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ffb462f-06f9-49df-bfe7-d41c274d4b05" containerName="nova-edpm-deployment-openstack-edpm-ipam"
Mar 20 16:23:38 crc kubenswrapper[4730]: I0320 16:23:38.426095    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ffb462f-06f9-49df-bfe7-d41c274d4b05" containerName="nova-edpm-deployment-openstack-edpm-ipam"
Mar 20 16:23:38 crc kubenswrapper[4730]: I0320 16:23:38.426122    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="e053a518-d8c9-42a1-9c8d-83d5fec8de8c" containerName="oc"
Mar 20 16:23:38 crc kubenswrapper[4730]: I0320 16:23:38.426829    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq"
Mar 20 16:23:38 crc kubenswrapper[4730]: I0320 16:23:38.430345    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data"
Mar 20 16:23:38 crc kubenswrapper[4730]: I0320 16:23:38.430561    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret"
Mar 20 16:23:38 crc kubenswrapper[4730]: I0320 16:23:38.431367    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam"
Mar 20 16:23:38 crc kubenswrapper[4730]: I0320 16:23:38.431664    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env"
Mar 20 16:23:38 crc kubenswrapper[4730]: I0320 16:23:38.431750    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vvsxx"
Mar 20 16:23:38 crc kubenswrapper[4730]: I0320 16:23:38.444872    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq"]
Mar 20 16:23:38 crc kubenswrapper[4730]: I0320 16:23:38.505463    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/884c2fa6-babb-44b8-b8e2-3e4fbce27153-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq\" (UID: \"884c2fa6-babb-44b8-b8e2-3e4fbce27153\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq"
Mar 20 16:23:38 crc kubenswrapper[4730]: I0320 16:23:38.505774    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/884c2fa6-babb-44b8-b8e2-3e4fbce27153-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq\" (UID: \"884c2fa6-babb-44b8-b8e2-3e4fbce27153\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq"
Mar 20 16:23:38 crc kubenswrapper[4730]: I0320 16:23:38.505966    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/884c2fa6-babb-44b8-b8e2-3e4fbce27153-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq\" (UID: \"884c2fa6-babb-44b8-b8e2-3e4fbce27153\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq"
Mar 20 16:23:38 crc kubenswrapper[4730]: I0320 16:23:38.506076    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/884c2fa6-babb-44b8-b8e2-3e4fbce27153-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq\" (UID: \"884c2fa6-babb-44b8-b8e2-3e4fbce27153\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq"
Mar 20 16:23:38 crc kubenswrapper[4730]: I0320 16:23:38.506237    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vccl6\" (UniqueName: \"kubernetes.io/projected/884c2fa6-babb-44b8-b8e2-3e4fbce27153-kube-api-access-vccl6\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq\" (UID: \"884c2fa6-babb-44b8-b8e2-3e4fbce27153\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq"
Mar 20 16:23:38 crc kubenswrapper[4730]: I0320 16:23:38.506425    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/884c2fa6-babb-44b8-b8e2-3e4fbce27153-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq\" (UID: \"884c2fa6-babb-44b8-b8e2-3e4fbce27153\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq"
Mar 20 16:23:38 crc kubenswrapper[4730]: I0320 16:23:38.506573    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/884c2fa6-babb-44b8-b8e2-3e4fbce27153-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq\" (UID: \"884c2fa6-babb-44b8-b8e2-3e4fbce27153\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq"
Mar 20 16:23:38 crc kubenswrapper[4730]: I0320 16:23:38.608656    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/884c2fa6-babb-44b8-b8e2-3e4fbce27153-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq\" (UID: \"884c2fa6-babb-44b8-b8e2-3e4fbce27153\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq"
Mar 20 16:23:38 crc kubenswrapper[4730]: I0320 16:23:38.608733    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/884c2fa6-babb-44b8-b8e2-3e4fbce27153-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq\" (UID: \"884c2fa6-babb-44b8-b8e2-3e4fbce27153\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq"
Mar 20 16:23:38 crc kubenswrapper[4730]: I0320 16:23:38.608771    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/884c2fa6-babb-44b8-b8e2-3e4fbce27153-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq\" (UID: \"884c2fa6-babb-44b8-b8e2-3e4fbce27153\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq"
Mar 20 16:23:38 crc kubenswrapper[4730]: I0320 16:23:38.608832    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vccl6\" (UniqueName: \"kubernetes.io/projected/884c2fa6-babb-44b8-b8e2-3e4fbce27153-kube-api-access-vccl6\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq\" (UID: \"884c2fa6-babb-44b8-b8e2-3e4fbce27153\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq"
Mar 20 16:23:38 crc kubenswrapper[4730]: I0320 16:23:38.608895    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/884c2fa6-babb-44b8-b8e2-3e4fbce27153-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq\" (UID: \"884c2fa6-babb-44b8-b8e2-3e4fbce27153\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq"
Mar 20 16:23:38 crc kubenswrapper[4730]: I0320 16:23:38.608949    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/884c2fa6-babb-44b8-b8e2-3e4fbce27153-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq\" (UID: \"884c2fa6-babb-44b8-b8e2-3e4fbce27153\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq"
Mar 20 16:23:38 crc kubenswrapper[4730]: I0320 16:23:38.609070    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/884c2fa6-babb-44b8-b8e2-3e4fbce27153-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq\" (UID: \"884c2fa6-babb-44b8-b8e2-3e4fbce27153\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq"
Mar 20 16:23:38 crc kubenswrapper[4730]: I0320 16:23:38.612328    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/884c2fa6-babb-44b8-b8e2-3e4fbce27153-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq\" (UID: \"884c2fa6-babb-44b8-b8e2-3e4fbce27153\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq"
Mar 20 16:23:38 crc kubenswrapper[4730]: I0320 16:23:38.612680    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/884c2fa6-babb-44b8-b8e2-3e4fbce27153-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq\" (UID: \"884c2fa6-babb-44b8-b8e2-3e4fbce27153\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq"
Mar 20 16:23:38 crc kubenswrapper[4730]: I0320 16:23:38.613448    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/884c2fa6-babb-44b8-b8e2-3e4fbce27153-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq\" (UID: \"884c2fa6-babb-44b8-b8e2-3e4fbce27153\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq"
Mar 20 16:23:38 crc kubenswrapper[4730]: I0320 16:23:38.614158    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/884c2fa6-babb-44b8-b8e2-3e4fbce27153-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq\" (UID: \"884c2fa6-babb-44b8-b8e2-3e4fbce27153\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq"
Mar 20 16:23:38 crc kubenswrapper[4730]: I0320 16:23:38.614747    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/884c2fa6-babb-44b8-b8e2-3e4fbce27153-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq\" (UID: \"884c2fa6-babb-44b8-b8e2-3e4fbce27153\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq"
Mar 20 16:23:38 crc kubenswrapper[4730]: I0320 16:23:38.615483    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/884c2fa6-babb-44b8-b8e2-3e4fbce27153-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq\" (UID: \"884c2fa6-babb-44b8-b8e2-3e4fbce27153\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq"
Mar 20 16:23:38 crc kubenswrapper[4730]: I0320 16:23:38.624824    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vccl6\" (UniqueName: \"kubernetes.io/projected/884c2fa6-babb-44b8-b8e2-3e4fbce27153-kube-api-access-vccl6\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq\" (UID: \"884c2fa6-babb-44b8-b8e2-3e4fbce27153\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq"
Mar 20 16:23:38 crc kubenswrapper[4730]: I0320 16:23:38.744105    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq"
Mar 20 16:23:39 crc kubenswrapper[4730]: I0320 16:23:39.290932    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq"]
Mar 20 16:23:39 crc kubenswrapper[4730]: I0320 16:23:39.319362    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq" event={"ID":"884c2fa6-babb-44b8-b8e2-3e4fbce27153","Type":"ContainerStarted","Data":"7ab2683493c806bbc0a72a38ad816fd90029738a822e7670b5034e10e4d79b42"}
Mar 20 16:23:40 crc kubenswrapper[4730]: I0320 16:23:40.334362    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq" event={"ID":"884c2fa6-babb-44b8-b8e2-3e4fbce27153","Type":"ContainerStarted","Data":"4f23a2d6263e179fbb42b831d0fa59c0e1411d46b87becc3c39bd7883db89197"}
Mar 20 16:23:41 crc kubenswrapper[4730]: I0320 16:23:40.372727    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq" podStartSLOduration=1.936028289 podStartE2EDuration="2.372702113s" podCreationTimestamp="2026-03-20 16:23:38 +0000 UTC" firstStartedPulling="2026-03-20 16:23:39.299738891 +0000 UTC m=+2678.513110260" lastFinishedPulling="2026-03-20 16:23:39.736412705 +0000 UTC m=+2678.949784084" observedRunningTime="2026-03-20 16:23:40.362636597 +0000 UTC m=+2679.576007986" watchObservedRunningTime="2026-03-20 16:23:40.372702113 +0000 UTC m=+2679.586073482"
Mar 20 16:23:44 crc kubenswrapper[4730]: I0320 16:23:44.532692    4730 scope.go:117] "RemoveContainer" containerID="7c543dcd07b51b6540df53377dd2e3fd92636df6153928eaa86c229ad0e8e213"
Mar 20 16:23:44 crc kubenswrapper[4730]: E0320 16:23:44.533442    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:23:59 crc kubenswrapper[4730]: I0320 16:23:59.532925    4730 scope.go:117] "RemoveContainer" containerID="7c543dcd07b51b6540df53377dd2e3fd92636df6153928eaa86c229ad0e8e213"
Mar 20 16:23:59 crc kubenswrapper[4730]: E0320 16:23:59.533911    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:24:00 crc kubenswrapper[4730]: I0320 16:24:00.162519    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567064-sklm5"]
Mar 20 16:24:00 crc kubenswrapper[4730]: I0320 16:24:00.164516    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567064-sklm5"
Mar 20 16:24:00 crc kubenswrapper[4730]: I0320 16:24:00.167363    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt"
Mar 20 16:24:00 crc kubenswrapper[4730]: I0320 16:24:00.167620    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt"
Mar 20 16:24:00 crc kubenswrapper[4730]: I0320 16:24:00.167809    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vlqbl"
Mar 20 16:24:00 crc kubenswrapper[4730]: I0320 16:24:00.170834    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567064-sklm5"]
Mar 20 16:24:00 crc kubenswrapper[4730]: I0320 16:24:00.244620    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g8xj\" (UniqueName: \"kubernetes.io/projected/a36e403d-410f-40cc-8441-66c444837d24-kube-api-access-4g8xj\") pod \"auto-csr-approver-29567064-sklm5\" (UID: \"a36e403d-410f-40cc-8441-66c444837d24\") " pod="openshift-infra/auto-csr-approver-29567064-sklm5"
Mar 20 16:24:00 crc kubenswrapper[4730]: I0320 16:24:00.345989    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g8xj\" (UniqueName: \"kubernetes.io/projected/a36e403d-410f-40cc-8441-66c444837d24-kube-api-access-4g8xj\") pod \"auto-csr-approver-29567064-sklm5\" (UID: \"a36e403d-410f-40cc-8441-66c444837d24\") " pod="openshift-infra/auto-csr-approver-29567064-sklm5"
Mar 20 16:24:00 crc kubenswrapper[4730]: I0320 16:24:00.365316    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g8xj\" (UniqueName: \"kubernetes.io/projected/a36e403d-410f-40cc-8441-66c444837d24-kube-api-access-4g8xj\") pod \"auto-csr-approver-29567064-sklm5\" (UID: \"a36e403d-410f-40cc-8441-66c444837d24\") " pod="openshift-infra/auto-csr-approver-29567064-sklm5"
Mar 20 16:24:00 crc kubenswrapper[4730]: I0320 16:24:00.484024    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567064-sklm5"
Mar 20 16:24:00 crc kubenswrapper[4730]: I0320 16:24:00.955699    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567064-sklm5"]
Mar 20 16:24:00 crc kubenswrapper[4730]: W0320 16:24:00.959080    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda36e403d_410f_40cc_8441_66c444837d24.slice/crio-6e6dae407208c0df8a73bdfd1274fe27ac4f9dd3cff0bfcf4c1f538cba05d871 WatchSource:0}: Error finding container 6e6dae407208c0df8a73bdfd1274fe27ac4f9dd3cff0bfcf4c1f538cba05d871: Status 404 returned error can't find the container with id 6e6dae407208c0df8a73bdfd1274fe27ac4f9dd3cff0bfcf4c1f538cba05d871
Mar 20 16:24:01 crc kubenswrapper[4730]: I0320 16:24:01.548131    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567064-sklm5" event={"ID":"a36e403d-410f-40cc-8441-66c444837d24","Type":"ContainerStarted","Data":"6e6dae407208c0df8a73bdfd1274fe27ac4f9dd3cff0bfcf4c1f538cba05d871"}
Mar 20 16:24:02 crc kubenswrapper[4730]: I0320 16:24:02.547482    4730 generic.go:334] "Generic (PLEG): container finished" podID="a36e403d-410f-40cc-8441-66c444837d24" containerID="1718e89ccd737bfb9a3619c68d04fdd94aa68dd80d2bc675347f69c2cc40fd04" exitCode=0
Mar 20 16:24:02 crc kubenswrapper[4730]: I0320 16:24:02.547540    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567064-sklm5" event={"ID":"a36e403d-410f-40cc-8441-66c444837d24","Type":"ContainerDied","Data":"1718e89ccd737bfb9a3619c68d04fdd94aa68dd80d2bc675347f69c2cc40fd04"}
Mar 20 16:24:03 crc kubenswrapper[4730]: I0320 16:24:03.906448    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567064-sklm5"
Mar 20 16:24:03 crc kubenswrapper[4730]: I0320 16:24:03.929775    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4g8xj\" (UniqueName: \"kubernetes.io/projected/a36e403d-410f-40cc-8441-66c444837d24-kube-api-access-4g8xj\") pod \"a36e403d-410f-40cc-8441-66c444837d24\" (UID: \"a36e403d-410f-40cc-8441-66c444837d24\") "
Mar 20 16:24:03 crc kubenswrapper[4730]: I0320 16:24:03.938585    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a36e403d-410f-40cc-8441-66c444837d24-kube-api-access-4g8xj" (OuterVolumeSpecName: "kube-api-access-4g8xj") pod "a36e403d-410f-40cc-8441-66c444837d24" (UID: "a36e403d-410f-40cc-8441-66c444837d24"). InnerVolumeSpecName "kube-api-access-4g8xj". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:24:04 crc kubenswrapper[4730]: I0320 16:24:04.031209    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4g8xj\" (UniqueName: \"kubernetes.io/projected/a36e403d-410f-40cc-8441-66c444837d24-kube-api-access-4g8xj\") on node \"crc\" DevicePath \"\""
Mar 20 16:24:04 crc kubenswrapper[4730]: I0320 16:24:04.573705    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567064-sklm5" event={"ID":"a36e403d-410f-40cc-8441-66c444837d24","Type":"ContainerDied","Data":"6e6dae407208c0df8a73bdfd1274fe27ac4f9dd3cff0bfcf4c1f538cba05d871"}
Mar 20 16:24:04 crc kubenswrapper[4730]: I0320 16:24:04.573752    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e6dae407208c0df8a73bdfd1274fe27ac4f9dd3cff0bfcf4c1f538cba05d871"
Mar 20 16:24:04 crc kubenswrapper[4730]: I0320 16:24:04.573825    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567064-sklm5"
Mar 20 16:24:04 crc kubenswrapper[4730]: I0320 16:24:04.977610    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567058-dbb42"]
Mar 20 16:24:04 crc kubenswrapper[4730]: I0320 16:24:04.988343    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567058-dbb42"]
Mar 20 16:24:05 crc kubenswrapper[4730]: I0320 16:24:05.547033    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e26c2d2-d860-4e2a-b8e9-d607220b44f7" path="/var/lib/kubelet/pods/3e26c2d2-d860-4e2a-b8e9-d607220b44f7/volumes"
Mar 20 16:24:13 crc kubenswrapper[4730]: I0320 16:24:13.533747    4730 scope.go:117] "RemoveContainer" containerID="7c543dcd07b51b6540df53377dd2e3fd92636df6153928eaa86c229ad0e8e213"
Mar 20 16:24:13 crc kubenswrapper[4730]: E0320 16:24:13.534528    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:24:27 crc kubenswrapper[4730]: I0320 16:24:27.533843    4730 scope.go:117] "RemoveContainer" containerID="7c543dcd07b51b6540df53377dd2e3fd92636df6153928eaa86c229ad0e8e213"
Mar 20 16:24:27 crc kubenswrapper[4730]: E0320 16:24:27.534730    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:24:37 crc kubenswrapper[4730]: I0320 16:24:37.091492    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gkg85"]
Mar 20 16:24:37 crc kubenswrapper[4730]: E0320 16:24:37.092607    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a36e403d-410f-40cc-8441-66c444837d24" containerName="oc"
Mar 20 16:24:37 crc kubenswrapper[4730]: I0320 16:24:37.092627    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="a36e403d-410f-40cc-8441-66c444837d24" containerName="oc"
Mar 20 16:24:37 crc kubenswrapper[4730]: I0320 16:24:37.092881    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="a36e403d-410f-40cc-8441-66c444837d24" containerName="oc"
Mar 20 16:24:37 crc kubenswrapper[4730]: I0320 16:24:37.094675    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gkg85"
Mar 20 16:24:37 crc kubenswrapper[4730]: I0320 16:24:37.101406    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gkg85"]
Mar 20 16:24:37 crc kubenswrapper[4730]: I0320 16:24:37.246882    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcclj\" (UniqueName: \"kubernetes.io/projected/99180a72-d38c-44e4-b866-691567a72781-kube-api-access-fcclj\") pod \"certified-operators-gkg85\" (UID: \"99180a72-d38c-44e4-b866-691567a72781\") " pod="openshift-marketplace/certified-operators-gkg85"
Mar 20 16:24:37 crc kubenswrapper[4730]: I0320 16:24:37.247312    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99180a72-d38c-44e4-b866-691567a72781-utilities\") pod \"certified-operators-gkg85\" (UID: \"99180a72-d38c-44e4-b866-691567a72781\") " pod="openshift-marketplace/certified-operators-gkg85"
Mar 20 16:24:37 crc kubenswrapper[4730]: I0320 16:24:37.247390    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99180a72-d38c-44e4-b866-691567a72781-catalog-content\") pod \"certified-operators-gkg85\" (UID: \"99180a72-d38c-44e4-b866-691567a72781\") " pod="openshift-marketplace/certified-operators-gkg85"
Mar 20 16:24:37 crc kubenswrapper[4730]: I0320 16:24:37.349000    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99180a72-d38c-44e4-b866-691567a72781-utilities\") pod \"certified-operators-gkg85\" (UID: \"99180a72-d38c-44e4-b866-691567a72781\") " pod="openshift-marketplace/certified-operators-gkg85"
Mar 20 16:24:37 crc kubenswrapper[4730]: I0320 16:24:37.349044    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99180a72-d38c-44e4-b866-691567a72781-catalog-content\") pod \"certified-operators-gkg85\" (UID: \"99180a72-d38c-44e4-b866-691567a72781\") " pod="openshift-marketplace/certified-operators-gkg85"
Mar 20 16:24:37 crc kubenswrapper[4730]: I0320 16:24:37.349124    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcclj\" (UniqueName: \"kubernetes.io/projected/99180a72-d38c-44e4-b866-691567a72781-kube-api-access-fcclj\") pod \"certified-operators-gkg85\" (UID: \"99180a72-d38c-44e4-b866-691567a72781\") " pod="openshift-marketplace/certified-operators-gkg85"
Mar 20 16:24:37 crc kubenswrapper[4730]: I0320 16:24:37.349580    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99180a72-d38c-44e4-b866-691567a72781-utilities\") pod \"certified-operators-gkg85\" (UID: \"99180a72-d38c-44e4-b866-691567a72781\") " pod="openshift-marketplace/certified-operators-gkg85"
Mar 20 16:24:37 crc kubenswrapper[4730]: I0320 16:24:37.349698    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99180a72-d38c-44e4-b866-691567a72781-catalog-content\") pod \"certified-operators-gkg85\" (UID: \"99180a72-d38c-44e4-b866-691567a72781\") " pod="openshift-marketplace/certified-operators-gkg85"
Mar 20 16:24:37 crc kubenswrapper[4730]: I0320 16:24:37.369819    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcclj\" (UniqueName: \"kubernetes.io/projected/99180a72-d38c-44e4-b866-691567a72781-kube-api-access-fcclj\") pod \"certified-operators-gkg85\" (UID: \"99180a72-d38c-44e4-b866-691567a72781\") " pod="openshift-marketplace/certified-operators-gkg85"
Mar 20 16:24:37 crc kubenswrapper[4730]: I0320 16:24:37.429502    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gkg85"
Mar 20 16:24:37 crc kubenswrapper[4730]: I0320 16:24:37.970211    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gkg85"]
Mar 20 16:24:38 crc kubenswrapper[4730]: I0320 16:24:38.918719    4730 generic.go:334] "Generic (PLEG): container finished" podID="99180a72-d38c-44e4-b866-691567a72781" containerID="b54e146d59f285cc301fc0bd7589d1ed01f280f8af399226c6b1444fcde6d455" exitCode=0
Mar 20 16:24:38 crc kubenswrapper[4730]: I0320 16:24:38.918771    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gkg85" event={"ID":"99180a72-d38c-44e4-b866-691567a72781","Type":"ContainerDied","Data":"b54e146d59f285cc301fc0bd7589d1ed01f280f8af399226c6b1444fcde6d455"}
Mar 20 16:24:38 crc kubenswrapper[4730]: I0320 16:24:38.918803    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gkg85" event={"ID":"99180a72-d38c-44e4-b866-691567a72781","Type":"ContainerStarted","Data":"3fa6e12c75e331bc0af19554791977b74f6d4895aba6bdbc02bb29ddc8d7e1db"}
Mar 20 16:24:39 crc kubenswrapper[4730]: I0320 16:24:39.930276    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gkg85" event={"ID":"99180a72-d38c-44e4-b866-691567a72781","Type":"ContainerStarted","Data":"8a4e7522fb16182e51a6774ba98314d01ce0904c8133a56833e9dfe6f34f0ff5"}
Mar 20 16:24:40 crc kubenswrapper[4730]: I0320 16:24:40.533601    4730 scope.go:117] "RemoveContainer" containerID="7c543dcd07b51b6540df53377dd2e3fd92636df6153928eaa86c229ad0e8e213"
Mar 20 16:24:40 crc kubenswrapper[4730]: E0320 16:24:40.534174    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:24:40 crc kubenswrapper[4730]: I0320 16:24:40.945620    4730 generic.go:334] "Generic (PLEG): container finished" podID="99180a72-d38c-44e4-b866-691567a72781" containerID="8a4e7522fb16182e51a6774ba98314d01ce0904c8133a56833e9dfe6f34f0ff5" exitCode=0
Mar 20 16:24:40 crc kubenswrapper[4730]: I0320 16:24:40.945684    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gkg85" event={"ID":"99180a72-d38c-44e4-b866-691567a72781","Type":"ContainerDied","Data":"8a4e7522fb16182e51a6774ba98314d01ce0904c8133a56833e9dfe6f34f0ff5"}
Mar 20 16:24:41 crc kubenswrapper[4730]: I0320 16:24:41.957347    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gkg85" event={"ID":"99180a72-d38c-44e4-b866-691567a72781","Type":"ContainerStarted","Data":"f4829eb2f86a7c76f3956b6b64ce5ba191028ecce966c5159dad68531c1ae504"}
Mar 20 16:24:41 crc kubenswrapper[4730]: I0320 16:24:41.974079    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gkg85" podStartSLOduration=2.511380308 podStartE2EDuration="4.974057095s" podCreationTimestamp="2026-03-20 16:24:37 +0000 UTC" firstStartedPulling="2026-03-20 16:24:38.920514651 +0000 UTC m=+2738.133886020" lastFinishedPulling="2026-03-20 16:24:41.383191418 +0000 UTC m=+2740.596562807" observedRunningTime="2026-03-20 16:24:41.972926903 +0000 UTC m=+2741.186298272" watchObservedRunningTime="2026-03-20 16:24:41.974057095 +0000 UTC m=+2741.187428464"
Mar 20 16:24:47 crc kubenswrapper[4730]: I0320 16:24:47.429801    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gkg85"
Mar 20 16:24:47 crc kubenswrapper[4730]: I0320 16:24:47.430086    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gkg85"
Mar 20 16:24:47 crc kubenswrapper[4730]: I0320 16:24:47.474234    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gkg85"
Mar 20 16:24:48 crc kubenswrapper[4730]: I0320 16:24:48.067322    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gkg85"
Mar 20 16:24:48 crc kubenswrapper[4730]: I0320 16:24:48.243501    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gkg85"]
Mar 20 16:24:49 crc kubenswrapper[4730]: I0320 16:24:49.710390    4730 scope.go:117] "RemoveContainer" containerID="7c5e3ddc85e2a694aa4b6f5091893d6172a7cc9cc61ecbf5665992a51d9cf392"
Mar 20 16:24:50 crc kubenswrapper[4730]: I0320 16:24:50.022047    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gkg85" podUID="99180a72-d38c-44e4-b866-691567a72781" containerName="registry-server" containerID="cri-o://f4829eb2f86a7c76f3956b6b64ce5ba191028ecce966c5159dad68531c1ae504" gracePeriod=2
Mar 20 16:24:50 crc kubenswrapper[4730]: I0320 16:24:50.454444    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gkg85"
Mar 20 16:24:50 crc kubenswrapper[4730]: I0320 16:24:50.600156    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcclj\" (UniqueName: \"kubernetes.io/projected/99180a72-d38c-44e4-b866-691567a72781-kube-api-access-fcclj\") pod \"99180a72-d38c-44e4-b866-691567a72781\" (UID: \"99180a72-d38c-44e4-b866-691567a72781\") "
Mar 20 16:24:50 crc kubenswrapper[4730]: I0320 16:24:50.600234    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99180a72-d38c-44e4-b866-691567a72781-utilities\") pod \"99180a72-d38c-44e4-b866-691567a72781\" (UID: \"99180a72-d38c-44e4-b866-691567a72781\") "
Mar 20 16:24:50 crc kubenswrapper[4730]: I0320 16:24:50.600292    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99180a72-d38c-44e4-b866-691567a72781-catalog-content\") pod \"99180a72-d38c-44e4-b866-691567a72781\" (UID: \"99180a72-d38c-44e4-b866-691567a72781\") "
Mar 20 16:24:50 crc kubenswrapper[4730]: I0320 16:24:50.601264    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99180a72-d38c-44e4-b866-691567a72781-utilities" (OuterVolumeSpecName: "utilities") pod "99180a72-d38c-44e4-b866-691567a72781" (UID: "99180a72-d38c-44e4-b866-691567a72781"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:24:50 crc kubenswrapper[4730]: I0320 16:24:50.610440    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99180a72-d38c-44e4-b866-691567a72781-kube-api-access-fcclj" (OuterVolumeSpecName: "kube-api-access-fcclj") pod "99180a72-d38c-44e4-b866-691567a72781" (UID: "99180a72-d38c-44e4-b866-691567a72781"). InnerVolumeSpecName "kube-api-access-fcclj". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:24:50 crc kubenswrapper[4730]: I0320 16:24:50.704409    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcclj\" (UniqueName: \"kubernetes.io/projected/99180a72-d38c-44e4-b866-691567a72781-kube-api-access-fcclj\") on node \"crc\" DevicePath \"\""
Mar 20 16:24:50 crc kubenswrapper[4730]: I0320 16:24:50.704555    4730 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99180a72-d38c-44e4-b866-691567a72781-utilities\") on node \"crc\" DevicePath \"\""
Mar 20 16:24:51 crc kubenswrapper[4730]: I0320 16:24:51.036646    4730 generic.go:334] "Generic (PLEG): container finished" podID="99180a72-d38c-44e4-b866-691567a72781" containerID="f4829eb2f86a7c76f3956b6b64ce5ba191028ecce966c5159dad68531c1ae504" exitCode=0
Mar 20 16:24:51 crc kubenswrapper[4730]: I0320 16:24:51.036726    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gkg85"
Mar 20 16:24:51 crc kubenswrapper[4730]: I0320 16:24:51.036709    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gkg85" event={"ID":"99180a72-d38c-44e4-b866-691567a72781","Type":"ContainerDied","Data":"f4829eb2f86a7c76f3956b6b64ce5ba191028ecce966c5159dad68531c1ae504"}
Mar 20 16:24:51 crc kubenswrapper[4730]: I0320 16:24:51.036797    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gkg85" event={"ID":"99180a72-d38c-44e4-b866-691567a72781","Type":"ContainerDied","Data":"3fa6e12c75e331bc0af19554791977b74f6d4895aba6bdbc02bb29ddc8d7e1db"}
Mar 20 16:24:51 crc kubenswrapper[4730]: I0320 16:24:51.036821    4730 scope.go:117] "RemoveContainer" containerID="f4829eb2f86a7c76f3956b6b64ce5ba191028ecce966c5159dad68531c1ae504"
Mar 20 16:24:51 crc kubenswrapper[4730]: I0320 16:24:51.064518    4730 scope.go:117] "RemoveContainer" containerID="8a4e7522fb16182e51a6774ba98314d01ce0904c8133a56833e9dfe6f34f0ff5"
Mar 20 16:24:51 crc kubenswrapper[4730]: I0320 16:24:51.102407    4730 scope.go:117] "RemoveContainer" containerID="b54e146d59f285cc301fc0bd7589d1ed01f280f8af399226c6b1444fcde6d455"
Mar 20 16:24:51 crc kubenswrapper[4730]: I0320 16:24:51.160633    4730 scope.go:117] "RemoveContainer" containerID="f4829eb2f86a7c76f3956b6b64ce5ba191028ecce966c5159dad68531c1ae504"
Mar 20 16:24:51 crc kubenswrapper[4730]: E0320 16:24:51.161716    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4829eb2f86a7c76f3956b6b64ce5ba191028ecce966c5159dad68531c1ae504\": container with ID starting with f4829eb2f86a7c76f3956b6b64ce5ba191028ecce966c5159dad68531c1ae504 not found: ID does not exist" containerID="f4829eb2f86a7c76f3956b6b64ce5ba191028ecce966c5159dad68531c1ae504"
Mar 20 16:24:51 crc kubenswrapper[4730]: I0320 16:24:51.161754    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4829eb2f86a7c76f3956b6b64ce5ba191028ecce966c5159dad68531c1ae504"} err="failed to get container status \"f4829eb2f86a7c76f3956b6b64ce5ba191028ecce966c5159dad68531c1ae504\": rpc error: code = NotFound desc = could not find container \"f4829eb2f86a7c76f3956b6b64ce5ba191028ecce966c5159dad68531c1ae504\": container with ID starting with f4829eb2f86a7c76f3956b6b64ce5ba191028ecce966c5159dad68531c1ae504 not found: ID does not exist"
Mar 20 16:24:51 crc kubenswrapper[4730]: I0320 16:24:51.161781    4730 scope.go:117] "RemoveContainer" containerID="8a4e7522fb16182e51a6774ba98314d01ce0904c8133a56833e9dfe6f34f0ff5"
Mar 20 16:24:51 crc kubenswrapper[4730]: E0320 16:24:51.162849    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a4e7522fb16182e51a6774ba98314d01ce0904c8133a56833e9dfe6f34f0ff5\": container with ID starting with 8a4e7522fb16182e51a6774ba98314d01ce0904c8133a56833e9dfe6f34f0ff5 not found: ID does not exist" containerID="8a4e7522fb16182e51a6774ba98314d01ce0904c8133a56833e9dfe6f34f0ff5"
Mar 20 16:24:51 crc kubenswrapper[4730]: I0320 16:24:51.162877    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a4e7522fb16182e51a6774ba98314d01ce0904c8133a56833e9dfe6f34f0ff5"} err="failed to get container status \"8a4e7522fb16182e51a6774ba98314d01ce0904c8133a56833e9dfe6f34f0ff5\": rpc error: code = NotFound desc = could not find container \"8a4e7522fb16182e51a6774ba98314d01ce0904c8133a56833e9dfe6f34f0ff5\": container with ID starting with 8a4e7522fb16182e51a6774ba98314d01ce0904c8133a56833e9dfe6f34f0ff5 not found: ID does not exist"
Mar 20 16:24:51 crc kubenswrapper[4730]: I0320 16:24:51.162896    4730 scope.go:117] "RemoveContainer" containerID="b54e146d59f285cc301fc0bd7589d1ed01f280f8af399226c6b1444fcde6d455"
Mar 20 16:24:51 crc kubenswrapper[4730]: E0320 16:24:51.163418    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b54e146d59f285cc301fc0bd7589d1ed01f280f8af399226c6b1444fcde6d455\": container with ID starting with b54e146d59f285cc301fc0bd7589d1ed01f280f8af399226c6b1444fcde6d455 not found: ID does not exist" containerID="b54e146d59f285cc301fc0bd7589d1ed01f280f8af399226c6b1444fcde6d455"
Mar 20 16:24:51 crc kubenswrapper[4730]: I0320 16:24:51.163450    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b54e146d59f285cc301fc0bd7589d1ed01f280f8af399226c6b1444fcde6d455"} err="failed to get container status \"b54e146d59f285cc301fc0bd7589d1ed01f280f8af399226c6b1444fcde6d455\": rpc error: code = NotFound desc = could not find container \"b54e146d59f285cc301fc0bd7589d1ed01f280f8af399226c6b1444fcde6d455\": container with ID starting with b54e146d59f285cc301fc0bd7589d1ed01f280f8af399226c6b1444fcde6d455 not found: ID does not exist"
Mar 20 16:24:51 crc kubenswrapper[4730]: I0320 16:24:51.167219    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99180a72-d38c-44e4-b866-691567a72781-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "99180a72-d38c-44e4-b866-691567a72781" (UID: "99180a72-d38c-44e4-b866-691567a72781"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:24:51 crc kubenswrapper[4730]: I0320 16:24:51.214704    4730 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99180a72-d38c-44e4-b866-691567a72781-catalog-content\") on node \"crc\" DevicePath \"\""
Mar 20 16:24:51 crc kubenswrapper[4730]: I0320 16:24:51.370646    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gkg85"]
Mar 20 16:24:51 crc kubenswrapper[4730]: I0320 16:24:51.380111    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gkg85"]
Mar 20 16:24:51 crc kubenswrapper[4730]: I0320 16:24:51.543993    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99180a72-d38c-44e4-b866-691567a72781" path="/var/lib/kubelet/pods/99180a72-d38c-44e4-b866-691567a72781/volumes"
Mar 20 16:24:53 crc kubenswrapper[4730]: I0320 16:24:53.167472    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7jb45"]
Mar 20 16:24:53 crc kubenswrapper[4730]: E0320 16:24:53.168061    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99180a72-d38c-44e4-b866-691567a72781" containerName="extract-utilities"
Mar 20 16:24:53 crc kubenswrapper[4730]: I0320 16:24:53.168079    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="99180a72-d38c-44e4-b866-691567a72781" containerName="extract-utilities"
Mar 20 16:24:53 crc kubenswrapper[4730]: E0320 16:24:53.168089    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99180a72-d38c-44e4-b866-691567a72781" containerName="extract-content"
Mar 20 16:24:53 crc kubenswrapper[4730]: I0320 16:24:53.168095    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="99180a72-d38c-44e4-b866-691567a72781" containerName="extract-content"
Mar 20 16:24:53 crc kubenswrapper[4730]: E0320 16:24:53.168109    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99180a72-d38c-44e4-b866-691567a72781" containerName="registry-server"
Mar 20 16:24:53 crc kubenswrapper[4730]: I0320 16:24:53.168116    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="99180a72-d38c-44e4-b866-691567a72781" containerName="registry-server"
Mar 20 16:24:53 crc kubenswrapper[4730]: I0320 16:24:53.168332    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="99180a72-d38c-44e4-b866-691567a72781" containerName="registry-server"
Mar 20 16:24:53 crc kubenswrapper[4730]: I0320 16:24:53.169949    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7jb45"
Mar 20 16:24:53 crc kubenswrapper[4730]: I0320 16:24:53.182310    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7jb45"]
Mar 20 16:24:53 crc kubenswrapper[4730]: I0320 16:24:53.356299    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74b6q\" (UniqueName: \"kubernetes.io/projected/a235cc35-7c77-483e-ac9b-966218dcd9a8-kube-api-access-74b6q\") pod \"community-operators-7jb45\" (UID: \"a235cc35-7c77-483e-ac9b-966218dcd9a8\") " pod="openshift-marketplace/community-operators-7jb45"
Mar 20 16:24:53 crc kubenswrapper[4730]: I0320 16:24:53.356645    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a235cc35-7c77-483e-ac9b-966218dcd9a8-utilities\") pod \"community-operators-7jb45\" (UID: \"a235cc35-7c77-483e-ac9b-966218dcd9a8\") " pod="openshift-marketplace/community-operators-7jb45"
Mar 20 16:24:53 crc kubenswrapper[4730]: I0320 16:24:53.356957    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a235cc35-7c77-483e-ac9b-966218dcd9a8-catalog-content\") pod \"community-operators-7jb45\" (UID: \"a235cc35-7c77-483e-ac9b-966218dcd9a8\") " pod="openshift-marketplace/community-operators-7jb45"
Mar 20 16:24:53 crc kubenswrapper[4730]: I0320 16:24:53.458334    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a235cc35-7c77-483e-ac9b-966218dcd9a8-utilities\") pod \"community-operators-7jb45\" (UID: \"a235cc35-7c77-483e-ac9b-966218dcd9a8\") " pod="openshift-marketplace/community-operators-7jb45"
Mar 20 16:24:53 crc kubenswrapper[4730]: I0320 16:24:53.458412    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a235cc35-7c77-483e-ac9b-966218dcd9a8-catalog-content\") pod \"community-operators-7jb45\" (UID: \"a235cc35-7c77-483e-ac9b-966218dcd9a8\") " pod="openshift-marketplace/community-operators-7jb45"
Mar 20 16:24:53 crc kubenswrapper[4730]: I0320 16:24:53.458471    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74b6q\" (UniqueName: \"kubernetes.io/projected/a235cc35-7c77-483e-ac9b-966218dcd9a8-kube-api-access-74b6q\") pod \"community-operators-7jb45\" (UID: \"a235cc35-7c77-483e-ac9b-966218dcd9a8\") " pod="openshift-marketplace/community-operators-7jb45"
Mar 20 16:24:53 crc kubenswrapper[4730]: I0320 16:24:53.458913    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a235cc35-7c77-483e-ac9b-966218dcd9a8-utilities\") pod \"community-operators-7jb45\" (UID: \"a235cc35-7c77-483e-ac9b-966218dcd9a8\") " pod="openshift-marketplace/community-operators-7jb45"
Mar 20 16:24:53 crc kubenswrapper[4730]: I0320 16:24:53.459123    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a235cc35-7c77-483e-ac9b-966218dcd9a8-catalog-content\") pod \"community-operators-7jb45\" (UID: \"a235cc35-7c77-483e-ac9b-966218dcd9a8\") " pod="openshift-marketplace/community-operators-7jb45"
Mar 20 16:24:53 crc kubenswrapper[4730]: I0320 16:24:53.491009    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74b6q\" (UniqueName: \"kubernetes.io/projected/a235cc35-7c77-483e-ac9b-966218dcd9a8-kube-api-access-74b6q\") pod \"community-operators-7jb45\" (UID: \"a235cc35-7c77-483e-ac9b-966218dcd9a8\") " pod="openshift-marketplace/community-operators-7jb45"
Mar 20 16:24:53 crc kubenswrapper[4730]: I0320 16:24:53.533521    4730 scope.go:117] "RemoveContainer" containerID="7c543dcd07b51b6540df53377dd2e3fd92636df6153928eaa86c229ad0e8e213"
Mar 20 16:24:53 crc kubenswrapper[4730]: E0320 16:24:53.533814    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:24:53 crc kubenswrapper[4730]: I0320 16:24:53.790718    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7jb45"
Mar 20 16:24:54 crc kubenswrapper[4730]: I0320 16:24:54.229140    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7jb45"]
Mar 20 16:24:54 crc kubenswrapper[4730]: W0320 16:24:54.236172    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda235cc35_7c77_483e_ac9b_966218dcd9a8.slice/crio-d42cb277b6d68c8617f1da3c597f77865b9d81575ad6f2fa8dd8530cc4a65717 WatchSource:0}: Error finding container d42cb277b6d68c8617f1da3c597f77865b9d81575ad6f2fa8dd8530cc4a65717: Status 404 returned error can't find the container with id d42cb277b6d68c8617f1da3c597f77865b9d81575ad6f2fa8dd8530cc4a65717
Mar 20 16:24:55 crc kubenswrapper[4730]: I0320 16:24:55.084411    4730 generic.go:334] "Generic (PLEG): container finished" podID="a235cc35-7c77-483e-ac9b-966218dcd9a8" containerID="2fc031312adb328146003fdca6895b2fc848dfb296665101401a3ee75c2bee08" exitCode=0
Mar 20 16:24:55 crc kubenswrapper[4730]: I0320 16:24:55.084481    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7jb45" event={"ID":"a235cc35-7c77-483e-ac9b-966218dcd9a8","Type":"ContainerDied","Data":"2fc031312adb328146003fdca6895b2fc848dfb296665101401a3ee75c2bee08"}
Mar 20 16:24:55 crc kubenswrapper[4730]: I0320 16:24:55.084807    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7jb45" event={"ID":"a235cc35-7c77-483e-ac9b-966218dcd9a8","Type":"ContainerStarted","Data":"d42cb277b6d68c8617f1da3c597f77865b9d81575ad6f2fa8dd8530cc4a65717"}
Mar 20 16:24:57 crc kubenswrapper[4730]: I0320 16:24:57.112322    4730 generic.go:334] "Generic (PLEG): container finished" podID="a235cc35-7c77-483e-ac9b-966218dcd9a8" containerID="d99d57bb87e0f3a1b3b5043e9291a7c3735d1bf0b933793d1e2984f582770d47" exitCode=0
Mar 20 16:24:57 crc kubenswrapper[4730]: I0320 16:24:57.112383    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7jb45" event={"ID":"a235cc35-7c77-483e-ac9b-966218dcd9a8","Type":"ContainerDied","Data":"d99d57bb87e0f3a1b3b5043e9291a7c3735d1bf0b933793d1e2984f582770d47"}
Mar 20 16:24:58 crc kubenswrapper[4730]: I0320 16:24:58.122355    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7jb45" event={"ID":"a235cc35-7c77-483e-ac9b-966218dcd9a8","Type":"ContainerStarted","Data":"8a9592624bd98932298ea29c114b5fa0638addc2801ff6eb39202018ee77f0df"}
Mar 20 16:24:58 crc kubenswrapper[4730]: I0320 16:24:58.145757    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7jb45" podStartSLOduration=2.592540357 podStartE2EDuration="5.145740246s" podCreationTimestamp="2026-03-20 16:24:53 +0000 UTC" firstStartedPulling="2026-03-20 16:24:55.086373418 +0000 UTC m=+2754.299744807" lastFinishedPulling="2026-03-20 16:24:57.639573297 +0000 UTC m=+2756.852944696" observedRunningTime="2026-03-20 16:24:58.140972561 +0000 UTC m=+2757.354343920" watchObservedRunningTime="2026-03-20 16:24:58.145740246 +0000 UTC m=+2757.359111615"
Mar 20 16:24:58 crc kubenswrapper[4730]: I0320 16:24:58.459377    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-95hgg"]
Mar 20 16:24:58 crc kubenswrapper[4730]: I0320 16:24:58.461759    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-95hgg"
Mar 20 16:24:58 crc kubenswrapper[4730]: I0320 16:24:58.475884    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-95hgg"]
Mar 20 16:24:58 crc kubenswrapper[4730]: I0320 16:24:58.557606    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qs8s\" (UniqueName: \"kubernetes.io/projected/e69761d7-fc5c-47a9-8f69-3f89dd557b65-kube-api-access-2qs8s\") pod \"redhat-marketplace-95hgg\" (UID: \"e69761d7-fc5c-47a9-8f69-3f89dd557b65\") " pod="openshift-marketplace/redhat-marketplace-95hgg"
Mar 20 16:24:58 crc kubenswrapper[4730]: I0320 16:24:58.557710    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e69761d7-fc5c-47a9-8f69-3f89dd557b65-catalog-content\") pod \"redhat-marketplace-95hgg\" (UID: \"e69761d7-fc5c-47a9-8f69-3f89dd557b65\") " pod="openshift-marketplace/redhat-marketplace-95hgg"
Mar 20 16:24:58 crc kubenswrapper[4730]: I0320 16:24:58.557734    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e69761d7-fc5c-47a9-8f69-3f89dd557b65-utilities\") pod \"redhat-marketplace-95hgg\" (UID: \"e69761d7-fc5c-47a9-8f69-3f89dd557b65\") " pod="openshift-marketplace/redhat-marketplace-95hgg"
Mar 20 16:24:58 crc kubenswrapper[4730]: I0320 16:24:58.659931    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qs8s\" (UniqueName: \"kubernetes.io/projected/e69761d7-fc5c-47a9-8f69-3f89dd557b65-kube-api-access-2qs8s\") pod \"redhat-marketplace-95hgg\" (UID: \"e69761d7-fc5c-47a9-8f69-3f89dd557b65\") " pod="openshift-marketplace/redhat-marketplace-95hgg"
Mar 20 16:24:58 crc kubenswrapper[4730]: I0320 16:24:58.660028    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e69761d7-fc5c-47a9-8f69-3f89dd557b65-catalog-content\") pod \"redhat-marketplace-95hgg\" (UID: \"e69761d7-fc5c-47a9-8f69-3f89dd557b65\") " pod="openshift-marketplace/redhat-marketplace-95hgg"
Mar 20 16:24:58 crc kubenswrapper[4730]: I0320 16:24:58.660048    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e69761d7-fc5c-47a9-8f69-3f89dd557b65-utilities\") pod \"redhat-marketplace-95hgg\" (UID: \"e69761d7-fc5c-47a9-8f69-3f89dd557b65\") " pod="openshift-marketplace/redhat-marketplace-95hgg"
Mar 20 16:24:58 crc kubenswrapper[4730]: I0320 16:24:58.660578    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e69761d7-fc5c-47a9-8f69-3f89dd557b65-utilities\") pod \"redhat-marketplace-95hgg\" (UID: \"e69761d7-fc5c-47a9-8f69-3f89dd557b65\") " pod="openshift-marketplace/redhat-marketplace-95hgg"
Mar 20 16:24:58 crc kubenswrapper[4730]: I0320 16:24:58.660983    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e69761d7-fc5c-47a9-8f69-3f89dd557b65-catalog-content\") pod \"redhat-marketplace-95hgg\" (UID: \"e69761d7-fc5c-47a9-8f69-3f89dd557b65\") " pod="openshift-marketplace/redhat-marketplace-95hgg"
Mar 20 16:24:58 crc kubenswrapper[4730]: I0320 16:24:58.695300    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qs8s\" (UniqueName: \"kubernetes.io/projected/e69761d7-fc5c-47a9-8f69-3f89dd557b65-kube-api-access-2qs8s\") pod \"redhat-marketplace-95hgg\" (UID: \"e69761d7-fc5c-47a9-8f69-3f89dd557b65\") " pod="openshift-marketplace/redhat-marketplace-95hgg"
Mar 20 16:24:58 crc kubenswrapper[4730]: I0320 16:24:58.777998    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-95hgg"
Mar 20 16:24:59 crc kubenswrapper[4730]: I0320 16:24:59.263664    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-95hgg"]
Mar 20 16:25:00 crc kubenswrapper[4730]: I0320 16:25:00.152093    4730 generic.go:334] "Generic (PLEG): container finished" podID="e69761d7-fc5c-47a9-8f69-3f89dd557b65" containerID="21206359c5fd0201b8b16a153309c18774b9c8d5deb6759f44c3804d3b10a4db" exitCode=0
Mar 20 16:25:00 crc kubenswrapper[4730]: I0320 16:25:00.152146    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-95hgg" event={"ID":"e69761d7-fc5c-47a9-8f69-3f89dd557b65","Type":"ContainerDied","Data":"21206359c5fd0201b8b16a153309c18774b9c8d5deb6759f44c3804d3b10a4db"}
Mar 20 16:25:00 crc kubenswrapper[4730]: I0320 16:25:00.152781    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-95hgg" event={"ID":"e69761d7-fc5c-47a9-8f69-3f89dd557b65","Type":"ContainerStarted","Data":"52a0abdeee686f55ac11d96180a5e7f5807128b11f0529b546142dd1ba88cd8b"}
Mar 20 16:25:01 crc kubenswrapper[4730]: I0320 16:25:01.164900    4730 generic.go:334] "Generic (PLEG): container finished" podID="e69761d7-fc5c-47a9-8f69-3f89dd557b65" containerID="aa8b3aec20a3b9706c932d80e020850878604b63ed5b242bfe7cae92232b5208" exitCode=0
Mar 20 16:25:01 crc kubenswrapper[4730]: I0320 16:25:01.165016    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-95hgg" event={"ID":"e69761d7-fc5c-47a9-8f69-3f89dd557b65","Type":"ContainerDied","Data":"aa8b3aec20a3b9706c932d80e020850878604b63ed5b242bfe7cae92232b5208"}
Mar 20 16:25:03 crc kubenswrapper[4730]: I0320 16:25:03.186136    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-95hgg" event={"ID":"e69761d7-fc5c-47a9-8f69-3f89dd557b65","Type":"ContainerStarted","Data":"da51b54a8c5fcbb4fa2046cfb5cf5456b08f534d75b4795aa1c5d672bb8dcf6c"}
Mar 20 16:25:03 crc kubenswrapper[4730]: I0320 16:25:03.214003    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-95hgg" podStartSLOduration=3.043675095 podStartE2EDuration="5.213981681s" podCreationTimestamp="2026-03-20 16:24:58 +0000 UTC" firstStartedPulling="2026-03-20 16:25:00.154330904 +0000 UTC m=+2759.367702273" lastFinishedPulling="2026-03-20 16:25:02.32463749 +0000 UTC m=+2761.538008859" observedRunningTime="2026-03-20 16:25:03.204239224 +0000 UTC m=+2762.417610583" watchObservedRunningTime="2026-03-20 16:25:03.213981681 +0000 UTC m=+2762.427353050"
Mar 20 16:25:03 crc kubenswrapper[4730]: I0320 16:25:03.791471    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7jb45"
Mar 20 16:25:03 crc kubenswrapper[4730]: I0320 16:25:03.791523    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7jb45"
Mar 20 16:25:03 crc kubenswrapper[4730]: I0320 16:25:03.857533    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7jb45"
Mar 20 16:25:04 crc kubenswrapper[4730]: I0320 16:25:04.248595    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7jb45"
Mar 20 16:25:05 crc kubenswrapper[4730]: I0320 16:25:05.044499    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7jb45"]
Mar 20 16:25:06 crc kubenswrapper[4730]: I0320 16:25:06.210645    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7jb45" podUID="a235cc35-7c77-483e-ac9b-966218dcd9a8" containerName="registry-server" containerID="cri-o://8a9592624bd98932298ea29c114b5fa0638addc2801ff6eb39202018ee77f0df" gracePeriod=2
Mar 20 16:25:07 crc kubenswrapper[4730]: I0320 16:25:07.155129    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7jb45"
Mar 20 16:25:07 crc kubenswrapper[4730]: I0320 16:25:07.184949    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74b6q\" (UniqueName: \"kubernetes.io/projected/a235cc35-7c77-483e-ac9b-966218dcd9a8-kube-api-access-74b6q\") pod \"a235cc35-7c77-483e-ac9b-966218dcd9a8\" (UID: \"a235cc35-7c77-483e-ac9b-966218dcd9a8\") "
Mar 20 16:25:07 crc kubenswrapper[4730]: I0320 16:25:07.185000    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a235cc35-7c77-483e-ac9b-966218dcd9a8-catalog-content\") pod \"a235cc35-7c77-483e-ac9b-966218dcd9a8\" (UID: \"a235cc35-7c77-483e-ac9b-966218dcd9a8\") "
Mar 20 16:25:07 crc kubenswrapper[4730]: I0320 16:25:07.185060    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a235cc35-7c77-483e-ac9b-966218dcd9a8-utilities\") pod \"a235cc35-7c77-483e-ac9b-966218dcd9a8\" (UID: \"a235cc35-7c77-483e-ac9b-966218dcd9a8\") "
Mar 20 16:25:07 crc kubenswrapper[4730]: I0320 16:25:07.186400    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a235cc35-7c77-483e-ac9b-966218dcd9a8-utilities" (OuterVolumeSpecName: "utilities") pod "a235cc35-7c77-483e-ac9b-966218dcd9a8" (UID: "a235cc35-7c77-483e-ac9b-966218dcd9a8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:25:07 crc kubenswrapper[4730]: I0320 16:25:07.192663    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a235cc35-7c77-483e-ac9b-966218dcd9a8-kube-api-access-74b6q" (OuterVolumeSpecName: "kube-api-access-74b6q") pod "a235cc35-7c77-483e-ac9b-966218dcd9a8" (UID: "a235cc35-7c77-483e-ac9b-966218dcd9a8"). InnerVolumeSpecName "kube-api-access-74b6q". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:25:07 crc kubenswrapper[4730]: I0320 16:25:07.253724    4730 generic.go:334] "Generic (PLEG): container finished" podID="a235cc35-7c77-483e-ac9b-966218dcd9a8" containerID="8a9592624bd98932298ea29c114b5fa0638addc2801ff6eb39202018ee77f0df" exitCode=0
Mar 20 16:25:07 crc kubenswrapper[4730]: I0320 16:25:07.253782    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7jb45" event={"ID":"a235cc35-7c77-483e-ac9b-966218dcd9a8","Type":"ContainerDied","Data":"8a9592624bd98932298ea29c114b5fa0638addc2801ff6eb39202018ee77f0df"}
Mar 20 16:25:07 crc kubenswrapper[4730]: I0320 16:25:07.253814    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7jb45" event={"ID":"a235cc35-7c77-483e-ac9b-966218dcd9a8","Type":"ContainerDied","Data":"d42cb277b6d68c8617f1da3c597f77865b9d81575ad6f2fa8dd8530cc4a65717"}
Mar 20 16:25:07 crc kubenswrapper[4730]: I0320 16:25:07.253835    4730 scope.go:117] "RemoveContainer" containerID="8a9592624bd98932298ea29c114b5fa0638addc2801ff6eb39202018ee77f0df"
Mar 20 16:25:07 crc kubenswrapper[4730]: I0320 16:25:07.254034    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7jb45"
Mar 20 16:25:07 crc kubenswrapper[4730]: I0320 16:25:07.263368    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a235cc35-7c77-483e-ac9b-966218dcd9a8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a235cc35-7c77-483e-ac9b-966218dcd9a8" (UID: "a235cc35-7c77-483e-ac9b-966218dcd9a8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:25:07 crc kubenswrapper[4730]: I0320 16:25:07.285112    4730 scope.go:117] "RemoveContainer" containerID="d99d57bb87e0f3a1b3b5043e9291a7c3735d1bf0b933793d1e2984f582770d47"
Mar 20 16:25:07 crc kubenswrapper[4730]: I0320 16:25:07.286747    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74b6q\" (UniqueName: \"kubernetes.io/projected/a235cc35-7c77-483e-ac9b-966218dcd9a8-kube-api-access-74b6q\") on node \"crc\" DevicePath \"\""
Mar 20 16:25:07 crc kubenswrapper[4730]: I0320 16:25:07.286767    4730 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a235cc35-7c77-483e-ac9b-966218dcd9a8-catalog-content\") on node \"crc\" DevicePath \"\""
Mar 20 16:25:07 crc kubenswrapper[4730]: I0320 16:25:07.286776    4730 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a235cc35-7c77-483e-ac9b-966218dcd9a8-utilities\") on node \"crc\" DevicePath \"\""
Mar 20 16:25:07 crc kubenswrapper[4730]: I0320 16:25:07.311665    4730 scope.go:117] "RemoveContainer" containerID="2fc031312adb328146003fdca6895b2fc848dfb296665101401a3ee75c2bee08"
Mar 20 16:25:07 crc kubenswrapper[4730]: I0320 16:25:07.353972    4730 scope.go:117] "RemoveContainer" containerID="8a9592624bd98932298ea29c114b5fa0638addc2801ff6eb39202018ee77f0df"
Mar 20 16:25:07 crc kubenswrapper[4730]: E0320 16:25:07.354450    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a9592624bd98932298ea29c114b5fa0638addc2801ff6eb39202018ee77f0df\": container with ID starting with 8a9592624bd98932298ea29c114b5fa0638addc2801ff6eb39202018ee77f0df not found: ID does not exist" containerID="8a9592624bd98932298ea29c114b5fa0638addc2801ff6eb39202018ee77f0df"
Mar 20 16:25:07 crc kubenswrapper[4730]: I0320 16:25:07.354575    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a9592624bd98932298ea29c114b5fa0638addc2801ff6eb39202018ee77f0df"} err="failed to get container status \"8a9592624bd98932298ea29c114b5fa0638addc2801ff6eb39202018ee77f0df\": rpc error: code = NotFound desc = could not find container \"8a9592624bd98932298ea29c114b5fa0638addc2801ff6eb39202018ee77f0df\": container with ID starting with 8a9592624bd98932298ea29c114b5fa0638addc2801ff6eb39202018ee77f0df not found: ID does not exist"
Mar 20 16:25:07 crc kubenswrapper[4730]: I0320 16:25:07.354662    4730 scope.go:117] "RemoveContainer" containerID="d99d57bb87e0f3a1b3b5043e9291a7c3735d1bf0b933793d1e2984f582770d47"
Mar 20 16:25:07 crc kubenswrapper[4730]: E0320 16:25:07.355118    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d99d57bb87e0f3a1b3b5043e9291a7c3735d1bf0b933793d1e2984f582770d47\": container with ID starting with d99d57bb87e0f3a1b3b5043e9291a7c3735d1bf0b933793d1e2984f582770d47 not found: ID does not exist" containerID="d99d57bb87e0f3a1b3b5043e9291a7c3735d1bf0b933793d1e2984f582770d47"
Mar 20 16:25:07 crc kubenswrapper[4730]: I0320 16:25:07.355223    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d99d57bb87e0f3a1b3b5043e9291a7c3735d1bf0b933793d1e2984f582770d47"} err="failed to get container status \"d99d57bb87e0f3a1b3b5043e9291a7c3735d1bf0b933793d1e2984f582770d47\": rpc error: code = NotFound desc = could not find container \"d99d57bb87e0f3a1b3b5043e9291a7c3735d1bf0b933793d1e2984f582770d47\": container with ID starting with d99d57bb87e0f3a1b3b5043e9291a7c3735d1bf0b933793d1e2984f582770d47 not found: ID does not exist"
Mar 20 16:25:07 crc kubenswrapper[4730]: I0320 16:25:07.355314    4730 scope.go:117] "RemoveContainer" containerID="2fc031312adb328146003fdca6895b2fc848dfb296665101401a3ee75c2bee08"
Mar 20 16:25:07 crc kubenswrapper[4730]: E0320 16:25:07.355794    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fc031312adb328146003fdca6895b2fc848dfb296665101401a3ee75c2bee08\": container with ID starting with 2fc031312adb328146003fdca6895b2fc848dfb296665101401a3ee75c2bee08 not found: ID does not exist" containerID="2fc031312adb328146003fdca6895b2fc848dfb296665101401a3ee75c2bee08"
Mar 20 16:25:07 crc kubenswrapper[4730]: I0320 16:25:07.355849    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fc031312adb328146003fdca6895b2fc848dfb296665101401a3ee75c2bee08"} err="failed to get container status \"2fc031312adb328146003fdca6895b2fc848dfb296665101401a3ee75c2bee08\": rpc error: code = NotFound desc = could not find container \"2fc031312adb328146003fdca6895b2fc848dfb296665101401a3ee75c2bee08\": container with ID starting with 2fc031312adb328146003fdca6895b2fc848dfb296665101401a3ee75c2bee08 not found: ID does not exist"
Mar 20 16:25:07 crc kubenswrapper[4730]: I0320 16:25:07.582365    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7jb45"]
Mar 20 16:25:07 crc kubenswrapper[4730]: I0320 16:25:07.591231    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7jb45"]
Mar 20 16:25:08 crc kubenswrapper[4730]: I0320 16:25:08.533748    4730 scope.go:117] "RemoveContainer" containerID="7c543dcd07b51b6540df53377dd2e3fd92636df6153928eaa86c229ad0e8e213"
Mar 20 16:25:08 crc kubenswrapper[4730]: E0320 16:25:08.534209    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:25:08 crc kubenswrapper[4730]: I0320 16:25:08.778460    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-95hgg"
Mar 20 16:25:08 crc kubenswrapper[4730]: I0320 16:25:08.778537    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-95hgg"
Mar 20 16:25:08 crc kubenswrapper[4730]: I0320 16:25:08.832823    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-95hgg"
Mar 20 16:25:09 crc kubenswrapper[4730]: I0320 16:25:09.319610    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-95hgg"
Mar 20 16:25:09 crc kubenswrapper[4730]: I0320 16:25:09.545279    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a235cc35-7c77-483e-ac9b-966218dcd9a8" path="/var/lib/kubelet/pods/a235cc35-7c77-483e-ac9b-966218dcd9a8/volumes"
Mar 20 16:25:11 crc kubenswrapper[4730]: I0320 16:25:11.444357    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-95hgg"]
Mar 20 16:25:11 crc kubenswrapper[4730]: I0320 16:25:11.444576    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-95hgg" podUID="e69761d7-fc5c-47a9-8f69-3f89dd557b65" containerName="registry-server" containerID="cri-o://da51b54a8c5fcbb4fa2046cfb5cf5456b08f534d75b4795aa1c5d672bb8dcf6c" gracePeriod=2
Mar 20 16:25:11 crc kubenswrapper[4730]: I0320 16:25:11.919782    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-95hgg"
Mar 20 16:25:12 crc kubenswrapper[4730]: I0320 16:25:12.099411    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qs8s\" (UniqueName: \"kubernetes.io/projected/e69761d7-fc5c-47a9-8f69-3f89dd557b65-kube-api-access-2qs8s\") pod \"e69761d7-fc5c-47a9-8f69-3f89dd557b65\" (UID: \"e69761d7-fc5c-47a9-8f69-3f89dd557b65\") "
Mar 20 16:25:12 crc kubenswrapper[4730]: I0320 16:25:12.099479    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e69761d7-fc5c-47a9-8f69-3f89dd557b65-utilities\") pod \"e69761d7-fc5c-47a9-8f69-3f89dd557b65\" (UID: \"e69761d7-fc5c-47a9-8f69-3f89dd557b65\") "
Mar 20 16:25:12 crc kubenswrapper[4730]: I0320 16:25:12.099681    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e69761d7-fc5c-47a9-8f69-3f89dd557b65-catalog-content\") pod \"e69761d7-fc5c-47a9-8f69-3f89dd557b65\" (UID: \"e69761d7-fc5c-47a9-8f69-3f89dd557b65\") "
Mar 20 16:25:12 crc kubenswrapper[4730]: I0320 16:25:12.100484    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e69761d7-fc5c-47a9-8f69-3f89dd557b65-utilities" (OuterVolumeSpecName: "utilities") pod "e69761d7-fc5c-47a9-8f69-3f89dd557b65" (UID: "e69761d7-fc5c-47a9-8f69-3f89dd557b65"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:25:12 crc kubenswrapper[4730]: I0320 16:25:12.108435    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e69761d7-fc5c-47a9-8f69-3f89dd557b65-kube-api-access-2qs8s" (OuterVolumeSpecName: "kube-api-access-2qs8s") pod "e69761d7-fc5c-47a9-8f69-3f89dd557b65" (UID: "e69761d7-fc5c-47a9-8f69-3f89dd557b65"). InnerVolumeSpecName "kube-api-access-2qs8s". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:25:12 crc kubenswrapper[4730]: I0320 16:25:12.126356    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e69761d7-fc5c-47a9-8f69-3f89dd557b65-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e69761d7-fc5c-47a9-8f69-3f89dd557b65" (UID: "e69761d7-fc5c-47a9-8f69-3f89dd557b65"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:25:12 crc kubenswrapper[4730]: I0320 16:25:12.201910    4730 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e69761d7-fc5c-47a9-8f69-3f89dd557b65-catalog-content\") on node \"crc\" DevicePath \"\""
Mar 20 16:25:12 crc kubenswrapper[4730]: I0320 16:25:12.201945    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qs8s\" (UniqueName: \"kubernetes.io/projected/e69761d7-fc5c-47a9-8f69-3f89dd557b65-kube-api-access-2qs8s\") on node \"crc\" DevicePath \"\""
Mar 20 16:25:12 crc kubenswrapper[4730]: I0320 16:25:12.201959    4730 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e69761d7-fc5c-47a9-8f69-3f89dd557b65-utilities\") on node \"crc\" DevicePath \"\""
Mar 20 16:25:12 crc kubenswrapper[4730]: I0320 16:25:12.303303    4730 generic.go:334] "Generic (PLEG): container finished" podID="e69761d7-fc5c-47a9-8f69-3f89dd557b65" containerID="da51b54a8c5fcbb4fa2046cfb5cf5456b08f534d75b4795aa1c5d672bb8dcf6c" exitCode=0
Mar 20 16:25:12 crc kubenswrapper[4730]: I0320 16:25:12.303346    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-95hgg" event={"ID":"e69761d7-fc5c-47a9-8f69-3f89dd557b65","Type":"ContainerDied","Data":"da51b54a8c5fcbb4fa2046cfb5cf5456b08f534d75b4795aa1c5d672bb8dcf6c"}
Mar 20 16:25:12 crc kubenswrapper[4730]: I0320 16:25:12.303372    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-95hgg" event={"ID":"e69761d7-fc5c-47a9-8f69-3f89dd557b65","Type":"ContainerDied","Data":"52a0abdeee686f55ac11d96180a5e7f5807128b11f0529b546142dd1ba88cd8b"}
Mar 20 16:25:12 crc kubenswrapper[4730]: I0320 16:25:12.303375    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-95hgg"
Mar 20 16:25:12 crc kubenswrapper[4730]: I0320 16:25:12.303400    4730 scope.go:117] "RemoveContainer" containerID="da51b54a8c5fcbb4fa2046cfb5cf5456b08f534d75b4795aa1c5d672bb8dcf6c"
Mar 20 16:25:12 crc kubenswrapper[4730]: I0320 16:25:12.336589    4730 scope.go:117] "RemoveContainer" containerID="aa8b3aec20a3b9706c932d80e020850878604b63ed5b242bfe7cae92232b5208"
Mar 20 16:25:12 crc kubenswrapper[4730]: I0320 16:25:12.342951    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-95hgg"]
Mar 20 16:25:12 crc kubenswrapper[4730]: I0320 16:25:12.351430    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-95hgg"]
Mar 20 16:25:12 crc kubenswrapper[4730]: I0320 16:25:12.358310    4730 scope.go:117] "RemoveContainer" containerID="21206359c5fd0201b8b16a153309c18774b9c8d5deb6759f44c3804d3b10a4db"
Mar 20 16:25:12 crc kubenswrapper[4730]: I0320 16:25:12.403116    4730 scope.go:117] "RemoveContainer" containerID="da51b54a8c5fcbb4fa2046cfb5cf5456b08f534d75b4795aa1c5d672bb8dcf6c"
Mar 20 16:25:12 crc kubenswrapper[4730]: E0320 16:25:12.403662    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da51b54a8c5fcbb4fa2046cfb5cf5456b08f534d75b4795aa1c5d672bb8dcf6c\": container with ID starting with da51b54a8c5fcbb4fa2046cfb5cf5456b08f534d75b4795aa1c5d672bb8dcf6c not found: ID does not exist" containerID="da51b54a8c5fcbb4fa2046cfb5cf5456b08f534d75b4795aa1c5d672bb8dcf6c"
Mar 20 16:25:12 crc kubenswrapper[4730]: I0320 16:25:12.403703    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da51b54a8c5fcbb4fa2046cfb5cf5456b08f534d75b4795aa1c5d672bb8dcf6c"} err="failed to get container status \"da51b54a8c5fcbb4fa2046cfb5cf5456b08f534d75b4795aa1c5d672bb8dcf6c\": rpc error: code = NotFound desc = could not find container \"da51b54a8c5fcbb4fa2046cfb5cf5456b08f534d75b4795aa1c5d672bb8dcf6c\": container with ID starting with da51b54a8c5fcbb4fa2046cfb5cf5456b08f534d75b4795aa1c5d672bb8dcf6c not found: ID does not exist"
Mar 20 16:25:12 crc kubenswrapper[4730]: I0320 16:25:12.403729    4730 scope.go:117] "RemoveContainer" containerID="aa8b3aec20a3b9706c932d80e020850878604b63ed5b242bfe7cae92232b5208"
Mar 20 16:25:12 crc kubenswrapper[4730]: E0320 16:25:12.404117    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa8b3aec20a3b9706c932d80e020850878604b63ed5b242bfe7cae92232b5208\": container with ID starting with aa8b3aec20a3b9706c932d80e020850878604b63ed5b242bfe7cae92232b5208 not found: ID does not exist" containerID="aa8b3aec20a3b9706c932d80e020850878604b63ed5b242bfe7cae92232b5208"
Mar 20 16:25:12 crc kubenswrapper[4730]: I0320 16:25:12.404134    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa8b3aec20a3b9706c932d80e020850878604b63ed5b242bfe7cae92232b5208"} err="failed to get container status \"aa8b3aec20a3b9706c932d80e020850878604b63ed5b242bfe7cae92232b5208\": rpc error: code = NotFound desc = could not find container \"aa8b3aec20a3b9706c932d80e020850878604b63ed5b242bfe7cae92232b5208\": container with ID starting with aa8b3aec20a3b9706c932d80e020850878604b63ed5b242bfe7cae92232b5208 not found: ID does not exist"
Mar 20 16:25:12 crc kubenswrapper[4730]: I0320 16:25:12.404145    4730 scope.go:117] "RemoveContainer" containerID="21206359c5fd0201b8b16a153309c18774b9c8d5deb6759f44c3804d3b10a4db"
Mar 20 16:25:12 crc kubenswrapper[4730]: E0320 16:25:12.404423    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21206359c5fd0201b8b16a153309c18774b9c8d5deb6759f44c3804d3b10a4db\": container with ID starting with 21206359c5fd0201b8b16a153309c18774b9c8d5deb6759f44c3804d3b10a4db not found: ID does not exist" containerID="21206359c5fd0201b8b16a153309c18774b9c8d5deb6759f44c3804d3b10a4db"
Mar 20 16:25:12 crc kubenswrapper[4730]: I0320 16:25:12.404463    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21206359c5fd0201b8b16a153309c18774b9c8d5deb6759f44c3804d3b10a4db"} err="failed to get container status \"21206359c5fd0201b8b16a153309c18774b9c8d5deb6759f44c3804d3b10a4db\": rpc error: code = NotFound desc = could not find container \"21206359c5fd0201b8b16a153309c18774b9c8d5deb6759f44c3804d3b10a4db\": container with ID starting with 21206359c5fd0201b8b16a153309c18774b9c8d5deb6759f44c3804d3b10a4db not found: ID does not exist"
Mar 20 16:25:13 crc kubenswrapper[4730]: I0320 16:25:13.542954    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e69761d7-fc5c-47a9-8f69-3f89dd557b65" path="/var/lib/kubelet/pods/e69761d7-fc5c-47a9-8f69-3f89dd557b65/volumes"
Mar 20 16:25:20 crc kubenswrapper[4730]: I0320 16:25:20.533419    4730 scope.go:117] "RemoveContainer" containerID="7c543dcd07b51b6540df53377dd2e3fd92636df6153928eaa86c229ad0e8e213"
Mar 20 16:25:20 crc kubenswrapper[4730]: E0320 16:25:20.534334    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:25:33 crc kubenswrapper[4730]: I0320 16:25:33.533037    4730 scope.go:117] "RemoveContainer" containerID="7c543dcd07b51b6540df53377dd2e3fd92636df6153928eaa86c229ad0e8e213"
Mar 20 16:25:33 crc kubenswrapper[4730]: E0320 16:25:33.533740    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:25:34 crc kubenswrapper[4730]: I0320 16:25:34.504540    4730 generic.go:334] "Generic (PLEG): container finished" podID="884c2fa6-babb-44b8-b8e2-3e4fbce27153" containerID="4f23a2d6263e179fbb42b831d0fa59c0e1411d46b87becc3c39bd7883db89197" exitCode=0
Mar 20 16:25:34 crc kubenswrapper[4730]: I0320 16:25:34.504627    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq" event={"ID":"884c2fa6-babb-44b8-b8e2-3e4fbce27153","Type":"ContainerDied","Data":"4f23a2d6263e179fbb42b831d0fa59c0e1411d46b87becc3c39bd7883db89197"}
Mar 20 16:25:35 crc kubenswrapper[4730]: I0320 16:25:35.977942    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq"
Mar 20 16:25:36 crc kubenswrapper[4730]: I0320 16:25:36.052023    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/884c2fa6-babb-44b8-b8e2-3e4fbce27153-ssh-key-openstack-edpm-ipam\") pod \"884c2fa6-babb-44b8-b8e2-3e4fbce27153\" (UID: \"884c2fa6-babb-44b8-b8e2-3e4fbce27153\") "
Mar 20 16:25:36 crc kubenswrapper[4730]: I0320 16:25:36.052105    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/884c2fa6-babb-44b8-b8e2-3e4fbce27153-inventory\") pod \"884c2fa6-babb-44b8-b8e2-3e4fbce27153\" (UID: \"884c2fa6-babb-44b8-b8e2-3e4fbce27153\") "
Mar 20 16:25:36 crc kubenswrapper[4730]: I0320 16:25:36.052172    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/884c2fa6-babb-44b8-b8e2-3e4fbce27153-ceilometer-compute-config-data-1\") pod \"884c2fa6-babb-44b8-b8e2-3e4fbce27153\" (UID: \"884c2fa6-babb-44b8-b8e2-3e4fbce27153\") "
Mar 20 16:25:36 crc kubenswrapper[4730]: I0320 16:25:36.052222    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/884c2fa6-babb-44b8-b8e2-3e4fbce27153-ceilometer-compute-config-data-0\") pod \"884c2fa6-babb-44b8-b8e2-3e4fbce27153\" (UID: \"884c2fa6-babb-44b8-b8e2-3e4fbce27153\") "
Mar 20 16:25:36 crc kubenswrapper[4730]: I0320 16:25:36.052284    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/884c2fa6-babb-44b8-b8e2-3e4fbce27153-telemetry-combined-ca-bundle\") pod \"884c2fa6-babb-44b8-b8e2-3e4fbce27153\" (UID: \"884c2fa6-babb-44b8-b8e2-3e4fbce27153\") "
Mar 20 16:25:36 crc kubenswrapper[4730]: I0320 16:25:36.052324    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/884c2fa6-babb-44b8-b8e2-3e4fbce27153-ceilometer-compute-config-data-2\") pod \"884c2fa6-babb-44b8-b8e2-3e4fbce27153\" (UID: \"884c2fa6-babb-44b8-b8e2-3e4fbce27153\") "
Mar 20 16:25:36 crc kubenswrapper[4730]: I0320 16:25:36.052366    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vccl6\" (UniqueName: \"kubernetes.io/projected/884c2fa6-babb-44b8-b8e2-3e4fbce27153-kube-api-access-vccl6\") pod \"884c2fa6-babb-44b8-b8e2-3e4fbce27153\" (UID: \"884c2fa6-babb-44b8-b8e2-3e4fbce27153\") "
Mar 20 16:25:36 crc kubenswrapper[4730]: I0320 16:25:36.071304    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/884c2fa6-babb-44b8-b8e2-3e4fbce27153-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "884c2fa6-babb-44b8-b8e2-3e4fbce27153" (UID: "884c2fa6-babb-44b8-b8e2-3e4fbce27153"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:25:36 crc kubenswrapper[4730]: I0320 16:25:36.071633    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/884c2fa6-babb-44b8-b8e2-3e4fbce27153-kube-api-access-vccl6" (OuterVolumeSpecName: "kube-api-access-vccl6") pod "884c2fa6-babb-44b8-b8e2-3e4fbce27153" (UID: "884c2fa6-babb-44b8-b8e2-3e4fbce27153"). InnerVolumeSpecName "kube-api-access-vccl6". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:25:36 crc kubenswrapper[4730]: I0320 16:25:36.082423    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/884c2fa6-babb-44b8-b8e2-3e4fbce27153-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "884c2fa6-babb-44b8-b8e2-3e4fbce27153" (UID: "884c2fa6-babb-44b8-b8e2-3e4fbce27153"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:25:36 crc kubenswrapper[4730]: I0320 16:25:36.082487    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/884c2fa6-babb-44b8-b8e2-3e4fbce27153-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "884c2fa6-babb-44b8-b8e2-3e4fbce27153" (UID: "884c2fa6-babb-44b8-b8e2-3e4fbce27153"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:25:36 crc kubenswrapper[4730]: I0320 16:25:36.083725    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/884c2fa6-babb-44b8-b8e2-3e4fbce27153-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "884c2fa6-babb-44b8-b8e2-3e4fbce27153" (UID: "884c2fa6-babb-44b8-b8e2-3e4fbce27153"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:25:36 crc kubenswrapper[4730]: I0320 16:25:36.084187    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/884c2fa6-babb-44b8-b8e2-3e4fbce27153-inventory" (OuterVolumeSpecName: "inventory") pod "884c2fa6-babb-44b8-b8e2-3e4fbce27153" (UID: "884c2fa6-babb-44b8-b8e2-3e4fbce27153"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:25:36 crc kubenswrapper[4730]: I0320 16:25:36.093483    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/884c2fa6-babb-44b8-b8e2-3e4fbce27153-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "884c2fa6-babb-44b8-b8e2-3e4fbce27153" (UID: "884c2fa6-babb-44b8-b8e2-3e4fbce27153"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:25:36 crc kubenswrapper[4730]: I0320 16:25:36.154339    4730 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/884c2fa6-babb-44b8-b8e2-3e4fbce27153-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\""
Mar 20 16:25:36 crc kubenswrapper[4730]: I0320 16:25:36.154741    4730 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/884c2fa6-babb-44b8-b8e2-3e4fbce27153-inventory\") on node \"crc\" DevicePath \"\""
Mar 20 16:25:36 crc kubenswrapper[4730]: I0320 16:25:36.154816    4730 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/884c2fa6-babb-44b8-b8e2-3e4fbce27153-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\""
Mar 20 16:25:36 crc kubenswrapper[4730]: I0320 16:25:36.154909    4730 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/884c2fa6-babb-44b8-b8e2-3e4fbce27153-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\""
Mar 20 16:25:36 crc kubenswrapper[4730]: I0320 16:25:36.154981    4730 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/884c2fa6-babb-44b8-b8e2-3e4fbce27153-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:25:36 crc kubenswrapper[4730]: I0320 16:25:36.155049    4730 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/884c2fa6-babb-44b8-b8e2-3e4fbce27153-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\""
Mar 20 16:25:36 crc kubenswrapper[4730]: I0320 16:25:36.155115    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vccl6\" (UniqueName: \"kubernetes.io/projected/884c2fa6-babb-44b8-b8e2-3e4fbce27153-kube-api-access-vccl6\") on node \"crc\" DevicePath \"\""
Mar 20 16:25:36 crc kubenswrapper[4730]: I0320 16:25:36.553821    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq" event={"ID":"884c2fa6-babb-44b8-b8e2-3e4fbce27153","Type":"ContainerDied","Data":"7ab2683493c806bbc0a72a38ad816fd90029738a822e7670b5034e10e4d79b42"}
Mar 20 16:25:36 crc kubenswrapper[4730]: I0320 16:25:36.553871    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ab2683493c806bbc0a72a38ad816fd90029738a822e7670b5034e10e4d79b42"
Mar 20 16:25:36 crc kubenswrapper[4730]: I0320 16:25:36.553925    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq"
Mar 20 16:25:48 crc kubenswrapper[4730]: I0320 16:25:48.532732    4730 scope.go:117] "RemoveContainer" containerID="7c543dcd07b51b6540df53377dd2e3fd92636df6153928eaa86c229ad0e8e213"
Mar 20 16:25:49 crc kubenswrapper[4730]: I0320 16:25:49.664097    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" event={"ID":"7fcd3db3-55f1-4c23-8fa9-78844495cea3","Type":"ContainerStarted","Data":"f26b9418791737b63a44943ab58f9d5995a9697abde137f76404e07e867c6e5f"}
Mar 20 16:26:00 crc kubenswrapper[4730]: I0320 16:26:00.141678    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567066-7w6j9"]
Mar 20 16:26:00 crc kubenswrapper[4730]: E0320 16:26:00.143781    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a235cc35-7c77-483e-ac9b-966218dcd9a8" containerName="registry-server"
Mar 20 16:26:00 crc kubenswrapper[4730]: I0320 16:26:00.143878    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="a235cc35-7c77-483e-ac9b-966218dcd9a8" containerName="registry-server"
Mar 20 16:26:00 crc kubenswrapper[4730]: E0320 16:26:00.143959    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="884c2fa6-babb-44b8-b8e2-3e4fbce27153" containerName="telemetry-edpm-deployment-openstack-edpm-ipam"
Mar 20 16:26:00 crc kubenswrapper[4730]: I0320 16:26:00.144027    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="884c2fa6-babb-44b8-b8e2-3e4fbce27153" containerName="telemetry-edpm-deployment-openstack-edpm-ipam"
Mar 20 16:26:00 crc kubenswrapper[4730]: E0320 16:26:00.144136    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e69761d7-fc5c-47a9-8f69-3f89dd557b65" containerName="registry-server"
Mar 20 16:26:00 crc kubenswrapper[4730]: I0320 16:26:00.144207    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="e69761d7-fc5c-47a9-8f69-3f89dd557b65" containerName="registry-server"
Mar 20 16:26:00 crc kubenswrapper[4730]: E0320 16:26:00.144315    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e69761d7-fc5c-47a9-8f69-3f89dd557b65" containerName="extract-utilities"
Mar 20 16:26:00 crc kubenswrapper[4730]: I0320 16:26:00.144387    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="e69761d7-fc5c-47a9-8f69-3f89dd557b65" containerName="extract-utilities"
Mar 20 16:26:00 crc kubenswrapper[4730]: E0320 16:26:00.144460    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a235cc35-7c77-483e-ac9b-966218dcd9a8" containerName="extract-utilities"
Mar 20 16:26:00 crc kubenswrapper[4730]: I0320 16:26:00.144524    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="a235cc35-7c77-483e-ac9b-966218dcd9a8" containerName="extract-utilities"
Mar 20 16:26:00 crc kubenswrapper[4730]: E0320 16:26:00.144600    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a235cc35-7c77-483e-ac9b-966218dcd9a8" containerName="extract-content"
Mar 20 16:26:00 crc kubenswrapper[4730]: I0320 16:26:00.144666    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="a235cc35-7c77-483e-ac9b-966218dcd9a8" containerName="extract-content"
Mar 20 16:26:00 crc kubenswrapper[4730]: E0320 16:26:00.144738    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e69761d7-fc5c-47a9-8f69-3f89dd557b65" containerName="extract-content"
Mar 20 16:26:00 crc kubenswrapper[4730]: I0320 16:26:00.144805    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="e69761d7-fc5c-47a9-8f69-3f89dd557b65" containerName="extract-content"
Mar 20 16:26:00 crc kubenswrapper[4730]: I0320 16:26:00.145118    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="884c2fa6-babb-44b8-b8e2-3e4fbce27153" containerName="telemetry-edpm-deployment-openstack-edpm-ipam"
Mar 20 16:26:00 crc kubenswrapper[4730]: I0320 16:26:00.145217    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="e69761d7-fc5c-47a9-8f69-3f89dd557b65" containerName="registry-server"
Mar 20 16:26:00 crc kubenswrapper[4730]: I0320 16:26:00.145312    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="a235cc35-7c77-483e-ac9b-966218dcd9a8" containerName="registry-server"
Mar 20 16:26:00 crc kubenswrapper[4730]: I0320 16:26:00.146209    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567066-7w6j9"
Mar 20 16:26:00 crc kubenswrapper[4730]: I0320 16:26:00.148638    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt"
Mar 20 16:26:00 crc kubenswrapper[4730]: I0320 16:26:00.148930    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vlqbl"
Mar 20 16:26:00 crc kubenswrapper[4730]: I0320 16:26:00.149044    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt"
Mar 20 16:26:00 crc kubenswrapper[4730]: I0320 16:26:00.149640    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567066-7w6j9"]
Mar 20 16:26:00 crc kubenswrapper[4730]: I0320 16:26:00.225815    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5ghs\" (UniqueName: \"kubernetes.io/projected/2996ff7e-454f-40d8-bc5c-894c45b7a58c-kube-api-access-b5ghs\") pod \"auto-csr-approver-29567066-7w6j9\" (UID: \"2996ff7e-454f-40d8-bc5c-894c45b7a58c\") " pod="openshift-infra/auto-csr-approver-29567066-7w6j9"
Mar 20 16:26:00 crc kubenswrapper[4730]: I0320 16:26:00.327849    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5ghs\" (UniqueName: \"kubernetes.io/projected/2996ff7e-454f-40d8-bc5c-894c45b7a58c-kube-api-access-b5ghs\") pod \"auto-csr-approver-29567066-7w6j9\" (UID: \"2996ff7e-454f-40d8-bc5c-894c45b7a58c\") " pod="openshift-infra/auto-csr-approver-29567066-7w6j9"
Mar 20 16:26:00 crc kubenswrapper[4730]: I0320 16:26:00.348329    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5ghs\" (UniqueName: \"kubernetes.io/projected/2996ff7e-454f-40d8-bc5c-894c45b7a58c-kube-api-access-b5ghs\") pod \"auto-csr-approver-29567066-7w6j9\" (UID: \"2996ff7e-454f-40d8-bc5c-894c45b7a58c\") " pod="openshift-infra/auto-csr-approver-29567066-7w6j9"
Mar 20 16:26:00 crc kubenswrapper[4730]: I0320 16:26:00.464808    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567066-7w6j9"
Mar 20 16:26:00 crc kubenswrapper[4730]: I0320 16:26:00.925043    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567066-7w6j9"]
Mar 20 16:26:01 crc kubenswrapper[4730]: I0320 16:26:01.781138    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567066-7w6j9" event={"ID":"2996ff7e-454f-40d8-bc5c-894c45b7a58c","Type":"ContainerStarted","Data":"0b348ada8776d11aa72321a4e5c9a7e6f704ae7f87da09179dd7eb2ae8dcbedb"}
Mar 20 16:26:02 crc kubenswrapper[4730]: I0320 16:26:02.790747    4730 generic.go:334] "Generic (PLEG): container finished" podID="2996ff7e-454f-40d8-bc5c-894c45b7a58c" containerID="1bc53c769c23ddf21b26ded07f20d42f23d89029cfc127b5683d217f18b840d6" exitCode=0
Mar 20 16:26:02 crc kubenswrapper[4730]: I0320 16:26:02.790924    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567066-7w6j9" event={"ID":"2996ff7e-454f-40d8-bc5c-894c45b7a58c","Type":"ContainerDied","Data":"1bc53c769c23ddf21b26ded07f20d42f23d89029cfc127b5683d217f18b840d6"}
Mar 20 16:26:04 crc kubenswrapper[4730]: I0320 16:26:04.327505    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567066-7w6j9"
Mar 20 16:26:04 crc kubenswrapper[4730]: I0320 16:26:04.411094    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5ghs\" (UniqueName: \"kubernetes.io/projected/2996ff7e-454f-40d8-bc5c-894c45b7a58c-kube-api-access-b5ghs\") pod \"2996ff7e-454f-40d8-bc5c-894c45b7a58c\" (UID: \"2996ff7e-454f-40d8-bc5c-894c45b7a58c\") "
Mar 20 16:26:04 crc kubenswrapper[4730]: I0320 16:26:04.415779    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2996ff7e-454f-40d8-bc5c-894c45b7a58c-kube-api-access-b5ghs" (OuterVolumeSpecName: "kube-api-access-b5ghs") pod "2996ff7e-454f-40d8-bc5c-894c45b7a58c" (UID: "2996ff7e-454f-40d8-bc5c-894c45b7a58c"). InnerVolumeSpecName "kube-api-access-b5ghs". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:26:04 crc kubenswrapper[4730]: I0320 16:26:04.513689    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5ghs\" (UniqueName: \"kubernetes.io/projected/2996ff7e-454f-40d8-bc5c-894c45b7a58c-kube-api-access-b5ghs\") on node \"crc\" DevicePath \"\""
Mar 20 16:26:04 crc kubenswrapper[4730]: I0320 16:26:04.837858    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567066-7w6j9" event={"ID":"2996ff7e-454f-40d8-bc5c-894c45b7a58c","Type":"ContainerDied","Data":"0b348ada8776d11aa72321a4e5c9a7e6f704ae7f87da09179dd7eb2ae8dcbedb"}
Mar 20 16:26:04 crc kubenswrapper[4730]: I0320 16:26:04.837897    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b348ada8776d11aa72321a4e5c9a7e6f704ae7f87da09179dd7eb2ae8dcbedb"
Mar 20 16:26:04 crc kubenswrapper[4730]: I0320 16:26:04.837909    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567066-7w6j9"
Mar 20 16:26:05 crc kubenswrapper[4730]: I0320 16:26:05.420391    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567060-dbzpk"]
Mar 20 16:26:05 crc kubenswrapper[4730]: I0320 16:26:05.442270    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567060-dbzpk"]
Mar 20 16:26:05 crc kubenswrapper[4730]: I0320 16:26:05.545380    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95b03afb-153e-4f6b-a88a-e64d8a889b97" path="/var/lib/kubelet/pods/95b03afb-153e-4f6b-a88a-e64d8a889b97/volumes"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.050988    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"]
Mar 20 16:26:11 crc kubenswrapper[4730]: E0320 16:26:11.053145    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2996ff7e-454f-40d8-bc5c-894c45b7a58c" containerName="oc"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.053180    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="2996ff7e-454f-40d8-bc5c-894c45b7a58c" containerName="oc"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.053421    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="2996ff7e-454f-40d8-bc5c-894c45b7a58c" containerName="oc"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.054442    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.061082    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.091690    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"]
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.104881    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-nfs-0"]
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.107155    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.109092    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-nfs-config-data"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.133399    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0bde1710-3861-42cb-8647-292785ee4392-scripts\") pod \"cinder-backup-0\" (UID: \"0bde1710-3861-42cb-8647-292785ee4392\") " pod="openstack/cinder-backup-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.133454    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/0bde1710-3861-42cb-8647-292785ee4392-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"0bde1710-3861-42cb-8647-292785ee4392\") " pod="openstack/cinder-backup-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.133507    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bde1710-3861-42cb-8647-292785ee4392-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"0bde1710-3861-42cb-8647-292785ee4392\") " pod="openstack/cinder-backup-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.133538    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0bde1710-3861-42cb-8647-292785ee4392-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"0bde1710-3861-42cb-8647-292785ee4392\") " pod="openstack/cinder-backup-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.133584    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0bde1710-3861-42cb-8647-292785ee4392-lib-modules\") pod \"cinder-backup-0\" (UID: \"0bde1710-3861-42cb-8647-292785ee4392\") " pod="openstack/cinder-backup-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.133605    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0bde1710-3861-42cb-8647-292785ee4392-dev\") pod \"cinder-backup-0\" (UID: \"0bde1710-3861-42cb-8647-292785ee4392\") " pod="openstack/cinder-backup-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.133624    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bde1710-3861-42cb-8647-292785ee4392-config-data\") pod \"cinder-backup-0\" (UID: \"0bde1710-3861-42cb-8647-292785ee4392\") " pod="openstack/cinder-backup-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.133646    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0bde1710-3861-42cb-8647-292785ee4392-etc-nvme\") pod \"cinder-backup-0\" (UID: \"0bde1710-3861-42cb-8647-292785ee4392\") " pod="openstack/cinder-backup-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.133665    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0bde1710-3861-42cb-8647-292785ee4392-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"0bde1710-3861-42cb-8647-292785ee4392\") " pod="openstack/cinder-backup-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.133707    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/0bde1710-3861-42cb-8647-292785ee4392-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"0bde1710-3861-42cb-8647-292785ee4392\") " pod="openstack/cinder-backup-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.133735    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0bde1710-3861-42cb-8647-292785ee4392-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"0bde1710-3861-42cb-8647-292785ee4392\") " pod="openstack/cinder-backup-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.133752    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0bde1710-3861-42cb-8647-292785ee4392-run\") pod \"cinder-backup-0\" (UID: \"0bde1710-3861-42cb-8647-292785ee4392\") " pod="openstack/cinder-backup-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.133777    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0bde1710-3861-42cb-8647-292785ee4392-sys\") pod \"cinder-backup-0\" (UID: \"0bde1710-3861-42cb-8647-292785ee4392\") " pod="openstack/cinder-backup-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.133801    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0bde1710-3861-42cb-8647-292785ee4392-config-data-custom\") pod \"cinder-backup-0\" (UID: \"0bde1710-3861-42cb-8647-292785ee4392\") " pod="openstack/cinder-backup-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.133822    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h764b\" (UniqueName: \"kubernetes.io/projected/0bde1710-3861-42cb-8647-292785ee4392-kube-api-access-h764b\") pod \"cinder-backup-0\" (UID: \"0bde1710-3861-42cb-8647-292785ee4392\") " pod="openstack/cinder-backup-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.140108    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-0"]
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.215562    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-nfs-2-0"]
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.217178    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-2-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.223625    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-nfs-2-config-data"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.228568    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-2-0"]
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.235100    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0bde1710-3861-42cb-8647-292785ee4392-dev\") pod \"cinder-backup-0\" (UID: \"0bde1710-3861-42cb-8647-292785ee4392\") " pod="openstack/cinder-backup-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.235155    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz5n7\" (UniqueName: \"kubernetes.io/projected/436a7a40-7823-4670-a107-ff5ca02da822-kube-api-access-vz5n7\") pod \"cinder-volume-nfs-0\" (UID: \"436a7a40-7823-4670-a107-ff5ca02da822\") " pod="openstack/cinder-volume-nfs-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.235178    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bde1710-3861-42cb-8647-292785ee4392-config-data\") pod \"cinder-backup-0\" (UID: \"0bde1710-3861-42cb-8647-292785ee4392\") " pod="openstack/cinder-backup-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.235199    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/436a7a40-7823-4670-a107-ff5ca02da822-etc-nvme\") pod \"cinder-volume-nfs-0\" (UID: \"436a7a40-7823-4670-a107-ff5ca02da822\") " pod="openstack/cinder-volume-nfs-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.235217    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/436a7a40-7823-4670-a107-ff5ca02da822-combined-ca-bundle\") pod \"cinder-volume-nfs-0\" (UID: \"436a7a40-7823-4670-a107-ff5ca02da822\") " pod="openstack/cinder-volume-nfs-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.235235    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0bde1710-3861-42cb-8647-292785ee4392-etc-nvme\") pod \"cinder-backup-0\" (UID: \"0bde1710-3861-42cb-8647-292785ee4392\") " pod="openstack/cinder-backup-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.235270    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0bde1710-3861-42cb-8647-292785ee4392-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"0bde1710-3861-42cb-8647-292785ee4392\") " pod="openstack/cinder-backup-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.235307    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/436a7a40-7823-4670-a107-ff5ca02da822-scripts\") pod \"cinder-volume-nfs-0\" (UID: \"436a7a40-7823-4670-a107-ff5ca02da822\") " pod="openstack/cinder-volume-nfs-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.235328    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/0bde1710-3861-42cb-8647-292785ee4392-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"0bde1710-3861-42cb-8647-292785ee4392\") " pod="openstack/cinder-backup-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.235353    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/436a7a40-7823-4670-a107-ff5ca02da822-config-data-custom\") pod \"cinder-volume-nfs-0\" (UID: \"436a7a40-7823-4670-a107-ff5ca02da822\") " pod="openstack/cinder-volume-nfs-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.235373    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0bde1710-3861-42cb-8647-292785ee4392-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"0bde1710-3861-42cb-8647-292785ee4392\") " pod="openstack/cinder-backup-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.235388    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0bde1710-3861-42cb-8647-292785ee4392-run\") pod \"cinder-backup-0\" (UID: \"0bde1710-3861-42cb-8647-292785ee4392\") " pod="openstack/cinder-backup-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.235409    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/436a7a40-7823-4670-a107-ff5ca02da822-etc-iscsi\") pod \"cinder-volume-nfs-0\" (UID: \"436a7a40-7823-4670-a107-ff5ca02da822\") " pod="openstack/cinder-volume-nfs-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.235445    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/436a7a40-7823-4670-a107-ff5ca02da822-dev\") pod \"cinder-volume-nfs-0\" (UID: \"436a7a40-7823-4670-a107-ff5ca02da822\") " pod="openstack/cinder-volume-nfs-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.235467    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0bde1710-3861-42cb-8647-292785ee4392-sys\") pod \"cinder-backup-0\" (UID: \"0bde1710-3861-42cb-8647-292785ee4392\") " pod="openstack/cinder-backup-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.235491    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0bde1710-3861-42cb-8647-292785ee4392-config-data-custom\") pod \"cinder-backup-0\" (UID: \"0bde1710-3861-42cb-8647-292785ee4392\") " pod="openstack/cinder-backup-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.235506    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/436a7a40-7823-4670-a107-ff5ca02da822-sys\") pod \"cinder-volume-nfs-0\" (UID: \"436a7a40-7823-4670-a107-ff5ca02da822\") " pod="openstack/cinder-volume-nfs-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.235522    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h764b\" (UniqueName: \"kubernetes.io/projected/0bde1710-3861-42cb-8647-292785ee4392-kube-api-access-h764b\") pod \"cinder-backup-0\" (UID: \"0bde1710-3861-42cb-8647-292785ee4392\") " pod="openstack/cinder-backup-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.235542    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/436a7a40-7823-4670-a107-ff5ca02da822-run\") pod \"cinder-volume-nfs-0\" (UID: \"436a7a40-7823-4670-a107-ff5ca02da822\") " pod="openstack/cinder-volume-nfs-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.235558    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/436a7a40-7823-4670-a107-ff5ca02da822-var-lib-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"436a7a40-7823-4670-a107-ff5ca02da822\") " pod="openstack/cinder-volume-nfs-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.235578    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0bde1710-3861-42cb-8647-292785ee4392-scripts\") pod \"cinder-backup-0\" (UID: \"0bde1710-3861-42cb-8647-292785ee4392\") " pod="openstack/cinder-backup-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.235595    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/436a7a40-7823-4670-a107-ff5ca02da822-etc-machine-id\") pod \"cinder-volume-nfs-0\" (UID: \"436a7a40-7823-4670-a107-ff5ca02da822\") " pod="openstack/cinder-volume-nfs-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.235612    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/0bde1710-3861-42cb-8647-292785ee4392-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"0bde1710-3861-42cb-8647-292785ee4392\") " pod="openstack/cinder-backup-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.235658    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bde1710-3861-42cb-8647-292785ee4392-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"0bde1710-3861-42cb-8647-292785ee4392\") " pod="openstack/cinder-backup-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.235674    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/436a7a40-7823-4670-a107-ff5ca02da822-config-data\") pod \"cinder-volume-nfs-0\" (UID: \"436a7a40-7823-4670-a107-ff5ca02da822\") " pod="openstack/cinder-volume-nfs-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.235694    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/436a7a40-7823-4670-a107-ff5ca02da822-var-locks-brick\") pod \"cinder-volume-nfs-0\" (UID: \"436a7a40-7823-4670-a107-ff5ca02da822\") " pod="openstack/cinder-volume-nfs-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.235711    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0bde1710-3861-42cb-8647-292785ee4392-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"0bde1710-3861-42cb-8647-292785ee4392\") " pod="openstack/cinder-backup-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.235733    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/436a7a40-7823-4670-a107-ff5ca02da822-var-locks-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"436a7a40-7823-4670-a107-ff5ca02da822\") " pod="openstack/cinder-volume-nfs-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.235757    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/436a7a40-7823-4670-a107-ff5ca02da822-lib-modules\") pod \"cinder-volume-nfs-0\" (UID: \"436a7a40-7823-4670-a107-ff5ca02da822\") " pod="openstack/cinder-volume-nfs-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.235773    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0bde1710-3861-42cb-8647-292785ee4392-lib-modules\") pod \"cinder-backup-0\" (UID: \"0bde1710-3861-42cb-8647-292785ee4392\") " pod="openstack/cinder-backup-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.235853    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0bde1710-3861-42cb-8647-292785ee4392-lib-modules\") pod \"cinder-backup-0\" (UID: \"0bde1710-3861-42cb-8647-292785ee4392\") " pod="openstack/cinder-backup-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.235887    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0bde1710-3861-42cb-8647-292785ee4392-dev\") pod \"cinder-backup-0\" (UID: \"0bde1710-3861-42cb-8647-292785ee4392\") " pod="openstack/cinder-backup-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.238798    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/0bde1710-3861-42cb-8647-292785ee4392-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"0bde1710-3861-42cb-8647-292785ee4392\") " pod="openstack/cinder-backup-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.241859    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/0bde1710-3861-42cb-8647-292785ee4392-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"0bde1710-3861-42cb-8647-292785ee4392\") " pod="openstack/cinder-backup-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.242155    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0bde1710-3861-42cb-8647-292785ee4392-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"0bde1710-3861-42cb-8647-292785ee4392\") " pod="openstack/cinder-backup-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.242338    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0bde1710-3861-42cb-8647-292785ee4392-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"0bde1710-3861-42cb-8647-292785ee4392\") " pod="openstack/cinder-backup-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.242382    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0bde1710-3861-42cb-8647-292785ee4392-run\") pod \"cinder-backup-0\" (UID: \"0bde1710-3861-42cb-8647-292785ee4392\") " pod="openstack/cinder-backup-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.242417    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0bde1710-3861-42cb-8647-292785ee4392-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"0bde1710-3861-42cb-8647-292785ee4392\") " pod="openstack/cinder-backup-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.242471    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0bde1710-3861-42cb-8647-292785ee4392-sys\") pod \"cinder-backup-0\" (UID: \"0bde1710-3861-42cb-8647-292785ee4392\") " pod="openstack/cinder-backup-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.246339    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0bde1710-3861-42cb-8647-292785ee4392-etc-nvme\") pod \"cinder-backup-0\" (UID: \"0bde1710-3861-42cb-8647-292785ee4392\") " pod="openstack/cinder-backup-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.247081    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0bde1710-3861-42cb-8647-292785ee4392-config-data-custom\") pod \"cinder-backup-0\" (UID: \"0bde1710-3861-42cb-8647-292785ee4392\") " pod="openstack/cinder-backup-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.247317    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bde1710-3861-42cb-8647-292785ee4392-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"0bde1710-3861-42cb-8647-292785ee4392\") " pod="openstack/cinder-backup-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.261402    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0bde1710-3861-42cb-8647-292785ee4392-scripts\") pod \"cinder-backup-0\" (UID: \"0bde1710-3861-42cb-8647-292785ee4392\") " pod="openstack/cinder-backup-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.268237    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bde1710-3861-42cb-8647-292785ee4392-config-data\") pod \"cinder-backup-0\" (UID: \"0bde1710-3861-42cb-8647-292785ee4392\") " pod="openstack/cinder-backup-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.278017    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h764b\" (UniqueName: \"kubernetes.io/projected/0bde1710-3861-42cb-8647-292785ee4392-kube-api-access-h764b\") pod \"cinder-backup-0\" (UID: \"0bde1710-3861-42cb-8647-292785ee4392\") " pod="openstack/cinder-backup-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.338164    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/20675030-52b7-4f1d-b087-d7703a59f5e1-var-locks-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"20675030-52b7-4f1d-b087-d7703a59f5e1\") " pod="openstack/cinder-volume-nfs-2-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.338230    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/20675030-52b7-4f1d-b087-d7703a59f5e1-var-lib-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"20675030-52b7-4f1d-b087-d7703a59f5e1\") " pod="openstack/cinder-volume-nfs-2-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.338285    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/436a7a40-7823-4670-a107-ff5ca02da822-etc-iscsi\") pod \"cinder-volume-nfs-0\" (UID: \"436a7a40-7823-4670-a107-ff5ca02da822\") " pod="openstack/cinder-volume-nfs-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.338313    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/436a7a40-7823-4670-a107-ff5ca02da822-dev\") pod \"cinder-volume-nfs-0\" (UID: \"436a7a40-7823-4670-a107-ff5ca02da822\") " pod="openstack/cinder-volume-nfs-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.338363    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4b6z\" (UniqueName: \"kubernetes.io/projected/20675030-52b7-4f1d-b087-d7703a59f5e1-kube-api-access-t4b6z\") pod \"cinder-volume-nfs-2-0\" (UID: \"20675030-52b7-4f1d-b087-d7703a59f5e1\") " pod="openstack/cinder-volume-nfs-2-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.338387    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/436a7a40-7823-4670-a107-ff5ca02da822-sys\") pod \"cinder-volume-nfs-0\" (UID: \"436a7a40-7823-4670-a107-ff5ca02da822\") " pod="openstack/cinder-volume-nfs-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.338420    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/436a7a40-7823-4670-a107-ff5ca02da822-run\") pod \"cinder-volume-nfs-0\" (UID: \"436a7a40-7823-4670-a107-ff5ca02da822\") " pod="openstack/cinder-volume-nfs-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.338446    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/436a7a40-7823-4670-a107-ff5ca02da822-var-lib-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"436a7a40-7823-4670-a107-ff5ca02da822\") " pod="openstack/cinder-volume-nfs-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.338474    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/20675030-52b7-4f1d-b087-d7703a59f5e1-var-locks-brick\") pod \"cinder-volume-nfs-2-0\" (UID: \"20675030-52b7-4f1d-b087-d7703a59f5e1\") " pod="openstack/cinder-volume-nfs-2-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.338506    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20675030-52b7-4f1d-b087-d7703a59f5e1-combined-ca-bundle\") pod \"cinder-volume-nfs-2-0\" (UID: \"20675030-52b7-4f1d-b087-d7703a59f5e1\") " pod="openstack/cinder-volume-nfs-2-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.338535    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/436a7a40-7823-4670-a107-ff5ca02da822-etc-machine-id\") pod \"cinder-volume-nfs-0\" (UID: \"436a7a40-7823-4670-a107-ff5ca02da822\") " pod="openstack/cinder-volume-nfs-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.338580    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/20675030-52b7-4f1d-b087-d7703a59f5e1-etc-iscsi\") pod \"cinder-volume-nfs-2-0\" (UID: \"20675030-52b7-4f1d-b087-d7703a59f5e1\") " pod="openstack/cinder-volume-nfs-2-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.338613    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/20675030-52b7-4f1d-b087-d7703a59f5e1-etc-nvme\") pod \"cinder-volume-nfs-2-0\" (UID: \"20675030-52b7-4f1d-b087-d7703a59f5e1\") " pod="openstack/cinder-volume-nfs-2-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.338637    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/436a7a40-7823-4670-a107-ff5ca02da822-config-data\") pod \"cinder-volume-nfs-0\" (UID: \"436a7a40-7823-4670-a107-ff5ca02da822\") " pod="openstack/cinder-volume-nfs-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.338669    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/436a7a40-7823-4670-a107-ff5ca02da822-var-locks-brick\") pod \"cinder-volume-nfs-0\" (UID: \"436a7a40-7823-4670-a107-ff5ca02da822\") " pod="openstack/cinder-volume-nfs-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.338697    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/20675030-52b7-4f1d-b087-d7703a59f5e1-lib-modules\") pod \"cinder-volume-nfs-2-0\" (UID: \"20675030-52b7-4f1d-b087-d7703a59f5e1\") " pod="openstack/cinder-volume-nfs-2-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.338728    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/436a7a40-7823-4670-a107-ff5ca02da822-var-locks-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"436a7a40-7823-4670-a107-ff5ca02da822\") " pod="openstack/cinder-volume-nfs-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.338756    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/20675030-52b7-4f1d-b087-d7703a59f5e1-run\") pod \"cinder-volume-nfs-2-0\" (UID: \"20675030-52b7-4f1d-b087-d7703a59f5e1\") " pod="openstack/cinder-volume-nfs-2-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.338783    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/20675030-52b7-4f1d-b087-d7703a59f5e1-etc-machine-id\") pod \"cinder-volume-nfs-2-0\" (UID: \"20675030-52b7-4f1d-b087-d7703a59f5e1\") " pod="openstack/cinder-volume-nfs-2-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.338815    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/436a7a40-7823-4670-a107-ff5ca02da822-lib-modules\") pod \"cinder-volume-nfs-0\" (UID: \"436a7a40-7823-4670-a107-ff5ca02da822\") " pod="openstack/cinder-volume-nfs-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.338840    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20675030-52b7-4f1d-b087-d7703a59f5e1-config-data\") pod \"cinder-volume-nfs-2-0\" (UID: \"20675030-52b7-4f1d-b087-d7703a59f5e1\") " pod="openstack/cinder-volume-nfs-2-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.338861    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20675030-52b7-4f1d-b087-d7703a59f5e1-scripts\") pod \"cinder-volume-nfs-2-0\" (UID: \"20675030-52b7-4f1d-b087-d7703a59f5e1\") " pod="openstack/cinder-volume-nfs-2-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.338900    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/20675030-52b7-4f1d-b087-d7703a59f5e1-config-data-custom\") pod \"cinder-volume-nfs-2-0\" (UID: \"20675030-52b7-4f1d-b087-d7703a59f5e1\") " pod="openstack/cinder-volume-nfs-2-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.338922    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vz5n7\" (UniqueName: \"kubernetes.io/projected/436a7a40-7823-4670-a107-ff5ca02da822-kube-api-access-vz5n7\") pod \"cinder-volume-nfs-0\" (UID: \"436a7a40-7823-4670-a107-ff5ca02da822\") " pod="openstack/cinder-volume-nfs-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.338951    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/436a7a40-7823-4670-a107-ff5ca02da822-etc-nvme\") pod \"cinder-volume-nfs-0\" (UID: \"436a7a40-7823-4670-a107-ff5ca02da822\") " pod="openstack/cinder-volume-nfs-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.338973    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/436a7a40-7823-4670-a107-ff5ca02da822-combined-ca-bundle\") pod \"cinder-volume-nfs-0\" (UID: \"436a7a40-7823-4670-a107-ff5ca02da822\") " pod="openstack/cinder-volume-nfs-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.339022    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/20675030-52b7-4f1d-b087-d7703a59f5e1-dev\") pod \"cinder-volume-nfs-2-0\" (UID: \"20675030-52b7-4f1d-b087-d7703a59f5e1\") " pod="openstack/cinder-volume-nfs-2-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.339054    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/20675030-52b7-4f1d-b087-d7703a59f5e1-sys\") pod \"cinder-volume-nfs-2-0\" (UID: \"20675030-52b7-4f1d-b087-d7703a59f5e1\") " pod="openstack/cinder-volume-nfs-2-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.339082    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/436a7a40-7823-4670-a107-ff5ca02da822-scripts\") pod \"cinder-volume-nfs-0\" (UID: \"436a7a40-7823-4670-a107-ff5ca02da822\") " pod="openstack/cinder-volume-nfs-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.339135    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/436a7a40-7823-4670-a107-ff5ca02da822-config-data-custom\") pod \"cinder-volume-nfs-0\" (UID: \"436a7a40-7823-4670-a107-ff5ca02da822\") " pod="openstack/cinder-volume-nfs-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.342410    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/436a7a40-7823-4670-a107-ff5ca02da822-var-locks-brick\") pod \"cinder-volume-nfs-0\" (UID: \"436a7a40-7823-4670-a107-ff5ca02da822\") " pod="openstack/cinder-volume-nfs-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.342509    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/436a7a40-7823-4670-a107-ff5ca02da822-etc-iscsi\") pod \"cinder-volume-nfs-0\" (UID: \"436a7a40-7823-4670-a107-ff5ca02da822\") " pod="openstack/cinder-volume-nfs-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.342544    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/436a7a40-7823-4670-a107-ff5ca02da822-dev\") pod \"cinder-volume-nfs-0\" (UID: \"436a7a40-7823-4670-a107-ff5ca02da822\") " pod="openstack/cinder-volume-nfs-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.342580    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/436a7a40-7823-4670-a107-ff5ca02da822-sys\") pod \"cinder-volume-nfs-0\" (UID: \"436a7a40-7823-4670-a107-ff5ca02da822\") " pod="openstack/cinder-volume-nfs-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.342606    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/436a7a40-7823-4670-a107-ff5ca02da822-run\") pod \"cinder-volume-nfs-0\" (UID: \"436a7a40-7823-4670-a107-ff5ca02da822\") " pod="openstack/cinder-volume-nfs-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.342652    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/436a7a40-7823-4670-a107-ff5ca02da822-var-lib-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"436a7a40-7823-4670-a107-ff5ca02da822\") " pod="openstack/cinder-volume-nfs-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.342694    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/436a7a40-7823-4670-a107-ff5ca02da822-etc-machine-id\") pod \"cinder-volume-nfs-0\" (UID: \"436a7a40-7823-4670-a107-ff5ca02da822\") " pod="openstack/cinder-volume-nfs-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.343490    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/436a7a40-7823-4670-a107-ff5ca02da822-var-locks-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"436a7a40-7823-4670-a107-ff5ca02da822\") " pod="openstack/cinder-volume-nfs-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.343572    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/436a7a40-7823-4670-a107-ff5ca02da822-lib-modules\") pod \"cinder-volume-nfs-0\" (UID: \"436a7a40-7823-4670-a107-ff5ca02da822\") " pod="openstack/cinder-volume-nfs-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.343825    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/436a7a40-7823-4670-a107-ff5ca02da822-config-data-custom\") pod \"cinder-volume-nfs-0\" (UID: \"436a7a40-7823-4670-a107-ff5ca02da822\") " pod="openstack/cinder-volume-nfs-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.346987    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/436a7a40-7823-4670-a107-ff5ca02da822-config-data\") pod \"cinder-volume-nfs-0\" (UID: \"436a7a40-7823-4670-a107-ff5ca02da822\") " pod="openstack/cinder-volume-nfs-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.347416    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/436a7a40-7823-4670-a107-ff5ca02da822-etc-nvme\") pod \"cinder-volume-nfs-0\" (UID: \"436a7a40-7823-4670-a107-ff5ca02da822\") " pod="openstack/cinder-volume-nfs-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.354273    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/436a7a40-7823-4670-a107-ff5ca02da822-scripts\") pod \"cinder-volume-nfs-0\" (UID: \"436a7a40-7823-4670-a107-ff5ca02da822\") " pod="openstack/cinder-volume-nfs-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.355035    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/436a7a40-7823-4670-a107-ff5ca02da822-combined-ca-bundle\") pod \"cinder-volume-nfs-0\" (UID: \"436a7a40-7823-4670-a107-ff5ca02da822\") " pod="openstack/cinder-volume-nfs-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.361857    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz5n7\" (UniqueName: \"kubernetes.io/projected/436a7a40-7823-4670-a107-ff5ca02da822-kube-api-access-vz5n7\") pod \"cinder-volume-nfs-0\" (UID: \"436a7a40-7823-4670-a107-ff5ca02da822\") " pod="openstack/cinder-volume-nfs-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.372057    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.423854    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.440759    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/20675030-52b7-4f1d-b087-d7703a59f5e1-etc-nvme\") pod \"cinder-volume-nfs-2-0\" (UID: \"20675030-52b7-4f1d-b087-d7703a59f5e1\") " pod="openstack/cinder-volume-nfs-2-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.440816    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/20675030-52b7-4f1d-b087-d7703a59f5e1-lib-modules\") pod \"cinder-volume-nfs-2-0\" (UID: \"20675030-52b7-4f1d-b087-d7703a59f5e1\") " pod="openstack/cinder-volume-nfs-2-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.440871    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/20675030-52b7-4f1d-b087-d7703a59f5e1-run\") pod \"cinder-volume-nfs-2-0\" (UID: \"20675030-52b7-4f1d-b087-d7703a59f5e1\") " pod="openstack/cinder-volume-nfs-2-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.440888    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/20675030-52b7-4f1d-b087-d7703a59f5e1-etc-machine-id\") pod \"cinder-volume-nfs-2-0\" (UID: \"20675030-52b7-4f1d-b087-d7703a59f5e1\") " pod="openstack/cinder-volume-nfs-2-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.440914    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20675030-52b7-4f1d-b087-d7703a59f5e1-config-data\") pod \"cinder-volume-nfs-2-0\" (UID: \"20675030-52b7-4f1d-b087-d7703a59f5e1\") " pod="openstack/cinder-volume-nfs-2-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.440923    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/20675030-52b7-4f1d-b087-d7703a59f5e1-etc-nvme\") pod \"cinder-volume-nfs-2-0\" (UID: \"20675030-52b7-4f1d-b087-d7703a59f5e1\") " pod="openstack/cinder-volume-nfs-2-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.440976    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/20675030-52b7-4f1d-b087-d7703a59f5e1-etc-machine-id\") pod \"cinder-volume-nfs-2-0\" (UID: \"20675030-52b7-4f1d-b087-d7703a59f5e1\") " pod="openstack/cinder-volume-nfs-2-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.440980    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/20675030-52b7-4f1d-b087-d7703a59f5e1-run\") pod \"cinder-volume-nfs-2-0\" (UID: \"20675030-52b7-4f1d-b087-d7703a59f5e1\") " pod="openstack/cinder-volume-nfs-2-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.440930    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20675030-52b7-4f1d-b087-d7703a59f5e1-scripts\") pod \"cinder-volume-nfs-2-0\" (UID: \"20675030-52b7-4f1d-b087-d7703a59f5e1\") " pod="openstack/cinder-volume-nfs-2-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.441022    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/20675030-52b7-4f1d-b087-d7703a59f5e1-lib-modules\") pod \"cinder-volume-nfs-2-0\" (UID: \"20675030-52b7-4f1d-b087-d7703a59f5e1\") " pod="openstack/cinder-volume-nfs-2-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.441067    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/20675030-52b7-4f1d-b087-d7703a59f5e1-config-data-custom\") pod \"cinder-volume-nfs-2-0\" (UID: \"20675030-52b7-4f1d-b087-d7703a59f5e1\") " pod="openstack/cinder-volume-nfs-2-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.441266    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/20675030-52b7-4f1d-b087-d7703a59f5e1-dev\") pod \"cinder-volume-nfs-2-0\" (UID: \"20675030-52b7-4f1d-b087-d7703a59f5e1\") " pod="openstack/cinder-volume-nfs-2-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.441310    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/20675030-52b7-4f1d-b087-d7703a59f5e1-sys\") pod \"cinder-volume-nfs-2-0\" (UID: \"20675030-52b7-4f1d-b087-d7703a59f5e1\") " pod="openstack/cinder-volume-nfs-2-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.441403    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/20675030-52b7-4f1d-b087-d7703a59f5e1-var-locks-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"20675030-52b7-4f1d-b087-d7703a59f5e1\") " pod="openstack/cinder-volume-nfs-2-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.441440    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/20675030-52b7-4f1d-b087-d7703a59f5e1-var-lib-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"20675030-52b7-4f1d-b087-d7703a59f5e1\") " pod="openstack/cinder-volume-nfs-2-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.441513    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4b6z\" (UniqueName: \"kubernetes.io/projected/20675030-52b7-4f1d-b087-d7703a59f5e1-kube-api-access-t4b6z\") pod \"cinder-volume-nfs-2-0\" (UID: \"20675030-52b7-4f1d-b087-d7703a59f5e1\") " pod="openstack/cinder-volume-nfs-2-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.441589    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/20675030-52b7-4f1d-b087-d7703a59f5e1-var-locks-brick\") pod \"cinder-volume-nfs-2-0\" (UID: \"20675030-52b7-4f1d-b087-d7703a59f5e1\") " pod="openstack/cinder-volume-nfs-2-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.441629    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20675030-52b7-4f1d-b087-d7703a59f5e1-combined-ca-bundle\") pod \"cinder-volume-nfs-2-0\" (UID: \"20675030-52b7-4f1d-b087-d7703a59f5e1\") " pod="openstack/cinder-volume-nfs-2-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.441643    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/20675030-52b7-4f1d-b087-d7703a59f5e1-var-locks-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"20675030-52b7-4f1d-b087-d7703a59f5e1\") " pod="openstack/cinder-volume-nfs-2-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.441679    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/20675030-52b7-4f1d-b087-d7703a59f5e1-dev\") pod \"cinder-volume-nfs-2-0\" (UID: \"20675030-52b7-4f1d-b087-d7703a59f5e1\") " pod="openstack/cinder-volume-nfs-2-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.441702    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/20675030-52b7-4f1d-b087-d7703a59f5e1-etc-iscsi\") pod \"cinder-volume-nfs-2-0\" (UID: \"20675030-52b7-4f1d-b087-d7703a59f5e1\") " pod="openstack/cinder-volume-nfs-2-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.441701    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/20675030-52b7-4f1d-b087-d7703a59f5e1-sys\") pod \"cinder-volume-nfs-2-0\" (UID: \"20675030-52b7-4f1d-b087-d7703a59f5e1\") " pod="openstack/cinder-volume-nfs-2-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.441825    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/20675030-52b7-4f1d-b087-d7703a59f5e1-var-locks-brick\") pod \"cinder-volume-nfs-2-0\" (UID: \"20675030-52b7-4f1d-b087-d7703a59f5e1\") " pod="openstack/cinder-volume-nfs-2-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.441974    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/20675030-52b7-4f1d-b087-d7703a59f5e1-var-lib-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"20675030-52b7-4f1d-b087-d7703a59f5e1\") " pod="openstack/cinder-volume-nfs-2-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.442012    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/20675030-52b7-4f1d-b087-d7703a59f5e1-etc-iscsi\") pod \"cinder-volume-nfs-2-0\" (UID: \"20675030-52b7-4f1d-b087-d7703a59f5e1\") " pod="openstack/cinder-volume-nfs-2-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.445515    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/20675030-52b7-4f1d-b087-d7703a59f5e1-config-data-custom\") pod \"cinder-volume-nfs-2-0\" (UID: \"20675030-52b7-4f1d-b087-d7703a59f5e1\") " pod="openstack/cinder-volume-nfs-2-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.446550    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20675030-52b7-4f1d-b087-d7703a59f5e1-scripts\") pod \"cinder-volume-nfs-2-0\" (UID: \"20675030-52b7-4f1d-b087-d7703a59f5e1\") " pod="openstack/cinder-volume-nfs-2-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.450210    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20675030-52b7-4f1d-b087-d7703a59f5e1-config-data\") pod \"cinder-volume-nfs-2-0\" (UID: \"20675030-52b7-4f1d-b087-d7703a59f5e1\") " pod="openstack/cinder-volume-nfs-2-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.451333    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20675030-52b7-4f1d-b087-d7703a59f5e1-combined-ca-bundle\") pod \"cinder-volume-nfs-2-0\" (UID: \"20675030-52b7-4f1d-b087-d7703a59f5e1\") " pod="openstack/cinder-volume-nfs-2-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.461002    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4b6z\" (UniqueName: \"kubernetes.io/projected/20675030-52b7-4f1d-b087-d7703a59f5e1-kube-api-access-t4b6z\") pod \"cinder-volume-nfs-2-0\" (UID: \"20675030-52b7-4f1d-b087-d7703a59f5e1\") " pod="openstack/cinder-volume-nfs-2-0"
Mar 20 16:26:11 crc kubenswrapper[4730]: I0320 16:26:11.548893    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-2-0"
Mar 20 16:26:12 crc kubenswrapper[4730]: I0320 16:26:12.075732    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"]
Mar 20 16:26:12 crc kubenswrapper[4730]: I0320 16:26:12.199495    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-0"]
Mar 20 16:26:12 crc kubenswrapper[4730]: I0320 16:26:12.317983    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-2-0"]
Mar 20 16:26:12 crc kubenswrapper[4730]: I0320 16:26:12.924304    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"0bde1710-3861-42cb-8647-292785ee4392","Type":"ContainerStarted","Data":"685c1d13a0db4e2ea0fc6d07cdce729cef52e4b3b6f558f92d6a241eb761edaf"}
Mar 20 16:26:12 crc kubenswrapper[4730]: I0320 16:26:12.925607    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-2-0" event={"ID":"20675030-52b7-4f1d-b087-d7703a59f5e1","Type":"ContainerStarted","Data":"230f866f4bbf8aaf3b8b2e46381fba766c863e7ffc2853c343419bf0a04c7356"}
Mar 20 16:26:12 crc kubenswrapper[4730]: I0320 16:26:12.928528    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-0" event={"ID":"436a7a40-7823-4670-a107-ff5ca02da822","Type":"ContainerStarted","Data":"515e994ed7ed84d55410946a62a82441d3159b32e8e0ba716c111cf102becdce"}
Mar 20 16:26:13 crc kubenswrapper[4730]: I0320 16:26:13.958687    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-2-0" event={"ID":"20675030-52b7-4f1d-b087-d7703a59f5e1","Type":"ContainerStarted","Data":"244a1a9da01b80a2dec1f2aad86cbb80808c91e9b5141f3019c87685f9ee1f67"}
Mar 20 16:26:13 crc kubenswrapper[4730]: I0320 16:26:13.959391    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-2-0" event={"ID":"20675030-52b7-4f1d-b087-d7703a59f5e1","Type":"ContainerStarted","Data":"98c4cc15882d4d9c4f39cab3aeaf386a68e358cfed7432474997137d22f43381"}
Mar 20 16:26:13 crc kubenswrapper[4730]: I0320 16:26:13.979094    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-0" event={"ID":"436a7a40-7823-4670-a107-ff5ca02da822","Type":"ContainerStarted","Data":"8d7b2fd6a6a3f93db5fcfb585196006bb7bd8578570a1903fcc7ac89695494c6"}
Mar 20 16:26:13 crc kubenswrapper[4730]: I0320 16:26:13.979177    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-0" event={"ID":"436a7a40-7823-4670-a107-ff5ca02da822","Type":"ContainerStarted","Data":"e94a36ea4cfee2e3e5488295127b81b7b45cd1eb7a47adee15169f50919eebf7"}
Mar 20 16:26:13 crc kubenswrapper[4730]: I0320 16:26:13.987501    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"0bde1710-3861-42cb-8647-292785ee4392","Type":"ContainerStarted","Data":"055fac303ea9ea6a10e34f284632763ac00812becd9a3d2658a8c31f1e8ac202"}
Mar 20 16:26:13 crc kubenswrapper[4730]: I0320 16:26:13.987558    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"0bde1710-3861-42cb-8647-292785ee4392","Type":"ContainerStarted","Data":"d813994f52ffcff4e1cd06d55a94248aab665c57861602d1352bd70e1511de1d"}
Mar 20 16:26:13 crc kubenswrapper[4730]: I0320 16:26:13.999173    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-nfs-2-0" podStartSLOduration=2.520349264 podStartE2EDuration="2.999146832s" podCreationTimestamp="2026-03-20 16:26:11 +0000 UTC" firstStartedPulling="2026-03-20 16:26:12.325493126 +0000 UTC m=+2831.538864505" lastFinishedPulling="2026-03-20 16:26:12.804290704 +0000 UTC m=+2832.017662073" observedRunningTime="2026-03-20 16:26:13.98958775 +0000 UTC m=+2833.202959119" watchObservedRunningTime="2026-03-20 16:26:13.999146832 +0000 UTC m=+2833.212518201"
Mar 20 16:26:14 crc kubenswrapper[4730]: I0320 16:26:14.027706    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-nfs-0" podStartSLOduration=2.471252346 podStartE2EDuration="3.027690204s" podCreationTimestamp="2026-03-20 16:26:11 +0000 UTC" firstStartedPulling="2026-03-20 16:26:12.247082504 +0000 UTC m=+2831.460453873" lastFinishedPulling="2026-03-20 16:26:12.803520362 +0000 UTC m=+2832.016891731" observedRunningTime="2026-03-20 16:26:14.024895655 +0000 UTC m=+2833.238267034" watchObservedRunningTime="2026-03-20 16:26:14.027690204 +0000 UTC m=+2833.241061573"
Mar 20 16:26:14 crc kubenswrapper[4730]: I0320 16:26:14.064163    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=2.420906774 podStartE2EDuration="3.064141902s" podCreationTimestamp="2026-03-20 16:26:11 +0000 UTC" firstStartedPulling="2026-03-20 16:26:12.081727638 +0000 UTC m=+2831.295099007" lastFinishedPulling="2026-03-20 16:26:12.724962746 +0000 UTC m=+2831.938334135" observedRunningTime="2026-03-20 16:26:14.051378758 +0000 UTC m=+2833.264750127" watchObservedRunningTime="2026-03-20 16:26:14.064141902 +0000 UTC m=+2833.277513271"
Mar 20 16:26:16 crc kubenswrapper[4730]: I0320 16:26:16.373206    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0"
Mar 20 16:26:16 crc kubenswrapper[4730]: I0320 16:26:16.425268    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-nfs-0"
Mar 20 16:26:16 crc kubenswrapper[4730]: I0320 16:26:16.550191    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-nfs-2-0"
Mar 20 16:26:21 crc kubenswrapper[4730]: I0320 16:26:21.570375    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0"
Mar 20 16:26:21 crc kubenswrapper[4730]: I0320 16:26:21.704261    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-nfs-0"
Mar 20 16:26:21 crc kubenswrapper[4730]: I0320 16:26:21.865192    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-nfs-2-0"
Mar 20 16:26:49 crc kubenswrapper[4730]: I0320 16:26:49.849962    4730 scope.go:117] "RemoveContainer" containerID="fea91f3e29a9829ef950c2b8d1b25f02c8cfb65d5e94d7557e912ffa33559bad"
Mar 20 16:27:21 crc kubenswrapper[4730]: I0320 16:27:21.455916    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"]
Mar 20 16:27:21 crc kubenswrapper[4730]: I0320 16:27:21.457075    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="b9474555-d03c-4f34-8914-15b7654ec76e" containerName="prometheus" containerID="cri-o://8be7e9c76c5955e48d7fb228ba7f64fc7c79c7a945b0f730c1eb3ac871d2f2ce" gracePeriod=600
Mar 20 16:27:21 crc kubenswrapper[4730]: I0320 16:27:21.457143    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="b9474555-d03c-4f34-8914-15b7654ec76e" containerName="thanos-sidecar" containerID="cri-o://bf089e92ae421a8920cefe87896cf3bb8f1ad22d0fc9bc224fb423d5346400e7" gracePeriod=600
Mar 20 16:27:21 crc kubenswrapper[4730]: I0320 16:27:21.457150    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="b9474555-d03c-4f34-8914-15b7654ec76e" containerName="config-reloader" containerID="cri-o://df8727da1c49b4db84c764524b2f4d737c0c03808dcebb6fb23caea5272a7aec" gracePeriod=600
Mar 20 16:27:21 crc kubenswrapper[4730]: I0320 16:27:21.627051    4730 generic.go:334] "Generic (PLEG): container finished" podID="b9474555-d03c-4f34-8914-15b7654ec76e" containerID="bf089e92ae421a8920cefe87896cf3bb8f1ad22d0fc9bc224fb423d5346400e7" exitCode=0
Mar 20 16:27:21 crc kubenswrapper[4730]: I0320 16:27:21.627081    4730 generic.go:334] "Generic (PLEG): container finished" podID="b9474555-d03c-4f34-8914-15b7654ec76e" containerID="8be7e9c76c5955e48d7fb228ba7f64fc7c79c7a945b0f730c1eb3ac871d2f2ce" exitCode=0
Mar 20 16:27:21 crc kubenswrapper[4730]: I0320 16:27:21.627100    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b9474555-d03c-4f34-8914-15b7654ec76e","Type":"ContainerDied","Data":"bf089e92ae421a8920cefe87896cf3bb8f1ad22d0fc9bc224fb423d5346400e7"}
Mar 20 16:27:21 crc kubenswrapper[4730]: I0320 16:27:21.627123    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b9474555-d03c-4f34-8914-15b7654ec76e","Type":"ContainerDied","Data":"8be7e9c76c5955e48d7fb228ba7f64fc7c79c7a945b0f730c1eb3ac871d2f2ce"}
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.577098    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0"
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.660455    4730 generic.go:334] "Generic (PLEG): container finished" podID="b9474555-d03c-4f34-8914-15b7654ec76e" containerID="df8727da1c49b4db84c764524b2f4d737c0c03808dcebb6fb23caea5272a7aec" exitCode=0
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.660506    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b9474555-d03c-4f34-8914-15b7654ec76e","Type":"ContainerDied","Data":"df8727da1c49b4db84c764524b2f4d737c0c03808dcebb6fb23caea5272a7aec"}
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.660538    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b9474555-d03c-4f34-8914-15b7654ec76e","Type":"ContainerDied","Data":"8f7081ac79f5f8ab5d11083740ef5a60bf4e5c0ec09313ba816c9290b7a2077b"}
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.660557    4730 scope.go:117] "RemoveContainer" containerID="bf089e92ae421a8920cefe87896cf3bb8f1ad22d0fc9bc224fb423d5346400e7"
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.660722    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0"
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.681168    4730 scope.go:117] "RemoveContainer" containerID="df8727da1c49b4db84c764524b2f4d737c0c03808dcebb6fb23caea5272a7aec"
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.700064    4730 scope.go:117] "RemoveContainer" containerID="8be7e9c76c5955e48d7fb228ba7f64fc7c79c7a945b0f730c1eb3ac871d2f2ce"
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.728770    4730 scope.go:117] "RemoveContainer" containerID="c6f36cf8613ae0c9fc8870f685a85cb84e12a16bcd52f187f659c8895c86bf85"
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.731823    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/b9474555-d03c-4f34-8914-15b7654ec76e-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"b9474555-d03c-4f34-8914-15b7654ec76e\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") "
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.731881    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/b9474555-d03c-4f34-8914-15b7654ec76e-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"b9474555-d03c-4f34-8914-15b7654ec76e\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") "
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.731971    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b9474555-d03c-4f34-8914-15b7654ec76e-config-out\") pod \"b9474555-d03c-4f34-8914-15b7654ec76e\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") "
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.732007    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9474555-d03c-4f34-8914-15b7654ec76e-secret-combined-ca-bundle\") pod \"b9474555-d03c-4f34-8914-15b7654ec76e\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") "
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.732039    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b9474555-d03c-4f34-8914-15b7654ec76e-thanos-prometheus-http-client-file\") pod \"b9474555-d03c-4f34-8914-15b7654ec76e\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") "
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.732085    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b9474555-d03c-4f34-8914-15b7654ec76e-config\") pod \"b9474555-d03c-4f34-8914-15b7654ec76e\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") "
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.732226    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/b9474555-d03c-4f34-8914-15b7654ec76e-prometheus-metric-storage-rulefiles-2\") pod \"b9474555-d03c-4f34-8914-15b7654ec76e\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") "
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.732342    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b9474555-d03c-4f34-8914-15b7654ec76e-tls-assets\") pod \"b9474555-d03c-4f34-8914-15b7654ec76e\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") "
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.732367    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b9474555-d03c-4f34-8914-15b7654ec76e-web-config\") pod \"b9474555-d03c-4f34-8914-15b7654ec76e\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") "
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.732470    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/b9474555-d03c-4f34-8914-15b7654ec76e-prometheus-metric-storage-rulefiles-1\") pod \"b9474555-d03c-4f34-8914-15b7654ec76e\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") "
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.732501    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55vpw\" (UniqueName: \"kubernetes.io/projected/b9474555-d03c-4f34-8914-15b7654ec76e-kube-api-access-55vpw\") pod \"b9474555-d03c-4f34-8914-15b7654ec76e\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") "
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.732600    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-361883a8-e26d-4e93-ae2e-72ea2afef68d\") pod \"b9474555-d03c-4f34-8914-15b7654ec76e\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") "
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.732645    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b9474555-d03c-4f34-8914-15b7654ec76e-prometheus-metric-storage-rulefiles-0\") pod \"b9474555-d03c-4f34-8914-15b7654ec76e\" (UID: \"b9474555-d03c-4f34-8914-15b7654ec76e\") "
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.733432    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9474555-d03c-4f34-8914-15b7654ec76e-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "b9474555-d03c-4f34-8914-15b7654ec76e" (UID: "b9474555-d03c-4f34-8914-15b7654ec76e"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.734375    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9474555-d03c-4f34-8914-15b7654ec76e-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "b9474555-d03c-4f34-8914-15b7654ec76e" (UID: "b9474555-d03c-4f34-8914-15b7654ec76e"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.734664    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9474555-d03c-4f34-8914-15b7654ec76e-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "b9474555-d03c-4f34-8914-15b7654ec76e" (UID: "b9474555-d03c-4f34-8914-15b7654ec76e"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.739130    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9474555-d03c-4f34-8914-15b7654ec76e-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "b9474555-d03c-4f34-8914-15b7654ec76e" (UID: "b9474555-d03c-4f34-8914-15b7654ec76e"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.739546    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9474555-d03c-4f34-8914-15b7654ec76e-kube-api-access-55vpw" (OuterVolumeSpecName: "kube-api-access-55vpw") pod "b9474555-d03c-4f34-8914-15b7654ec76e" (UID: "b9474555-d03c-4f34-8914-15b7654ec76e"). InnerVolumeSpecName "kube-api-access-55vpw". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.739693    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9474555-d03c-4f34-8914-15b7654ec76e-config" (OuterVolumeSpecName: "config") pod "b9474555-d03c-4f34-8914-15b7654ec76e" (UID: "b9474555-d03c-4f34-8914-15b7654ec76e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.739720    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9474555-d03c-4f34-8914-15b7654ec76e-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "b9474555-d03c-4f34-8914-15b7654ec76e" (UID: "b9474555-d03c-4f34-8914-15b7654ec76e"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.739992    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9474555-d03c-4f34-8914-15b7654ec76e-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "b9474555-d03c-4f34-8914-15b7654ec76e" (UID: "b9474555-d03c-4f34-8914-15b7654ec76e"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.745525    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9474555-d03c-4f34-8914-15b7654ec76e-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "b9474555-d03c-4f34-8914-15b7654ec76e" (UID: "b9474555-d03c-4f34-8914-15b7654ec76e"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.745695    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9474555-d03c-4f34-8914-15b7654ec76e-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "b9474555-d03c-4f34-8914-15b7654ec76e" (UID: "b9474555-d03c-4f34-8914-15b7654ec76e"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.752169    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9474555-d03c-4f34-8914-15b7654ec76e-config-out" (OuterVolumeSpecName: "config-out") pod "b9474555-d03c-4f34-8914-15b7654ec76e" (UID: "b9474555-d03c-4f34-8914-15b7654ec76e"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.758012    4730 scope.go:117] "RemoveContainer" containerID="bf089e92ae421a8920cefe87896cf3bb8f1ad22d0fc9bc224fb423d5346400e7"
Mar 20 16:27:22 crc kubenswrapper[4730]: E0320 16:27:22.758813    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf089e92ae421a8920cefe87896cf3bb8f1ad22d0fc9bc224fb423d5346400e7\": container with ID starting with bf089e92ae421a8920cefe87896cf3bb8f1ad22d0fc9bc224fb423d5346400e7 not found: ID does not exist" containerID="bf089e92ae421a8920cefe87896cf3bb8f1ad22d0fc9bc224fb423d5346400e7"
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.758850    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf089e92ae421a8920cefe87896cf3bb8f1ad22d0fc9bc224fb423d5346400e7"} err="failed to get container status \"bf089e92ae421a8920cefe87896cf3bb8f1ad22d0fc9bc224fb423d5346400e7\": rpc error: code = NotFound desc = could not find container \"bf089e92ae421a8920cefe87896cf3bb8f1ad22d0fc9bc224fb423d5346400e7\": container with ID starting with bf089e92ae421a8920cefe87896cf3bb8f1ad22d0fc9bc224fb423d5346400e7 not found: ID does not exist"
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.758875    4730 scope.go:117] "RemoveContainer" containerID="df8727da1c49b4db84c764524b2f4d737c0c03808dcebb6fb23caea5272a7aec"
Mar 20 16:27:22 crc kubenswrapper[4730]: E0320 16:27:22.759258    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df8727da1c49b4db84c764524b2f4d737c0c03808dcebb6fb23caea5272a7aec\": container with ID starting with df8727da1c49b4db84c764524b2f4d737c0c03808dcebb6fb23caea5272a7aec not found: ID does not exist" containerID="df8727da1c49b4db84c764524b2f4d737c0c03808dcebb6fb23caea5272a7aec"
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.759278    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df8727da1c49b4db84c764524b2f4d737c0c03808dcebb6fb23caea5272a7aec"} err="failed to get container status \"df8727da1c49b4db84c764524b2f4d737c0c03808dcebb6fb23caea5272a7aec\": rpc error: code = NotFound desc = could not find container \"df8727da1c49b4db84c764524b2f4d737c0c03808dcebb6fb23caea5272a7aec\": container with ID starting with df8727da1c49b4db84c764524b2f4d737c0c03808dcebb6fb23caea5272a7aec not found: ID does not exist"
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.759292    4730 scope.go:117] "RemoveContainer" containerID="8be7e9c76c5955e48d7fb228ba7f64fc7c79c7a945b0f730c1eb3ac871d2f2ce"
Mar 20 16:27:22 crc kubenswrapper[4730]: E0320 16:27:22.759525    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8be7e9c76c5955e48d7fb228ba7f64fc7c79c7a945b0f730c1eb3ac871d2f2ce\": container with ID starting with 8be7e9c76c5955e48d7fb228ba7f64fc7c79c7a945b0f730c1eb3ac871d2f2ce not found: ID does not exist" containerID="8be7e9c76c5955e48d7fb228ba7f64fc7c79c7a945b0f730c1eb3ac871d2f2ce"
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.759579    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8be7e9c76c5955e48d7fb228ba7f64fc7c79c7a945b0f730c1eb3ac871d2f2ce"} err="failed to get container status \"8be7e9c76c5955e48d7fb228ba7f64fc7c79c7a945b0f730c1eb3ac871d2f2ce\": rpc error: code = NotFound desc = could not find container \"8be7e9c76c5955e48d7fb228ba7f64fc7c79c7a945b0f730c1eb3ac871d2f2ce\": container with ID starting with 8be7e9c76c5955e48d7fb228ba7f64fc7c79c7a945b0f730c1eb3ac871d2f2ce not found: ID does not exist"
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.759599    4730 scope.go:117] "RemoveContainer" containerID="c6f36cf8613ae0c9fc8870f685a85cb84e12a16bcd52f187f659c8895c86bf85"
Mar 20 16:27:22 crc kubenswrapper[4730]: E0320 16:27:22.759816    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6f36cf8613ae0c9fc8870f685a85cb84e12a16bcd52f187f659c8895c86bf85\": container with ID starting with c6f36cf8613ae0c9fc8870f685a85cb84e12a16bcd52f187f659c8895c86bf85 not found: ID does not exist" containerID="c6f36cf8613ae0c9fc8870f685a85cb84e12a16bcd52f187f659c8895c86bf85"
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.759831    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6f36cf8613ae0c9fc8870f685a85cb84e12a16bcd52f187f659c8895c86bf85"} err="failed to get container status \"c6f36cf8613ae0c9fc8870f685a85cb84e12a16bcd52f187f659c8895c86bf85\": rpc error: code = NotFound desc = could not find container \"c6f36cf8613ae0c9fc8870f685a85cb84e12a16bcd52f187f659c8895c86bf85\": container with ID starting with c6f36cf8613ae0c9fc8870f685a85cb84e12a16bcd52f187f659c8895c86bf85 not found: ID does not exist"
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.761618    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-361883a8-e26d-4e93-ae2e-72ea2afef68d" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "b9474555-d03c-4f34-8914-15b7654ec76e" (UID: "b9474555-d03c-4f34-8914-15b7654ec76e"). InnerVolumeSpecName "pvc-361883a8-e26d-4e93-ae2e-72ea2afef68d". PluginName "kubernetes.io/csi", VolumeGidValue ""
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.826156    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9474555-d03c-4f34-8914-15b7654ec76e-web-config" (OuterVolumeSpecName: "web-config") pod "b9474555-d03c-4f34-8914-15b7654ec76e" (UID: "b9474555-d03c-4f34-8914-15b7654ec76e"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.835795    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55vpw\" (UniqueName: \"kubernetes.io/projected/b9474555-d03c-4f34-8914-15b7654ec76e-kube-api-access-55vpw\") on node \"crc\" DevicePath \"\""
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.835855    4730 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-361883a8-e26d-4e93-ae2e-72ea2afef68d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-361883a8-e26d-4e93-ae2e-72ea2afef68d\") on node \"crc\" "
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.835871    4730 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b9474555-d03c-4f34-8914-15b7654ec76e-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\""
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.835887    4730 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/b9474555-d03c-4f34-8914-15b7654ec76e-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\""
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.835899    4730 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/b9474555-d03c-4f34-8914-15b7654ec76e-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\""
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.835913    4730 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b9474555-d03c-4f34-8914-15b7654ec76e-config-out\") on node \"crc\" DevicePath \"\""
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.835924    4730 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9474555-d03c-4f34-8914-15b7654ec76e-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.835935    4730 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b9474555-d03c-4f34-8914-15b7654ec76e-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\""
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.835946    4730 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/b9474555-d03c-4f34-8914-15b7654ec76e-config\") on node \"crc\" DevicePath \"\""
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.835957    4730 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/b9474555-d03c-4f34-8914-15b7654ec76e-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\""
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.835967    4730 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b9474555-d03c-4f34-8914-15b7654ec76e-tls-assets\") on node \"crc\" DevicePath \"\""
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.835976    4730 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b9474555-d03c-4f34-8914-15b7654ec76e-web-config\") on node \"crc\" DevicePath \"\""
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.835987    4730 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/b9474555-d03c-4f34-8914-15b7654ec76e-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\""
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.875144    4730 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice...
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.875300    4730 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-361883a8-e26d-4e93-ae2e-72ea2afef68d" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-361883a8-e26d-4e93-ae2e-72ea2afef68d") on node "crc"
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.937397    4730 reconciler_common.go:293] "Volume detached for volume \"pvc-361883a8-e26d-4e93-ae2e-72ea2afef68d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-361883a8-e26d-4e93-ae2e-72ea2afef68d\") on node \"crc\" DevicePath \"\""
Mar 20 16:27:22 crc kubenswrapper[4730]: I0320 16:27:22.998466    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"]
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.015216    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"]
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.040413    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"]
Mar 20 16:27:23 crc kubenswrapper[4730]: E0320 16:27:23.040875    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9474555-d03c-4f34-8914-15b7654ec76e" containerName="init-config-reloader"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.040902    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9474555-d03c-4f34-8914-15b7654ec76e" containerName="init-config-reloader"
Mar 20 16:27:23 crc kubenswrapper[4730]: E0320 16:27:23.040936    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9474555-d03c-4f34-8914-15b7654ec76e" containerName="config-reloader"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.040945    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9474555-d03c-4f34-8914-15b7654ec76e" containerName="config-reloader"
Mar 20 16:27:23 crc kubenswrapper[4730]: E0320 16:27:23.040975    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9474555-d03c-4f34-8914-15b7654ec76e" containerName="prometheus"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.040983    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9474555-d03c-4f34-8914-15b7654ec76e" containerName="prometheus"
Mar 20 16:27:23 crc kubenswrapper[4730]: E0320 16:27:23.041000    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9474555-d03c-4f34-8914-15b7654ec76e" containerName="thanos-sidecar"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.041008    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9474555-d03c-4f34-8914-15b7654ec76e" containerName="thanos-sidecar"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.041267    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9474555-d03c-4f34-8914-15b7654ec76e" containerName="thanos-sidecar"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.041287    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9474555-d03c-4f34-8914-15b7654ec76e" containerName="prometheus"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.041299    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9474555-d03c-4f34-8914-15b7654ec76e" containerName="config-reloader"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.043613    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.045672    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.045805    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.046145    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.046323    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.046436    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.046939    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.047734    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-2q5k6"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.054332    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.061854    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"]
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.141091    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/457a736d-6c3f-486d-b8d1-fef19df33e26-config\") pod \"prometheus-metric-storage-0\" (UID: \"457a736d-6c3f-486d-b8d1-fef19df33e26\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.141141    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/457a736d-6c3f-486d-b8d1-fef19df33e26-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"457a736d-6c3f-486d-b8d1-fef19df33e26\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.141175    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/457a736d-6c3f-486d-b8d1-fef19df33e26-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"457a736d-6c3f-486d-b8d1-fef19df33e26\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.141196    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/457a736d-6c3f-486d-b8d1-fef19df33e26-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"457a736d-6c3f-486d-b8d1-fef19df33e26\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.141233    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/457a736d-6c3f-486d-b8d1-fef19df33e26-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"457a736d-6c3f-486d-b8d1-fef19df33e26\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.141492    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/457a736d-6c3f-486d-b8d1-fef19df33e26-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"457a736d-6c3f-486d-b8d1-fef19df33e26\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.141559    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-361883a8-e26d-4e93-ae2e-72ea2afef68d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-361883a8-e26d-4e93-ae2e-72ea2afef68d\") pod \"prometheus-metric-storage-0\" (UID: \"457a736d-6c3f-486d-b8d1-fef19df33e26\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.141766    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/457a736d-6c3f-486d-b8d1-fef19df33e26-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"457a736d-6c3f-486d-b8d1-fef19df33e26\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.141845    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/457a736d-6c3f-486d-b8d1-fef19df33e26-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"457a736d-6c3f-486d-b8d1-fef19df33e26\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.141905    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/457a736d-6c3f-486d-b8d1-fef19df33e26-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"457a736d-6c3f-486d-b8d1-fef19df33e26\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.141947    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/457a736d-6c3f-486d-b8d1-fef19df33e26-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"457a736d-6c3f-486d-b8d1-fef19df33e26\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.141969    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2pth\" (UniqueName: \"kubernetes.io/projected/457a736d-6c3f-486d-b8d1-fef19df33e26-kube-api-access-s2pth\") pod \"prometheus-metric-storage-0\" (UID: \"457a736d-6c3f-486d-b8d1-fef19df33e26\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.141995    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/457a736d-6c3f-486d-b8d1-fef19df33e26-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"457a736d-6c3f-486d-b8d1-fef19df33e26\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.243876    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2pth\" (UniqueName: \"kubernetes.io/projected/457a736d-6c3f-486d-b8d1-fef19df33e26-kube-api-access-s2pth\") pod \"prometheus-metric-storage-0\" (UID: \"457a736d-6c3f-486d-b8d1-fef19df33e26\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.243928    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/457a736d-6c3f-486d-b8d1-fef19df33e26-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"457a736d-6c3f-486d-b8d1-fef19df33e26\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.243976    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/457a736d-6c3f-486d-b8d1-fef19df33e26-config\") pod \"prometheus-metric-storage-0\" (UID: \"457a736d-6c3f-486d-b8d1-fef19df33e26\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.243997    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/457a736d-6c3f-486d-b8d1-fef19df33e26-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"457a736d-6c3f-486d-b8d1-fef19df33e26\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.244041    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/457a736d-6c3f-486d-b8d1-fef19df33e26-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"457a736d-6c3f-486d-b8d1-fef19df33e26\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.244056    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/457a736d-6c3f-486d-b8d1-fef19df33e26-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"457a736d-6c3f-486d-b8d1-fef19df33e26\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.244095    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/457a736d-6c3f-486d-b8d1-fef19df33e26-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"457a736d-6c3f-486d-b8d1-fef19df33e26\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.244138    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/457a736d-6c3f-486d-b8d1-fef19df33e26-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"457a736d-6c3f-486d-b8d1-fef19df33e26\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.244157    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-361883a8-e26d-4e93-ae2e-72ea2afef68d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-361883a8-e26d-4e93-ae2e-72ea2afef68d\") pod \"prometheus-metric-storage-0\" (UID: \"457a736d-6c3f-486d-b8d1-fef19df33e26\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.244219    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/457a736d-6c3f-486d-b8d1-fef19df33e26-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"457a736d-6c3f-486d-b8d1-fef19df33e26\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.244260    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/457a736d-6c3f-486d-b8d1-fef19df33e26-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"457a736d-6c3f-486d-b8d1-fef19df33e26\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.244297    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/457a736d-6c3f-486d-b8d1-fef19df33e26-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"457a736d-6c3f-486d-b8d1-fef19df33e26\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.244326    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/457a736d-6c3f-486d-b8d1-fef19df33e26-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"457a736d-6c3f-486d-b8d1-fef19df33e26\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.245283    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/457a736d-6c3f-486d-b8d1-fef19df33e26-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"457a736d-6c3f-486d-b8d1-fef19df33e26\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.245833    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/457a736d-6c3f-486d-b8d1-fef19df33e26-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"457a736d-6c3f-486d-b8d1-fef19df33e26\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.246097    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/457a736d-6c3f-486d-b8d1-fef19df33e26-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"457a736d-6c3f-486d-b8d1-fef19df33e26\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.248798    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/457a736d-6c3f-486d-b8d1-fef19df33e26-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"457a736d-6c3f-486d-b8d1-fef19df33e26\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.249442    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/457a736d-6c3f-486d-b8d1-fef19df33e26-config\") pod \"prometheus-metric-storage-0\" (UID: \"457a736d-6c3f-486d-b8d1-fef19df33e26\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.249816    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/457a736d-6c3f-486d-b8d1-fef19df33e26-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"457a736d-6c3f-486d-b8d1-fef19df33e26\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.251104    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/457a736d-6c3f-486d-b8d1-fef19df33e26-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"457a736d-6c3f-486d-b8d1-fef19df33e26\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.251508    4730 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice...
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.251589    4730 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-361883a8-e26d-4e93-ae2e-72ea2afef68d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-361883a8-e26d-4e93-ae2e-72ea2afef68d\") pod \"prometheus-metric-storage-0\" (UID: \"457a736d-6c3f-486d-b8d1-fef19df33e26\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6c50c5c57c27fdb24da1fcbf3a7504c7bda45f4dc15a5678e0deb708aa433733/globalmount\"" pod="openstack/prometheus-metric-storage-0"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.252485    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/457a736d-6c3f-486d-b8d1-fef19df33e26-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"457a736d-6c3f-486d-b8d1-fef19df33e26\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.255992    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/457a736d-6c3f-486d-b8d1-fef19df33e26-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"457a736d-6c3f-486d-b8d1-fef19df33e26\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.257002    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/457a736d-6c3f-486d-b8d1-fef19df33e26-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"457a736d-6c3f-486d-b8d1-fef19df33e26\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.264235    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2pth\" (UniqueName: \"kubernetes.io/projected/457a736d-6c3f-486d-b8d1-fef19df33e26-kube-api-access-s2pth\") pod \"prometheus-metric-storage-0\" (UID: \"457a736d-6c3f-486d-b8d1-fef19df33e26\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.264397    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/457a736d-6c3f-486d-b8d1-fef19df33e26-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"457a736d-6c3f-486d-b8d1-fef19df33e26\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.297362    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-361883a8-e26d-4e93-ae2e-72ea2afef68d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-361883a8-e26d-4e93-ae2e-72ea2afef68d\") pod \"prometheus-metric-storage-0\" (UID: \"457a736d-6c3f-486d-b8d1-fef19df33e26\") " pod="openstack/prometheus-metric-storage-0"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.372783    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.547283    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9474555-d03c-4f34-8914-15b7654ec76e" path="/var/lib/kubelet/pods/b9474555-d03c-4f34-8914-15b7654ec76e/volumes"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.805210    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tbwsz"]
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.808461    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tbwsz"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.817768    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tbwsz"]
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.897168    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"]
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.963442    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kv8w\" (UniqueName: \"kubernetes.io/projected/d4b708fa-9cfd-4986-b8cf-829083a898dc-kube-api-access-6kv8w\") pod \"redhat-operators-tbwsz\" (UID: \"d4b708fa-9cfd-4986-b8cf-829083a898dc\") " pod="openshift-marketplace/redhat-operators-tbwsz"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.963515    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4b708fa-9cfd-4986-b8cf-829083a898dc-catalog-content\") pod \"redhat-operators-tbwsz\" (UID: \"d4b708fa-9cfd-4986-b8cf-829083a898dc\") " pod="openshift-marketplace/redhat-operators-tbwsz"
Mar 20 16:27:23 crc kubenswrapper[4730]: I0320 16:27:23.963584    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4b708fa-9cfd-4986-b8cf-829083a898dc-utilities\") pod \"redhat-operators-tbwsz\" (UID: \"d4b708fa-9cfd-4986-b8cf-829083a898dc\") " pod="openshift-marketplace/redhat-operators-tbwsz"
Mar 20 16:27:24 crc kubenswrapper[4730]: I0320 16:27:24.065987    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kv8w\" (UniqueName: \"kubernetes.io/projected/d4b708fa-9cfd-4986-b8cf-829083a898dc-kube-api-access-6kv8w\") pod \"redhat-operators-tbwsz\" (UID: \"d4b708fa-9cfd-4986-b8cf-829083a898dc\") " pod="openshift-marketplace/redhat-operators-tbwsz"
Mar 20 16:27:24 crc kubenswrapper[4730]: I0320 16:27:24.066384    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4b708fa-9cfd-4986-b8cf-829083a898dc-catalog-content\") pod \"redhat-operators-tbwsz\" (UID: \"d4b708fa-9cfd-4986-b8cf-829083a898dc\") " pod="openshift-marketplace/redhat-operators-tbwsz"
Mar 20 16:27:24 crc kubenswrapper[4730]: I0320 16:27:24.066458    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4b708fa-9cfd-4986-b8cf-829083a898dc-utilities\") pod \"redhat-operators-tbwsz\" (UID: \"d4b708fa-9cfd-4986-b8cf-829083a898dc\") " pod="openshift-marketplace/redhat-operators-tbwsz"
Mar 20 16:27:24 crc kubenswrapper[4730]: I0320 16:27:24.066981    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4b708fa-9cfd-4986-b8cf-829083a898dc-catalog-content\") pod \"redhat-operators-tbwsz\" (UID: \"d4b708fa-9cfd-4986-b8cf-829083a898dc\") " pod="openshift-marketplace/redhat-operators-tbwsz"
Mar 20 16:27:24 crc kubenswrapper[4730]: I0320 16:27:24.067040    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4b708fa-9cfd-4986-b8cf-829083a898dc-utilities\") pod \"redhat-operators-tbwsz\" (UID: \"d4b708fa-9cfd-4986-b8cf-829083a898dc\") " pod="openshift-marketplace/redhat-operators-tbwsz"
Mar 20 16:27:24 crc kubenswrapper[4730]: I0320 16:27:24.090842    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kv8w\" (UniqueName: \"kubernetes.io/projected/d4b708fa-9cfd-4986-b8cf-829083a898dc-kube-api-access-6kv8w\") pod \"redhat-operators-tbwsz\" (UID: \"d4b708fa-9cfd-4986-b8cf-829083a898dc\") " pod="openshift-marketplace/redhat-operators-tbwsz"
Mar 20 16:27:24 crc kubenswrapper[4730]: I0320 16:27:24.160001    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tbwsz"
Mar 20 16:27:24 crc kubenswrapper[4730]: I0320 16:27:24.673129    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tbwsz"]
Mar 20 16:27:24 crc kubenswrapper[4730]: I0320 16:27:24.685310    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"457a736d-6c3f-486d-b8d1-fef19df33e26","Type":"ContainerStarted","Data":"a08b055a42b0787d64934a875d37cdeff6d784dccb9876ce7c9cf3a5cf1d37c4"}
Mar 20 16:27:24 crc kubenswrapper[4730]: W0320 16:27:24.687427    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4b708fa_9cfd_4986_b8cf_829083a898dc.slice/crio-bd691c4e3989104a40a71572866dcc51a30677ea52b34eea58c7ab44f894bd33 WatchSource:0}: Error finding container bd691c4e3989104a40a71572866dcc51a30677ea52b34eea58c7ab44f894bd33: Status 404 returned error can't find the container with id bd691c4e3989104a40a71572866dcc51a30677ea52b34eea58c7ab44f894bd33
Mar 20 16:27:25 crc kubenswrapper[4730]: I0320 16:27:25.693706    4730 generic.go:334] "Generic (PLEG): container finished" podID="d4b708fa-9cfd-4986-b8cf-829083a898dc" containerID="3f7a8e2bcf60a0b2f78a1591703724fb4e894f9989953a89c4c5bdecd0d31e6b" exitCode=0
Mar 20 16:27:25 crc kubenswrapper[4730]: I0320 16:27:25.693804    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tbwsz" event={"ID":"d4b708fa-9cfd-4986-b8cf-829083a898dc","Type":"ContainerDied","Data":"3f7a8e2bcf60a0b2f78a1591703724fb4e894f9989953a89c4c5bdecd0d31e6b"}
Mar 20 16:27:25 crc kubenswrapper[4730]: I0320 16:27:25.694363    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tbwsz" event={"ID":"d4b708fa-9cfd-4986-b8cf-829083a898dc","Type":"ContainerStarted","Data":"bd691c4e3989104a40a71572866dcc51a30677ea52b34eea58c7ab44f894bd33"}
Mar 20 16:27:25 crc kubenswrapper[4730]: I0320 16:27:25.696417    4730 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider
Mar 20 16:27:27 crc kubenswrapper[4730]: I0320 16:27:27.714701    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"457a736d-6c3f-486d-b8d1-fef19df33e26","Type":"ContainerStarted","Data":"8484dd0c7c8c11489d2bf88865470823c2eca98ee8d5c33a4ca245b45d333516"}
Mar 20 16:27:28 crc kubenswrapper[4730]: I0320 16:27:28.726750    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tbwsz" event={"ID":"d4b708fa-9cfd-4986-b8cf-829083a898dc","Type":"ContainerStarted","Data":"2c7c685807feee902d4eb4b91c7981a5208919ad906e30a3ca544ea7bc705a9f"}
Mar 20 16:27:33 crc kubenswrapper[4730]: I0320 16:27:33.769293    4730 generic.go:334] "Generic (PLEG): container finished" podID="457a736d-6c3f-486d-b8d1-fef19df33e26" containerID="8484dd0c7c8c11489d2bf88865470823c2eca98ee8d5c33a4ca245b45d333516" exitCode=0
Mar 20 16:27:33 crc kubenswrapper[4730]: I0320 16:27:33.769374    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"457a736d-6c3f-486d-b8d1-fef19df33e26","Type":"ContainerDied","Data":"8484dd0c7c8c11489d2bf88865470823c2eca98ee8d5c33a4ca245b45d333516"}
Mar 20 16:27:34 crc kubenswrapper[4730]: I0320 16:27:34.781081    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"457a736d-6c3f-486d-b8d1-fef19df33e26","Type":"ContainerStarted","Data":"1d85169860587c3477dc1f6ba3a5983294271f2fa8771c2709c4cb28da928e51"}
Mar 20 16:27:34 crc kubenswrapper[4730]: I0320 16:27:34.783223    4730 generic.go:334] "Generic (PLEG): container finished" podID="d4b708fa-9cfd-4986-b8cf-829083a898dc" containerID="2c7c685807feee902d4eb4b91c7981a5208919ad906e30a3ca544ea7bc705a9f" exitCode=0
Mar 20 16:27:34 crc kubenswrapper[4730]: I0320 16:27:34.783365    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tbwsz" event={"ID":"d4b708fa-9cfd-4986-b8cf-829083a898dc","Type":"ContainerDied","Data":"2c7c685807feee902d4eb4b91c7981a5208919ad906e30a3ca544ea7bc705a9f"}
Mar 20 16:27:35 crc kubenswrapper[4730]: I0320 16:27:35.794840    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tbwsz" event={"ID":"d4b708fa-9cfd-4986-b8cf-829083a898dc","Type":"ContainerStarted","Data":"761a57e8015a72556b7fa86e30ae3e038c8a644cf3b8bde8484efaceede61fb9"}
Mar 20 16:27:35 crc kubenswrapper[4730]: I0320 16:27:35.823675    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tbwsz" podStartSLOduration=3.175471431 podStartE2EDuration="12.823658508s" podCreationTimestamp="2026-03-20 16:27:23 +0000 UTC" firstStartedPulling="2026-03-20 16:27:25.696040748 +0000 UTC m=+2904.909412117" lastFinishedPulling="2026-03-20 16:27:35.344227825 +0000 UTC m=+2914.557599194" observedRunningTime="2026-03-20 16:27:35.815442647 +0000 UTC m=+2915.028814016" watchObservedRunningTime="2026-03-20 16:27:35.823658508 +0000 UTC m=+2915.037029877"
Mar 20 16:27:37 crc kubenswrapper[4730]: I0320 16:27:37.815224    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"457a736d-6c3f-486d-b8d1-fef19df33e26","Type":"ContainerStarted","Data":"e9ab1e53399e9e0391e9713bfe1c8e9c80955463e25e0a964cade0038151bd75"}
Mar 20 16:27:37 crc kubenswrapper[4730]: I0320 16:27:37.815807    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"457a736d-6c3f-486d-b8d1-fef19df33e26","Type":"ContainerStarted","Data":"9c54daf456b96e86d6df7d4ffbf0e8ecc46f035e8d4101ea255939449f22bdd7"}
Mar 20 16:27:37 crc kubenswrapper[4730]: I0320 16:27:37.861642    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=14.861619206 podStartE2EDuration="14.861619206s" podCreationTimestamp="2026-03-20 16:27:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:27:37.854139176 +0000 UTC m=+2917.067510555" watchObservedRunningTime="2026-03-20 16:27:37.861619206 +0000 UTC m=+2917.074990575"
Mar 20 16:27:38 crc kubenswrapper[4730]: I0320 16:27:38.373095    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0"
Mar 20 16:27:38 crc kubenswrapper[4730]: I0320 16:27:38.373275    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0"
Mar 20 16:27:38 crc kubenswrapper[4730]: I0320 16:27:38.379992    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0"
Mar 20 16:27:38 crc kubenswrapper[4730]: I0320 16:27:38.830528    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0"
Mar 20 16:27:44 crc kubenswrapper[4730]: I0320 16:27:44.160384    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tbwsz"
Mar 20 16:27:44 crc kubenswrapper[4730]: I0320 16:27:44.160950    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tbwsz"
Mar 20 16:27:45 crc kubenswrapper[4730]: I0320 16:27:45.231930    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tbwsz" podUID="d4b708fa-9cfd-4986-b8cf-829083a898dc" containerName="registry-server" probeResult="failure" output=<
Mar 20 16:27:45 crc kubenswrapper[4730]:         timeout: failed to connect service ":50051" within 1s
Mar 20 16:27:45 crc kubenswrapper[4730]:  >
Mar 20 16:27:55 crc kubenswrapper[4730]: I0320 16:27:55.239551    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tbwsz" podUID="d4b708fa-9cfd-4986-b8cf-829083a898dc" containerName="registry-server" probeResult="failure" output=<
Mar 20 16:27:55 crc kubenswrapper[4730]:         timeout: failed to connect service ":50051" within 1s
Mar 20 16:27:55 crc kubenswrapper[4730]:  >
Mar 20 16:28:00 crc kubenswrapper[4730]: I0320 16:28:00.145096    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567068-d884x"]
Mar 20 16:28:00 crc kubenswrapper[4730]: I0320 16:28:00.147110    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567068-d884x"
Mar 20 16:28:00 crc kubenswrapper[4730]: I0320 16:28:00.153496    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt"
Mar 20 16:28:00 crc kubenswrapper[4730]: I0320 16:28:00.153889    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt"
Mar 20 16:28:00 crc kubenswrapper[4730]: I0320 16:28:00.154054    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vlqbl"
Mar 20 16:28:00 crc kubenswrapper[4730]: I0320 16:28:00.155732    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567068-d884x"]
Mar 20 16:28:00 crc kubenswrapper[4730]: I0320 16:28:00.244980    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8sqp\" (UniqueName: \"kubernetes.io/projected/02177dc8-25be-4462-afba-d87fda4396c6-kube-api-access-k8sqp\") pod \"auto-csr-approver-29567068-d884x\" (UID: \"02177dc8-25be-4462-afba-d87fda4396c6\") " pod="openshift-infra/auto-csr-approver-29567068-d884x"
Mar 20 16:28:00 crc kubenswrapper[4730]: I0320 16:28:00.347309    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8sqp\" (UniqueName: \"kubernetes.io/projected/02177dc8-25be-4462-afba-d87fda4396c6-kube-api-access-k8sqp\") pod \"auto-csr-approver-29567068-d884x\" (UID: \"02177dc8-25be-4462-afba-d87fda4396c6\") " pod="openshift-infra/auto-csr-approver-29567068-d884x"
Mar 20 16:28:00 crc kubenswrapper[4730]: I0320 16:28:00.367320    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8sqp\" (UniqueName: \"kubernetes.io/projected/02177dc8-25be-4462-afba-d87fda4396c6-kube-api-access-k8sqp\") pod \"auto-csr-approver-29567068-d884x\" (UID: \"02177dc8-25be-4462-afba-d87fda4396c6\") " pod="openshift-infra/auto-csr-approver-29567068-d884x"
Mar 20 16:28:00 crc kubenswrapper[4730]: I0320 16:28:00.469339    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567068-d884x"
Mar 20 16:28:00 crc kubenswrapper[4730]: I0320 16:28:00.909504    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567068-d884x"]
Mar 20 16:28:01 crc kubenswrapper[4730]: I0320 16:28:01.210163    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567068-d884x" event={"ID":"02177dc8-25be-4462-afba-d87fda4396c6","Type":"ContainerStarted","Data":"a423e995336389e4a72bf13a9fbebe0ca777c1c9a74473ff5376c942e4c6d269"}
Mar 20 16:28:03 crc kubenswrapper[4730]: I0320 16:28:03.227767    4730 generic.go:334] "Generic (PLEG): container finished" podID="02177dc8-25be-4462-afba-d87fda4396c6" containerID="82eb38bbfad57d58cc830a2f14037d88d822b6b89fbb3a03b36b1b472f369ed1" exitCode=0
Mar 20 16:28:03 crc kubenswrapper[4730]: I0320 16:28:03.227824    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567068-d884x" event={"ID":"02177dc8-25be-4462-afba-d87fda4396c6","Type":"ContainerDied","Data":"82eb38bbfad57d58cc830a2f14037d88d822b6b89fbb3a03b36b1b472f369ed1"}
Mar 20 16:28:04 crc kubenswrapper[4730]: I0320 16:28:04.582712    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567068-d884x"
Mar 20 16:28:04 crc kubenswrapper[4730]: I0320 16:28:04.640537    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8sqp\" (UniqueName: \"kubernetes.io/projected/02177dc8-25be-4462-afba-d87fda4396c6-kube-api-access-k8sqp\") pod \"02177dc8-25be-4462-afba-d87fda4396c6\" (UID: \"02177dc8-25be-4462-afba-d87fda4396c6\") "
Mar 20 16:28:04 crc kubenswrapper[4730]: I0320 16:28:04.656295    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02177dc8-25be-4462-afba-d87fda4396c6-kube-api-access-k8sqp" (OuterVolumeSpecName: "kube-api-access-k8sqp") pod "02177dc8-25be-4462-afba-d87fda4396c6" (UID: "02177dc8-25be-4462-afba-d87fda4396c6"). InnerVolumeSpecName "kube-api-access-k8sqp". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:28:04 crc kubenswrapper[4730]: I0320 16:28:04.743556    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8sqp\" (UniqueName: \"kubernetes.io/projected/02177dc8-25be-4462-afba-d87fda4396c6-kube-api-access-k8sqp\") on node \"crc\" DevicePath \"\""
Mar 20 16:28:05 crc kubenswrapper[4730]: I0320 16:28:05.203349    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tbwsz" podUID="d4b708fa-9cfd-4986-b8cf-829083a898dc" containerName="registry-server" probeResult="failure" output=<
Mar 20 16:28:05 crc kubenswrapper[4730]:         timeout: failed to connect service ":50051" within 1s
Mar 20 16:28:05 crc kubenswrapper[4730]:  >
Mar 20 16:28:05 crc kubenswrapper[4730]: I0320 16:28:05.250526    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567068-d884x" event={"ID":"02177dc8-25be-4462-afba-d87fda4396c6","Type":"ContainerDied","Data":"a423e995336389e4a72bf13a9fbebe0ca777c1c9a74473ff5376c942e4c6d269"}
Mar 20 16:28:05 crc kubenswrapper[4730]: I0320 16:28:05.250562    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a423e995336389e4a72bf13a9fbebe0ca777c1c9a74473ff5376c942e4c6d269"
Mar 20 16:28:05 crc kubenswrapper[4730]: I0320 16:28:05.250625    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567068-d884x"
Mar 20 16:28:05 crc kubenswrapper[4730]: I0320 16:28:05.665713    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567062-58ww8"]
Mar 20 16:28:05 crc kubenswrapper[4730]: I0320 16:28:05.674498    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567062-58ww8"]
Mar 20 16:28:07 crc kubenswrapper[4730]: I0320 16:28:07.550904    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e053a518-d8c9-42a1-9c8d-83d5fec8de8c" path="/var/lib/kubelet/pods/e053a518-d8c9-42a1-9c8d-83d5fec8de8c/volumes"
Mar 20 16:28:12 crc kubenswrapper[4730]: I0320 16:28:12.880800    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 16:28:12 crc kubenswrapper[4730]: I0320 16:28:12.881573    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 16:28:15 crc kubenswrapper[4730]: I0320 16:28:15.205992    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tbwsz" podUID="d4b708fa-9cfd-4986-b8cf-829083a898dc" containerName="registry-server" probeResult="failure" output=<
Mar 20 16:28:15 crc kubenswrapper[4730]:         timeout: failed to connect service ":50051" within 1s
Mar 20 16:28:15 crc kubenswrapper[4730]:  >
Mar 20 16:28:17 crc kubenswrapper[4730]: I0320 16:28:17.866232    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"]
Mar 20 16:28:17 crc kubenswrapper[4730]: E0320 16:28:17.867082    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02177dc8-25be-4462-afba-d87fda4396c6" containerName="oc"
Mar 20 16:28:17 crc kubenswrapper[4730]: I0320 16:28:17.867099    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="02177dc8-25be-4462-afba-d87fda4396c6" containerName="oc"
Mar 20 16:28:17 crc kubenswrapper[4730]: I0320 16:28:17.867341    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="02177dc8-25be-4462-afba-d87fda4396c6" containerName="oc"
Mar 20 16:28:17 crc kubenswrapper[4730]: I0320 16:28:17.868152    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest"
Mar 20 16:28:17 crc kubenswrapper[4730]: I0320 16:28:17.871741    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key"
Mar 20 16:28:17 crc kubenswrapper[4730]: I0320 16:28:17.871917    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0"
Mar 20 16:28:17 crc kubenswrapper[4730]: I0320 16:28:17.872993    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0"
Mar 20 16:28:17 crc kubenswrapper[4730]: I0320 16:28:17.873320    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gh48g"
Mar 20 16:28:17 crc kubenswrapper[4730]: I0320 16:28:17.882516    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"]
Mar 20 16:28:17 crc kubenswrapper[4730]: I0320 16:28:17.926563    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75t8r\" (UniqueName: \"kubernetes.io/projected/c69a80b5-69a7-48c5-8ad4-5063b6cb4676-kube-api-access-75t8r\") pod \"tempest-tests-tempest\" (UID: \"c69a80b5-69a7-48c5-8ad4-5063b6cb4676\") " pod="openstack/tempest-tests-tempest"
Mar 20 16:28:17 crc kubenswrapper[4730]: I0320 16:28:17.926622    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/c69a80b5-69a7-48c5-8ad4-5063b6cb4676-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"c69a80b5-69a7-48c5-8ad4-5063b6cb4676\") " pod="openstack/tempest-tests-tempest"
Mar 20 16:28:17 crc kubenswrapper[4730]: I0320 16:28:17.926684    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"c69a80b5-69a7-48c5-8ad4-5063b6cb4676\") " pod="openstack/tempest-tests-tempest"
Mar 20 16:28:17 crc kubenswrapper[4730]: I0320 16:28:17.926712    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c69a80b5-69a7-48c5-8ad4-5063b6cb4676-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"c69a80b5-69a7-48c5-8ad4-5063b6cb4676\") " pod="openstack/tempest-tests-tempest"
Mar 20 16:28:17 crc kubenswrapper[4730]: I0320 16:28:17.926750    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c69a80b5-69a7-48c5-8ad4-5063b6cb4676-config-data\") pod \"tempest-tests-tempest\" (UID: \"c69a80b5-69a7-48c5-8ad4-5063b6cb4676\") " pod="openstack/tempest-tests-tempest"
Mar 20 16:28:17 crc kubenswrapper[4730]: I0320 16:28:17.926769    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/c69a80b5-69a7-48c5-8ad4-5063b6cb4676-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"c69a80b5-69a7-48c5-8ad4-5063b6cb4676\") " pod="openstack/tempest-tests-tempest"
Mar 20 16:28:17 crc kubenswrapper[4730]: I0320 16:28:17.926796    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/c69a80b5-69a7-48c5-8ad4-5063b6cb4676-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"c69a80b5-69a7-48c5-8ad4-5063b6cb4676\") " pod="openstack/tempest-tests-tempest"
Mar 20 16:28:17 crc kubenswrapper[4730]: I0320 16:28:17.926889    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c69a80b5-69a7-48c5-8ad4-5063b6cb4676-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"c69a80b5-69a7-48c5-8ad4-5063b6cb4676\") " pod="openstack/tempest-tests-tempest"
Mar 20 16:28:17 crc kubenswrapper[4730]: I0320 16:28:17.926923    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c69a80b5-69a7-48c5-8ad4-5063b6cb4676-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"c69a80b5-69a7-48c5-8ad4-5063b6cb4676\") " pod="openstack/tempest-tests-tempest"
Mar 20 16:28:18 crc kubenswrapper[4730]: I0320 16:28:18.029029    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c69a80b5-69a7-48c5-8ad4-5063b6cb4676-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"c69a80b5-69a7-48c5-8ad4-5063b6cb4676\") " pod="openstack/tempest-tests-tempest"
Mar 20 16:28:18 crc kubenswrapper[4730]: I0320 16:28:18.029101    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c69a80b5-69a7-48c5-8ad4-5063b6cb4676-config-data\") pod \"tempest-tests-tempest\" (UID: \"c69a80b5-69a7-48c5-8ad4-5063b6cb4676\") " pod="openstack/tempest-tests-tempest"
Mar 20 16:28:18 crc kubenswrapper[4730]: I0320 16:28:18.029123    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/c69a80b5-69a7-48c5-8ad4-5063b6cb4676-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"c69a80b5-69a7-48c5-8ad4-5063b6cb4676\") " pod="openstack/tempest-tests-tempest"
Mar 20 16:28:18 crc kubenswrapper[4730]: I0320 16:28:18.029150    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/c69a80b5-69a7-48c5-8ad4-5063b6cb4676-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"c69a80b5-69a7-48c5-8ad4-5063b6cb4676\") " pod="openstack/tempest-tests-tempest"
Mar 20 16:28:18 crc kubenswrapper[4730]: I0320 16:28:18.029216    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c69a80b5-69a7-48c5-8ad4-5063b6cb4676-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"c69a80b5-69a7-48c5-8ad4-5063b6cb4676\") " pod="openstack/tempest-tests-tempest"
Mar 20 16:28:18 crc kubenswrapper[4730]: I0320 16:28:18.029263    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c69a80b5-69a7-48c5-8ad4-5063b6cb4676-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"c69a80b5-69a7-48c5-8ad4-5063b6cb4676\") " pod="openstack/tempest-tests-tempest"
Mar 20 16:28:18 crc kubenswrapper[4730]: I0320 16:28:18.029318    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75t8r\" (UniqueName: \"kubernetes.io/projected/c69a80b5-69a7-48c5-8ad4-5063b6cb4676-kube-api-access-75t8r\") pod \"tempest-tests-tempest\" (UID: \"c69a80b5-69a7-48c5-8ad4-5063b6cb4676\") " pod="openstack/tempest-tests-tempest"
Mar 20 16:28:18 crc kubenswrapper[4730]: I0320 16:28:18.029346    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/c69a80b5-69a7-48c5-8ad4-5063b6cb4676-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"c69a80b5-69a7-48c5-8ad4-5063b6cb4676\") " pod="openstack/tempest-tests-tempest"
Mar 20 16:28:18 crc kubenswrapper[4730]: I0320 16:28:18.029381    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"c69a80b5-69a7-48c5-8ad4-5063b6cb4676\") " pod="openstack/tempest-tests-tempest"
Mar 20 16:28:18 crc kubenswrapper[4730]: I0320 16:28:18.029681    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/c69a80b5-69a7-48c5-8ad4-5063b6cb4676-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"c69a80b5-69a7-48c5-8ad4-5063b6cb4676\") " pod="openstack/tempest-tests-tempest"
Mar 20 16:28:18 crc kubenswrapper[4730]: I0320 16:28:18.029749    4730 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"c69a80b5-69a7-48c5-8ad4-5063b6cb4676\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/tempest-tests-tempest"
Mar 20 16:28:18 crc kubenswrapper[4730]: I0320 16:28:18.029778    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/c69a80b5-69a7-48c5-8ad4-5063b6cb4676-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"c69a80b5-69a7-48c5-8ad4-5063b6cb4676\") " pod="openstack/tempest-tests-tempest"
Mar 20 16:28:18 crc kubenswrapper[4730]: I0320 16:28:18.030517    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c69a80b5-69a7-48c5-8ad4-5063b6cb4676-config-data\") pod \"tempest-tests-tempest\" (UID: \"c69a80b5-69a7-48c5-8ad4-5063b6cb4676\") " pod="openstack/tempest-tests-tempest"
Mar 20 16:28:18 crc kubenswrapper[4730]: I0320 16:28:18.031152    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c69a80b5-69a7-48c5-8ad4-5063b6cb4676-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"c69a80b5-69a7-48c5-8ad4-5063b6cb4676\") " pod="openstack/tempest-tests-tempest"
Mar 20 16:28:18 crc kubenswrapper[4730]: I0320 16:28:18.035909    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c69a80b5-69a7-48c5-8ad4-5063b6cb4676-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"c69a80b5-69a7-48c5-8ad4-5063b6cb4676\") " pod="openstack/tempest-tests-tempest"
Mar 20 16:28:18 crc kubenswrapper[4730]: I0320 16:28:18.035920    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c69a80b5-69a7-48c5-8ad4-5063b6cb4676-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"c69a80b5-69a7-48c5-8ad4-5063b6cb4676\") " pod="openstack/tempest-tests-tempest"
Mar 20 16:28:18 crc kubenswrapper[4730]: I0320 16:28:18.041107    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/c69a80b5-69a7-48c5-8ad4-5063b6cb4676-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"c69a80b5-69a7-48c5-8ad4-5063b6cb4676\") " pod="openstack/tempest-tests-tempest"
Mar 20 16:28:18 crc kubenswrapper[4730]: I0320 16:28:18.045931    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75t8r\" (UniqueName: \"kubernetes.io/projected/c69a80b5-69a7-48c5-8ad4-5063b6cb4676-kube-api-access-75t8r\") pod \"tempest-tests-tempest\" (UID: \"c69a80b5-69a7-48c5-8ad4-5063b6cb4676\") " pod="openstack/tempest-tests-tempest"
Mar 20 16:28:18 crc kubenswrapper[4730]: I0320 16:28:18.064907    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"c69a80b5-69a7-48c5-8ad4-5063b6cb4676\") " pod="openstack/tempest-tests-tempest"
Mar 20 16:28:18 crc kubenswrapper[4730]: I0320 16:28:18.208032    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest"
Mar 20 16:28:18 crc kubenswrapper[4730]: I0320 16:28:18.689280    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"]
Mar 20 16:28:19 crc kubenswrapper[4730]: I0320 16:28:19.385133    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"c69a80b5-69a7-48c5-8ad4-5063b6cb4676","Type":"ContainerStarted","Data":"3a3b9dc78f4221095ec1a260d19a50071b9bafd10f8f90a8b372cb1bb88e13e5"}
Mar 20 16:28:24 crc kubenswrapper[4730]: I0320 16:28:24.221431    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tbwsz"
Mar 20 16:28:24 crc kubenswrapper[4730]: I0320 16:28:24.289448    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tbwsz"
Mar 20 16:28:25 crc kubenswrapper[4730]: I0320 16:28:25.019987    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tbwsz"]
Mar 20 16:28:25 crc kubenswrapper[4730]: I0320 16:28:25.450870    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tbwsz" podUID="d4b708fa-9cfd-4986-b8cf-829083a898dc" containerName="registry-server" containerID="cri-o://761a57e8015a72556b7fa86e30ae3e038c8a644cf3b8bde8484efaceede61fb9" gracePeriod=2
Mar 20 16:28:26 crc kubenswrapper[4730]: I0320 16:28:26.460119    4730 generic.go:334] "Generic (PLEG): container finished" podID="d4b708fa-9cfd-4986-b8cf-829083a898dc" containerID="761a57e8015a72556b7fa86e30ae3e038c8a644cf3b8bde8484efaceede61fb9" exitCode=0
Mar 20 16:28:26 crc kubenswrapper[4730]: I0320 16:28:26.460190    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tbwsz" event={"ID":"d4b708fa-9cfd-4986-b8cf-829083a898dc","Type":"ContainerDied","Data":"761a57e8015a72556b7fa86e30ae3e038c8a644cf3b8bde8484efaceede61fb9"}
Mar 20 16:28:28 crc kubenswrapper[4730]: I0320 16:28:28.293282    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tbwsz"
Mar 20 16:28:28 crc kubenswrapper[4730]: I0320 16:28:28.370765    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4b708fa-9cfd-4986-b8cf-829083a898dc-utilities\") pod \"d4b708fa-9cfd-4986-b8cf-829083a898dc\" (UID: \"d4b708fa-9cfd-4986-b8cf-829083a898dc\") "
Mar 20 16:28:28 crc kubenswrapper[4730]: I0320 16:28:28.370969    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kv8w\" (UniqueName: \"kubernetes.io/projected/d4b708fa-9cfd-4986-b8cf-829083a898dc-kube-api-access-6kv8w\") pod \"d4b708fa-9cfd-4986-b8cf-829083a898dc\" (UID: \"d4b708fa-9cfd-4986-b8cf-829083a898dc\") "
Mar 20 16:28:28 crc kubenswrapper[4730]: I0320 16:28:28.371080    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4b708fa-9cfd-4986-b8cf-829083a898dc-catalog-content\") pod \"d4b708fa-9cfd-4986-b8cf-829083a898dc\" (UID: \"d4b708fa-9cfd-4986-b8cf-829083a898dc\") "
Mar 20 16:28:28 crc kubenswrapper[4730]: I0320 16:28:28.375681    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4b708fa-9cfd-4986-b8cf-829083a898dc-kube-api-access-6kv8w" (OuterVolumeSpecName: "kube-api-access-6kv8w") pod "d4b708fa-9cfd-4986-b8cf-829083a898dc" (UID: "d4b708fa-9cfd-4986-b8cf-829083a898dc"). InnerVolumeSpecName "kube-api-access-6kv8w". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:28:28 crc kubenswrapper[4730]: I0320 16:28:28.376463    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4b708fa-9cfd-4986-b8cf-829083a898dc-utilities" (OuterVolumeSpecName: "utilities") pod "d4b708fa-9cfd-4986-b8cf-829083a898dc" (UID: "d4b708fa-9cfd-4986-b8cf-829083a898dc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:28:28 crc kubenswrapper[4730]: I0320 16:28:28.473092    4730 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4b708fa-9cfd-4986-b8cf-829083a898dc-utilities\") on node \"crc\" DevicePath \"\""
Mar 20 16:28:28 crc kubenswrapper[4730]: I0320 16:28:28.473140    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kv8w\" (UniqueName: \"kubernetes.io/projected/d4b708fa-9cfd-4986-b8cf-829083a898dc-kube-api-access-6kv8w\") on node \"crc\" DevicePath \"\""
Mar 20 16:28:28 crc kubenswrapper[4730]: I0320 16:28:28.476928    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4b708fa-9cfd-4986-b8cf-829083a898dc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d4b708fa-9cfd-4986-b8cf-829083a898dc" (UID: "d4b708fa-9cfd-4986-b8cf-829083a898dc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:28:28 crc kubenswrapper[4730]: I0320 16:28:28.484651    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tbwsz" event={"ID":"d4b708fa-9cfd-4986-b8cf-829083a898dc","Type":"ContainerDied","Data":"bd691c4e3989104a40a71572866dcc51a30677ea52b34eea58c7ab44f894bd33"}
Mar 20 16:28:28 crc kubenswrapper[4730]: I0320 16:28:28.484712    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tbwsz"
Mar 20 16:28:28 crc kubenswrapper[4730]: I0320 16:28:28.484713    4730 scope.go:117] "RemoveContainer" containerID="761a57e8015a72556b7fa86e30ae3e038c8a644cf3b8bde8484efaceede61fb9"
Mar 20 16:28:28 crc kubenswrapper[4730]: I0320 16:28:28.515737    4730 scope.go:117] "RemoveContainer" containerID="2c7c685807feee902d4eb4b91c7981a5208919ad906e30a3ca544ea7bc705a9f"
Mar 20 16:28:28 crc kubenswrapper[4730]: I0320 16:28:28.530195    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tbwsz"]
Mar 20 16:28:28 crc kubenswrapper[4730]: I0320 16:28:28.541534    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tbwsz"]
Mar 20 16:28:28 crc kubenswrapper[4730]: I0320 16:28:28.550896    4730 scope.go:117] "RemoveContainer" containerID="3f7a8e2bcf60a0b2f78a1591703724fb4e894f9989953a89c4c5bdecd0d31e6b"
Mar 20 16:28:28 crc kubenswrapper[4730]: I0320 16:28:28.576365    4730 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4b708fa-9cfd-4986-b8cf-829083a898dc-catalog-content\") on node \"crc\" DevicePath \"\""
Mar 20 16:28:29 crc kubenswrapper[4730]: I0320 16:28:29.495173    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"c69a80b5-69a7-48c5-8ad4-5063b6cb4676","Type":"ContainerStarted","Data":"7d54b7219e1263f587317fcdebaae6f3c46012a7941ad45c24813ffa14627f5b"}
Mar 20 16:28:29 crc kubenswrapper[4730]: I0320 16:28:29.521122    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.141357662 podStartE2EDuration="13.521103426s" podCreationTimestamp="2026-03-20 16:28:16 +0000 UTC" firstStartedPulling="2026-03-20 16:28:18.692596798 +0000 UTC m=+2957.905968167" lastFinishedPulling="2026-03-20 16:28:28.072342562 +0000 UTC m=+2967.285713931" observedRunningTime="2026-03-20 16:28:29.516539797 +0000 UTC m=+2968.729911166" watchObservedRunningTime="2026-03-20 16:28:29.521103426 +0000 UTC m=+2968.734474785"
Mar 20 16:28:29 crc kubenswrapper[4730]: I0320 16:28:29.546239    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4b708fa-9cfd-4986-b8cf-829083a898dc" path="/var/lib/kubelet/pods/d4b708fa-9cfd-4986-b8cf-829083a898dc/volumes"
Mar 20 16:28:42 crc kubenswrapper[4730]: I0320 16:28:42.880872    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 16:28:42 crc kubenswrapper[4730]: I0320 16:28:42.881541    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 16:28:49 crc kubenswrapper[4730]: I0320 16:28:49.971130    4730 scope.go:117] "RemoveContainer" containerID="0ccfc2a0baaac1ea53dda2b6020b62a6460f7d7641aebee239ddb879e9c99ccb"
Mar 20 16:29:12 crc kubenswrapper[4730]: I0320 16:29:12.880311    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 16:29:12 crc kubenswrapper[4730]: I0320 16:29:12.881158    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 16:29:12 crc kubenswrapper[4730]: I0320 16:29:12.882434    4730 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf"
Mar 20 16:29:12 crc kubenswrapper[4730]: I0320 16:29:12.883510    4730 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f26b9418791737b63a44943ab58f9d5995a9697abde137f76404e07e867c6e5f"} pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted"
Mar 20 16:29:12 crc kubenswrapper[4730]: I0320 16:29:12.883576    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" containerID="cri-o://f26b9418791737b63a44943ab58f9d5995a9697abde137f76404e07e867c6e5f" gracePeriod=600
Mar 20 16:29:13 crc kubenswrapper[4730]: I0320 16:29:13.931693    4730 generic.go:334] "Generic (PLEG): container finished" podID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerID="f26b9418791737b63a44943ab58f9d5995a9697abde137f76404e07e867c6e5f" exitCode=0
Mar 20 16:29:13 crc kubenswrapper[4730]: I0320 16:29:13.931980    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" event={"ID":"7fcd3db3-55f1-4c23-8fa9-78844495cea3","Type":"ContainerDied","Data":"f26b9418791737b63a44943ab58f9d5995a9697abde137f76404e07e867c6e5f"}
Mar 20 16:29:13 crc kubenswrapper[4730]: I0320 16:29:13.932319    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" event={"ID":"7fcd3db3-55f1-4c23-8fa9-78844495cea3","Type":"ContainerStarted","Data":"0b0c904fd86d0d3236cc303102f0b32669f34531125a255be5df201b0dd8ef34"}
Mar 20 16:29:13 crc kubenswrapper[4730]: I0320 16:29:13.932347    4730 scope.go:117] "RemoveContainer" containerID="7c543dcd07b51b6540df53377dd2e3fd92636df6153928eaa86c229ad0e8e213"
Mar 20 16:30:00 crc kubenswrapper[4730]: I0320 16:30:00.173002    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567070-h8qb5"]
Mar 20 16:30:00 crc kubenswrapper[4730]: E0320 16:30:00.174459    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4b708fa-9cfd-4986-b8cf-829083a898dc" containerName="extract-content"
Mar 20 16:30:00 crc kubenswrapper[4730]: I0320 16:30:00.174553    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4b708fa-9cfd-4986-b8cf-829083a898dc" containerName="extract-content"
Mar 20 16:30:00 crc kubenswrapper[4730]: E0320 16:30:00.174613    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4b708fa-9cfd-4986-b8cf-829083a898dc" containerName="registry-server"
Mar 20 16:30:00 crc kubenswrapper[4730]: I0320 16:30:00.174652    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4b708fa-9cfd-4986-b8cf-829083a898dc" containerName="registry-server"
Mar 20 16:30:00 crc kubenswrapper[4730]: E0320 16:30:00.174706    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4b708fa-9cfd-4986-b8cf-829083a898dc" containerName="extract-utilities"
Mar 20 16:30:00 crc kubenswrapper[4730]: I0320 16:30:00.174718    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4b708fa-9cfd-4986-b8cf-829083a898dc" containerName="extract-utilities"
Mar 20 16:30:00 crc kubenswrapper[4730]: I0320 16:30:00.175098    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4b708fa-9cfd-4986-b8cf-829083a898dc" containerName="registry-server"
Mar 20 16:30:00 crc kubenswrapper[4730]: I0320 16:30:00.176175    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567070-h8qb5"
Mar 20 16:30:00 crc kubenswrapper[4730]: I0320 16:30:00.181887    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt"
Mar 20 16:30:00 crc kubenswrapper[4730]: I0320 16:30:00.181991    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt"
Mar 20 16:30:00 crc kubenswrapper[4730]: I0320 16:30:00.182059    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567070-5g4gp"]
Mar 20 16:30:00 crc kubenswrapper[4730]: I0320 16:30:00.183811    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-5g4gp"
Mar 20 16:30:00 crc kubenswrapper[4730]: I0320 16:30:00.184624    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vlqbl"
Mar 20 16:30:00 crc kubenswrapper[4730]: I0320 16:30:00.185518    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t"
Mar 20 16:30:00 crc kubenswrapper[4730]: I0320 16:30:00.186220    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config"
Mar 20 16:30:00 crc kubenswrapper[4730]: I0320 16:30:00.191533    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567070-5g4gp"]
Mar 20 16:30:00 crc kubenswrapper[4730]: I0320 16:30:00.222604    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567070-h8qb5"]
Mar 20 16:30:00 crc kubenswrapper[4730]: I0320 16:30:00.266959    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c86d92cc-d42e-496f-b31c-d6c56fb441c7-secret-volume\") pod \"collect-profiles-29567070-5g4gp\" (UID: \"c86d92cc-d42e-496f-b31c-d6c56fb441c7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-5g4gp"
Mar 20 16:30:00 crc kubenswrapper[4730]: I0320 16:30:00.266998    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c86d92cc-d42e-496f-b31c-d6c56fb441c7-config-volume\") pod \"collect-profiles-29567070-5g4gp\" (UID: \"c86d92cc-d42e-496f-b31c-d6c56fb441c7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-5g4gp"
Mar 20 16:30:00 crc kubenswrapper[4730]: I0320 16:30:00.267172    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk5m2\" (UniqueName: \"kubernetes.io/projected/c86d92cc-d42e-496f-b31c-d6c56fb441c7-kube-api-access-wk5m2\") pod \"collect-profiles-29567070-5g4gp\" (UID: \"c86d92cc-d42e-496f-b31c-d6c56fb441c7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-5g4gp"
Mar 20 16:30:00 crc kubenswrapper[4730]: I0320 16:30:00.267198    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d94nb\" (UniqueName: \"kubernetes.io/projected/f50e1094-eded-4000-b7f3-29722d8ba695-kube-api-access-d94nb\") pod \"auto-csr-approver-29567070-h8qb5\" (UID: \"f50e1094-eded-4000-b7f3-29722d8ba695\") " pod="openshift-infra/auto-csr-approver-29567070-h8qb5"
Mar 20 16:30:00 crc kubenswrapper[4730]: I0320 16:30:00.369622    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c86d92cc-d42e-496f-b31c-d6c56fb441c7-secret-volume\") pod \"collect-profiles-29567070-5g4gp\" (UID: \"c86d92cc-d42e-496f-b31c-d6c56fb441c7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-5g4gp"
Mar 20 16:30:00 crc kubenswrapper[4730]: I0320 16:30:00.369691    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c86d92cc-d42e-496f-b31c-d6c56fb441c7-config-volume\") pod \"collect-profiles-29567070-5g4gp\" (UID: \"c86d92cc-d42e-496f-b31c-d6c56fb441c7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-5g4gp"
Mar 20 16:30:00 crc kubenswrapper[4730]: I0320 16:30:00.369874    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wk5m2\" (UniqueName: \"kubernetes.io/projected/c86d92cc-d42e-496f-b31c-d6c56fb441c7-kube-api-access-wk5m2\") pod \"collect-profiles-29567070-5g4gp\" (UID: \"c86d92cc-d42e-496f-b31c-d6c56fb441c7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-5g4gp"
Mar 20 16:30:00 crc kubenswrapper[4730]: I0320 16:30:00.369904    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d94nb\" (UniqueName: \"kubernetes.io/projected/f50e1094-eded-4000-b7f3-29722d8ba695-kube-api-access-d94nb\") pod \"auto-csr-approver-29567070-h8qb5\" (UID: \"f50e1094-eded-4000-b7f3-29722d8ba695\") " pod="openshift-infra/auto-csr-approver-29567070-h8qb5"
Mar 20 16:30:00 crc kubenswrapper[4730]: I0320 16:30:00.370688    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c86d92cc-d42e-496f-b31c-d6c56fb441c7-config-volume\") pod \"collect-profiles-29567070-5g4gp\" (UID: \"c86d92cc-d42e-496f-b31c-d6c56fb441c7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-5g4gp"
Mar 20 16:30:00 crc kubenswrapper[4730]: I0320 16:30:00.386810    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c86d92cc-d42e-496f-b31c-d6c56fb441c7-secret-volume\") pod \"collect-profiles-29567070-5g4gp\" (UID: \"c86d92cc-d42e-496f-b31c-d6c56fb441c7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-5g4gp"
Mar 20 16:30:00 crc kubenswrapper[4730]: I0320 16:30:00.390393    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d94nb\" (UniqueName: \"kubernetes.io/projected/f50e1094-eded-4000-b7f3-29722d8ba695-kube-api-access-d94nb\") pod \"auto-csr-approver-29567070-h8qb5\" (UID: \"f50e1094-eded-4000-b7f3-29722d8ba695\") " pod="openshift-infra/auto-csr-approver-29567070-h8qb5"
Mar 20 16:30:00 crc kubenswrapper[4730]: I0320 16:30:00.391531    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk5m2\" (UniqueName: \"kubernetes.io/projected/c86d92cc-d42e-496f-b31c-d6c56fb441c7-kube-api-access-wk5m2\") pod \"collect-profiles-29567070-5g4gp\" (UID: \"c86d92cc-d42e-496f-b31c-d6c56fb441c7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-5g4gp"
Mar 20 16:30:00 crc kubenswrapper[4730]: I0320 16:30:00.513030    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567070-h8qb5"
Mar 20 16:30:00 crc kubenswrapper[4730]: I0320 16:30:00.524728    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-5g4gp"
Mar 20 16:30:00 crc kubenswrapper[4730]: I0320 16:30:00.988551    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567070-h8qb5"]
Mar 20 16:30:00 crc kubenswrapper[4730]: W0320 16:30:00.993353    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc86d92cc_d42e_496f_b31c_d6c56fb441c7.slice/crio-8b7d672e1388cfbe49cf1dd56d2c3d2ea5b0d87030b7b41cd27cd89e599652cc WatchSource:0}: Error finding container 8b7d672e1388cfbe49cf1dd56d2c3d2ea5b0d87030b7b41cd27cd89e599652cc: Status 404 returned error can't find the container with id 8b7d672e1388cfbe49cf1dd56d2c3d2ea5b0d87030b7b41cd27cd89e599652cc
Mar 20 16:30:00 crc kubenswrapper[4730]: I0320 16:30:00.998667    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567070-5g4gp"]
Mar 20 16:30:01 crc kubenswrapper[4730]: I0320 16:30:01.366995    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-5g4gp" event={"ID":"c86d92cc-d42e-496f-b31c-d6c56fb441c7","Type":"ContainerStarted","Data":"dc2c07b3766f06e0423270d40c09a7a028e4cbca82d59a060deedb7b5661816a"}
Mar 20 16:30:01 crc kubenswrapper[4730]: I0320 16:30:01.367054    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-5g4gp" event={"ID":"c86d92cc-d42e-496f-b31c-d6c56fb441c7","Type":"ContainerStarted","Data":"8b7d672e1388cfbe49cf1dd56d2c3d2ea5b0d87030b7b41cd27cd89e599652cc"}
Mar 20 16:30:01 crc kubenswrapper[4730]: I0320 16:30:01.369820    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567070-h8qb5" event={"ID":"f50e1094-eded-4000-b7f3-29722d8ba695","Type":"ContainerStarted","Data":"b09d2473c6efd02c36230d95fa3de33d5cdb445c921f8ad156b8354f2a60aa3b"}
Mar 20 16:30:01 crc kubenswrapper[4730]: I0320 16:30:01.387591    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-5g4gp" podStartSLOduration=1.3875718049999999 podStartE2EDuration="1.387571805s" podCreationTimestamp="2026-03-20 16:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:30:01.384078059 +0000 UTC m=+3060.597449448" watchObservedRunningTime="2026-03-20 16:30:01.387571805 +0000 UTC m=+3060.600943174"
Mar 20 16:30:02 crc kubenswrapper[4730]: I0320 16:30:02.380409    4730 generic.go:334] "Generic (PLEG): container finished" podID="c86d92cc-d42e-496f-b31c-d6c56fb441c7" containerID="dc2c07b3766f06e0423270d40c09a7a028e4cbca82d59a060deedb7b5661816a" exitCode=0
Mar 20 16:30:02 crc kubenswrapper[4730]: I0320 16:30:02.380465    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-5g4gp" event={"ID":"c86d92cc-d42e-496f-b31c-d6c56fb441c7","Type":"ContainerDied","Data":"dc2c07b3766f06e0423270d40c09a7a028e4cbca82d59a060deedb7b5661816a"}
Mar 20 16:30:02 crc kubenswrapper[4730]: E0320 16:30:02.826724    4730 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc86d92cc_d42e_496f_b31c_d6c56fb441c7.slice/crio-dc2c07b3766f06e0423270d40c09a7a028e4cbca82d59a060deedb7b5661816a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc86d92cc_d42e_496f_b31c_d6c56fb441c7.slice/crio-conmon-dc2c07b3766f06e0423270d40c09a7a028e4cbca82d59a060deedb7b5661816a.scope\": RecentStats: unable to find data in memory cache]"
Mar 20 16:30:03 crc kubenswrapper[4730]: I0320 16:30:03.780655    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-5g4gp"
Mar 20 16:30:03 crc kubenswrapper[4730]: I0320 16:30:03.841526    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c86d92cc-d42e-496f-b31c-d6c56fb441c7-secret-volume\") pod \"c86d92cc-d42e-496f-b31c-d6c56fb441c7\" (UID: \"c86d92cc-d42e-496f-b31c-d6c56fb441c7\") "
Mar 20 16:30:03 crc kubenswrapper[4730]: I0320 16:30:03.841665    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c86d92cc-d42e-496f-b31c-d6c56fb441c7-config-volume\") pod \"c86d92cc-d42e-496f-b31c-d6c56fb441c7\" (UID: \"c86d92cc-d42e-496f-b31c-d6c56fb441c7\") "
Mar 20 16:30:03 crc kubenswrapper[4730]: I0320 16:30:03.841957    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wk5m2\" (UniqueName: \"kubernetes.io/projected/c86d92cc-d42e-496f-b31c-d6c56fb441c7-kube-api-access-wk5m2\") pod \"c86d92cc-d42e-496f-b31c-d6c56fb441c7\" (UID: \"c86d92cc-d42e-496f-b31c-d6c56fb441c7\") "
Mar 20 16:30:03 crc kubenswrapper[4730]: I0320 16:30:03.844504    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c86d92cc-d42e-496f-b31c-d6c56fb441c7-config-volume" (OuterVolumeSpecName: "config-volume") pod "c86d92cc-d42e-496f-b31c-d6c56fb441c7" (UID: "c86d92cc-d42e-496f-b31c-d6c56fb441c7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:30:03 crc kubenswrapper[4730]: I0320 16:30:03.852162    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c86d92cc-d42e-496f-b31c-d6c56fb441c7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c86d92cc-d42e-496f-b31c-d6c56fb441c7" (UID: "c86d92cc-d42e-496f-b31c-d6c56fb441c7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:30:03 crc kubenswrapper[4730]: I0320 16:30:03.854371    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c86d92cc-d42e-496f-b31c-d6c56fb441c7-kube-api-access-wk5m2" (OuterVolumeSpecName: "kube-api-access-wk5m2") pod "c86d92cc-d42e-496f-b31c-d6c56fb441c7" (UID: "c86d92cc-d42e-496f-b31c-d6c56fb441c7"). InnerVolumeSpecName "kube-api-access-wk5m2". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:30:03 crc kubenswrapper[4730]: I0320 16:30:03.944456    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wk5m2\" (UniqueName: \"kubernetes.io/projected/c86d92cc-d42e-496f-b31c-d6c56fb441c7-kube-api-access-wk5m2\") on node \"crc\" DevicePath \"\""
Mar 20 16:30:03 crc kubenswrapper[4730]: I0320 16:30:03.944484    4730 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c86d92cc-d42e-496f-b31c-d6c56fb441c7-secret-volume\") on node \"crc\" DevicePath \"\""
Mar 20 16:30:03 crc kubenswrapper[4730]: I0320 16:30:03.944528    4730 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c86d92cc-d42e-496f-b31c-d6c56fb441c7-config-volume\") on node \"crc\" DevicePath \"\""
Mar 20 16:30:04 crc kubenswrapper[4730]: I0320 16:30:04.416344    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-5g4gp" event={"ID":"c86d92cc-d42e-496f-b31c-d6c56fb441c7","Type":"ContainerDied","Data":"8b7d672e1388cfbe49cf1dd56d2c3d2ea5b0d87030b7b41cd27cd89e599652cc"}
Mar 20 16:30:04 crc kubenswrapper[4730]: I0320 16:30:04.416644    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b7d672e1388cfbe49cf1dd56d2c3d2ea5b0d87030b7b41cd27cd89e599652cc"
Mar 20 16:30:04 crc kubenswrapper[4730]: I0320 16:30:04.416713    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567070-5g4gp"
Mar 20 16:30:04 crc kubenswrapper[4730]: I0320 16:30:04.472932    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567025-sp8pk"]
Mar 20 16:30:04 crc kubenswrapper[4730]: I0320 16:30:04.484405    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567025-sp8pk"]
Mar 20 16:30:05 crc kubenswrapper[4730]: I0320 16:30:05.427853    4730 generic.go:334] "Generic (PLEG): container finished" podID="f50e1094-eded-4000-b7f3-29722d8ba695" containerID="20c367b1f4c39cf9e28d8318713966c9d59ce69c25b518b2e48b38d0f034fa5d" exitCode=0
Mar 20 16:30:05 crc kubenswrapper[4730]: I0320 16:30:05.427920    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567070-h8qb5" event={"ID":"f50e1094-eded-4000-b7f3-29722d8ba695","Type":"ContainerDied","Data":"20c367b1f4c39cf9e28d8318713966c9d59ce69c25b518b2e48b38d0f034fa5d"}
Mar 20 16:30:05 crc kubenswrapper[4730]: I0320 16:30:05.548849    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db3d4357-8143-45e9-ab45-e55f54735cbc" path="/var/lib/kubelet/pods/db3d4357-8143-45e9-ab45-e55f54735cbc/volumes"
Mar 20 16:30:06 crc kubenswrapper[4730]: I0320 16:30:06.818106    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567070-h8qb5"
Mar 20 16:30:06 crc kubenswrapper[4730]: I0320 16:30:06.910318    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d94nb\" (UniqueName: \"kubernetes.io/projected/f50e1094-eded-4000-b7f3-29722d8ba695-kube-api-access-d94nb\") pod \"f50e1094-eded-4000-b7f3-29722d8ba695\" (UID: \"f50e1094-eded-4000-b7f3-29722d8ba695\") "
Mar 20 16:30:06 crc kubenswrapper[4730]: I0320 16:30:06.921649    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f50e1094-eded-4000-b7f3-29722d8ba695-kube-api-access-d94nb" (OuterVolumeSpecName: "kube-api-access-d94nb") pod "f50e1094-eded-4000-b7f3-29722d8ba695" (UID: "f50e1094-eded-4000-b7f3-29722d8ba695"). InnerVolumeSpecName "kube-api-access-d94nb". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:30:07 crc kubenswrapper[4730]: I0320 16:30:07.014199    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d94nb\" (UniqueName: \"kubernetes.io/projected/f50e1094-eded-4000-b7f3-29722d8ba695-kube-api-access-d94nb\") on node \"crc\" DevicePath \"\""
Mar 20 16:30:07 crc kubenswrapper[4730]: I0320 16:30:07.446435    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567070-h8qb5" event={"ID":"f50e1094-eded-4000-b7f3-29722d8ba695","Type":"ContainerDied","Data":"b09d2473c6efd02c36230d95fa3de33d5cdb445c921f8ad156b8354f2a60aa3b"}
Mar 20 16:30:07 crc kubenswrapper[4730]: I0320 16:30:07.446480    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b09d2473c6efd02c36230d95fa3de33d5cdb445c921f8ad156b8354f2a60aa3b"
Mar 20 16:30:07 crc kubenswrapper[4730]: I0320 16:30:07.446477    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567070-h8qb5"
Mar 20 16:30:07 crc kubenswrapper[4730]: I0320 16:30:07.885342    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567064-sklm5"]
Mar 20 16:30:07 crc kubenswrapper[4730]: I0320 16:30:07.894570    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567064-sklm5"]
Mar 20 16:30:09 crc kubenswrapper[4730]: I0320 16:30:09.545508    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a36e403d-410f-40cc-8441-66c444837d24" path="/var/lib/kubelet/pods/a36e403d-410f-40cc-8441-66c444837d24/volumes"
Mar 20 16:30:13 crc kubenswrapper[4730]: E0320 16:30:13.096985    4730 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc86d92cc_d42e_496f_b31c_d6c56fb441c7.slice/crio-conmon-dc2c07b3766f06e0423270d40c09a7a028e4cbca82d59a060deedb7b5661816a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc86d92cc_d42e_496f_b31c_d6c56fb441c7.slice/crio-dc2c07b3766f06e0423270d40c09a7a028e4cbca82d59a060deedb7b5661816a.scope\": RecentStats: unable to find data in memory cache]"
Mar 20 16:30:23 crc kubenswrapper[4730]: E0320 16:30:23.348924    4730 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc86d92cc_d42e_496f_b31c_d6c56fb441c7.slice/crio-conmon-dc2c07b3766f06e0423270d40c09a7a028e4cbca82d59a060deedb7b5661816a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc86d92cc_d42e_496f_b31c_d6c56fb441c7.slice/crio-dc2c07b3766f06e0423270d40c09a7a028e4cbca82d59a060deedb7b5661816a.scope\": RecentStats: unable to find data in memory cache]"
Mar 20 16:30:33 crc kubenswrapper[4730]: E0320 16:30:33.597135    4730 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc86d92cc_d42e_496f_b31c_d6c56fb441c7.slice/crio-conmon-dc2c07b3766f06e0423270d40c09a7a028e4cbca82d59a060deedb7b5661816a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc86d92cc_d42e_496f_b31c_d6c56fb441c7.slice/crio-dc2c07b3766f06e0423270d40c09a7a028e4cbca82d59a060deedb7b5661816a.scope\": RecentStats: unable to find data in memory cache]"
Mar 20 16:30:43 crc kubenswrapper[4730]: E0320 16:30:43.862098    4730 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc86d92cc_d42e_496f_b31c_d6c56fb441c7.slice/crio-conmon-dc2c07b3766f06e0423270d40c09a7a028e4cbca82d59a060deedb7b5661816a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc86d92cc_d42e_496f_b31c_d6c56fb441c7.slice/crio-dc2c07b3766f06e0423270d40c09a7a028e4cbca82d59a060deedb7b5661816a.scope\": RecentStats: unable to find data in memory cache]"
Mar 20 16:30:50 crc kubenswrapper[4730]: I0320 16:30:50.277717    4730 scope.go:117] "RemoveContainer" containerID="7961ca89ce2a460b127b00611370ac925492414c79b33a7aef5d34aaea8acb7f"
Mar 20 16:30:50 crc kubenswrapper[4730]: I0320 16:30:50.325332    4730 scope.go:117] "RemoveContainer" containerID="1718e89ccd737bfb9a3619c68d04fdd94aa68dd80d2bc675347f69c2cc40fd04"
Mar 20 16:30:54 crc kubenswrapper[4730]: E0320 16:30:54.102805    4730 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc86d92cc_d42e_496f_b31c_d6c56fb441c7.slice/crio-conmon-dc2c07b3766f06e0423270d40c09a7a028e4cbca82d59a060deedb7b5661816a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc86d92cc_d42e_496f_b31c_d6c56fb441c7.slice/crio-dc2c07b3766f06e0423270d40c09a7a028e4cbca82d59a060deedb7b5661816a.scope\": RecentStats: unable to find data in memory cache]"
Mar 20 16:31:42 crc kubenswrapper[4730]: I0320 16:31:42.880652    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 16:31:42 crc kubenswrapper[4730]: I0320 16:31:42.881089    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 16:32:00 crc kubenswrapper[4730]: I0320 16:32:00.161991    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567072-jzsw2"]
Mar 20 16:32:00 crc kubenswrapper[4730]: E0320 16:32:00.163062    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f50e1094-eded-4000-b7f3-29722d8ba695" containerName="oc"
Mar 20 16:32:00 crc kubenswrapper[4730]: I0320 16:32:00.163077    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="f50e1094-eded-4000-b7f3-29722d8ba695" containerName="oc"
Mar 20 16:32:00 crc kubenswrapper[4730]: E0320 16:32:00.163095    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c86d92cc-d42e-496f-b31c-d6c56fb441c7" containerName="collect-profiles"
Mar 20 16:32:00 crc kubenswrapper[4730]: I0320 16:32:00.163103    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="c86d92cc-d42e-496f-b31c-d6c56fb441c7" containerName="collect-profiles"
Mar 20 16:32:00 crc kubenswrapper[4730]: I0320 16:32:00.163416    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="f50e1094-eded-4000-b7f3-29722d8ba695" containerName="oc"
Mar 20 16:32:00 crc kubenswrapper[4730]: I0320 16:32:00.163439    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="c86d92cc-d42e-496f-b31c-d6c56fb441c7" containerName="collect-profiles"
Mar 20 16:32:00 crc kubenswrapper[4730]: I0320 16:32:00.164566    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567072-jzsw2"
Mar 20 16:32:00 crc kubenswrapper[4730]: I0320 16:32:00.167848    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt"
Mar 20 16:32:00 crc kubenswrapper[4730]: I0320 16:32:00.168174    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vlqbl"
Mar 20 16:32:00 crc kubenswrapper[4730]: I0320 16:32:00.169438    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt"
Mar 20 16:32:00 crc kubenswrapper[4730]: I0320 16:32:00.182169    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567072-jzsw2"]
Mar 20 16:32:00 crc kubenswrapper[4730]: I0320 16:32:00.284670    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjgcc\" (UniqueName: \"kubernetes.io/projected/cadf5c48-6db4-421c-977d-1216334a9383-kube-api-access-hjgcc\") pod \"auto-csr-approver-29567072-jzsw2\" (UID: \"cadf5c48-6db4-421c-977d-1216334a9383\") " pod="openshift-infra/auto-csr-approver-29567072-jzsw2"
Mar 20 16:32:00 crc kubenswrapper[4730]: I0320 16:32:00.386371    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjgcc\" (UniqueName: \"kubernetes.io/projected/cadf5c48-6db4-421c-977d-1216334a9383-kube-api-access-hjgcc\") pod \"auto-csr-approver-29567072-jzsw2\" (UID: \"cadf5c48-6db4-421c-977d-1216334a9383\") " pod="openshift-infra/auto-csr-approver-29567072-jzsw2"
Mar 20 16:32:00 crc kubenswrapper[4730]: I0320 16:32:00.405728    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjgcc\" (UniqueName: \"kubernetes.io/projected/cadf5c48-6db4-421c-977d-1216334a9383-kube-api-access-hjgcc\") pod \"auto-csr-approver-29567072-jzsw2\" (UID: \"cadf5c48-6db4-421c-977d-1216334a9383\") " pod="openshift-infra/auto-csr-approver-29567072-jzsw2"
Mar 20 16:32:00 crc kubenswrapper[4730]: I0320 16:32:00.484549    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567072-jzsw2"
Mar 20 16:32:00 crc kubenswrapper[4730]: I0320 16:32:00.964976    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567072-jzsw2"]
Mar 20 16:32:01 crc kubenswrapper[4730]: I0320 16:32:01.581343    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567072-jzsw2" event={"ID":"cadf5c48-6db4-421c-977d-1216334a9383","Type":"ContainerStarted","Data":"6f4c8959179fbc514db5c8ef46c1c92062e00de72c7204d982e7a00a0ed553a3"}
Mar 20 16:32:03 crc kubenswrapper[4730]: I0320 16:32:03.630391    4730 generic.go:334] "Generic (PLEG): container finished" podID="cadf5c48-6db4-421c-977d-1216334a9383" containerID="e43508074aa0c7c7c61cb53a8852f8061943211007b9394c89ac6a8a6c904123" exitCode=0
Mar 20 16:32:03 crc kubenswrapper[4730]: I0320 16:32:03.630455    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567072-jzsw2" event={"ID":"cadf5c48-6db4-421c-977d-1216334a9383","Type":"ContainerDied","Data":"e43508074aa0c7c7c61cb53a8852f8061943211007b9394c89ac6a8a6c904123"}
Mar 20 16:32:04 crc kubenswrapper[4730]: I0320 16:32:04.956402    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567072-jzsw2"
Mar 20 16:32:05 crc kubenswrapper[4730]: I0320 16:32:05.113964    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjgcc\" (UniqueName: \"kubernetes.io/projected/cadf5c48-6db4-421c-977d-1216334a9383-kube-api-access-hjgcc\") pod \"cadf5c48-6db4-421c-977d-1216334a9383\" (UID: \"cadf5c48-6db4-421c-977d-1216334a9383\") "
Mar 20 16:32:05 crc kubenswrapper[4730]: I0320 16:32:05.119659    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cadf5c48-6db4-421c-977d-1216334a9383-kube-api-access-hjgcc" (OuterVolumeSpecName: "kube-api-access-hjgcc") pod "cadf5c48-6db4-421c-977d-1216334a9383" (UID: "cadf5c48-6db4-421c-977d-1216334a9383"). InnerVolumeSpecName "kube-api-access-hjgcc". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:32:05 crc kubenswrapper[4730]: I0320 16:32:05.217228    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjgcc\" (UniqueName: \"kubernetes.io/projected/cadf5c48-6db4-421c-977d-1216334a9383-kube-api-access-hjgcc\") on node \"crc\" DevicePath \"\""
Mar 20 16:32:05 crc kubenswrapper[4730]: I0320 16:32:05.656674    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567072-jzsw2" event={"ID":"cadf5c48-6db4-421c-977d-1216334a9383","Type":"ContainerDied","Data":"6f4c8959179fbc514db5c8ef46c1c92062e00de72c7204d982e7a00a0ed553a3"}
Mar 20 16:32:05 crc kubenswrapper[4730]: I0320 16:32:05.656939    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f4c8959179fbc514db5c8ef46c1c92062e00de72c7204d982e7a00a0ed553a3"
Mar 20 16:32:05 crc kubenswrapper[4730]: I0320 16:32:05.656726    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567072-jzsw2"
Mar 20 16:32:06 crc kubenswrapper[4730]: I0320 16:32:06.048834    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567066-7w6j9"]
Mar 20 16:32:06 crc kubenswrapper[4730]: I0320 16:32:06.058581    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567066-7w6j9"]
Mar 20 16:32:07 crc kubenswrapper[4730]: I0320 16:32:07.545203    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2996ff7e-454f-40d8-bc5c-894c45b7a58c" path="/var/lib/kubelet/pods/2996ff7e-454f-40d8-bc5c-894c45b7a58c/volumes"
Mar 20 16:32:12 crc kubenswrapper[4730]: I0320 16:32:12.882401    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 16:32:12 crc kubenswrapper[4730]: I0320 16:32:12.883084    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 16:32:42 crc kubenswrapper[4730]: I0320 16:32:42.880366    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 16:32:42 crc kubenswrapper[4730]: I0320 16:32:42.880783    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 16:32:42 crc kubenswrapper[4730]: I0320 16:32:42.880830    4730 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf"
Mar 20 16:32:42 crc kubenswrapper[4730]: I0320 16:32:42.881550    4730 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0b0c904fd86d0d3236cc303102f0b32669f34531125a255be5df201b0dd8ef34"} pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted"
Mar 20 16:32:42 crc kubenswrapper[4730]: I0320 16:32:42.881599    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" containerID="cri-o://0b0c904fd86d0d3236cc303102f0b32669f34531125a255be5df201b0dd8ef34" gracePeriod=600
Mar 20 16:32:43 crc kubenswrapper[4730]: E0320 16:32:43.004940    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:32:43 crc kubenswrapper[4730]: I0320 16:32:43.011537    4730 generic.go:334] "Generic (PLEG): container finished" podID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerID="0b0c904fd86d0d3236cc303102f0b32669f34531125a255be5df201b0dd8ef34" exitCode=0
Mar 20 16:32:43 crc kubenswrapper[4730]: I0320 16:32:43.011572    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" event={"ID":"7fcd3db3-55f1-4c23-8fa9-78844495cea3","Type":"ContainerDied","Data":"0b0c904fd86d0d3236cc303102f0b32669f34531125a255be5df201b0dd8ef34"}
Mar 20 16:32:43 crc kubenswrapper[4730]: I0320 16:32:43.011601    4730 scope.go:117] "RemoveContainer" containerID="f26b9418791737b63a44943ab58f9d5995a9697abde137f76404e07e867c6e5f"
Mar 20 16:32:43 crc kubenswrapper[4730]: I0320 16:32:43.012539    4730 scope.go:117] "RemoveContainer" containerID="0b0c904fd86d0d3236cc303102f0b32669f34531125a255be5df201b0dd8ef34"
Mar 20 16:32:43 crc kubenswrapper[4730]: E0320 16:32:43.012917    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:32:50 crc kubenswrapper[4730]: I0320 16:32:50.446519    4730 scope.go:117] "RemoveContainer" containerID="1bc53c769c23ddf21b26ded07f20d42f23d89029cfc127b5683d217f18b840d6"
Mar 20 16:32:53 crc kubenswrapper[4730]: I0320 16:32:53.533428    4730 scope.go:117] "RemoveContainer" containerID="0b0c904fd86d0d3236cc303102f0b32669f34531125a255be5df201b0dd8ef34"
Mar 20 16:32:53 crc kubenswrapper[4730]: E0320 16:32:53.534127    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:33:06 crc kubenswrapper[4730]: I0320 16:33:06.534791    4730 scope.go:117] "RemoveContainer" containerID="0b0c904fd86d0d3236cc303102f0b32669f34531125a255be5df201b0dd8ef34"
Mar 20 16:33:06 crc kubenswrapper[4730]: E0320 16:33:06.535545    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:33:20 crc kubenswrapper[4730]: I0320 16:33:20.533053    4730 scope.go:117] "RemoveContainer" containerID="0b0c904fd86d0d3236cc303102f0b32669f34531125a255be5df201b0dd8ef34"
Mar 20 16:33:20 crc kubenswrapper[4730]: E0320 16:33:20.534444    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:33:33 crc kubenswrapper[4730]: I0320 16:33:33.532882    4730 scope.go:117] "RemoveContainer" containerID="0b0c904fd86d0d3236cc303102f0b32669f34531125a255be5df201b0dd8ef34"
Mar 20 16:33:33 crc kubenswrapper[4730]: E0320 16:33:33.533649    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:33:47 crc kubenswrapper[4730]: I0320 16:33:47.533921    4730 scope.go:117] "RemoveContainer" containerID="0b0c904fd86d0d3236cc303102f0b32669f34531125a255be5df201b0dd8ef34"
Mar 20 16:33:47 crc kubenswrapper[4730]: E0320 16:33:47.534832    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:34:00 crc kubenswrapper[4730]: I0320 16:34:00.171793    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567074-2xrgq"]
Mar 20 16:34:00 crc kubenswrapper[4730]: E0320 16:34:00.172985    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cadf5c48-6db4-421c-977d-1216334a9383" containerName="oc"
Mar 20 16:34:00 crc kubenswrapper[4730]: I0320 16:34:00.173003    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="cadf5c48-6db4-421c-977d-1216334a9383" containerName="oc"
Mar 20 16:34:00 crc kubenswrapper[4730]: I0320 16:34:00.173301    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="cadf5c48-6db4-421c-977d-1216334a9383" containerName="oc"
Mar 20 16:34:00 crc kubenswrapper[4730]: I0320 16:34:00.174154    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567074-2xrgq"
Mar 20 16:34:00 crc kubenswrapper[4730]: I0320 16:34:00.175873    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhfvb\" (UniqueName: \"kubernetes.io/projected/2a649324-b73a-44e0-94e5-2b8c54476367-kube-api-access-mhfvb\") pod \"auto-csr-approver-29567074-2xrgq\" (UID: \"2a649324-b73a-44e0-94e5-2b8c54476367\") " pod="openshift-infra/auto-csr-approver-29567074-2xrgq"
Mar 20 16:34:00 crc kubenswrapper[4730]: I0320 16:34:00.177663    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vlqbl"
Mar 20 16:34:00 crc kubenswrapper[4730]: I0320 16:34:00.177839    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt"
Mar 20 16:34:00 crc kubenswrapper[4730]: I0320 16:34:00.182700    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567074-2xrgq"]
Mar 20 16:34:00 crc kubenswrapper[4730]: I0320 16:34:00.183567    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt"
Mar 20 16:34:00 crc kubenswrapper[4730]: I0320 16:34:00.277785    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhfvb\" (UniqueName: \"kubernetes.io/projected/2a649324-b73a-44e0-94e5-2b8c54476367-kube-api-access-mhfvb\") pod \"auto-csr-approver-29567074-2xrgq\" (UID: \"2a649324-b73a-44e0-94e5-2b8c54476367\") " pod="openshift-infra/auto-csr-approver-29567074-2xrgq"
Mar 20 16:34:00 crc kubenswrapper[4730]: I0320 16:34:00.299290    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhfvb\" (UniqueName: \"kubernetes.io/projected/2a649324-b73a-44e0-94e5-2b8c54476367-kube-api-access-mhfvb\") pod \"auto-csr-approver-29567074-2xrgq\" (UID: \"2a649324-b73a-44e0-94e5-2b8c54476367\") " pod="openshift-infra/auto-csr-approver-29567074-2xrgq"
Mar 20 16:34:00 crc kubenswrapper[4730]: I0320 16:34:00.492944    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567074-2xrgq"
Mar 20 16:34:01 crc kubenswrapper[4730]: I0320 16:34:01.025229    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567074-2xrgq"]
Mar 20 16:34:01 crc kubenswrapper[4730]: I0320 16:34:01.028112    4730 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider
Mar 20 16:34:01 crc kubenswrapper[4730]: I0320 16:34:01.545551    4730 scope.go:117] "RemoveContainer" containerID="0b0c904fd86d0d3236cc303102f0b32669f34531125a255be5df201b0dd8ef34"
Mar 20 16:34:01 crc kubenswrapper[4730]: E0320 16:34:01.546198    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:34:01 crc kubenswrapper[4730]: I0320 16:34:01.771238    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567074-2xrgq" event={"ID":"2a649324-b73a-44e0-94e5-2b8c54476367","Type":"ContainerStarted","Data":"2dbc3338f430f65a77101dc0cc6803709cb1f4c2df202c94dde2ff10ca47ffca"}
Mar 20 16:34:02 crc kubenswrapper[4730]: I0320 16:34:02.784753    4730 generic.go:334] "Generic (PLEG): container finished" podID="2a649324-b73a-44e0-94e5-2b8c54476367" containerID="2bb1a712fbfcbaa124ee788c9be392cdb5ddacf514828a35ba09574bc19839a4" exitCode=0
Mar 20 16:34:02 crc kubenswrapper[4730]: I0320 16:34:02.784974    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567074-2xrgq" event={"ID":"2a649324-b73a-44e0-94e5-2b8c54476367","Type":"ContainerDied","Data":"2bb1a712fbfcbaa124ee788c9be392cdb5ddacf514828a35ba09574bc19839a4"}
Mar 20 16:34:04 crc kubenswrapper[4730]: I0320 16:34:04.166744    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567074-2xrgq"
Mar 20 16:34:04 crc kubenswrapper[4730]: I0320 16:34:04.267395    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhfvb\" (UniqueName: \"kubernetes.io/projected/2a649324-b73a-44e0-94e5-2b8c54476367-kube-api-access-mhfvb\") pod \"2a649324-b73a-44e0-94e5-2b8c54476367\" (UID: \"2a649324-b73a-44e0-94e5-2b8c54476367\") "
Mar 20 16:34:04 crc kubenswrapper[4730]: I0320 16:34:04.273135    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a649324-b73a-44e0-94e5-2b8c54476367-kube-api-access-mhfvb" (OuterVolumeSpecName: "kube-api-access-mhfvb") pod "2a649324-b73a-44e0-94e5-2b8c54476367" (UID: "2a649324-b73a-44e0-94e5-2b8c54476367"). InnerVolumeSpecName "kube-api-access-mhfvb". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:34:04 crc kubenswrapper[4730]: I0320 16:34:04.370577    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhfvb\" (UniqueName: \"kubernetes.io/projected/2a649324-b73a-44e0-94e5-2b8c54476367-kube-api-access-mhfvb\") on node \"crc\" DevicePath \"\""
Mar 20 16:34:04 crc kubenswrapper[4730]: I0320 16:34:04.804870    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567074-2xrgq" event={"ID":"2a649324-b73a-44e0-94e5-2b8c54476367","Type":"ContainerDied","Data":"2dbc3338f430f65a77101dc0cc6803709cb1f4c2df202c94dde2ff10ca47ffca"}
Mar 20 16:34:04 crc kubenswrapper[4730]: I0320 16:34:04.805213    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2dbc3338f430f65a77101dc0cc6803709cb1f4c2df202c94dde2ff10ca47ffca"
Mar 20 16:34:04 crc kubenswrapper[4730]: I0320 16:34:04.804942    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567074-2xrgq"
Mar 20 16:34:05 crc kubenswrapper[4730]: I0320 16:34:05.247374    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567068-d884x"]
Mar 20 16:34:05 crc kubenswrapper[4730]: I0320 16:34:05.254318    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567068-d884x"]
Mar 20 16:34:05 crc kubenswrapper[4730]: I0320 16:34:05.544048    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02177dc8-25be-4462-afba-d87fda4396c6" path="/var/lib/kubelet/pods/02177dc8-25be-4462-afba-d87fda4396c6/volumes"
Mar 20 16:34:14 crc kubenswrapper[4730]: I0320 16:34:14.533857    4730 scope.go:117] "RemoveContainer" containerID="0b0c904fd86d0d3236cc303102f0b32669f34531125a255be5df201b0dd8ef34"
Mar 20 16:34:14 crc kubenswrapper[4730]: E0320 16:34:14.534788    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:34:26 crc kubenswrapper[4730]: I0320 16:34:26.533026    4730 scope.go:117] "RemoveContainer" containerID="0b0c904fd86d0d3236cc303102f0b32669f34531125a255be5df201b0dd8ef34"
Mar 20 16:34:26 crc kubenswrapper[4730]: E0320 16:34:26.534715    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:34:37 crc kubenswrapper[4730]: I0320 16:34:37.533802    4730 scope.go:117] "RemoveContainer" containerID="0b0c904fd86d0d3236cc303102f0b32669f34531125a255be5df201b0dd8ef34"
Mar 20 16:34:37 crc kubenswrapper[4730]: E0320 16:34:37.534687    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:34:50 crc kubenswrapper[4730]: I0320 16:34:50.540183    4730 scope.go:117] "RemoveContainer" containerID="82eb38bbfad57d58cc830a2f14037d88d822b6b89fbb3a03b36b1b472f369ed1"
Mar 20 16:34:51 crc kubenswrapper[4730]: I0320 16:34:51.545606    4730 scope.go:117] "RemoveContainer" containerID="0b0c904fd86d0d3236cc303102f0b32669f34531125a255be5df201b0dd8ef34"
Mar 20 16:34:51 crc kubenswrapper[4730]: E0320 16:34:51.546144    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:35:03 crc kubenswrapper[4730]: I0320 16:35:03.533324    4730 scope.go:117] "RemoveContainer" containerID="0b0c904fd86d0d3236cc303102f0b32669f34531125a255be5df201b0dd8ef34"
Mar 20 16:35:03 crc kubenswrapper[4730]: E0320 16:35:03.534193    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:35:05 crc kubenswrapper[4730]: I0320 16:35:05.557775    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-q2kb5"]
Mar 20 16:35:05 crc kubenswrapper[4730]: E0320 16:35:05.558905    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a649324-b73a-44e0-94e5-2b8c54476367" containerName="oc"
Mar 20 16:35:05 crc kubenswrapper[4730]: I0320 16:35:05.558922    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a649324-b73a-44e0-94e5-2b8c54476367" containerName="oc"
Mar 20 16:35:05 crc kubenswrapper[4730]: I0320 16:35:05.559166    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a649324-b73a-44e0-94e5-2b8c54476367" containerName="oc"
Mar 20 16:35:05 crc kubenswrapper[4730]: I0320 16:35:05.561030    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q2kb5"
Mar 20 16:35:05 crc kubenswrapper[4730]: I0320 16:35:05.587155    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q2kb5"]
Mar 20 16:35:05 crc kubenswrapper[4730]: I0320 16:35:05.631171    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6063e942-1052-4d11-b5d3-22b8a54fac0b-utilities\") pod \"redhat-marketplace-q2kb5\" (UID: \"6063e942-1052-4d11-b5d3-22b8a54fac0b\") " pod="openshift-marketplace/redhat-marketplace-q2kb5"
Mar 20 16:35:05 crc kubenswrapper[4730]: I0320 16:35:05.631263    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnwmt\" (UniqueName: \"kubernetes.io/projected/6063e942-1052-4d11-b5d3-22b8a54fac0b-kube-api-access-tnwmt\") pod \"redhat-marketplace-q2kb5\" (UID: \"6063e942-1052-4d11-b5d3-22b8a54fac0b\") " pod="openshift-marketplace/redhat-marketplace-q2kb5"
Mar 20 16:35:05 crc kubenswrapper[4730]: I0320 16:35:05.631629    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6063e942-1052-4d11-b5d3-22b8a54fac0b-catalog-content\") pod \"redhat-marketplace-q2kb5\" (UID: \"6063e942-1052-4d11-b5d3-22b8a54fac0b\") " pod="openshift-marketplace/redhat-marketplace-q2kb5"
Mar 20 16:35:05 crc kubenswrapper[4730]: I0320 16:35:05.733597    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6063e942-1052-4d11-b5d3-22b8a54fac0b-utilities\") pod \"redhat-marketplace-q2kb5\" (UID: \"6063e942-1052-4d11-b5d3-22b8a54fac0b\") " pod="openshift-marketplace/redhat-marketplace-q2kb5"
Mar 20 16:35:05 crc kubenswrapper[4730]: I0320 16:35:05.733669    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnwmt\" (UniqueName: \"kubernetes.io/projected/6063e942-1052-4d11-b5d3-22b8a54fac0b-kube-api-access-tnwmt\") pod \"redhat-marketplace-q2kb5\" (UID: \"6063e942-1052-4d11-b5d3-22b8a54fac0b\") " pod="openshift-marketplace/redhat-marketplace-q2kb5"
Mar 20 16:35:05 crc kubenswrapper[4730]: I0320 16:35:05.733924    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6063e942-1052-4d11-b5d3-22b8a54fac0b-catalog-content\") pod \"redhat-marketplace-q2kb5\" (UID: \"6063e942-1052-4d11-b5d3-22b8a54fac0b\") " pod="openshift-marketplace/redhat-marketplace-q2kb5"
Mar 20 16:35:05 crc kubenswrapper[4730]: I0320 16:35:05.734116    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6063e942-1052-4d11-b5d3-22b8a54fac0b-utilities\") pod \"redhat-marketplace-q2kb5\" (UID: \"6063e942-1052-4d11-b5d3-22b8a54fac0b\") " pod="openshift-marketplace/redhat-marketplace-q2kb5"
Mar 20 16:35:05 crc kubenswrapper[4730]: I0320 16:35:05.734864    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6063e942-1052-4d11-b5d3-22b8a54fac0b-catalog-content\") pod \"redhat-marketplace-q2kb5\" (UID: \"6063e942-1052-4d11-b5d3-22b8a54fac0b\") " pod="openshift-marketplace/redhat-marketplace-q2kb5"
Mar 20 16:35:05 crc kubenswrapper[4730]: I0320 16:35:05.757712    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnwmt\" (UniqueName: \"kubernetes.io/projected/6063e942-1052-4d11-b5d3-22b8a54fac0b-kube-api-access-tnwmt\") pod \"redhat-marketplace-q2kb5\" (UID: \"6063e942-1052-4d11-b5d3-22b8a54fac0b\") " pod="openshift-marketplace/redhat-marketplace-q2kb5"
Mar 20 16:35:05 crc kubenswrapper[4730]: I0320 16:35:05.882808    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q2kb5"
Mar 20 16:35:06 crc kubenswrapper[4730]: I0320 16:35:06.396823    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q2kb5"]
Mar 20 16:35:06 crc kubenswrapper[4730]: I0320 16:35:06.413967    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q2kb5" event={"ID":"6063e942-1052-4d11-b5d3-22b8a54fac0b","Type":"ContainerStarted","Data":"8dadef0cacc819297d7b74d8157d397280f72d917551fbf30a2a26925d8faa40"}
Mar 20 16:35:07 crc kubenswrapper[4730]: I0320 16:35:07.423498    4730 generic.go:334] "Generic (PLEG): container finished" podID="6063e942-1052-4d11-b5d3-22b8a54fac0b" containerID="d7d89caefd3eb9f14a2b8ca66fe393d83a87edc1c07d15102fe40ddac9e511db" exitCode=0
Mar 20 16:35:07 crc kubenswrapper[4730]: I0320 16:35:07.423592    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q2kb5" event={"ID":"6063e942-1052-4d11-b5d3-22b8a54fac0b","Type":"ContainerDied","Data":"d7d89caefd3eb9f14a2b8ca66fe393d83a87edc1c07d15102fe40ddac9e511db"}
Mar 20 16:35:08 crc kubenswrapper[4730]: I0320 16:35:08.439507    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q2kb5" event={"ID":"6063e942-1052-4d11-b5d3-22b8a54fac0b","Type":"ContainerStarted","Data":"fedebce627b5326d3479102ee71d13b0ef2a44bb20a21e8ed4485d42c7214a27"}
Mar 20 16:35:10 crc kubenswrapper[4730]: I0320 16:35:10.456804    4730 generic.go:334] "Generic (PLEG): container finished" podID="6063e942-1052-4d11-b5d3-22b8a54fac0b" containerID="fedebce627b5326d3479102ee71d13b0ef2a44bb20a21e8ed4485d42c7214a27" exitCode=0
Mar 20 16:35:10 crc kubenswrapper[4730]: I0320 16:35:10.457000    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q2kb5" event={"ID":"6063e942-1052-4d11-b5d3-22b8a54fac0b","Type":"ContainerDied","Data":"fedebce627b5326d3479102ee71d13b0ef2a44bb20a21e8ed4485d42c7214a27"}
Mar 20 16:35:11 crc kubenswrapper[4730]: I0320 16:35:11.468678    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q2kb5" event={"ID":"6063e942-1052-4d11-b5d3-22b8a54fac0b","Type":"ContainerStarted","Data":"22dc69ffe2787e82cc89a2d123fa9f97367487636c43039a924e4b35f0f2ad2a"}
Mar 20 16:35:11 crc kubenswrapper[4730]: I0320 16:35:11.492170    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-q2kb5" podStartSLOduration=3.01077823 podStartE2EDuration="6.492149858s" podCreationTimestamp="2026-03-20 16:35:05 +0000 UTC" firstStartedPulling="2026-03-20 16:35:07.426758026 +0000 UTC m=+3366.640129395" lastFinishedPulling="2026-03-20 16:35:10.908129654 +0000 UTC m=+3370.121501023" observedRunningTime="2026-03-20 16:35:11.48692866 +0000 UTC m=+3370.700300029" watchObservedRunningTime="2026-03-20 16:35:11.492149858 +0000 UTC m=+3370.705521227"
Mar 20 16:35:15 crc kubenswrapper[4730]: I0320 16:35:15.883386    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-q2kb5"
Mar 20 16:35:15 crc kubenswrapper[4730]: I0320 16:35:15.884735    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-q2kb5"
Mar 20 16:35:15 crc kubenswrapper[4730]: I0320 16:35:15.941310    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-q2kb5"
Mar 20 16:35:16 crc kubenswrapper[4730]: I0320 16:35:16.533771    4730 scope.go:117] "RemoveContainer" containerID="0b0c904fd86d0d3236cc303102f0b32669f34531125a255be5df201b0dd8ef34"
Mar 20 16:35:16 crc kubenswrapper[4730]: E0320 16:35:16.534184    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:35:16 crc kubenswrapper[4730]: I0320 16:35:16.587401    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-q2kb5"
Mar 20 16:35:16 crc kubenswrapper[4730]: I0320 16:35:16.653027    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q2kb5"]
Mar 20 16:35:18 crc kubenswrapper[4730]: I0320 16:35:18.547752    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-q2kb5" podUID="6063e942-1052-4d11-b5d3-22b8a54fac0b" containerName="registry-server" containerID="cri-o://22dc69ffe2787e82cc89a2d123fa9f97367487636c43039a924e4b35f0f2ad2a" gracePeriod=2
Mar 20 16:35:19 crc kubenswrapper[4730]: I0320 16:35:19.031169    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q2kb5"
Mar 20 16:35:19 crc kubenswrapper[4730]: I0320 16:35:19.135206    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6063e942-1052-4d11-b5d3-22b8a54fac0b-utilities\") pod \"6063e942-1052-4d11-b5d3-22b8a54fac0b\" (UID: \"6063e942-1052-4d11-b5d3-22b8a54fac0b\") "
Mar 20 16:35:19 crc kubenswrapper[4730]: I0320 16:35:19.135448    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnwmt\" (UniqueName: \"kubernetes.io/projected/6063e942-1052-4d11-b5d3-22b8a54fac0b-kube-api-access-tnwmt\") pod \"6063e942-1052-4d11-b5d3-22b8a54fac0b\" (UID: \"6063e942-1052-4d11-b5d3-22b8a54fac0b\") "
Mar 20 16:35:19 crc kubenswrapper[4730]: I0320 16:35:19.135468    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6063e942-1052-4d11-b5d3-22b8a54fac0b-catalog-content\") pod \"6063e942-1052-4d11-b5d3-22b8a54fac0b\" (UID: \"6063e942-1052-4d11-b5d3-22b8a54fac0b\") "
Mar 20 16:35:19 crc kubenswrapper[4730]: I0320 16:35:19.136081    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6063e942-1052-4d11-b5d3-22b8a54fac0b-utilities" (OuterVolumeSpecName: "utilities") pod "6063e942-1052-4d11-b5d3-22b8a54fac0b" (UID: "6063e942-1052-4d11-b5d3-22b8a54fac0b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:35:19 crc kubenswrapper[4730]: I0320 16:35:19.142625    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6063e942-1052-4d11-b5d3-22b8a54fac0b-kube-api-access-tnwmt" (OuterVolumeSpecName: "kube-api-access-tnwmt") pod "6063e942-1052-4d11-b5d3-22b8a54fac0b" (UID: "6063e942-1052-4d11-b5d3-22b8a54fac0b"). InnerVolumeSpecName "kube-api-access-tnwmt". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:35:19 crc kubenswrapper[4730]: I0320 16:35:19.169935    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6063e942-1052-4d11-b5d3-22b8a54fac0b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6063e942-1052-4d11-b5d3-22b8a54fac0b" (UID: "6063e942-1052-4d11-b5d3-22b8a54fac0b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:35:19 crc kubenswrapper[4730]: I0320 16:35:19.238087    4730 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6063e942-1052-4d11-b5d3-22b8a54fac0b-utilities\") on node \"crc\" DevicePath \"\""
Mar 20 16:35:19 crc kubenswrapper[4730]: I0320 16:35:19.238148    4730 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6063e942-1052-4d11-b5d3-22b8a54fac0b-catalog-content\") on node \"crc\" DevicePath \"\""
Mar 20 16:35:19 crc kubenswrapper[4730]: I0320 16:35:19.238166    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnwmt\" (UniqueName: \"kubernetes.io/projected/6063e942-1052-4d11-b5d3-22b8a54fac0b-kube-api-access-tnwmt\") on node \"crc\" DevicePath \"\""
Mar 20 16:35:19 crc kubenswrapper[4730]: I0320 16:35:19.564573    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q2kb5"
Mar 20 16:35:19 crc kubenswrapper[4730]: I0320 16:35:19.564582    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q2kb5" event={"ID":"6063e942-1052-4d11-b5d3-22b8a54fac0b","Type":"ContainerDied","Data":"22dc69ffe2787e82cc89a2d123fa9f97367487636c43039a924e4b35f0f2ad2a"}
Mar 20 16:35:19 crc kubenswrapper[4730]: I0320 16:35:19.564641    4730 scope.go:117] "RemoveContainer" containerID="22dc69ffe2787e82cc89a2d123fa9f97367487636c43039a924e4b35f0f2ad2a"
Mar 20 16:35:19 crc kubenswrapper[4730]: I0320 16:35:19.564434    4730 generic.go:334] "Generic (PLEG): container finished" podID="6063e942-1052-4d11-b5d3-22b8a54fac0b" containerID="22dc69ffe2787e82cc89a2d123fa9f97367487636c43039a924e4b35f0f2ad2a" exitCode=0
Mar 20 16:35:19 crc kubenswrapper[4730]: I0320 16:35:19.565322    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q2kb5" event={"ID":"6063e942-1052-4d11-b5d3-22b8a54fac0b","Type":"ContainerDied","Data":"8dadef0cacc819297d7b74d8157d397280f72d917551fbf30a2a26925d8faa40"}
Mar 20 16:35:19 crc kubenswrapper[4730]: I0320 16:35:19.598350    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q2kb5"]
Mar 20 16:35:19 crc kubenswrapper[4730]: I0320 16:35:19.602509    4730 scope.go:117] "RemoveContainer" containerID="fedebce627b5326d3479102ee71d13b0ef2a44bb20a21e8ed4485d42c7214a27"
Mar 20 16:35:19 crc kubenswrapper[4730]: I0320 16:35:19.607453    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-q2kb5"]
Mar 20 16:35:19 crc kubenswrapper[4730]: I0320 16:35:19.636570    4730 scope.go:117] "RemoveContainer" containerID="d7d89caefd3eb9f14a2b8ca66fe393d83a87edc1c07d15102fe40ddac9e511db"
Mar 20 16:35:19 crc kubenswrapper[4730]: I0320 16:35:19.671644    4730 scope.go:117] "RemoveContainer" containerID="22dc69ffe2787e82cc89a2d123fa9f97367487636c43039a924e4b35f0f2ad2a"
Mar 20 16:35:19 crc kubenswrapper[4730]: E0320 16:35:19.672110    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22dc69ffe2787e82cc89a2d123fa9f97367487636c43039a924e4b35f0f2ad2a\": container with ID starting with 22dc69ffe2787e82cc89a2d123fa9f97367487636c43039a924e4b35f0f2ad2a not found: ID does not exist" containerID="22dc69ffe2787e82cc89a2d123fa9f97367487636c43039a924e4b35f0f2ad2a"
Mar 20 16:35:19 crc kubenswrapper[4730]: I0320 16:35:19.672161    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22dc69ffe2787e82cc89a2d123fa9f97367487636c43039a924e4b35f0f2ad2a"} err="failed to get container status \"22dc69ffe2787e82cc89a2d123fa9f97367487636c43039a924e4b35f0f2ad2a\": rpc error: code = NotFound desc = could not find container \"22dc69ffe2787e82cc89a2d123fa9f97367487636c43039a924e4b35f0f2ad2a\": container with ID starting with 22dc69ffe2787e82cc89a2d123fa9f97367487636c43039a924e4b35f0f2ad2a not found: ID does not exist"
Mar 20 16:35:19 crc kubenswrapper[4730]: I0320 16:35:19.672190    4730 scope.go:117] "RemoveContainer" containerID="fedebce627b5326d3479102ee71d13b0ef2a44bb20a21e8ed4485d42c7214a27"
Mar 20 16:35:19 crc kubenswrapper[4730]: E0320 16:35:19.672638    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fedebce627b5326d3479102ee71d13b0ef2a44bb20a21e8ed4485d42c7214a27\": container with ID starting with fedebce627b5326d3479102ee71d13b0ef2a44bb20a21e8ed4485d42c7214a27 not found: ID does not exist" containerID="fedebce627b5326d3479102ee71d13b0ef2a44bb20a21e8ed4485d42c7214a27"
Mar 20 16:35:19 crc kubenswrapper[4730]: I0320 16:35:19.672670    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fedebce627b5326d3479102ee71d13b0ef2a44bb20a21e8ed4485d42c7214a27"} err="failed to get container status \"fedebce627b5326d3479102ee71d13b0ef2a44bb20a21e8ed4485d42c7214a27\": rpc error: code = NotFound desc = could not find container \"fedebce627b5326d3479102ee71d13b0ef2a44bb20a21e8ed4485d42c7214a27\": container with ID starting with fedebce627b5326d3479102ee71d13b0ef2a44bb20a21e8ed4485d42c7214a27 not found: ID does not exist"
Mar 20 16:35:19 crc kubenswrapper[4730]: I0320 16:35:19.672691    4730 scope.go:117] "RemoveContainer" containerID="d7d89caefd3eb9f14a2b8ca66fe393d83a87edc1c07d15102fe40ddac9e511db"
Mar 20 16:35:19 crc kubenswrapper[4730]: E0320 16:35:19.672904    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7d89caefd3eb9f14a2b8ca66fe393d83a87edc1c07d15102fe40ddac9e511db\": container with ID starting with d7d89caefd3eb9f14a2b8ca66fe393d83a87edc1c07d15102fe40ddac9e511db not found: ID does not exist" containerID="d7d89caefd3eb9f14a2b8ca66fe393d83a87edc1c07d15102fe40ddac9e511db"
Mar 20 16:35:19 crc kubenswrapper[4730]: I0320 16:35:19.672932    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7d89caefd3eb9f14a2b8ca66fe393d83a87edc1c07d15102fe40ddac9e511db"} err="failed to get container status \"d7d89caefd3eb9f14a2b8ca66fe393d83a87edc1c07d15102fe40ddac9e511db\": rpc error: code = NotFound desc = could not find container \"d7d89caefd3eb9f14a2b8ca66fe393d83a87edc1c07d15102fe40ddac9e511db\": container with ID starting with d7d89caefd3eb9f14a2b8ca66fe393d83a87edc1c07d15102fe40ddac9e511db not found: ID does not exist"
Mar 20 16:35:21 crc kubenswrapper[4730]: I0320 16:35:21.561658    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6063e942-1052-4d11-b5d3-22b8a54fac0b" path="/var/lib/kubelet/pods/6063e942-1052-4d11-b5d3-22b8a54fac0b/volumes"
Mar 20 16:35:31 crc kubenswrapper[4730]: I0320 16:35:31.540290    4730 scope.go:117] "RemoveContainer" containerID="0b0c904fd86d0d3236cc303102f0b32669f34531125a255be5df201b0dd8ef34"
Mar 20 16:35:31 crc kubenswrapper[4730]: E0320 16:35:31.541016    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:35:45 crc kubenswrapper[4730]: I0320 16:35:45.534544    4730 scope.go:117] "RemoveContainer" containerID="0b0c904fd86d0d3236cc303102f0b32669f34531125a255be5df201b0dd8ef34"
Mar 20 16:35:45 crc kubenswrapper[4730]: E0320 16:35:45.535719    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:35:51 crc kubenswrapper[4730]: I0320 16:35:51.253237    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qtb9x"]
Mar 20 16:35:51 crc kubenswrapper[4730]: E0320 16:35:51.254261    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6063e942-1052-4d11-b5d3-22b8a54fac0b" containerName="registry-server"
Mar 20 16:35:51 crc kubenswrapper[4730]: I0320 16:35:51.254277    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="6063e942-1052-4d11-b5d3-22b8a54fac0b" containerName="registry-server"
Mar 20 16:35:51 crc kubenswrapper[4730]: E0320 16:35:51.254324    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6063e942-1052-4d11-b5d3-22b8a54fac0b" containerName="extract-content"
Mar 20 16:35:51 crc kubenswrapper[4730]: I0320 16:35:51.254333    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="6063e942-1052-4d11-b5d3-22b8a54fac0b" containerName="extract-content"
Mar 20 16:35:51 crc kubenswrapper[4730]: E0320 16:35:51.254350    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6063e942-1052-4d11-b5d3-22b8a54fac0b" containerName="extract-utilities"
Mar 20 16:35:51 crc kubenswrapper[4730]: I0320 16:35:51.254359    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="6063e942-1052-4d11-b5d3-22b8a54fac0b" containerName="extract-utilities"
Mar 20 16:35:51 crc kubenswrapper[4730]: I0320 16:35:51.254600    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="6063e942-1052-4d11-b5d3-22b8a54fac0b" containerName="registry-server"
Mar 20 16:35:51 crc kubenswrapper[4730]: I0320 16:35:51.256428    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qtb9x"
Mar 20 16:35:51 crc kubenswrapper[4730]: I0320 16:35:51.277752    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qtb9x"]
Mar 20 16:35:51 crc kubenswrapper[4730]: I0320 16:35:51.318589    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccc2048d-115d-4872-a28e-0a34a552b5fc-utilities\") pod \"community-operators-qtb9x\" (UID: \"ccc2048d-115d-4872-a28e-0a34a552b5fc\") " pod="openshift-marketplace/community-operators-qtb9x"
Mar 20 16:35:51 crc kubenswrapper[4730]: I0320 16:35:51.318694    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsh5z\" (UniqueName: \"kubernetes.io/projected/ccc2048d-115d-4872-a28e-0a34a552b5fc-kube-api-access-lsh5z\") pod \"community-operators-qtb9x\" (UID: \"ccc2048d-115d-4872-a28e-0a34a552b5fc\") " pod="openshift-marketplace/community-operators-qtb9x"
Mar 20 16:35:51 crc kubenswrapper[4730]: I0320 16:35:51.318857    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccc2048d-115d-4872-a28e-0a34a552b5fc-catalog-content\") pod \"community-operators-qtb9x\" (UID: \"ccc2048d-115d-4872-a28e-0a34a552b5fc\") " pod="openshift-marketplace/community-operators-qtb9x"
Mar 20 16:35:51 crc kubenswrapper[4730]: I0320 16:35:51.420571    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccc2048d-115d-4872-a28e-0a34a552b5fc-catalog-content\") pod \"community-operators-qtb9x\" (UID: \"ccc2048d-115d-4872-a28e-0a34a552b5fc\") " pod="openshift-marketplace/community-operators-qtb9x"
Mar 20 16:35:51 crc kubenswrapper[4730]: I0320 16:35:51.420643    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccc2048d-115d-4872-a28e-0a34a552b5fc-utilities\") pod \"community-operators-qtb9x\" (UID: \"ccc2048d-115d-4872-a28e-0a34a552b5fc\") " pod="openshift-marketplace/community-operators-qtb9x"
Mar 20 16:35:51 crc kubenswrapper[4730]: I0320 16:35:51.420717    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsh5z\" (UniqueName: \"kubernetes.io/projected/ccc2048d-115d-4872-a28e-0a34a552b5fc-kube-api-access-lsh5z\") pod \"community-operators-qtb9x\" (UID: \"ccc2048d-115d-4872-a28e-0a34a552b5fc\") " pod="openshift-marketplace/community-operators-qtb9x"
Mar 20 16:35:51 crc kubenswrapper[4730]: I0320 16:35:51.421191    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccc2048d-115d-4872-a28e-0a34a552b5fc-catalog-content\") pod \"community-operators-qtb9x\" (UID: \"ccc2048d-115d-4872-a28e-0a34a552b5fc\") " pod="openshift-marketplace/community-operators-qtb9x"
Mar 20 16:35:51 crc kubenswrapper[4730]: I0320 16:35:51.421191    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccc2048d-115d-4872-a28e-0a34a552b5fc-utilities\") pod \"community-operators-qtb9x\" (UID: \"ccc2048d-115d-4872-a28e-0a34a552b5fc\") " pod="openshift-marketplace/community-operators-qtb9x"
Mar 20 16:35:51 crc kubenswrapper[4730]: I0320 16:35:51.453096    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsh5z\" (UniqueName: \"kubernetes.io/projected/ccc2048d-115d-4872-a28e-0a34a552b5fc-kube-api-access-lsh5z\") pod \"community-operators-qtb9x\" (UID: \"ccc2048d-115d-4872-a28e-0a34a552b5fc\") " pod="openshift-marketplace/community-operators-qtb9x"
Mar 20 16:35:51 crc kubenswrapper[4730]: I0320 16:35:51.602278    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qtb9x"
Mar 20 16:35:52 crc kubenswrapper[4730]: I0320 16:35:52.112583    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qtb9x"]
Mar 20 16:35:52 crc kubenswrapper[4730]: I0320 16:35:52.871212    4730 generic.go:334] "Generic (PLEG): container finished" podID="ccc2048d-115d-4872-a28e-0a34a552b5fc" containerID="0f0d2d1d59884c6abc7a55b255dabe9f1d0344a1ac94f3f4341815bd4338056b" exitCode=0
Mar 20 16:35:52 crc kubenswrapper[4730]: I0320 16:35:52.871535    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qtb9x" event={"ID":"ccc2048d-115d-4872-a28e-0a34a552b5fc","Type":"ContainerDied","Data":"0f0d2d1d59884c6abc7a55b255dabe9f1d0344a1ac94f3f4341815bd4338056b"}
Mar 20 16:35:52 crc kubenswrapper[4730]: I0320 16:35:52.872433    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qtb9x" event={"ID":"ccc2048d-115d-4872-a28e-0a34a552b5fc","Type":"ContainerStarted","Data":"eff11742f0e763d24178c93c7f26dcee77c92125f068d32fc5f5646326a8fc8b"}
Mar 20 16:35:54 crc kubenswrapper[4730]: I0320 16:35:54.894957    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qtb9x" event={"ID":"ccc2048d-115d-4872-a28e-0a34a552b5fc","Type":"ContainerStarted","Data":"288dd475a1bf708e43583f729e6d81820e0c88b93d5fa513bc9fc1acb9234d16"}
Mar 20 16:35:56 crc kubenswrapper[4730]: I0320 16:35:56.533942    4730 scope.go:117] "RemoveContainer" containerID="0b0c904fd86d0d3236cc303102f0b32669f34531125a255be5df201b0dd8ef34"
Mar 20 16:35:56 crc kubenswrapper[4730]: E0320 16:35:56.534554    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:35:56 crc kubenswrapper[4730]: I0320 16:35:56.913999    4730 generic.go:334] "Generic (PLEG): container finished" podID="ccc2048d-115d-4872-a28e-0a34a552b5fc" containerID="288dd475a1bf708e43583f729e6d81820e0c88b93d5fa513bc9fc1acb9234d16" exitCode=0
Mar 20 16:35:56 crc kubenswrapper[4730]: I0320 16:35:56.914049    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qtb9x" event={"ID":"ccc2048d-115d-4872-a28e-0a34a552b5fc","Type":"ContainerDied","Data":"288dd475a1bf708e43583f729e6d81820e0c88b93d5fa513bc9fc1acb9234d16"}
Mar 20 16:35:57 crc kubenswrapper[4730]: I0320 16:35:57.926142    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qtb9x" event={"ID":"ccc2048d-115d-4872-a28e-0a34a552b5fc","Type":"ContainerStarted","Data":"1ea60d900e9ef26ef6983c268a7de372b9ed4b12de5cec649b6cba7470812fe5"}
Mar 20 16:35:57 crc kubenswrapper[4730]: I0320 16:35:57.950801    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qtb9x" podStartSLOduration=2.212236401 podStartE2EDuration="6.950779153s" podCreationTimestamp="2026-03-20 16:35:51 +0000 UTC" firstStartedPulling="2026-03-20 16:35:52.875112506 +0000 UTC m=+3412.088483875" lastFinishedPulling="2026-03-20 16:35:57.613655258 +0000 UTC m=+3416.827026627" observedRunningTime="2026-03-20 16:35:57.943193428 +0000 UTC m=+3417.156564807" watchObservedRunningTime="2026-03-20 16:35:57.950779153 +0000 UTC m=+3417.164150522"
Mar 20 16:36:00 crc kubenswrapper[4730]: I0320 16:36:00.144141    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567076-5zjgq"]
Mar 20 16:36:00 crc kubenswrapper[4730]: I0320 16:36:00.146097    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567076-5zjgq"
Mar 20 16:36:00 crc kubenswrapper[4730]: I0320 16:36:00.148067    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vlqbl"
Mar 20 16:36:00 crc kubenswrapper[4730]: I0320 16:36:00.148298    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt"
Mar 20 16:36:00 crc kubenswrapper[4730]: I0320 16:36:00.150765    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt"
Mar 20 16:36:00 crc kubenswrapper[4730]: I0320 16:36:00.154959    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567076-5zjgq"]
Mar 20 16:36:00 crc kubenswrapper[4730]: I0320 16:36:00.315648    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xsjd\" (UniqueName: \"kubernetes.io/projected/bcb7d22d-4ad0-4a9c-bf00-e966d1abb051-kube-api-access-9xsjd\") pod \"auto-csr-approver-29567076-5zjgq\" (UID: \"bcb7d22d-4ad0-4a9c-bf00-e966d1abb051\") " pod="openshift-infra/auto-csr-approver-29567076-5zjgq"
Mar 20 16:36:00 crc kubenswrapper[4730]: I0320 16:36:00.418102    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xsjd\" (UniqueName: \"kubernetes.io/projected/bcb7d22d-4ad0-4a9c-bf00-e966d1abb051-kube-api-access-9xsjd\") pod \"auto-csr-approver-29567076-5zjgq\" (UID: \"bcb7d22d-4ad0-4a9c-bf00-e966d1abb051\") " pod="openshift-infra/auto-csr-approver-29567076-5zjgq"
Mar 20 16:36:00 crc kubenswrapper[4730]: I0320 16:36:00.443808    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xsjd\" (UniqueName: \"kubernetes.io/projected/bcb7d22d-4ad0-4a9c-bf00-e966d1abb051-kube-api-access-9xsjd\") pod \"auto-csr-approver-29567076-5zjgq\" (UID: \"bcb7d22d-4ad0-4a9c-bf00-e966d1abb051\") " pod="openshift-infra/auto-csr-approver-29567076-5zjgq"
Mar 20 16:36:00 crc kubenswrapper[4730]: I0320 16:36:00.468891    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567076-5zjgq"
Mar 20 16:36:00 crc kubenswrapper[4730]: I0320 16:36:00.943077    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567076-5zjgq"]
Mar 20 16:36:00 crc kubenswrapper[4730]: I0320 16:36:00.969704    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567076-5zjgq" event={"ID":"bcb7d22d-4ad0-4a9c-bf00-e966d1abb051","Type":"ContainerStarted","Data":"90c9a5f2489ea935ff20be9876decd344799b1f14ecda31d27a44ae5ad46b793"}
Mar 20 16:36:01 crc kubenswrapper[4730]: I0320 16:36:01.602585    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qtb9x"
Mar 20 16:36:01 crc kubenswrapper[4730]: I0320 16:36:01.602634    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qtb9x"
Mar 20 16:36:02 crc kubenswrapper[4730]: I0320 16:36:02.660994    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-qtb9x" podUID="ccc2048d-115d-4872-a28e-0a34a552b5fc" containerName="registry-server" probeResult="failure" output=<
Mar 20 16:36:02 crc kubenswrapper[4730]:         timeout: failed to connect service ":50051" within 1s
Mar 20 16:36:02 crc kubenswrapper[4730]:  >
Mar 20 16:36:02 crc kubenswrapper[4730]: I0320 16:36:02.996261    4730 generic.go:334] "Generic (PLEG): container finished" podID="bcb7d22d-4ad0-4a9c-bf00-e966d1abb051" containerID="7abe93567f97d011f8ae053e88185c6004136b63e3d5f72b19beb707014bf434" exitCode=0
Mar 20 16:36:02 crc kubenswrapper[4730]: I0320 16:36:02.996743    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567076-5zjgq" event={"ID":"bcb7d22d-4ad0-4a9c-bf00-e966d1abb051","Type":"ContainerDied","Data":"7abe93567f97d011f8ae053e88185c6004136b63e3d5f72b19beb707014bf434"}
Mar 20 16:36:04 crc kubenswrapper[4730]: I0320 16:36:04.426773    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567076-5zjgq"
Mar 20 16:36:04 crc kubenswrapper[4730]: I0320 16:36:04.530664    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xsjd\" (UniqueName: \"kubernetes.io/projected/bcb7d22d-4ad0-4a9c-bf00-e966d1abb051-kube-api-access-9xsjd\") pod \"bcb7d22d-4ad0-4a9c-bf00-e966d1abb051\" (UID: \"bcb7d22d-4ad0-4a9c-bf00-e966d1abb051\") "
Mar 20 16:36:04 crc kubenswrapper[4730]: I0320 16:36:04.541574    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcb7d22d-4ad0-4a9c-bf00-e966d1abb051-kube-api-access-9xsjd" (OuterVolumeSpecName: "kube-api-access-9xsjd") pod "bcb7d22d-4ad0-4a9c-bf00-e966d1abb051" (UID: "bcb7d22d-4ad0-4a9c-bf00-e966d1abb051"). InnerVolumeSpecName "kube-api-access-9xsjd". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:36:04 crc kubenswrapper[4730]: I0320 16:36:04.634002    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xsjd\" (UniqueName: \"kubernetes.io/projected/bcb7d22d-4ad0-4a9c-bf00-e966d1abb051-kube-api-access-9xsjd\") on node \"crc\" DevicePath \"\""
Mar 20 16:36:05 crc kubenswrapper[4730]: I0320 16:36:05.024855    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567076-5zjgq" event={"ID":"bcb7d22d-4ad0-4a9c-bf00-e966d1abb051","Type":"ContainerDied","Data":"90c9a5f2489ea935ff20be9876decd344799b1f14ecda31d27a44ae5ad46b793"}
Mar 20 16:36:05 crc kubenswrapper[4730]: I0320 16:36:05.025097    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90c9a5f2489ea935ff20be9876decd344799b1f14ecda31d27a44ae5ad46b793"
Mar 20 16:36:05 crc kubenswrapper[4730]: I0320 16:36:05.024900    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567076-5zjgq"
Mar 20 16:36:05 crc kubenswrapper[4730]: I0320 16:36:05.531213    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567070-h8qb5"]
Mar 20 16:36:05 crc kubenswrapper[4730]: I0320 16:36:05.543926    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567070-h8qb5"]
Mar 20 16:36:07 crc kubenswrapper[4730]: I0320 16:36:07.544589    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f50e1094-eded-4000-b7f3-29722d8ba695" path="/var/lib/kubelet/pods/f50e1094-eded-4000-b7f3-29722d8ba695/volumes"
Mar 20 16:36:09 crc kubenswrapper[4730]: I0320 16:36:09.537323    4730 scope.go:117] "RemoveContainer" containerID="0b0c904fd86d0d3236cc303102f0b32669f34531125a255be5df201b0dd8ef34"
Mar 20 16:36:09 crc kubenswrapper[4730]: E0320 16:36:09.538310    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:36:11 crc kubenswrapper[4730]: I0320 16:36:11.651693    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qtb9x"
Mar 20 16:36:11 crc kubenswrapper[4730]: I0320 16:36:11.703790    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qtb9x"
Mar 20 16:36:11 crc kubenswrapper[4730]: I0320 16:36:11.887663    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qtb9x"]
Mar 20 16:36:13 crc kubenswrapper[4730]: I0320 16:36:13.137373    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qtb9x" podUID="ccc2048d-115d-4872-a28e-0a34a552b5fc" containerName="registry-server" containerID="cri-o://1ea60d900e9ef26ef6983c268a7de372b9ed4b12de5cec649b6cba7470812fe5" gracePeriod=2
Mar 20 16:36:13 crc kubenswrapper[4730]: I0320 16:36:13.692391    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qtb9x"
Mar 20 16:36:13 crc kubenswrapper[4730]: I0320 16:36:13.707789    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccc2048d-115d-4872-a28e-0a34a552b5fc-utilities\") pod \"ccc2048d-115d-4872-a28e-0a34a552b5fc\" (UID: \"ccc2048d-115d-4872-a28e-0a34a552b5fc\") "
Mar 20 16:36:13 crc kubenswrapper[4730]: I0320 16:36:13.709179    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccc2048d-115d-4872-a28e-0a34a552b5fc-utilities" (OuterVolumeSpecName: "utilities") pod "ccc2048d-115d-4872-a28e-0a34a552b5fc" (UID: "ccc2048d-115d-4872-a28e-0a34a552b5fc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:36:13 crc kubenswrapper[4730]: I0320 16:36:13.712113    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsh5z\" (UniqueName: \"kubernetes.io/projected/ccc2048d-115d-4872-a28e-0a34a552b5fc-kube-api-access-lsh5z\") pod \"ccc2048d-115d-4872-a28e-0a34a552b5fc\" (UID: \"ccc2048d-115d-4872-a28e-0a34a552b5fc\") "
Mar 20 16:36:13 crc kubenswrapper[4730]: I0320 16:36:13.712885    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccc2048d-115d-4872-a28e-0a34a552b5fc-catalog-content\") pod \"ccc2048d-115d-4872-a28e-0a34a552b5fc\" (UID: \"ccc2048d-115d-4872-a28e-0a34a552b5fc\") "
Mar 20 16:36:13 crc kubenswrapper[4730]: I0320 16:36:13.714485    4730 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccc2048d-115d-4872-a28e-0a34a552b5fc-utilities\") on node \"crc\" DevicePath \"\""
Mar 20 16:36:13 crc kubenswrapper[4730]: I0320 16:36:13.722102    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccc2048d-115d-4872-a28e-0a34a552b5fc-kube-api-access-lsh5z" (OuterVolumeSpecName: "kube-api-access-lsh5z") pod "ccc2048d-115d-4872-a28e-0a34a552b5fc" (UID: "ccc2048d-115d-4872-a28e-0a34a552b5fc"). InnerVolumeSpecName "kube-api-access-lsh5z". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:36:13 crc kubenswrapper[4730]: I0320 16:36:13.778173    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccc2048d-115d-4872-a28e-0a34a552b5fc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ccc2048d-115d-4872-a28e-0a34a552b5fc" (UID: "ccc2048d-115d-4872-a28e-0a34a552b5fc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:36:13 crc kubenswrapper[4730]: I0320 16:36:13.816142    4730 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccc2048d-115d-4872-a28e-0a34a552b5fc-catalog-content\") on node \"crc\" DevicePath \"\""
Mar 20 16:36:13 crc kubenswrapper[4730]: I0320 16:36:13.816191    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsh5z\" (UniqueName: \"kubernetes.io/projected/ccc2048d-115d-4872-a28e-0a34a552b5fc-kube-api-access-lsh5z\") on node \"crc\" DevicePath \"\""
Mar 20 16:36:14 crc kubenswrapper[4730]: I0320 16:36:14.159535    4730 generic.go:334] "Generic (PLEG): container finished" podID="ccc2048d-115d-4872-a28e-0a34a552b5fc" containerID="1ea60d900e9ef26ef6983c268a7de372b9ed4b12de5cec649b6cba7470812fe5" exitCode=0
Mar 20 16:36:14 crc kubenswrapper[4730]: I0320 16:36:14.159596    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qtb9x" event={"ID":"ccc2048d-115d-4872-a28e-0a34a552b5fc","Type":"ContainerDied","Data":"1ea60d900e9ef26ef6983c268a7de372b9ed4b12de5cec649b6cba7470812fe5"}
Mar 20 16:36:14 crc kubenswrapper[4730]: I0320 16:36:14.159628    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qtb9x" event={"ID":"ccc2048d-115d-4872-a28e-0a34a552b5fc","Type":"ContainerDied","Data":"eff11742f0e763d24178c93c7f26dcee77c92125f068d32fc5f5646326a8fc8b"}
Mar 20 16:36:14 crc kubenswrapper[4730]: I0320 16:36:14.159647    4730 scope.go:117] "RemoveContainer" containerID="1ea60d900e9ef26ef6983c268a7de372b9ed4b12de5cec649b6cba7470812fe5"
Mar 20 16:36:14 crc kubenswrapper[4730]: I0320 16:36:14.159817    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qtb9x"
Mar 20 16:36:14 crc kubenswrapper[4730]: I0320 16:36:14.198169    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qtb9x"]
Mar 20 16:36:14 crc kubenswrapper[4730]: I0320 16:36:14.200540    4730 scope.go:117] "RemoveContainer" containerID="288dd475a1bf708e43583f729e6d81820e0c88b93d5fa513bc9fc1acb9234d16"
Mar 20 16:36:14 crc kubenswrapper[4730]: I0320 16:36:14.207838    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qtb9x"]
Mar 20 16:36:14 crc kubenswrapper[4730]: I0320 16:36:14.228307    4730 scope.go:117] "RemoveContainer" containerID="0f0d2d1d59884c6abc7a55b255dabe9f1d0344a1ac94f3f4341815bd4338056b"
Mar 20 16:36:14 crc kubenswrapper[4730]: I0320 16:36:14.276657    4730 scope.go:117] "RemoveContainer" containerID="1ea60d900e9ef26ef6983c268a7de372b9ed4b12de5cec649b6cba7470812fe5"
Mar 20 16:36:14 crc kubenswrapper[4730]: E0320 16:36:14.277129    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ea60d900e9ef26ef6983c268a7de372b9ed4b12de5cec649b6cba7470812fe5\": container with ID starting with 1ea60d900e9ef26ef6983c268a7de372b9ed4b12de5cec649b6cba7470812fe5 not found: ID does not exist" containerID="1ea60d900e9ef26ef6983c268a7de372b9ed4b12de5cec649b6cba7470812fe5"
Mar 20 16:36:14 crc kubenswrapper[4730]: I0320 16:36:14.277171    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ea60d900e9ef26ef6983c268a7de372b9ed4b12de5cec649b6cba7470812fe5"} err="failed to get container status \"1ea60d900e9ef26ef6983c268a7de372b9ed4b12de5cec649b6cba7470812fe5\": rpc error: code = NotFound desc = could not find container \"1ea60d900e9ef26ef6983c268a7de372b9ed4b12de5cec649b6cba7470812fe5\": container with ID starting with 1ea60d900e9ef26ef6983c268a7de372b9ed4b12de5cec649b6cba7470812fe5 not found: ID does not exist"
Mar 20 16:36:14 crc kubenswrapper[4730]: I0320 16:36:14.277201    4730 scope.go:117] "RemoveContainer" containerID="288dd475a1bf708e43583f729e6d81820e0c88b93d5fa513bc9fc1acb9234d16"
Mar 20 16:36:14 crc kubenswrapper[4730]: E0320 16:36:14.277554    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"288dd475a1bf708e43583f729e6d81820e0c88b93d5fa513bc9fc1acb9234d16\": container with ID starting with 288dd475a1bf708e43583f729e6d81820e0c88b93d5fa513bc9fc1acb9234d16 not found: ID does not exist" containerID="288dd475a1bf708e43583f729e6d81820e0c88b93d5fa513bc9fc1acb9234d16"
Mar 20 16:36:14 crc kubenswrapper[4730]: I0320 16:36:14.277595    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"288dd475a1bf708e43583f729e6d81820e0c88b93d5fa513bc9fc1acb9234d16"} err="failed to get container status \"288dd475a1bf708e43583f729e6d81820e0c88b93d5fa513bc9fc1acb9234d16\": rpc error: code = NotFound desc = could not find container \"288dd475a1bf708e43583f729e6d81820e0c88b93d5fa513bc9fc1acb9234d16\": container with ID starting with 288dd475a1bf708e43583f729e6d81820e0c88b93d5fa513bc9fc1acb9234d16 not found: ID does not exist"
Mar 20 16:36:14 crc kubenswrapper[4730]: I0320 16:36:14.277628    4730 scope.go:117] "RemoveContainer" containerID="0f0d2d1d59884c6abc7a55b255dabe9f1d0344a1ac94f3f4341815bd4338056b"
Mar 20 16:36:14 crc kubenswrapper[4730]: E0320 16:36:14.277930    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f0d2d1d59884c6abc7a55b255dabe9f1d0344a1ac94f3f4341815bd4338056b\": container with ID starting with 0f0d2d1d59884c6abc7a55b255dabe9f1d0344a1ac94f3f4341815bd4338056b not found: ID does not exist" containerID="0f0d2d1d59884c6abc7a55b255dabe9f1d0344a1ac94f3f4341815bd4338056b"
Mar 20 16:36:14 crc kubenswrapper[4730]: I0320 16:36:14.277962    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f0d2d1d59884c6abc7a55b255dabe9f1d0344a1ac94f3f4341815bd4338056b"} err="failed to get container status \"0f0d2d1d59884c6abc7a55b255dabe9f1d0344a1ac94f3f4341815bd4338056b\": rpc error: code = NotFound desc = could not find container \"0f0d2d1d59884c6abc7a55b255dabe9f1d0344a1ac94f3f4341815bd4338056b\": container with ID starting with 0f0d2d1d59884c6abc7a55b255dabe9f1d0344a1ac94f3f4341815bd4338056b not found: ID does not exist"
Mar 20 16:36:15 crc kubenswrapper[4730]: I0320 16:36:15.544863    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccc2048d-115d-4872-a28e-0a34a552b5fc" path="/var/lib/kubelet/pods/ccc2048d-115d-4872-a28e-0a34a552b5fc/volumes"
Mar 20 16:36:24 crc kubenswrapper[4730]: I0320 16:36:24.533360    4730 scope.go:117] "RemoveContainer" containerID="0b0c904fd86d0d3236cc303102f0b32669f34531125a255be5df201b0dd8ef34"
Mar 20 16:36:24 crc kubenswrapper[4730]: E0320 16:36:24.534322    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:36:36 crc kubenswrapper[4730]: I0320 16:36:36.533029    4730 scope.go:117] "RemoveContainer" containerID="0b0c904fd86d0d3236cc303102f0b32669f34531125a255be5df201b0dd8ef34"
Mar 20 16:36:36 crc kubenswrapper[4730]: E0320 16:36:36.533843    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:36:48 crc kubenswrapper[4730]: I0320 16:36:48.532750    4730 scope.go:117] "RemoveContainer" containerID="0b0c904fd86d0d3236cc303102f0b32669f34531125a255be5df201b0dd8ef34"
Mar 20 16:36:48 crc kubenswrapper[4730]: E0320 16:36:48.533411    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:36:50 crc kubenswrapper[4730]: I0320 16:36:50.665595    4730 scope.go:117] "RemoveContainer" containerID="20c367b1f4c39cf9e28d8318713966c9d59ce69c25b518b2e48b38d0f034fa5d"
Mar 20 16:37:03 crc kubenswrapper[4730]: I0320 16:37:03.533885    4730 scope.go:117] "RemoveContainer" containerID="0b0c904fd86d0d3236cc303102f0b32669f34531125a255be5df201b0dd8ef34"
Mar 20 16:37:03 crc kubenswrapper[4730]: E0320 16:37:03.535147    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:37:15 crc kubenswrapper[4730]: I0320 16:37:15.533595    4730 scope.go:117] "RemoveContainer" containerID="0b0c904fd86d0d3236cc303102f0b32669f34531125a255be5df201b0dd8ef34"
Mar 20 16:37:15 crc kubenswrapper[4730]: E0320 16:37:15.534369    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:37:28 crc kubenswrapper[4730]: I0320 16:37:28.534127    4730 scope.go:117] "RemoveContainer" containerID="0b0c904fd86d0d3236cc303102f0b32669f34531125a255be5df201b0dd8ef34"
Mar 20 16:37:28 crc kubenswrapper[4730]: E0320 16:37:28.536017    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:37:40 crc kubenswrapper[4730]: I0320 16:37:40.533129    4730 scope.go:117] "RemoveContainer" containerID="0b0c904fd86d0d3236cc303102f0b32669f34531125a255be5df201b0dd8ef34"
Mar 20 16:37:40 crc kubenswrapper[4730]: E0320 16:37:40.533935    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:37:51 crc kubenswrapper[4730]: I0320 16:37:51.541810    4730 scope.go:117] "RemoveContainer" containerID="0b0c904fd86d0d3236cc303102f0b32669f34531125a255be5df201b0dd8ef34"
Mar 20 16:37:52 crc kubenswrapper[4730]: I0320 16:37:52.045349    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" event={"ID":"7fcd3db3-55f1-4c23-8fa9-78844495cea3","Type":"ContainerStarted","Data":"941dc6e58516657b43df3ac7120bf6da060d2f3b3a6c41da62694a0c2e80f6c6"}
Mar 20 16:38:00 crc kubenswrapper[4730]: I0320 16:38:00.159887    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567078-bsnqc"]
Mar 20 16:38:00 crc kubenswrapper[4730]: E0320 16:38:00.161188    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcb7d22d-4ad0-4a9c-bf00-e966d1abb051" containerName="oc"
Mar 20 16:38:00 crc kubenswrapper[4730]: I0320 16:38:00.161211    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcb7d22d-4ad0-4a9c-bf00-e966d1abb051" containerName="oc"
Mar 20 16:38:00 crc kubenswrapper[4730]: E0320 16:38:00.161244    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccc2048d-115d-4872-a28e-0a34a552b5fc" containerName="extract-utilities"
Mar 20 16:38:00 crc kubenswrapper[4730]: I0320 16:38:00.161278    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccc2048d-115d-4872-a28e-0a34a552b5fc" containerName="extract-utilities"
Mar 20 16:38:00 crc kubenswrapper[4730]: E0320 16:38:00.161299    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccc2048d-115d-4872-a28e-0a34a552b5fc" containerName="registry-server"
Mar 20 16:38:00 crc kubenswrapper[4730]: I0320 16:38:00.161311    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccc2048d-115d-4872-a28e-0a34a552b5fc" containerName="registry-server"
Mar 20 16:38:00 crc kubenswrapper[4730]: E0320 16:38:00.161327    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccc2048d-115d-4872-a28e-0a34a552b5fc" containerName="extract-content"
Mar 20 16:38:00 crc kubenswrapper[4730]: I0320 16:38:00.161336    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccc2048d-115d-4872-a28e-0a34a552b5fc" containerName="extract-content"
Mar 20 16:38:00 crc kubenswrapper[4730]: I0320 16:38:00.161597    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcb7d22d-4ad0-4a9c-bf00-e966d1abb051" containerName="oc"
Mar 20 16:38:00 crc kubenswrapper[4730]: I0320 16:38:00.161622    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccc2048d-115d-4872-a28e-0a34a552b5fc" containerName="registry-server"
Mar 20 16:38:00 crc kubenswrapper[4730]: I0320 16:38:00.162554    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567078-bsnqc"
Mar 20 16:38:00 crc kubenswrapper[4730]: I0320 16:38:00.165582    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt"
Mar 20 16:38:00 crc kubenswrapper[4730]: I0320 16:38:00.172862    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt"
Mar 20 16:38:00 crc kubenswrapper[4730]: I0320 16:38:00.173091    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vlqbl"
Mar 20 16:38:00 crc kubenswrapper[4730]: I0320 16:38:00.175841    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567078-bsnqc"]
Mar 20 16:38:00 crc kubenswrapper[4730]: I0320 16:38:00.222140    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6kk4\" (UniqueName: \"kubernetes.io/projected/7178ffb9-4891-485e-b3d6-d7fcc8f22ef4-kube-api-access-m6kk4\") pod \"auto-csr-approver-29567078-bsnqc\" (UID: \"7178ffb9-4891-485e-b3d6-d7fcc8f22ef4\") " pod="openshift-infra/auto-csr-approver-29567078-bsnqc"
Mar 20 16:38:00 crc kubenswrapper[4730]: I0320 16:38:00.324853    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6kk4\" (UniqueName: \"kubernetes.io/projected/7178ffb9-4891-485e-b3d6-d7fcc8f22ef4-kube-api-access-m6kk4\") pod \"auto-csr-approver-29567078-bsnqc\" (UID: \"7178ffb9-4891-485e-b3d6-d7fcc8f22ef4\") " pod="openshift-infra/auto-csr-approver-29567078-bsnqc"
Mar 20 16:38:00 crc kubenswrapper[4730]: I0320 16:38:00.343532    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6kk4\" (UniqueName: \"kubernetes.io/projected/7178ffb9-4891-485e-b3d6-d7fcc8f22ef4-kube-api-access-m6kk4\") pod \"auto-csr-approver-29567078-bsnqc\" (UID: \"7178ffb9-4891-485e-b3d6-d7fcc8f22ef4\") " pod="openshift-infra/auto-csr-approver-29567078-bsnqc"
Mar 20 16:38:00 crc kubenswrapper[4730]: I0320 16:38:00.485191    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567078-bsnqc"
Mar 20 16:38:00 crc kubenswrapper[4730]: I0320 16:38:00.933408    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567078-bsnqc"]
Mar 20 16:38:00 crc kubenswrapper[4730]: W0320 16:38:00.935547    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7178ffb9_4891_485e_b3d6_d7fcc8f22ef4.slice/crio-a3a07fe20e20d11ec568a974c1ea93616cd7badad1aece21767ed895b9bed784 WatchSource:0}: Error finding container a3a07fe20e20d11ec568a974c1ea93616cd7badad1aece21767ed895b9bed784: Status 404 returned error can't find the container with id a3a07fe20e20d11ec568a974c1ea93616cd7badad1aece21767ed895b9bed784
Mar 20 16:38:01 crc kubenswrapper[4730]: I0320 16:38:01.128478    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567078-bsnqc" event={"ID":"7178ffb9-4891-485e-b3d6-d7fcc8f22ef4","Type":"ContainerStarted","Data":"a3a07fe20e20d11ec568a974c1ea93616cd7badad1aece21767ed895b9bed784"}
Mar 20 16:38:02 crc kubenswrapper[4730]: I0320 16:38:02.146044    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567078-bsnqc" event={"ID":"7178ffb9-4891-485e-b3d6-d7fcc8f22ef4","Type":"ContainerStarted","Data":"3b2c3b7b49826d995d5d767d6f87b9a288d68f48a6bee9638eee715a068be2d7"}
Mar 20 16:38:02 crc kubenswrapper[4730]: I0320 16:38:02.165717    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567078-bsnqc" podStartSLOduration=1.251741025 podStartE2EDuration="2.165698402s" podCreationTimestamp="2026-03-20 16:38:00 +0000 UTC" firstStartedPulling="2026-03-20 16:38:00.938334738 +0000 UTC m=+3540.151706107" lastFinishedPulling="2026-03-20 16:38:01.852292125 +0000 UTC m=+3541.065663484" observedRunningTime="2026-03-20 16:38:02.158963702 +0000 UTC m=+3541.372335081" watchObservedRunningTime="2026-03-20 16:38:02.165698402 +0000 UTC m=+3541.379069771"
Mar 20 16:38:03 crc kubenswrapper[4730]: I0320 16:38:03.158805    4730 generic.go:334] "Generic (PLEG): container finished" podID="7178ffb9-4891-485e-b3d6-d7fcc8f22ef4" containerID="3b2c3b7b49826d995d5d767d6f87b9a288d68f48a6bee9638eee715a068be2d7" exitCode=0
Mar 20 16:38:03 crc kubenswrapper[4730]: I0320 16:38:03.158866    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567078-bsnqc" event={"ID":"7178ffb9-4891-485e-b3d6-d7fcc8f22ef4","Type":"ContainerDied","Data":"3b2c3b7b49826d995d5d767d6f87b9a288d68f48a6bee9638eee715a068be2d7"}
Mar 20 16:38:04 crc kubenswrapper[4730]: I0320 16:38:04.562061    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567078-bsnqc"
Mar 20 16:38:04 crc kubenswrapper[4730]: I0320 16:38:04.614649    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6kk4\" (UniqueName: \"kubernetes.io/projected/7178ffb9-4891-485e-b3d6-d7fcc8f22ef4-kube-api-access-m6kk4\") pod \"7178ffb9-4891-485e-b3d6-d7fcc8f22ef4\" (UID: \"7178ffb9-4891-485e-b3d6-d7fcc8f22ef4\") "
Mar 20 16:38:04 crc kubenswrapper[4730]: I0320 16:38:04.621430    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7178ffb9-4891-485e-b3d6-d7fcc8f22ef4-kube-api-access-m6kk4" (OuterVolumeSpecName: "kube-api-access-m6kk4") pod "7178ffb9-4891-485e-b3d6-d7fcc8f22ef4" (UID: "7178ffb9-4891-485e-b3d6-d7fcc8f22ef4"). InnerVolumeSpecName "kube-api-access-m6kk4". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:38:04 crc kubenswrapper[4730]: I0320 16:38:04.718239    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6kk4\" (UniqueName: \"kubernetes.io/projected/7178ffb9-4891-485e-b3d6-d7fcc8f22ef4-kube-api-access-m6kk4\") on node \"crc\" DevicePath \"\""
Mar 20 16:38:05 crc kubenswrapper[4730]: I0320 16:38:05.181798    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567078-bsnqc" event={"ID":"7178ffb9-4891-485e-b3d6-d7fcc8f22ef4","Type":"ContainerDied","Data":"a3a07fe20e20d11ec568a974c1ea93616cd7badad1aece21767ed895b9bed784"}
Mar 20 16:38:05 crc kubenswrapper[4730]: I0320 16:38:05.181838    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3a07fe20e20d11ec568a974c1ea93616cd7badad1aece21767ed895b9bed784"
Mar 20 16:38:05 crc kubenswrapper[4730]: I0320 16:38:05.181880    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567078-bsnqc"
Mar 20 16:38:05 crc kubenswrapper[4730]: I0320 16:38:05.650324    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567072-jzsw2"]
Mar 20 16:38:05 crc kubenswrapper[4730]: I0320 16:38:05.662270    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567072-jzsw2"]
Mar 20 16:38:07 crc kubenswrapper[4730]: I0320 16:38:07.603975    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cadf5c48-6db4-421c-977d-1216334a9383" path="/var/lib/kubelet/pods/cadf5c48-6db4-421c-977d-1216334a9383/volumes"
Mar 20 16:38:07 crc kubenswrapper[4730]: I0320 16:38:07.669849    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2ng72"]
Mar 20 16:38:07 crc kubenswrapper[4730]: E0320 16:38:07.670290    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7178ffb9-4891-485e-b3d6-d7fcc8f22ef4" containerName="oc"
Mar 20 16:38:07 crc kubenswrapper[4730]: I0320 16:38:07.670306    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="7178ffb9-4891-485e-b3d6-d7fcc8f22ef4" containerName="oc"
Mar 20 16:38:07 crc kubenswrapper[4730]: I0320 16:38:07.670495    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="7178ffb9-4891-485e-b3d6-d7fcc8f22ef4" containerName="oc"
Mar 20 16:38:07 crc kubenswrapper[4730]: I0320 16:38:07.679938    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2ng72"
Mar 20 16:38:07 crc kubenswrapper[4730]: I0320 16:38:07.693701    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2ng72"]
Mar 20 16:38:07 crc kubenswrapper[4730]: I0320 16:38:07.781637    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab-catalog-content\") pod \"redhat-operators-2ng72\" (UID: \"6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab\") " pod="openshift-marketplace/redhat-operators-2ng72"
Mar 20 16:38:07 crc kubenswrapper[4730]: I0320 16:38:07.781743    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcflj\" (UniqueName: \"kubernetes.io/projected/6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab-kube-api-access-mcflj\") pod \"redhat-operators-2ng72\" (UID: \"6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab\") " pod="openshift-marketplace/redhat-operators-2ng72"
Mar 20 16:38:07 crc kubenswrapper[4730]: I0320 16:38:07.781891    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab-utilities\") pod \"redhat-operators-2ng72\" (UID: \"6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab\") " pod="openshift-marketplace/redhat-operators-2ng72"
Mar 20 16:38:07 crc kubenswrapper[4730]: I0320 16:38:07.883774    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab-catalog-content\") pod \"redhat-operators-2ng72\" (UID: \"6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab\") " pod="openshift-marketplace/redhat-operators-2ng72"
Mar 20 16:38:07 crc kubenswrapper[4730]: I0320 16:38:07.883879    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcflj\" (UniqueName: \"kubernetes.io/projected/6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab-kube-api-access-mcflj\") pod \"redhat-operators-2ng72\" (UID: \"6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab\") " pod="openshift-marketplace/redhat-operators-2ng72"
Mar 20 16:38:07 crc kubenswrapper[4730]: I0320 16:38:07.884010    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab-utilities\") pod \"redhat-operators-2ng72\" (UID: \"6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab\") " pod="openshift-marketplace/redhat-operators-2ng72"
Mar 20 16:38:07 crc kubenswrapper[4730]: I0320 16:38:07.884377    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab-catalog-content\") pod \"redhat-operators-2ng72\" (UID: \"6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab\") " pod="openshift-marketplace/redhat-operators-2ng72"
Mar 20 16:38:07 crc kubenswrapper[4730]: I0320 16:38:07.884510    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab-utilities\") pod \"redhat-operators-2ng72\" (UID: \"6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab\") " pod="openshift-marketplace/redhat-operators-2ng72"
Mar 20 16:38:07 crc kubenswrapper[4730]: I0320 16:38:07.912545    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcflj\" (UniqueName: \"kubernetes.io/projected/6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab-kube-api-access-mcflj\") pod \"redhat-operators-2ng72\" (UID: \"6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab\") " pod="openshift-marketplace/redhat-operators-2ng72"
Mar 20 16:38:08 crc kubenswrapper[4730]: I0320 16:38:08.040458    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2ng72"
Mar 20 16:38:08 crc kubenswrapper[4730]: W0320 16:38:08.531973    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e2c22ab_f597_4ec1_b66d_a8b80ddae7ab.slice/crio-2a8b57ef0999e1507449e57f62b329f91fdc4c79646cc432d53ec5cf91fac4d1 WatchSource:0}: Error finding container 2a8b57ef0999e1507449e57f62b329f91fdc4c79646cc432d53ec5cf91fac4d1: Status 404 returned error can't find the container with id 2a8b57ef0999e1507449e57f62b329f91fdc4c79646cc432d53ec5cf91fac4d1
Mar 20 16:38:08 crc kubenswrapper[4730]: I0320 16:38:08.531986    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2ng72"]
Mar 20 16:38:09 crc kubenswrapper[4730]: I0320 16:38:09.219528    4730 generic.go:334] "Generic (PLEG): container finished" podID="6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab" containerID="41fb40d724ce9bcdac4ab8d8841562939f9e3bf8f3ffd4e8bc708e5977e01b6f" exitCode=0
Mar 20 16:38:09 crc kubenswrapper[4730]: I0320 16:38:09.219635    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2ng72" event={"ID":"6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab","Type":"ContainerDied","Data":"41fb40d724ce9bcdac4ab8d8841562939f9e3bf8f3ffd4e8bc708e5977e01b6f"}
Mar 20 16:38:09 crc kubenswrapper[4730]: I0320 16:38:09.220084    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2ng72" event={"ID":"6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab","Type":"ContainerStarted","Data":"2a8b57ef0999e1507449e57f62b329f91fdc4c79646cc432d53ec5cf91fac4d1"}
Mar 20 16:38:10 crc kubenswrapper[4730]: I0320 16:38:10.237147    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2ng72" event={"ID":"6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab","Type":"ContainerStarted","Data":"fa67c25c48f3868aa77adc40c5b76a87682169dff1c5d76306550f084ee3a7c7"}
Mar 20 16:38:16 crc kubenswrapper[4730]: I0320 16:38:16.295665    4730 generic.go:334] "Generic (PLEG): container finished" podID="6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab" containerID="fa67c25c48f3868aa77adc40c5b76a87682169dff1c5d76306550f084ee3a7c7" exitCode=0
Mar 20 16:38:16 crc kubenswrapper[4730]: I0320 16:38:16.295749    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2ng72" event={"ID":"6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab","Type":"ContainerDied","Data":"fa67c25c48f3868aa77adc40c5b76a87682169dff1c5d76306550f084ee3a7c7"}
Mar 20 16:38:17 crc kubenswrapper[4730]: I0320 16:38:17.308010    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2ng72" event={"ID":"6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab","Type":"ContainerStarted","Data":"1b4d30d81ff77e2ae0201a4ec1d98e674f270f1721ed98bd7f2f431853c8df6f"}
Mar 20 16:38:17 crc kubenswrapper[4730]: I0320 16:38:17.333679    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2ng72" podStartSLOduration=2.8249934740000002 podStartE2EDuration="10.333659933s" podCreationTimestamp="2026-03-20 16:38:07 +0000 UTC" firstStartedPulling="2026-03-20 16:38:09.221403347 +0000 UTC m=+3548.434774716" lastFinishedPulling="2026-03-20 16:38:16.730069796 +0000 UTC m=+3555.943441175" observedRunningTime="2026-03-20 16:38:17.328805106 +0000 UTC m=+3556.542176485" watchObservedRunningTime="2026-03-20 16:38:17.333659933 +0000 UTC m=+3556.547031302"
Mar 20 16:38:18 crc kubenswrapper[4730]: I0320 16:38:18.040684    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2ng72"
Mar 20 16:38:18 crc kubenswrapper[4730]: I0320 16:38:18.041095    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2ng72"
Mar 20 16:38:19 crc kubenswrapper[4730]: I0320 16:38:19.138714    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2ng72" podUID="6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab" containerName="registry-server" probeResult="failure" output=<
Mar 20 16:38:19 crc kubenswrapper[4730]:         timeout: failed to connect service ":50051" within 1s
Mar 20 16:38:19 crc kubenswrapper[4730]:  >
Mar 20 16:38:28 crc kubenswrapper[4730]: I0320 16:38:28.099644    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2ng72"
Mar 20 16:38:28 crc kubenswrapper[4730]: I0320 16:38:28.165465    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2ng72"
Mar 20 16:38:28 crc kubenswrapper[4730]: I0320 16:38:28.339117    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2ng72"]
Mar 20 16:38:29 crc kubenswrapper[4730]: I0320 16:38:29.423513    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2ng72" podUID="6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab" containerName="registry-server" containerID="cri-o://1b4d30d81ff77e2ae0201a4ec1d98e674f270f1721ed98bd7f2f431853c8df6f" gracePeriod=2
Mar 20 16:38:29 crc kubenswrapper[4730]: I0320 16:38:29.929613    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2ng72"
Mar 20 16:38:29 crc kubenswrapper[4730]: I0320 16:38:29.942967    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab-utilities\") pod \"6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab\" (UID: \"6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab\") "
Mar 20 16:38:29 crc kubenswrapper[4730]: I0320 16:38:29.943168    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcflj\" (UniqueName: \"kubernetes.io/projected/6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab-kube-api-access-mcflj\") pod \"6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab\" (UID: \"6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab\") "
Mar 20 16:38:29 crc kubenswrapper[4730]: I0320 16:38:29.943626    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab-catalog-content\") pod \"6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab\" (UID: \"6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab\") "
Mar 20 16:38:29 crc kubenswrapper[4730]: I0320 16:38:29.943933    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab-utilities" (OuterVolumeSpecName: "utilities") pod "6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab" (UID: "6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:38:29 crc kubenswrapper[4730]: I0320 16:38:29.952219    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab-kube-api-access-mcflj" (OuterVolumeSpecName: "kube-api-access-mcflj") pod "6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab" (UID: "6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab"). InnerVolumeSpecName "kube-api-access-mcflj". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:38:29 crc kubenswrapper[4730]: I0320 16:38:29.972427    4730 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab-utilities\") on node \"crc\" DevicePath \"\""
Mar 20 16:38:29 crc kubenswrapper[4730]: I0320 16:38:29.972486    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcflj\" (UniqueName: \"kubernetes.io/projected/6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab-kube-api-access-mcflj\") on node \"crc\" DevicePath \"\""
Mar 20 16:38:30 crc kubenswrapper[4730]: I0320 16:38:30.107701    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab" (UID: "6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:38:30 crc kubenswrapper[4730]: I0320 16:38:30.176852    4730 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab-catalog-content\") on node \"crc\" DevicePath \"\""
Mar 20 16:38:30 crc kubenswrapper[4730]: I0320 16:38:30.438023    4730 generic.go:334] "Generic (PLEG): container finished" podID="6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab" containerID="1b4d30d81ff77e2ae0201a4ec1d98e674f270f1721ed98bd7f2f431853c8df6f" exitCode=0
Mar 20 16:38:30 crc kubenswrapper[4730]: I0320 16:38:30.438066    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2ng72" event={"ID":"6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab","Type":"ContainerDied","Data":"1b4d30d81ff77e2ae0201a4ec1d98e674f270f1721ed98bd7f2f431853c8df6f"}
Mar 20 16:38:30 crc kubenswrapper[4730]: I0320 16:38:30.438093    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2ng72" event={"ID":"6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab","Type":"ContainerDied","Data":"2a8b57ef0999e1507449e57f62b329f91fdc4c79646cc432d53ec5cf91fac4d1"}
Mar 20 16:38:30 crc kubenswrapper[4730]: I0320 16:38:30.438111    4730 scope.go:117] "RemoveContainer" containerID="1b4d30d81ff77e2ae0201a4ec1d98e674f270f1721ed98bd7f2f431853c8df6f"
Mar 20 16:38:30 crc kubenswrapper[4730]: I0320 16:38:30.438121    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2ng72"
Mar 20 16:38:30 crc kubenswrapper[4730]: I0320 16:38:30.462377    4730 scope.go:117] "RemoveContainer" containerID="fa67c25c48f3868aa77adc40c5b76a87682169dff1c5d76306550f084ee3a7c7"
Mar 20 16:38:30 crc kubenswrapper[4730]: I0320 16:38:30.484155    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2ng72"]
Mar 20 16:38:30 crc kubenswrapper[4730]: I0320 16:38:30.486163    4730 scope.go:117] "RemoveContainer" containerID="41fb40d724ce9bcdac4ab8d8841562939f9e3bf8f3ffd4e8bc708e5977e01b6f"
Mar 20 16:38:30 crc kubenswrapper[4730]: I0320 16:38:30.493377    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2ng72"]
Mar 20 16:38:30 crc kubenswrapper[4730]: I0320 16:38:30.528932    4730 scope.go:117] "RemoveContainer" containerID="1b4d30d81ff77e2ae0201a4ec1d98e674f270f1721ed98bd7f2f431853c8df6f"
Mar 20 16:38:30 crc kubenswrapper[4730]: E0320 16:38:30.529359    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b4d30d81ff77e2ae0201a4ec1d98e674f270f1721ed98bd7f2f431853c8df6f\": container with ID starting with 1b4d30d81ff77e2ae0201a4ec1d98e674f270f1721ed98bd7f2f431853c8df6f not found: ID does not exist" containerID="1b4d30d81ff77e2ae0201a4ec1d98e674f270f1721ed98bd7f2f431853c8df6f"
Mar 20 16:38:30 crc kubenswrapper[4730]: I0320 16:38:30.529422    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b4d30d81ff77e2ae0201a4ec1d98e674f270f1721ed98bd7f2f431853c8df6f"} err="failed to get container status \"1b4d30d81ff77e2ae0201a4ec1d98e674f270f1721ed98bd7f2f431853c8df6f\": rpc error: code = NotFound desc = could not find container \"1b4d30d81ff77e2ae0201a4ec1d98e674f270f1721ed98bd7f2f431853c8df6f\": container with ID starting with 1b4d30d81ff77e2ae0201a4ec1d98e674f270f1721ed98bd7f2f431853c8df6f not found: ID does not exist"
Mar 20 16:38:30 crc kubenswrapper[4730]: I0320 16:38:30.529461    4730 scope.go:117] "RemoveContainer" containerID="fa67c25c48f3868aa77adc40c5b76a87682169dff1c5d76306550f084ee3a7c7"
Mar 20 16:38:30 crc kubenswrapper[4730]: E0320 16:38:30.529773    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa67c25c48f3868aa77adc40c5b76a87682169dff1c5d76306550f084ee3a7c7\": container with ID starting with fa67c25c48f3868aa77adc40c5b76a87682169dff1c5d76306550f084ee3a7c7 not found: ID does not exist" containerID="fa67c25c48f3868aa77adc40c5b76a87682169dff1c5d76306550f084ee3a7c7"
Mar 20 16:38:30 crc kubenswrapper[4730]: I0320 16:38:30.529822    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa67c25c48f3868aa77adc40c5b76a87682169dff1c5d76306550f084ee3a7c7"} err="failed to get container status \"fa67c25c48f3868aa77adc40c5b76a87682169dff1c5d76306550f084ee3a7c7\": rpc error: code = NotFound desc = could not find container \"fa67c25c48f3868aa77adc40c5b76a87682169dff1c5d76306550f084ee3a7c7\": container with ID starting with fa67c25c48f3868aa77adc40c5b76a87682169dff1c5d76306550f084ee3a7c7 not found: ID does not exist"
Mar 20 16:38:30 crc kubenswrapper[4730]: I0320 16:38:30.529857    4730 scope.go:117] "RemoveContainer" containerID="41fb40d724ce9bcdac4ab8d8841562939f9e3bf8f3ffd4e8bc708e5977e01b6f"
Mar 20 16:38:30 crc kubenswrapper[4730]: E0320 16:38:30.530152    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41fb40d724ce9bcdac4ab8d8841562939f9e3bf8f3ffd4e8bc708e5977e01b6f\": container with ID starting with 41fb40d724ce9bcdac4ab8d8841562939f9e3bf8f3ffd4e8bc708e5977e01b6f not found: ID does not exist" containerID="41fb40d724ce9bcdac4ab8d8841562939f9e3bf8f3ffd4e8bc708e5977e01b6f"
Mar 20 16:38:30 crc kubenswrapper[4730]: I0320 16:38:30.530200    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41fb40d724ce9bcdac4ab8d8841562939f9e3bf8f3ffd4e8bc708e5977e01b6f"} err="failed to get container status \"41fb40d724ce9bcdac4ab8d8841562939f9e3bf8f3ffd4e8bc708e5977e01b6f\": rpc error: code = NotFound desc = could not find container \"41fb40d724ce9bcdac4ab8d8841562939f9e3bf8f3ffd4e8bc708e5977e01b6f\": container with ID starting with 41fb40d724ce9bcdac4ab8d8841562939f9e3bf8f3ffd4e8bc708e5977e01b6f not found: ID does not exist"
Mar 20 16:38:31 crc kubenswrapper[4730]: I0320 16:38:31.544396    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab" path="/var/lib/kubelet/pods/6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab/volumes"
Mar 20 16:38:33 crc kubenswrapper[4730]: I0320 16:38:33.549266    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sfrtb"]
Mar 20 16:38:33 crc kubenswrapper[4730]: E0320 16:38:33.549963    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab" containerName="extract-content"
Mar 20 16:38:33 crc kubenswrapper[4730]: I0320 16:38:33.549977    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab" containerName="extract-content"
Mar 20 16:38:33 crc kubenswrapper[4730]: E0320 16:38:33.549989    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab" containerName="registry-server"
Mar 20 16:38:33 crc kubenswrapper[4730]: I0320 16:38:33.549996    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab" containerName="registry-server"
Mar 20 16:38:33 crc kubenswrapper[4730]: E0320 16:38:33.550028    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab" containerName="extract-utilities"
Mar 20 16:38:33 crc kubenswrapper[4730]: I0320 16:38:33.550035    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab" containerName="extract-utilities"
Mar 20 16:38:33 crc kubenswrapper[4730]: I0320 16:38:33.550214    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e2c22ab-f597-4ec1-b66d-a8b80ddae7ab" containerName="registry-server"
Mar 20 16:38:33 crc kubenswrapper[4730]: I0320 16:38:33.554508    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sfrtb"
Mar 20 16:38:33 crc kubenswrapper[4730]: I0320 16:38:33.569177    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sfrtb"]
Mar 20 16:38:33 crc kubenswrapper[4730]: I0320 16:38:33.645837    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68e77c65-6357-459f-9bf2-fe45499cd296-utilities\") pod \"certified-operators-sfrtb\" (UID: \"68e77c65-6357-459f-9bf2-fe45499cd296\") " pod="openshift-marketplace/certified-operators-sfrtb"
Mar 20 16:38:33 crc kubenswrapper[4730]: I0320 16:38:33.645935    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68e77c65-6357-459f-9bf2-fe45499cd296-catalog-content\") pod \"certified-operators-sfrtb\" (UID: \"68e77c65-6357-459f-9bf2-fe45499cd296\") " pod="openshift-marketplace/certified-operators-sfrtb"
Mar 20 16:38:33 crc kubenswrapper[4730]: I0320 16:38:33.646107    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjswk\" (UniqueName: \"kubernetes.io/projected/68e77c65-6357-459f-9bf2-fe45499cd296-kube-api-access-cjswk\") pod \"certified-operators-sfrtb\" (UID: \"68e77c65-6357-459f-9bf2-fe45499cd296\") " pod="openshift-marketplace/certified-operators-sfrtb"
Mar 20 16:38:33 crc kubenswrapper[4730]: I0320 16:38:33.749033    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68e77c65-6357-459f-9bf2-fe45499cd296-catalog-content\") pod \"certified-operators-sfrtb\" (UID: \"68e77c65-6357-459f-9bf2-fe45499cd296\") " pod="openshift-marketplace/certified-operators-sfrtb"
Mar 20 16:38:33 crc kubenswrapper[4730]: I0320 16:38:33.749199    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjswk\" (UniqueName: \"kubernetes.io/projected/68e77c65-6357-459f-9bf2-fe45499cd296-kube-api-access-cjswk\") pod \"certified-operators-sfrtb\" (UID: \"68e77c65-6357-459f-9bf2-fe45499cd296\") " pod="openshift-marketplace/certified-operators-sfrtb"
Mar 20 16:38:33 crc kubenswrapper[4730]: I0320 16:38:33.749307    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68e77c65-6357-459f-9bf2-fe45499cd296-utilities\") pod \"certified-operators-sfrtb\" (UID: \"68e77c65-6357-459f-9bf2-fe45499cd296\") " pod="openshift-marketplace/certified-operators-sfrtb"
Mar 20 16:38:33 crc kubenswrapper[4730]: I0320 16:38:33.749642    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68e77c65-6357-459f-9bf2-fe45499cd296-catalog-content\") pod \"certified-operators-sfrtb\" (UID: \"68e77c65-6357-459f-9bf2-fe45499cd296\") " pod="openshift-marketplace/certified-operators-sfrtb"
Mar 20 16:38:33 crc kubenswrapper[4730]: I0320 16:38:33.749674    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68e77c65-6357-459f-9bf2-fe45499cd296-utilities\") pod \"certified-operators-sfrtb\" (UID: \"68e77c65-6357-459f-9bf2-fe45499cd296\") " pod="openshift-marketplace/certified-operators-sfrtb"
Mar 20 16:38:33 crc kubenswrapper[4730]: I0320 16:38:33.774138    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjswk\" (UniqueName: \"kubernetes.io/projected/68e77c65-6357-459f-9bf2-fe45499cd296-kube-api-access-cjswk\") pod \"certified-operators-sfrtb\" (UID: \"68e77c65-6357-459f-9bf2-fe45499cd296\") " pod="openshift-marketplace/certified-operators-sfrtb"
Mar 20 16:38:33 crc kubenswrapper[4730]: I0320 16:38:33.882548    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sfrtb"
Mar 20 16:38:34 crc kubenswrapper[4730]: I0320 16:38:34.442382    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sfrtb"]
Mar 20 16:38:34 crc kubenswrapper[4730]: I0320 16:38:34.476079    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sfrtb" event={"ID":"68e77c65-6357-459f-9bf2-fe45499cd296","Type":"ContainerStarted","Data":"8326f9a3070f09a5c1a65a65c0238d53148283f463d30f8f66946ba3d27b4572"}
Mar 20 16:38:35 crc kubenswrapper[4730]: I0320 16:38:35.489699    4730 generic.go:334] "Generic (PLEG): container finished" podID="68e77c65-6357-459f-9bf2-fe45499cd296" containerID="2c0196867229753a5d5133a1c1a6d4ae5892cbd2d676b231f509ca4edf696469" exitCode=0
Mar 20 16:38:35 crc kubenswrapper[4730]: I0320 16:38:35.489756    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sfrtb" event={"ID":"68e77c65-6357-459f-9bf2-fe45499cd296","Type":"ContainerDied","Data":"2c0196867229753a5d5133a1c1a6d4ae5892cbd2d676b231f509ca4edf696469"}
Mar 20 16:38:37 crc kubenswrapper[4730]: I0320 16:38:37.510301    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sfrtb" event={"ID":"68e77c65-6357-459f-9bf2-fe45499cd296","Type":"ContainerStarted","Data":"d569825b7078b64e8055d1b820560032b6386e4536c814bef30f2c5e7dab2c36"}
Mar 20 16:38:38 crc kubenswrapper[4730]: I0320 16:38:38.520743    4730 generic.go:334] "Generic (PLEG): container finished" podID="68e77c65-6357-459f-9bf2-fe45499cd296" containerID="d569825b7078b64e8055d1b820560032b6386e4536c814bef30f2c5e7dab2c36" exitCode=0
Mar 20 16:38:38 crc kubenswrapper[4730]: I0320 16:38:38.520798    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sfrtb" event={"ID":"68e77c65-6357-459f-9bf2-fe45499cd296","Type":"ContainerDied","Data":"d569825b7078b64e8055d1b820560032b6386e4536c814bef30f2c5e7dab2c36"}
Mar 20 16:38:39 crc kubenswrapper[4730]: I0320 16:38:39.543724    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sfrtb" event={"ID":"68e77c65-6357-459f-9bf2-fe45499cd296","Type":"ContainerStarted","Data":"99b4b38b434690826e45d27a5d067e4e0e256417cb7003c55df7e43f31c807a8"}
Mar 20 16:38:39 crc kubenswrapper[4730]: I0320 16:38:39.554139    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sfrtb" podStartSLOduration=3.019910419 podStartE2EDuration="6.554115252s" podCreationTimestamp="2026-03-20 16:38:33 +0000 UTC" firstStartedPulling="2026-03-20 16:38:35.492200618 +0000 UTC m=+3574.705571987" lastFinishedPulling="2026-03-20 16:38:39.026405451 +0000 UTC m=+3578.239776820" observedRunningTime="2026-03-20 16:38:39.549842421 +0000 UTC m=+3578.763213790" watchObservedRunningTime="2026-03-20 16:38:39.554115252 +0000 UTC m=+3578.767486621"
Mar 20 16:38:43 crc kubenswrapper[4730]: I0320 16:38:43.883413    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sfrtb"
Mar 20 16:38:43 crc kubenswrapper[4730]: I0320 16:38:43.884016    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sfrtb"
Mar 20 16:38:43 crc kubenswrapper[4730]: I0320 16:38:43.929938    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sfrtb"
Mar 20 16:38:44 crc kubenswrapper[4730]: I0320 16:38:44.661807    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sfrtb"
Mar 20 16:38:44 crc kubenswrapper[4730]: I0320 16:38:44.715911    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sfrtb"]
Mar 20 16:38:46 crc kubenswrapper[4730]: I0320 16:38:46.627724    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sfrtb" podUID="68e77c65-6357-459f-9bf2-fe45499cd296" containerName="registry-server" containerID="cri-o://99b4b38b434690826e45d27a5d067e4e0e256417cb7003c55df7e43f31c807a8" gracePeriod=2
Mar 20 16:38:47 crc kubenswrapper[4730]: I0320 16:38:47.193288    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sfrtb"
Mar 20 16:38:47 crc kubenswrapper[4730]: I0320 16:38:47.245222    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjswk\" (UniqueName: \"kubernetes.io/projected/68e77c65-6357-459f-9bf2-fe45499cd296-kube-api-access-cjswk\") pod \"68e77c65-6357-459f-9bf2-fe45499cd296\" (UID: \"68e77c65-6357-459f-9bf2-fe45499cd296\") "
Mar 20 16:38:47 crc kubenswrapper[4730]: I0320 16:38:47.245366    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68e77c65-6357-459f-9bf2-fe45499cd296-utilities\") pod \"68e77c65-6357-459f-9bf2-fe45499cd296\" (UID: \"68e77c65-6357-459f-9bf2-fe45499cd296\") "
Mar 20 16:38:47 crc kubenswrapper[4730]: I0320 16:38:47.245424    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68e77c65-6357-459f-9bf2-fe45499cd296-catalog-content\") pod \"68e77c65-6357-459f-9bf2-fe45499cd296\" (UID: \"68e77c65-6357-459f-9bf2-fe45499cd296\") "
Mar 20 16:38:47 crc kubenswrapper[4730]: I0320 16:38:47.247744    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68e77c65-6357-459f-9bf2-fe45499cd296-utilities" (OuterVolumeSpecName: "utilities") pod "68e77c65-6357-459f-9bf2-fe45499cd296" (UID: "68e77c65-6357-459f-9bf2-fe45499cd296"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:38:47 crc kubenswrapper[4730]: I0320 16:38:47.268450    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68e77c65-6357-459f-9bf2-fe45499cd296-kube-api-access-cjswk" (OuterVolumeSpecName: "kube-api-access-cjswk") pod "68e77c65-6357-459f-9bf2-fe45499cd296" (UID: "68e77c65-6357-459f-9bf2-fe45499cd296"). InnerVolumeSpecName "kube-api-access-cjswk". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:38:47 crc kubenswrapper[4730]: I0320 16:38:47.346873    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjswk\" (UniqueName: \"kubernetes.io/projected/68e77c65-6357-459f-9bf2-fe45499cd296-kube-api-access-cjswk\") on node \"crc\" DevicePath \"\""
Mar 20 16:38:47 crc kubenswrapper[4730]: I0320 16:38:47.346935    4730 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68e77c65-6357-459f-9bf2-fe45499cd296-utilities\") on node \"crc\" DevicePath \"\""
Mar 20 16:38:47 crc kubenswrapper[4730]: I0320 16:38:47.644628    4730 generic.go:334] "Generic (PLEG): container finished" podID="68e77c65-6357-459f-9bf2-fe45499cd296" containerID="99b4b38b434690826e45d27a5d067e4e0e256417cb7003c55df7e43f31c807a8" exitCode=0
Mar 20 16:38:47 crc kubenswrapper[4730]: I0320 16:38:47.644708    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sfrtb" event={"ID":"68e77c65-6357-459f-9bf2-fe45499cd296","Type":"ContainerDied","Data":"99b4b38b434690826e45d27a5d067e4e0e256417cb7003c55df7e43f31c807a8"}
Mar 20 16:38:47 crc kubenswrapper[4730]: I0320 16:38:47.644757    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sfrtb" event={"ID":"68e77c65-6357-459f-9bf2-fe45499cd296","Type":"ContainerDied","Data":"8326f9a3070f09a5c1a65a65c0238d53148283f463d30f8f66946ba3d27b4572"}
Mar 20 16:38:47 crc kubenswrapper[4730]: I0320 16:38:47.644787    4730 scope.go:117] "RemoveContainer" containerID="99b4b38b434690826e45d27a5d067e4e0e256417cb7003c55df7e43f31c807a8"
Mar 20 16:38:47 crc kubenswrapper[4730]: I0320 16:38:47.645118    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sfrtb"
Mar 20 16:38:47 crc kubenswrapper[4730]: I0320 16:38:47.668895    4730 scope.go:117] "RemoveContainer" containerID="d569825b7078b64e8055d1b820560032b6386e4536c814bef30f2c5e7dab2c36"
Mar 20 16:38:47 crc kubenswrapper[4730]: I0320 16:38:47.695044    4730 scope.go:117] "RemoveContainer" containerID="2c0196867229753a5d5133a1c1a6d4ae5892cbd2d676b231f509ca4edf696469"
Mar 20 16:38:47 crc kubenswrapper[4730]: I0320 16:38:47.762177    4730 scope.go:117] "RemoveContainer" containerID="99b4b38b434690826e45d27a5d067e4e0e256417cb7003c55df7e43f31c807a8"
Mar 20 16:38:47 crc kubenswrapper[4730]: E0320 16:38:47.762605    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99b4b38b434690826e45d27a5d067e4e0e256417cb7003c55df7e43f31c807a8\": container with ID starting with 99b4b38b434690826e45d27a5d067e4e0e256417cb7003c55df7e43f31c807a8 not found: ID does not exist" containerID="99b4b38b434690826e45d27a5d067e4e0e256417cb7003c55df7e43f31c807a8"
Mar 20 16:38:47 crc kubenswrapper[4730]: I0320 16:38:47.762636    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99b4b38b434690826e45d27a5d067e4e0e256417cb7003c55df7e43f31c807a8"} err="failed to get container status \"99b4b38b434690826e45d27a5d067e4e0e256417cb7003c55df7e43f31c807a8\": rpc error: code = NotFound desc = could not find container \"99b4b38b434690826e45d27a5d067e4e0e256417cb7003c55df7e43f31c807a8\": container with ID starting with 99b4b38b434690826e45d27a5d067e4e0e256417cb7003c55df7e43f31c807a8 not found: ID does not exist"
Mar 20 16:38:47 crc kubenswrapper[4730]: I0320 16:38:47.762658    4730 scope.go:117] "RemoveContainer" containerID="d569825b7078b64e8055d1b820560032b6386e4536c814bef30f2c5e7dab2c36"
Mar 20 16:38:47 crc kubenswrapper[4730]: E0320 16:38:47.763067    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d569825b7078b64e8055d1b820560032b6386e4536c814bef30f2c5e7dab2c36\": container with ID starting with d569825b7078b64e8055d1b820560032b6386e4536c814bef30f2c5e7dab2c36 not found: ID does not exist" containerID="d569825b7078b64e8055d1b820560032b6386e4536c814bef30f2c5e7dab2c36"
Mar 20 16:38:47 crc kubenswrapper[4730]: I0320 16:38:47.763090    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d569825b7078b64e8055d1b820560032b6386e4536c814bef30f2c5e7dab2c36"} err="failed to get container status \"d569825b7078b64e8055d1b820560032b6386e4536c814bef30f2c5e7dab2c36\": rpc error: code = NotFound desc = could not find container \"d569825b7078b64e8055d1b820560032b6386e4536c814bef30f2c5e7dab2c36\": container with ID starting with d569825b7078b64e8055d1b820560032b6386e4536c814bef30f2c5e7dab2c36 not found: ID does not exist"
Mar 20 16:38:47 crc kubenswrapper[4730]: I0320 16:38:47.763107    4730 scope.go:117] "RemoveContainer" containerID="2c0196867229753a5d5133a1c1a6d4ae5892cbd2d676b231f509ca4edf696469"
Mar 20 16:38:47 crc kubenswrapper[4730]: E0320 16:38:47.763498    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c0196867229753a5d5133a1c1a6d4ae5892cbd2d676b231f509ca4edf696469\": container with ID starting with 2c0196867229753a5d5133a1c1a6d4ae5892cbd2d676b231f509ca4edf696469 not found: ID does not exist" containerID="2c0196867229753a5d5133a1c1a6d4ae5892cbd2d676b231f509ca4edf696469"
Mar 20 16:38:47 crc kubenswrapper[4730]: I0320 16:38:47.763544    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c0196867229753a5d5133a1c1a6d4ae5892cbd2d676b231f509ca4edf696469"} err="failed to get container status \"2c0196867229753a5d5133a1c1a6d4ae5892cbd2d676b231f509ca4edf696469\": rpc error: code = NotFound desc = could not find container \"2c0196867229753a5d5133a1c1a6d4ae5892cbd2d676b231f509ca4edf696469\": container with ID starting with 2c0196867229753a5d5133a1c1a6d4ae5892cbd2d676b231f509ca4edf696469 not found: ID does not exist"
Mar 20 16:38:47 crc kubenswrapper[4730]: I0320 16:38:47.825965    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68e77c65-6357-459f-9bf2-fe45499cd296-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "68e77c65-6357-459f-9bf2-fe45499cd296" (UID: "68e77c65-6357-459f-9bf2-fe45499cd296"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:38:47 crc kubenswrapper[4730]: I0320 16:38:47.853225    4730 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68e77c65-6357-459f-9bf2-fe45499cd296-catalog-content\") on node \"crc\" DevicePath \"\""
Mar 20 16:38:47 crc kubenswrapper[4730]: I0320 16:38:47.987740    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sfrtb"]
Mar 20 16:38:48 crc kubenswrapper[4730]: I0320 16:38:48.001033    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sfrtb"]
Mar 20 16:38:49 crc kubenswrapper[4730]: I0320 16:38:49.550544    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68e77c65-6357-459f-9bf2-fe45499cd296" path="/var/lib/kubelet/pods/68e77c65-6357-459f-9bf2-fe45499cd296/volumes"
Mar 20 16:38:50 crc kubenswrapper[4730]: I0320 16:38:50.785180    4730 scope.go:117] "RemoveContainer" containerID="e43508074aa0c7c7c61cb53a8852f8061943211007b9394c89ac6a8a6c904123"
Mar 20 16:40:00 crc kubenswrapper[4730]: I0320 16:40:00.161513    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567080-445j8"]
Mar 20 16:40:00 crc kubenswrapper[4730]: E0320 16:40:00.162441    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68e77c65-6357-459f-9bf2-fe45499cd296" containerName="extract-content"
Mar 20 16:40:00 crc kubenswrapper[4730]: I0320 16:40:00.162455    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="68e77c65-6357-459f-9bf2-fe45499cd296" containerName="extract-content"
Mar 20 16:40:00 crc kubenswrapper[4730]: E0320 16:40:00.162475    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68e77c65-6357-459f-9bf2-fe45499cd296" containerName="registry-server"
Mar 20 16:40:00 crc kubenswrapper[4730]: I0320 16:40:00.162481    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="68e77c65-6357-459f-9bf2-fe45499cd296" containerName="registry-server"
Mar 20 16:40:00 crc kubenswrapper[4730]: E0320 16:40:00.162529    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68e77c65-6357-459f-9bf2-fe45499cd296" containerName="extract-utilities"
Mar 20 16:40:00 crc kubenswrapper[4730]: I0320 16:40:00.162536    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="68e77c65-6357-459f-9bf2-fe45499cd296" containerName="extract-utilities"
Mar 20 16:40:00 crc kubenswrapper[4730]: I0320 16:40:00.162719    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="68e77c65-6357-459f-9bf2-fe45499cd296" containerName="registry-server"
Mar 20 16:40:00 crc kubenswrapper[4730]: I0320 16:40:00.163402    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567080-445j8"
Mar 20 16:40:00 crc kubenswrapper[4730]: I0320 16:40:00.166566    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt"
Mar 20 16:40:00 crc kubenswrapper[4730]: I0320 16:40:00.166604    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt"
Mar 20 16:40:00 crc kubenswrapper[4730]: I0320 16:40:00.166958    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vlqbl"
Mar 20 16:40:00 crc kubenswrapper[4730]: I0320 16:40:00.171402    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567080-445j8"]
Mar 20 16:40:00 crc kubenswrapper[4730]: I0320 16:40:00.280387    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgq9g\" (UniqueName: \"kubernetes.io/projected/9b8e3816-7082-4505-847c-880b40d33930-kube-api-access-vgq9g\") pod \"auto-csr-approver-29567080-445j8\" (UID: \"9b8e3816-7082-4505-847c-880b40d33930\") " pod="openshift-infra/auto-csr-approver-29567080-445j8"
Mar 20 16:40:00 crc kubenswrapper[4730]: I0320 16:40:00.382088    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgq9g\" (UniqueName: \"kubernetes.io/projected/9b8e3816-7082-4505-847c-880b40d33930-kube-api-access-vgq9g\") pod \"auto-csr-approver-29567080-445j8\" (UID: \"9b8e3816-7082-4505-847c-880b40d33930\") " pod="openshift-infra/auto-csr-approver-29567080-445j8"
Mar 20 16:40:00 crc kubenswrapper[4730]: I0320 16:40:00.406836    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgq9g\" (UniqueName: \"kubernetes.io/projected/9b8e3816-7082-4505-847c-880b40d33930-kube-api-access-vgq9g\") pod \"auto-csr-approver-29567080-445j8\" (UID: \"9b8e3816-7082-4505-847c-880b40d33930\") " pod="openshift-infra/auto-csr-approver-29567080-445j8"
Mar 20 16:40:00 crc kubenswrapper[4730]: I0320 16:40:00.484624    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567080-445j8"
Mar 20 16:40:00 crc kubenswrapper[4730]: I0320 16:40:00.977451    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567080-445j8"]
Mar 20 16:40:00 crc kubenswrapper[4730]: I0320 16:40:00.977979    4730 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider
Mar 20 16:40:01 crc kubenswrapper[4730]: I0320 16:40:01.311610    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567080-445j8" event={"ID":"9b8e3816-7082-4505-847c-880b40d33930","Type":"ContainerStarted","Data":"3936d873f1da84ce2f7ff062b900f2b0100ea3cfeea943d681e1395f94bb332b"}
Mar 20 16:40:02 crc kubenswrapper[4730]: I0320 16:40:02.321358    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567080-445j8" event={"ID":"9b8e3816-7082-4505-847c-880b40d33930","Type":"ContainerStarted","Data":"86dd9cb2df6336d37948551b33d5a151e10e12f60435fec8a924c6900e110929"}
Mar 20 16:40:02 crc kubenswrapper[4730]: I0320 16:40:02.338294    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567080-445j8" podStartSLOduration=1.3194325980000001 podStartE2EDuration="2.338273392s" podCreationTimestamp="2026-03-20 16:40:00 +0000 UTC" firstStartedPulling="2026-03-20 16:40:00.977771747 +0000 UTC m=+3660.191143116" lastFinishedPulling="2026-03-20 16:40:01.996612541 +0000 UTC m=+3661.209983910" observedRunningTime="2026-03-20 16:40:02.335289198 +0000 UTC m=+3661.548660587" watchObservedRunningTime="2026-03-20 16:40:02.338273392 +0000 UTC m=+3661.551644761"
Mar 20 16:40:03 crc kubenswrapper[4730]: I0320 16:40:03.330746    4730 generic.go:334] "Generic (PLEG): container finished" podID="9b8e3816-7082-4505-847c-880b40d33930" containerID="86dd9cb2df6336d37948551b33d5a151e10e12f60435fec8a924c6900e110929" exitCode=0
Mar 20 16:40:03 crc kubenswrapper[4730]: I0320 16:40:03.330848    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567080-445j8" event={"ID":"9b8e3816-7082-4505-847c-880b40d33930","Type":"ContainerDied","Data":"86dd9cb2df6336d37948551b33d5a151e10e12f60435fec8a924c6900e110929"}
Mar 20 16:40:04 crc kubenswrapper[4730]: I0320 16:40:04.718401    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567080-445j8"
Mar 20 16:40:04 crc kubenswrapper[4730]: I0320 16:40:04.790060    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgq9g\" (UniqueName: \"kubernetes.io/projected/9b8e3816-7082-4505-847c-880b40d33930-kube-api-access-vgq9g\") pod \"9b8e3816-7082-4505-847c-880b40d33930\" (UID: \"9b8e3816-7082-4505-847c-880b40d33930\") "
Mar 20 16:40:04 crc kubenswrapper[4730]: I0320 16:40:04.798766    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b8e3816-7082-4505-847c-880b40d33930-kube-api-access-vgq9g" (OuterVolumeSpecName: "kube-api-access-vgq9g") pod "9b8e3816-7082-4505-847c-880b40d33930" (UID: "9b8e3816-7082-4505-847c-880b40d33930"). InnerVolumeSpecName "kube-api-access-vgq9g". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:40:04 crc kubenswrapper[4730]: I0320 16:40:04.894033    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgq9g\" (UniqueName: \"kubernetes.io/projected/9b8e3816-7082-4505-847c-880b40d33930-kube-api-access-vgq9g\") on node \"crc\" DevicePath \"\""
Mar 20 16:40:05 crc kubenswrapper[4730]: I0320 16:40:05.350828    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567080-445j8" event={"ID":"9b8e3816-7082-4505-847c-880b40d33930","Type":"ContainerDied","Data":"3936d873f1da84ce2f7ff062b900f2b0100ea3cfeea943d681e1395f94bb332b"}
Mar 20 16:40:05 crc kubenswrapper[4730]: I0320 16:40:05.350863    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3936d873f1da84ce2f7ff062b900f2b0100ea3cfeea943d681e1395f94bb332b"
Mar 20 16:40:05 crc kubenswrapper[4730]: I0320 16:40:05.350887    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567080-445j8"
Mar 20 16:40:05 crc kubenswrapper[4730]: I0320 16:40:05.788200    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567074-2xrgq"]
Mar 20 16:40:05 crc kubenswrapper[4730]: I0320 16:40:05.801652    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567074-2xrgq"]
Mar 20 16:40:07 crc kubenswrapper[4730]: I0320 16:40:07.542732    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a649324-b73a-44e0-94e5-2b8c54476367" path="/var/lib/kubelet/pods/2a649324-b73a-44e0-94e5-2b8c54476367/volumes"
Mar 20 16:40:12 crc kubenswrapper[4730]: I0320 16:40:12.879786    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 16:40:12 crc kubenswrapper[4730]: I0320 16:40:12.880457    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 16:40:42 crc kubenswrapper[4730]: I0320 16:40:42.880376    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 16:40:42 crc kubenswrapper[4730]: I0320 16:40:42.881943    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 16:40:50 crc kubenswrapper[4730]: I0320 16:40:50.919497    4730 scope.go:117] "RemoveContainer" containerID="2bb1a712fbfcbaa124ee788c9be392cdb5ddacf514828a35ba09574bc19839a4"
Mar 20 16:41:12 crc kubenswrapper[4730]: I0320 16:41:12.880244    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 16:41:12 crc kubenswrapper[4730]: I0320 16:41:12.880744    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 16:41:12 crc kubenswrapper[4730]: I0320 16:41:12.880797    4730 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf"
Mar 20 16:41:12 crc kubenswrapper[4730]: I0320 16:41:12.881720    4730 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"941dc6e58516657b43df3ac7120bf6da060d2f3b3a6c41da62694a0c2e80f6c6"} pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted"
Mar 20 16:41:12 crc kubenswrapper[4730]: I0320 16:41:12.881793    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" containerID="cri-o://941dc6e58516657b43df3ac7120bf6da060d2f3b3a6c41da62694a0c2e80f6c6" gracePeriod=600
Mar 20 16:41:14 crc kubenswrapper[4730]: I0320 16:41:14.024275    4730 generic.go:334] "Generic (PLEG): container finished" podID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerID="941dc6e58516657b43df3ac7120bf6da060d2f3b3a6c41da62694a0c2e80f6c6" exitCode=0
Mar 20 16:41:14 crc kubenswrapper[4730]: I0320 16:41:14.024328    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" event={"ID":"7fcd3db3-55f1-4c23-8fa9-78844495cea3","Type":"ContainerDied","Data":"941dc6e58516657b43df3ac7120bf6da060d2f3b3a6c41da62694a0c2e80f6c6"}
Mar 20 16:41:14 crc kubenswrapper[4730]: I0320 16:41:14.024930    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" event={"ID":"7fcd3db3-55f1-4c23-8fa9-78844495cea3","Type":"ContainerStarted","Data":"ee13c0427129be645bbda82e57ffd84564f844ed3cec3a2748df09a4a24fe610"}
Mar 20 16:41:14 crc kubenswrapper[4730]: I0320 16:41:14.024959    4730 scope.go:117] "RemoveContainer" containerID="0b0c904fd86d0d3236cc303102f0b32669f34531125a255be5df201b0dd8ef34"
Mar 20 16:42:00 crc kubenswrapper[4730]: I0320 16:42:00.149995    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567082-597m4"]
Mar 20 16:42:00 crc kubenswrapper[4730]: E0320 16:42:00.151068    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b8e3816-7082-4505-847c-880b40d33930" containerName="oc"
Mar 20 16:42:00 crc kubenswrapper[4730]: I0320 16:42:00.151085    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b8e3816-7082-4505-847c-880b40d33930" containerName="oc"
Mar 20 16:42:00 crc kubenswrapper[4730]: I0320 16:42:00.151377    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b8e3816-7082-4505-847c-880b40d33930" containerName="oc"
Mar 20 16:42:00 crc kubenswrapper[4730]: I0320 16:42:00.152324    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567082-597m4"
Mar 20 16:42:00 crc kubenswrapper[4730]: I0320 16:42:00.156098    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt"
Mar 20 16:42:00 crc kubenswrapper[4730]: I0320 16:42:00.156104    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt"
Mar 20 16:42:00 crc kubenswrapper[4730]: I0320 16:42:00.156914    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vlqbl"
Mar 20 16:42:00 crc kubenswrapper[4730]: I0320 16:42:00.160234    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567082-597m4"]
Mar 20 16:42:00 crc kubenswrapper[4730]: I0320 16:42:00.225924    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjsc8\" (UniqueName: \"kubernetes.io/projected/713df8c0-cae2-4cfd-9ecf-66856a78c066-kube-api-access-mjsc8\") pod \"auto-csr-approver-29567082-597m4\" (UID: \"713df8c0-cae2-4cfd-9ecf-66856a78c066\") " pod="openshift-infra/auto-csr-approver-29567082-597m4"
Mar 20 16:42:00 crc kubenswrapper[4730]: I0320 16:42:00.327657    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjsc8\" (UniqueName: \"kubernetes.io/projected/713df8c0-cae2-4cfd-9ecf-66856a78c066-kube-api-access-mjsc8\") pod \"auto-csr-approver-29567082-597m4\" (UID: \"713df8c0-cae2-4cfd-9ecf-66856a78c066\") " pod="openshift-infra/auto-csr-approver-29567082-597m4"
Mar 20 16:42:00 crc kubenswrapper[4730]: I0320 16:42:00.352808    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjsc8\" (UniqueName: \"kubernetes.io/projected/713df8c0-cae2-4cfd-9ecf-66856a78c066-kube-api-access-mjsc8\") pod \"auto-csr-approver-29567082-597m4\" (UID: \"713df8c0-cae2-4cfd-9ecf-66856a78c066\") " pod="openshift-infra/auto-csr-approver-29567082-597m4"
Mar 20 16:42:00 crc kubenswrapper[4730]: I0320 16:42:00.473898    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567082-597m4"
Mar 20 16:42:00 crc kubenswrapper[4730]: I0320 16:42:00.952823    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567082-597m4"]
Mar 20 16:42:01 crc kubenswrapper[4730]: I0320 16:42:01.464082    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567082-597m4" event={"ID":"713df8c0-cae2-4cfd-9ecf-66856a78c066","Type":"ContainerStarted","Data":"dbd6b9fa7b06d76ffe7fc201a8e7d674e8d261c5618fc8a84d4c08a89cda9426"}
Mar 20 16:42:02 crc kubenswrapper[4730]: I0320 16:42:02.477137    4730 generic.go:334] "Generic (PLEG): container finished" podID="713df8c0-cae2-4cfd-9ecf-66856a78c066" containerID="44f54f3fc7434586ebe0f8d3b305da77181f85e1a148a72f852acf4f69b33aae" exitCode=0
Mar 20 16:42:02 crc kubenswrapper[4730]: I0320 16:42:02.477261    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567082-597m4" event={"ID":"713df8c0-cae2-4cfd-9ecf-66856a78c066","Type":"ContainerDied","Data":"44f54f3fc7434586ebe0f8d3b305da77181f85e1a148a72f852acf4f69b33aae"}
Mar 20 16:42:03 crc kubenswrapper[4730]: I0320 16:42:03.827795    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567082-597m4"
Mar 20 16:42:03 crc kubenswrapper[4730]: I0320 16:42:03.934143    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjsc8\" (UniqueName: \"kubernetes.io/projected/713df8c0-cae2-4cfd-9ecf-66856a78c066-kube-api-access-mjsc8\") pod \"713df8c0-cae2-4cfd-9ecf-66856a78c066\" (UID: \"713df8c0-cae2-4cfd-9ecf-66856a78c066\") "
Mar 20 16:42:03 crc kubenswrapper[4730]: I0320 16:42:03.943442    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/713df8c0-cae2-4cfd-9ecf-66856a78c066-kube-api-access-mjsc8" (OuterVolumeSpecName: "kube-api-access-mjsc8") pod "713df8c0-cae2-4cfd-9ecf-66856a78c066" (UID: "713df8c0-cae2-4cfd-9ecf-66856a78c066"). InnerVolumeSpecName "kube-api-access-mjsc8". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:42:04 crc kubenswrapper[4730]: I0320 16:42:04.037954    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjsc8\" (UniqueName: \"kubernetes.io/projected/713df8c0-cae2-4cfd-9ecf-66856a78c066-kube-api-access-mjsc8\") on node \"crc\" DevicePath \"\""
Mar 20 16:42:04 crc kubenswrapper[4730]: I0320 16:42:04.494685    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567082-597m4" event={"ID":"713df8c0-cae2-4cfd-9ecf-66856a78c066","Type":"ContainerDied","Data":"dbd6b9fa7b06d76ffe7fc201a8e7d674e8d261c5618fc8a84d4c08a89cda9426"}
Mar 20 16:42:04 crc kubenswrapper[4730]: I0320 16:42:04.495039    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbd6b9fa7b06d76ffe7fc201a8e7d674e8d261c5618fc8a84d4c08a89cda9426"
Mar 20 16:42:04 crc kubenswrapper[4730]: I0320 16:42:04.495139    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567082-597m4"
Mar 20 16:42:04 crc kubenswrapper[4730]: I0320 16:42:04.899388    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567076-5zjgq"]
Mar 20 16:42:04 crc kubenswrapper[4730]: I0320 16:42:04.909455    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567076-5zjgq"]
Mar 20 16:42:05 crc kubenswrapper[4730]: I0320 16:42:05.550252    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcb7d22d-4ad0-4a9c-bf00-e966d1abb051" path="/var/lib/kubelet/pods/bcb7d22d-4ad0-4a9c-bf00-e966d1abb051/volumes"
Mar 20 16:42:51 crc kubenswrapper[4730]: I0320 16:42:51.012095    4730 scope.go:117] "RemoveContainer" containerID="7abe93567f97d011f8ae053e88185c6004136b63e3d5f72b19beb707014bf434"
Mar 20 16:43:42 crc kubenswrapper[4730]: I0320 16:43:42.880002    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 16:43:42 crc kubenswrapper[4730]: I0320 16:43:42.880655    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 16:44:00 crc kubenswrapper[4730]: I0320 16:44:00.164491    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567084-c4mpd"]
Mar 20 16:44:00 crc kubenswrapper[4730]: E0320 16:44:00.165799    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="713df8c0-cae2-4cfd-9ecf-66856a78c066" containerName="oc"
Mar 20 16:44:00 crc kubenswrapper[4730]: I0320 16:44:00.165820    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="713df8c0-cae2-4cfd-9ecf-66856a78c066" containerName="oc"
Mar 20 16:44:00 crc kubenswrapper[4730]: I0320 16:44:00.166168    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="713df8c0-cae2-4cfd-9ecf-66856a78c066" containerName="oc"
Mar 20 16:44:00 crc kubenswrapper[4730]: I0320 16:44:00.167543    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567084-c4mpd"
Mar 20 16:44:00 crc kubenswrapper[4730]: I0320 16:44:00.172203    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt"
Mar 20 16:44:00 crc kubenswrapper[4730]: I0320 16:44:00.175682    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt"
Mar 20 16:44:00 crc kubenswrapper[4730]: I0320 16:44:00.176072    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vlqbl"
Mar 20 16:44:00 crc kubenswrapper[4730]: I0320 16:44:00.180854    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567084-c4mpd"]
Mar 20 16:44:00 crc kubenswrapper[4730]: I0320 16:44:00.303360    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sr4x\" (UniqueName: \"kubernetes.io/projected/51901671-4d27-46c5-9a9d-baf51b2b9c01-kube-api-access-6sr4x\") pod \"auto-csr-approver-29567084-c4mpd\" (UID: \"51901671-4d27-46c5-9a9d-baf51b2b9c01\") " pod="openshift-infra/auto-csr-approver-29567084-c4mpd"
Mar 20 16:44:00 crc kubenswrapper[4730]: I0320 16:44:00.405576    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sr4x\" (UniqueName: \"kubernetes.io/projected/51901671-4d27-46c5-9a9d-baf51b2b9c01-kube-api-access-6sr4x\") pod \"auto-csr-approver-29567084-c4mpd\" (UID: \"51901671-4d27-46c5-9a9d-baf51b2b9c01\") " pod="openshift-infra/auto-csr-approver-29567084-c4mpd"
Mar 20 16:44:00 crc kubenswrapper[4730]: I0320 16:44:00.901304    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sr4x\" (UniqueName: \"kubernetes.io/projected/51901671-4d27-46c5-9a9d-baf51b2b9c01-kube-api-access-6sr4x\") pod \"auto-csr-approver-29567084-c4mpd\" (UID: \"51901671-4d27-46c5-9a9d-baf51b2b9c01\") " pod="openshift-infra/auto-csr-approver-29567084-c4mpd"
Mar 20 16:44:01 crc kubenswrapper[4730]: I0320 16:44:01.093481    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567084-c4mpd"
Mar 20 16:44:01 crc kubenswrapper[4730]: I0320 16:44:01.658512    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567084-c4mpd"]
Mar 20 16:44:01 crc kubenswrapper[4730]: I0320 16:44:01.729903    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567084-c4mpd" event={"ID":"51901671-4d27-46c5-9a9d-baf51b2b9c01","Type":"ContainerStarted","Data":"84df18edf9b53088b22d7c63f76d7a13c8c7254f5c750e4137ce33ff56783214"}
Mar 20 16:44:03 crc kubenswrapper[4730]: I0320 16:44:03.750911    4730 generic.go:334] "Generic (PLEG): container finished" podID="51901671-4d27-46c5-9a9d-baf51b2b9c01" containerID="cbdd90e3d11772056ef45ec365a19533f01de2f8c0583c5498c86a843612b56d" exitCode=0
Mar 20 16:44:03 crc kubenswrapper[4730]: I0320 16:44:03.750967    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567084-c4mpd" event={"ID":"51901671-4d27-46c5-9a9d-baf51b2b9c01","Type":"ContainerDied","Data":"cbdd90e3d11772056ef45ec365a19533f01de2f8c0583c5498c86a843612b56d"}
Mar 20 16:44:05 crc kubenswrapper[4730]: I0320 16:44:05.154887    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567084-c4mpd"
Mar 20 16:44:05 crc kubenswrapper[4730]: I0320 16:44:05.308896    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6sr4x\" (UniqueName: \"kubernetes.io/projected/51901671-4d27-46c5-9a9d-baf51b2b9c01-kube-api-access-6sr4x\") pod \"51901671-4d27-46c5-9a9d-baf51b2b9c01\" (UID: \"51901671-4d27-46c5-9a9d-baf51b2b9c01\") "
Mar 20 16:44:05 crc kubenswrapper[4730]: I0320 16:44:05.324596    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51901671-4d27-46c5-9a9d-baf51b2b9c01-kube-api-access-6sr4x" (OuterVolumeSpecName: "kube-api-access-6sr4x") pod "51901671-4d27-46c5-9a9d-baf51b2b9c01" (UID: "51901671-4d27-46c5-9a9d-baf51b2b9c01"). InnerVolumeSpecName "kube-api-access-6sr4x". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:44:05 crc kubenswrapper[4730]: I0320 16:44:05.412078    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6sr4x\" (UniqueName: \"kubernetes.io/projected/51901671-4d27-46c5-9a9d-baf51b2b9c01-kube-api-access-6sr4x\") on node \"crc\" DevicePath \"\""
Mar 20 16:44:05 crc kubenswrapper[4730]: I0320 16:44:05.779937    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567084-c4mpd" event={"ID":"51901671-4d27-46c5-9a9d-baf51b2b9c01","Type":"ContainerDied","Data":"84df18edf9b53088b22d7c63f76d7a13c8c7254f5c750e4137ce33ff56783214"}
Mar 20 16:44:05 crc kubenswrapper[4730]: I0320 16:44:05.780274    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84df18edf9b53088b22d7c63f76d7a13c8c7254f5c750e4137ce33ff56783214"
Mar 20 16:44:05 crc kubenswrapper[4730]: I0320 16:44:05.780019    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567084-c4mpd"
Mar 20 16:44:06 crc kubenswrapper[4730]: I0320 16:44:06.239313    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567078-bsnqc"]
Mar 20 16:44:06 crc kubenswrapper[4730]: I0320 16:44:06.252901    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567078-bsnqc"]
Mar 20 16:44:07 crc kubenswrapper[4730]: I0320 16:44:07.548571    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7178ffb9-4891-485e-b3d6-d7fcc8f22ef4" path="/var/lib/kubelet/pods/7178ffb9-4891-485e-b3d6-d7fcc8f22ef4/volumes"
Mar 20 16:44:12 crc kubenswrapper[4730]: I0320 16:44:12.879815    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 16:44:12 crc kubenswrapper[4730]: I0320 16:44:12.880275    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 16:44:42 crc kubenswrapper[4730]: I0320 16:44:42.879740    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 16:44:42 crc kubenswrapper[4730]: I0320 16:44:42.880275    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 16:44:42 crc kubenswrapper[4730]: I0320 16:44:42.880326    4730 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf"
Mar 20 16:44:42 crc kubenswrapper[4730]: I0320 16:44:42.881278    4730 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ee13c0427129be645bbda82e57ffd84564f844ed3cec3a2748df09a4a24fe610"} pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted"
Mar 20 16:44:42 crc kubenswrapper[4730]: I0320 16:44:42.881345    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" containerID="cri-o://ee13c0427129be645bbda82e57ffd84564f844ed3cec3a2748df09a4a24fe610" gracePeriod=600
Mar 20 16:44:43 crc kubenswrapper[4730]: E0320 16:44:43.002846    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:44:43 crc kubenswrapper[4730]: I0320 16:44:43.178232    4730 generic.go:334] "Generic (PLEG): container finished" podID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerID="ee13c0427129be645bbda82e57ffd84564f844ed3cec3a2748df09a4a24fe610" exitCode=0
Mar 20 16:44:43 crc kubenswrapper[4730]: I0320 16:44:43.178293    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" event={"ID":"7fcd3db3-55f1-4c23-8fa9-78844495cea3","Type":"ContainerDied","Data":"ee13c0427129be645bbda82e57ffd84564f844ed3cec3a2748df09a4a24fe610"}
Mar 20 16:44:43 crc kubenswrapper[4730]: I0320 16:44:43.178772    4730 scope.go:117] "RemoveContainer" containerID="941dc6e58516657b43df3ac7120bf6da060d2f3b3a6c41da62694a0c2e80f6c6"
Mar 20 16:44:43 crc kubenswrapper[4730]: I0320 16:44:43.179993    4730 scope.go:117] "RemoveContainer" containerID="ee13c0427129be645bbda82e57ffd84564f844ed3cec3a2748df09a4a24fe610"
Mar 20 16:44:43 crc kubenswrapper[4730]: E0320 16:44:43.180643    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:44:51 crc kubenswrapper[4730]: I0320 16:44:51.115117    4730 scope.go:117] "RemoveContainer" containerID="3b2c3b7b49826d995d5d767d6f87b9a288d68f48a6bee9638eee715a068be2d7"
Mar 20 16:44:55 crc kubenswrapper[4730]: I0320 16:44:55.534095    4730 scope.go:117] "RemoveContainer" containerID="ee13c0427129be645bbda82e57ffd84564f844ed3cec3a2748df09a4a24fe610"
Mar 20 16:44:55 crc kubenswrapper[4730]: E0320 16:44:55.535592    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:45:00 crc kubenswrapper[4730]: I0320 16:45:00.166880    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567085-cglw4"]
Mar 20 16:45:00 crc kubenswrapper[4730]: E0320 16:45:00.168199    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51901671-4d27-46c5-9a9d-baf51b2b9c01" containerName="oc"
Mar 20 16:45:00 crc kubenswrapper[4730]: I0320 16:45:00.168222    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="51901671-4d27-46c5-9a9d-baf51b2b9c01" containerName="oc"
Mar 20 16:45:00 crc kubenswrapper[4730]: I0320 16:45:00.168598    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="51901671-4d27-46c5-9a9d-baf51b2b9c01" containerName="oc"
Mar 20 16:45:00 crc kubenswrapper[4730]: I0320 16:45:00.169718    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567085-cglw4"
Mar 20 16:45:00 crc kubenswrapper[4730]: I0320 16:45:00.171897    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t"
Mar 20 16:45:00 crc kubenswrapper[4730]: I0320 16:45:00.182687    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config"
Mar 20 16:45:00 crc kubenswrapper[4730]: I0320 16:45:00.202645    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567085-cglw4"]
Mar 20 16:45:00 crc kubenswrapper[4730]: I0320 16:45:00.287778    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1c630452-a358-4849-b036-b8cdeb19775f-secret-volume\") pod \"collect-profiles-29567085-cglw4\" (UID: \"1c630452-a358-4849-b036-b8cdeb19775f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567085-cglw4"
Mar 20 16:45:00 crc kubenswrapper[4730]: I0320 16:45:00.287869    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rzhq\" (UniqueName: \"kubernetes.io/projected/1c630452-a358-4849-b036-b8cdeb19775f-kube-api-access-8rzhq\") pod \"collect-profiles-29567085-cglw4\" (UID: \"1c630452-a358-4849-b036-b8cdeb19775f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567085-cglw4"
Mar 20 16:45:00 crc kubenswrapper[4730]: I0320 16:45:00.288384    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c630452-a358-4849-b036-b8cdeb19775f-config-volume\") pod \"collect-profiles-29567085-cglw4\" (UID: \"1c630452-a358-4849-b036-b8cdeb19775f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567085-cglw4"
Mar 20 16:45:00 crc kubenswrapper[4730]: I0320 16:45:00.392156    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1c630452-a358-4849-b036-b8cdeb19775f-secret-volume\") pod \"collect-profiles-29567085-cglw4\" (UID: \"1c630452-a358-4849-b036-b8cdeb19775f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567085-cglw4"
Mar 20 16:45:00 crc kubenswrapper[4730]: I0320 16:45:00.392278    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rzhq\" (UniqueName: \"kubernetes.io/projected/1c630452-a358-4849-b036-b8cdeb19775f-kube-api-access-8rzhq\") pod \"collect-profiles-29567085-cglw4\" (UID: \"1c630452-a358-4849-b036-b8cdeb19775f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567085-cglw4"
Mar 20 16:45:00 crc kubenswrapper[4730]: I0320 16:45:00.392419    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c630452-a358-4849-b036-b8cdeb19775f-config-volume\") pod \"collect-profiles-29567085-cglw4\" (UID: \"1c630452-a358-4849-b036-b8cdeb19775f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567085-cglw4"
Mar 20 16:45:00 crc kubenswrapper[4730]: I0320 16:45:00.393698    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c630452-a358-4849-b036-b8cdeb19775f-config-volume\") pod \"collect-profiles-29567085-cglw4\" (UID: \"1c630452-a358-4849-b036-b8cdeb19775f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567085-cglw4"
Mar 20 16:45:00 crc kubenswrapper[4730]: I0320 16:45:00.410695    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1c630452-a358-4849-b036-b8cdeb19775f-secret-volume\") pod \"collect-profiles-29567085-cglw4\" (UID: \"1c630452-a358-4849-b036-b8cdeb19775f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567085-cglw4"
Mar 20 16:45:00 crc kubenswrapper[4730]: I0320 16:45:00.412500    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rzhq\" (UniqueName: \"kubernetes.io/projected/1c630452-a358-4849-b036-b8cdeb19775f-kube-api-access-8rzhq\") pod \"collect-profiles-29567085-cglw4\" (UID: \"1c630452-a358-4849-b036-b8cdeb19775f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567085-cglw4"
Mar 20 16:45:00 crc kubenswrapper[4730]: I0320 16:45:00.508181    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567085-cglw4"
Mar 20 16:45:01 crc kubenswrapper[4730]: I0320 16:45:01.015649    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567085-cglw4"]
Mar 20 16:45:01 crc kubenswrapper[4730]: I0320 16:45:01.364122    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567085-cglw4" event={"ID":"1c630452-a358-4849-b036-b8cdeb19775f","Type":"ContainerStarted","Data":"2f6293be6ccbad3a3176b9b41c9c1955e0305b0804646ead1e4d478ea3563234"}
Mar 20 16:45:01 crc kubenswrapper[4730]: I0320 16:45:01.364179    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567085-cglw4" event={"ID":"1c630452-a358-4849-b036-b8cdeb19775f","Type":"ContainerStarted","Data":"74ea47e100eaace17e602b1d49e86e12d719d06c6e1e869935b40a8d6e8d1499"}
Mar 20 16:45:02 crc kubenswrapper[4730]: I0320 16:45:02.375770    4730 generic.go:334] "Generic (PLEG): container finished" podID="1c630452-a358-4849-b036-b8cdeb19775f" containerID="2f6293be6ccbad3a3176b9b41c9c1955e0305b0804646ead1e4d478ea3563234" exitCode=0
Mar 20 16:45:02 crc kubenswrapper[4730]: I0320 16:45:02.375911    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567085-cglw4" event={"ID":"1c630452-a358-4849-b036-b8cdeb19775f","Type":"ContainerDied","Data":"2f6293be6ccbad3a3176b9b41c9c1955e0305b0804646ead1e4d478ea3563234"}
Mar 20 16:45:02 crc kubenswrapper[4730]: I0320 16:45:02.809521    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567085-cglw4"
Mar 20 16:45:02 crc kubenswrapper[4730]: I0320 16:45:02.949285    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c630452-a358-4849-b036-b8cdeb19775f-config-volume\") pod \"1c630452-a358-4849-b036-b8cdeb19775f\" (UID: \"1c630452-a358-4849-b036-b8cdeb19775f\") "
Mar 20 16:45:02 crc kubenswrapper[4730]: I0320 16:45:02.949350    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1c630452-a358-4849-b036-b8cdeb19775f-secret-volume\") pod \"1c630452-a358-4849-b036-b8cdeb19775f\" (UID: \"1c630452-a358-4849-b036-b8cdeb19775f\") "
Mar 20 16:45:02 crc kubenswrapper[4730]: I0320 16:45:02.949429    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rzhq\" (UniqueName: \"kubernetes.io/projected/1c630452-a358-4849-b036-b8cdeb19775f-kube-api-access-8rzhq\") pod \"1c630452-a358-4849-b036-b8cdeb19775f\" (UID: \"1c630452-a358-4849-b036-b8cdeb19775f\") "
Mar 20 16:45:02 crc kubenswrapper[4730]: I0320 16:45:02.950096    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c630452-a358-4849-b036-b8cdeb19775f-config-volume" (OuterVolumeSpecName: "config-volume") pod "1c630452-a358-4849-b036-b8cdeb19775f" (UID: "1c630452-a358-4849-b036-b8cdeb19775f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 16:45:02 crc kubenswrapper[4730]: I0320 16:45:02.955332    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c630452-a358-4849-b036-b8cdeb19775f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1c630452-a358-4849-b036-b8cdeb19775f" (UID: "1c630452-a358-4849-b036-b8cdeb19775f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 16:45:02 crc kubenswrapper[4730]: I0320 16:45:02.955555    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c630452-a358-4849-b036-b8cdeb19775f-kube-api-access-8rzhq" (OuterVolumeSpecName: "kube-api-access-8rzhq") pod "1c630452-a358-4849-b036-b8cdeb19775f" (UID: "1c630452-a358-4849-b036-b8cdeb19775f"). InnerVolumeSpecName "kube-api-access-8rzhq". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:45:03 crc kubenswrapper[4730]: I0320 16:45:03.052630    4730 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1c630452-a358-4849-b036-b8cdeb19775f-secret-volume\") on node \"crc\" DevicePath \"\""
Mar 20 16:45:03 crc kubenswrapper[4730]: I0320 16:45:03.052678    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rzhq\" (UniqueName: \"kubernetes.io/projected/1c630452-a358-4849-b036-b8cdeb19775f-kube-api-access-8rzhq\") on node \"crc\" DevicePath \"\""
Mar 20 16:45:03 crc kubenswrapper[4730]: I0320 16:45:03.052691    4730 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c630452-a358-4849-b036-b8cdeb19775f-config-volume\") on node \"crc\" DevicePath \"\""
Mar 20 16:45:03 crc kubenswrapper[4730]: I0320 16:45:03.389850    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567085-cglw4" event={"ID":"1c630452-a358-4849-b036-b8cdeb19775f","Type":"ContainerDied","Data":"74ea47e100eaace17e602b1d49e86e12d719d06c6e1e869935b40a8d6e8d1499"}
Mar 20 16:45:03 crc kubenswrapper[4730]: I0320 16:45:03.389893    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74ea47e100eaace17e602b1d49e86e12d719d06c6e1e869935b40a8d6e8d1499"
Mar 20 16:45:03 crc kubenswrapper[4730]: I0320 16:45:03.389898    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567085-cglw4"
Mar 20 16:45:03 crc kubenswrapper[4730]: I0320 16:45:03.912628    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567040-cz69b"]
Mar 20 16:45:03 crc kubenswrapper[4730]: I0320 16:45:03.925207    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567040-cz69b"]
Mar 20 16:45:05 crc kubenswrapper[4730]: I0320 16:45:05.547062    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="672cfda1-2ec8-41fe-b3dc-eabe4e60726d" path="/var/lib/kubelet/pods/672cfda1-2ec8-41fe-b3dc-eabe4e60726d/volumes"
Mar 20 16:45:07 crc kubenswrapper[4730]: I0320 16:45:07.533018    4730 scope.go:117] "RemoveContainer" containerID="ee13c0427129be645bbda82e57ffd84564f844ed3cec3a2748df09a4a24fe610"
Mar 20 16:45:07 crc kubenswrapper[4730]: E0320 16:45:07.533787    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:45:19 crc kubenswrapper[4730]: I0320 16:45:19.533627    4730 scope.go:117] "RemoveContainer" containerID="ee13c0427129be645bbda82e57ffd84564f844ed3cec3a2748df09a4a24fe610"
Mar 20 16:45:19 crc kubenswrapper[4730]: E0320 16:45:19.535122    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:45:32 crc kubenswrapper[4730]: I0320 16:45:32.532738    4730 scope.go:117] "RemoveContainer" containerID="ee13c0427129be645bbda82e57ffd84564f844ed3cec3a2748df09a4a24fe610"
Mar 20 16:45:32 crc kubenswrapper[4730]: E0320 16:45:32.533549    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:45:45 crc kubenswrapper[4730]: I0320 16:45:45.533522    4730 scope.go:117] "RemoveContainer" containerID="ee13c0427129be645bbda82e57ffd84564f844ed3cec3a2748df09a4a24fe610"
Mar 20 16:45:45 crc kubenswrapper[4730]: E0320 16:45:45.535331    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:45:51 crc kubenswrapper[4730]: I0320 16:45:51.208907    4730 scope.go:117] "RemoveContainer" containerID="aa12014b37ee0e01204777f8c797059805894b107ea52ba01e8a5d24299b55a5"
Mar 20 16:45:57 crc kubenswrapper[4730]: I0320 16:45:57.537391    4730 scope.go:117] "RemoveContainer" containerID="ee13c0427129be645bbda82e57ffd84564f844ed3cec3a2748df09a4a24fe610"
Mar 20 16:45:57 crc kubenswrapper[4730]: E0320 16:45:57.538239    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:45:58 crc kubenswrapper[4730]: I0320 16:45:58.907667    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-64spl"]
Mar 20 16:45:58 crc kubenswrapper[4730]: E0320 16:45:58.908377    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c630452-a358-4849-b036-b8cdeb19775f" containerName="collect-profiles"
Mar 20 16:45:58 crc kubenswrapper[4730]: I0320 16:45:58.908389    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c630452-a358-4849-b036-b8cdeb19775f" containerName="collect-profiles"
Mar 20 16:45:58 crc kubenswrapper[4730]: I0320 16:45:58.908611    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c630452-a358-4849-b036-b8cdeb19775f" containerName="collect-profiles"
Mar 20 16:45:58 crc kubenswrapper[4730]: I0320 16:45:58.910008    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-64spl"
Mar 20 16:45:58 crc kubenswrapper[4730]: I0320 16:45:58.924717    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-64spl"]
Mar 20 16:45:58 crc kubenswrapper[4730]: I0320 16:45:58.997537    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c95ee43-d4bb-471d-977a-4cb14f99a03e-utilities\") pod \"redhat-marketplace-64spl\" (UID: \"3c95ee43-d4bb-471d-977a-4cb14f99a03e\") " pod="openshift-marketplace/redhat-marketplace-64spl"
Mar 20 16:45:58 crc kubenswrapper[4730]: I0320 16:45:58.997728    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s9p2\" (UniqueName: \"kubernetes.io/projected/3c95ee43-d4bb-471d-977a-4cb14f99a03e-kube-api-access-9s9p2\") pod \"redhat-marketplace-64spl\" (UID: \"3c95ee43-d4bb-471d-977a-4cb14f99a03e\") " pod="openshift-marketplace/redhat-marketplace-64spl"
Mar 20 16:45:58 crc kubenswrapper[4730]: I0320 16:45:58.997930    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c95ee43-d4bb-471d-977a-4cb14f99a03e-catalog-content\") pod \"redhat-marketplace-64spl\" (UID: \"3c95ee43-d4bb-471d-977a-4cb14f99a03e\") " pod="openshift-marketplace/redhat-marketplace-64spl"
Mar 20 16:45:59 crc kubenswrapper[4730]: I0320 16:45:59.100527    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c95ee43-d4bb-471d-977a-4cb14f99a03e-catalog-content\") pod \"redhat-marketplace-64spl\" (UID: \"3c95ee43-d4bb-471d-977a-4cb14f99a03e\") " pod="openshift-marketplace/redhat-marketplace-64spl"
Mar 20 16:45:59 crc kubenswrapper[4730]: I0320 16:45:59.100789    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c95ee43-d4bb-471d-977a-4cb14f99a03e-utilities\") pod \"redhat-marketplace-64spl\" (UID: \"3c95ee43-d4bb-471d-977a-4cb14f99a03e\") " pod="openshift-marketplace/redhat-marketplace-64spl"
Mar 20 16:45:59 crc kubenswrapper[4730]: I0320 16:45:59.100851    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9s9p2\" (UniqueName: \"kubernetes.io/projected/3c95ee43-d4bb-471d-977a-4cb14f99a03e-kube-api-access-9s9p2\") pod \"redhat-marketplace-64spl\" (UID: \"3c95ee43-d4bb-471d-977a-4cb14f99a03e\") " pod="openshift-marketplace/redhat-marketplace-64spl"
Mar 20 16:45:59 crc kubenswrapper[4730]: I0320 16:45:59.101586    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c95ee43-d4bb-471d-977a-4cb14f99a03e-utilities\") pod \"redhat-marketplace-64spl\" (UID: \"3c95ee43-d4bb-471d-977a-4cb14f99a03e\") " pod="openshift-marketplace/redhat-marketplace-64spl"
Mar 20 16:45:59 crc kubenswrapper[4730]: I0320 16:45:59.102413    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c95ee43-d4bb-471d-977a-4cb14f99a03e-catalog-content\") pod \"redhat-marketplace-64spl\" (UID: \"3c95ee43-d4bb-471d-977a-4cb14f99a03e\") " pod="openshift-marketplace/redhat-marketplace-64spl"
Mar 20 16:45:59 crc kubenswrapper[4730]: I0320 16:45:59.125568    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s9p2\" (UniqueName: \"kubernetes.io/projected/3c95ee43-d4bb-471d-977a-4cb14f99a03e-kube-api-access-9s9p2\") pod \"redhat-marketplace-64spl\" (UID: \"3c95ee43-d4bb-471d-977a-4cb14f99a03e\") " pod="openshift-marketplace/redhat-marketplace-64spl"
Mar 20 16:45:59 crc kubenswrapper[4730]: I0320 16:45:59.237104    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-64spl"
Mar 20 16:45:59 crc kubenswrapper[4730]: I0320 16:45:59.712208    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-64spl"]
Mar 20 16:45:59 crc kubenswrapper[4730]: W0320 16:45:59.715216    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c95ee43_d4bb_471d_977a_4cb14f99a03e.slice/crio-959b6f691ded28673442a436f57cb1d08ee3d5669b036068fbb5bb7dcb1750ab WatchSource:0}: Error finding container 959b6f691ded28673442a436f57cb1d08ee3d5669b036068fbb5bb7dcb1750ab: Status 404 returned error can't find the container with id 959b6f691ded28673442a436f57cb1d08ee3d5669b036068fbb5bb7dcb1750ab
Mar 20 16:46:00 crc kubenswrapper[4730]: I0320 16:46:00.148204    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567086-bgtzj"]
Mar 20 16:46:00 crc kubenswrapper[4730]: I0320 16:46:00.150443    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567086-bgtzj"
Mar 20 16:46:00 crc kubenswrapper[4730]: I0320 16:46:00.156072    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt"
Mar 20 16:46:00 crc kubenswrapper[4730]: I0320 16:46:00.156103    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vlqbl"
Mar 20 16:46:00 crc kubenswrapper[4730]: I0320 16:46:00.156145    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt"
Mar 20 16:46:00 crc kubenswrapper[4730]: I0320 16:46:00.159219    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567086-bgtzj"]
Mar 20 16:46:00 crc kubenswrapper[4730]: I0320 16:46:00.176905    4730 generic.go:334] "Generic (PLEG): container finished" podID="3c95ee43-d4bb-471d-977a-4cb14f99a03e" containerID="bb084137d5918afdeb931c3364180027f66610bb94a9b283e99cdd6522173e0b" exitCode=0
Mar 20 16:46:00 crc kubenswrapper[4730]: I0320 16:46:00.176974    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64spl" event={"ID":"3c95ee43-d4bb-471d-977a-4cb14f99a03e","Type":"ContainerDied","Data":"bb084137d5918afdeb931c3364180027f66610bb94a9b283e99cdd6522173e0b"}
Mar 20 16:46:00 crc kubenswrapper[4730]: I0320 16:46:00.177070    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64spl" event={"ID":"3c95ee43-d4bb-471d-977a-4cb14f99a03e","Type":"ContainerStarted","Data":"959b6f691ded28673442a436f57cb1d08ee3d5669b036068fbb5bb7dcb1750ab"}
Mar 20 16:46:00 crc kubenswrapper[4730]: I0320 16:46:00.178716    4730 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider
Mar 20 16:46:00 crc kubenswrapper[4730]: I0320 16:46:00.223753    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pw49\" (UniqueName: \"kubernetes.io/projected/1598084b-3967-4d5d-8911-87b4bbf10965-kube-api-access-8pw49\") pod \"auto-csr-approver-29567086-bgtzj\" (UID: \"1598084b-3967-4d5d-8911-87b4bbf10965\") " pod="openshift-infra/auto-csr-approver-29567086-bgtzj"
Mar 20 16:46:00 crc kubenswrapper[4730]: I0320 16:46:00.325735    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pw49\" (UniqueName: \"kubernetes.io/projected/1598084b-3967-4d5d-8911-87b4bbf10965-kube-api-access-8pw49\") pod \"auto-csr-approver-29567086-bgtzj\" (UID: \"1598084b-3967-4d5d-8911-87b4bbf10965\") " pod="openshift-infra/auto-csr-approver-29567086-bgtzj"
Mar 20 16:46:00 crc kubenswrapper[4730]: I0320 16:46:00.346422    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pw49\" (UniqueName: \"kubernetes.io/projected/1598084b-3967-4d5d-8911-87b4bbf10965-kube-api-access-8pw49\") pod \"auto-csr-approver-29567086-bgtzj\" (UID: \"1598084b-3967-4d5d-8911-87b4bbf10965\") " pod="openshift-infra/auto-csr-approver-29567086-bgtzj"
Mar 20 16:46:00 crc kubenswrapper[4730]: I0320 16:46:00.472683    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567086-bgtzj"
Mar 20 16:46:00 crc kubenswrapper[4730]: I0320 16:46:00.932497    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567086-bgtzj"]
Mar 20 16:46:00 crc kubenswrapper[4730]: W0320 16:46:00.955781    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1598084b_3967_4d5d_8911_87b4bbf10965.slice/crio-6109ce38a6660db0e0a833021a4b531135042d1a4f46fc4d4a324f3c95359edd WatchSource:0}: Error finding container 6109ce38a6660db0e0a833021a4b531135042d1a4f46fc4d4a324f3c95359edd: Status 404 returned error can't find the container with id 6109ce38a6660db0e0a833021a4b531135042d1a4f46fc4d4a324f3c95359edd
Mar 20 16:46:01 crc kubenswrapper[4730]: I0320 16:46:01.189598    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64spl" event={"ID":"3c95ee43-d4bb-471d-977a-4cb14f99a03e","Type":"ContainerStarted","Data":"c380a914b21ad400e0c8ebd69b48a7f3bfef4e58619cb65f94f2903eccb23840"}
Mar 20 16:46:01 crc kubenswrapper[4730]: I0320 16:46:01.192916    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567086-bgtzj" event={"ID":"1598084b-3967-4d5d-8911-87b4bbf10965","Type":"ContainerStarted","Data":"6109ce38a6660db0e0a833021a4b531135042d1a4f46fc4d4a324f3c95359edd"}
Mar 20 16:46:03 crc kubenswrapper[4730]: I0320 16:46:03.211684    4730 generic.go:334] "Generic (PLEG): container finished" podID="3c95ee43-d4bb-471d-977a-4cb14f99a03e" containerID="c380a914b21ad400e0c8ebd69b48a7f3bfef4e58619cb65f94f2903eccb23840" exitCode=0
Mar 20 16:46:03 crc kubenswrapper[4730]: I0320 16:46:03.211727    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64spl" event={"ID":"3c95ee43-d4bb-471d-977a-4cb14f99a03e","Type":"ContainerDied","Data":"c380a914b21ad400e0c8ebd69b48a7f3bfef4e58619cb65f94f2903eccb23840"}
Mar 20 16:46:03 crc kubenswrapper[4730]: I0320 16:46:03.214109    4730 generic.go:334] "Generic (PLEG): container finished" podID="1598084b-3967-4d5d-8911-87b4bbf10965" containerID="090324d1acddd1e29456802f46699a7cfabedaef8f848dbdf774851d4687bf7f" exitCode=0
Mar 20 16:46:03 crc kubenswrapper[4730]: I0320 16:46:03.214132    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567086-bgtzj" event={"ID":"1598084b-3967-4d5d-8911-87b4bbf10965","Type":"ContainerDied","Data":"090324d1acddd1e29456802f46699a7cfabedaef8f848dbdf774851d4687bf7f"}
Mar 20 16:46:04 crc kubenswrapper[4730]: I0320 16:46:04.228619    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64spl" event={"ID":"3c95ee43-d4bb-471d-977a-4cb14f99a03e","Type":"ContainerStarted","Data":"08a3a7a763d29be2ad327ffc0a70e98c7747477d348e5faafbeb00384d3d3088"}
Mar 20 16:46:04 crc kubenswrapper[4730]: I0320 16:46:04.260734    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-64spl" podStartSLOduration=2.859321607 podStartE2EDuration="6.260716992s" podCreationTimestamp="2026-03-20 16:45:58 +0000 UTC" firstStartedPulling="2026-03-20 16:46:00.178512269 +0000 UTC m=+4019.391883638" lastFinishedPulling="2026-03-20 16:46:03.579907654 +0000 UTC m=+4022.793279023" observedRunningTime="2026-03-20 16:46:04.251390157 +0000 UTC m=+4023.464761516" watchObservedRunningTime="2026-03-20 16:46:04.260716992 +0000 UTC m=+4023.474088361"
Mar 20 16:46:04 crc kubenswrapper[4730]: I0320 16:46:04.578013    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567086-bgtzj"
Mar 20 16:46:04 crc kubenswrapper[4730]: I0320 16:46:04.616267    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pw49\" (UniqueName: \"kubernetes.io/projected/1598084b-3967-4d5d-8911-87b4bbf10965-kube-api-access-8pw49\") pod \"1598084b-3967-4d5d-8911-87b4bbf10965\" (UID: \"1598084b-3967-4d5d-8911-87b4bbf10965\") "
Mar 20 16:46:04 crc kubenswrapper[4730]: I0320 16:46:04.622796    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1598084b-3967-4d5d-8911-87b4bbf10965-kube-api-access-8pw49" (OuterVolumeSpecName: "kube-api-access-8pw49") pod "1598084b-3967-4d5d-8911-87b4bbf10965" (UID: "1598084b-3967-4d5d-8911-87b4bbf10965"). InnerVolumeSpecName "kube-api-access-8pw49". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:46:04 crc kubenswrapper[4730]: I0320 16:46:04.718963    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pw49\" (UniqueName: \"kubernetes.io/projected/1598084b-3967-4d5d-8911-87b4bbf10965-kube-api-access-8pw49\") on node \"crc\" DevicePath \"\""
Mar 20 16:46:05 crc kubenswrapper[4730]: I0320 16:46:05.243201    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567086-bgtzj" event={"ID":"1598084b-3967-4d5d-8911-87b4bbf10965","Type":"ContainerDied","Data":"6109ce38a6660db0e0a833021a4b531135042d1a4f46fc4d4a324f3c95359edd"}
Mar 20 16:46:05 crc kubenswrapper[4730]: I0320 16:46:05.243272    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6109ce38a6660db0e0a833021a4b531135042d1a4f46fc4d4a324f3c95359edd"
Mar 20 16:46:05 crc kubenswrapper[4730]: I0320 16:46:05.243338    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567086-bgtzj"
Mar 20 16:46:05 crc kubenswrapper[4730]: I0320 16:46:05.671829    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567080-445j8"]
Mar 20 16:46:05 crc kubenswrapper[4730]: I0320 16:46:05.692293    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567080-445j8"]
Mar 20 16:46:07 crc kubenswrapper[4730]: I0320 16:46:07.397409    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qgqdz"]
Mar 20 16:46:07 crc kubenswrapper[4730]: E0320 16:46:07.398466    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1598084b-3967-4d5d-8911-87b4bbf10965" containerName="oc"
Mar 20 16:46:07 crc kubenswrapper[4730]: I0320 16:46:07.398490    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="1598084b-3967-4d5d-8911-87b4bbf10965" containerName="oc"
Mar 20 16:46:07 crc kubenswrapper[4730]: I0320 16:46:07.398891    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="1598084b-3967-4d5d-8911-87b4bbf10965" containerName="oc"
Mar 20 16:46:07 crc kubenswrapper[4730]: I0320 16:46:07.401498    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qgqdz"
Mar 20 16:46:07 crc kubenswrapper[4730]: I0320 16:46:07.406187    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qgqdz"]
Mar 20 16:46:07 crc kubenswrapper[4730]: I0320 16:46:07.472736    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfng9\" (UniqueName: \"kubernetes.io/projected/64b3980f-0b4c-4751-9152-a70ce47eca6a-kube-api-access-dfng9\") pod \"community-operators-qgqdz\" (UID: \"64b3980f-0b4c-4751-9152-a70ce47eca6a\") " pod="openshift-marketplace/community-operators-qgqdz"
Mar 20 16:46:07 crc kubenswrapper[4730]: I0320 16:46:07.472844    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64b3980f-0b4c-4751-9152-a70ce47eca6a-catalog-content\") pod \"community-operators-qgqdz\" (UID: \"64b3980f-0b4c-4751-9152-a70ce47eca6a\") " pod="openshift-marketplace/community-operators-qgqdz"
Mar 20 16:46:07 crc kubenswrapper[4730]: I0320 16:46:07.472909    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64b3980f-0b4c-4751-9152-a70ce47eca6a-utilities\") pod \"community-operators-qgqdz\" (UID: \"64b3980f-0b4c-4751-9152-a70ce47eca6a\") " pod="openshift-marketplace/community-operators-qgqdz"
Mar 20 16:46:07 crc kubenswrapper[4730]: I0320 16:46:07.546571    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b8e3816-7082-4505-847c-880b40d33930" path="/var/lib/kubelet/pods/9b8e3816-7082-4505-847c-880b40d33930/volumes"
Mar 20 16:46:07 crc kubenswrapper[4730]: I0320 16:46:07.574786    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfng9\" (UniqueName: \"kubernetes.io/projected/64b3980f-0b4c-4751-9152-a70ce47eca6a-kube-api-access-dfng9\") pod \"community-operators-qgqdz\" (UID: \"64b3980f-0b4c-4751-9152-a70ce47eca6a\") " pod="openshift-marketplace/community-operators-qgqdz"
Mar 20 16:46:07 crc kubenswrapper[4730]: I0320 16:46:07.575209    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64b3980f-0b4c-4751-9152-a70ce47eca6a-catalog-content\") pod \"community-operators-qgqdz\" (UID: \"64b3980f-0b4c-4751-9152-a70ce47eca6a\") " pod="openshift-marketplace/community-operators-qgqdz"
Mar 20 16:46:07 crc kubenswrapper[4730]: I0320 16:46:07.575446    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64b3980f-0b4c-4751-9152-a70ce47eca6a-utilities\") pod \"community-operators-qgqdz\" (UID: \"64b3980f-0b4c-4751-9152-a70ce47eca6a\") " pod="openshift-marketplace/community-operators-qgqdz"
Mar 20 16:46:07 crc kubenswrapper[4730]: I0320 16:46:07.576566    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64b3980f-0b4c-4751-9152-a70ce47eca6a-catalog-content\") pod \"community-operators-qgqdz\" (UID: \"64b3980f-0b4c-4751-9152-a70ce47eca6a\") " pod="openshift-marketplace/community-operators-qgqdz"
Mar 20 16:46:07 crc kubenswrapper[4730]: I0320 16:46:07.576610    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64b3980f-0b4c-4751-9152-a70ce47eca6a-utilities\") pod \"community-operators-qgqdz\" (UID: \"64b3980f-0b4c-4751-9152-a70ce47eca6a\") " pod="openshift-marketplace/community-operators-qgqdz"
Mar 20 16:46:07 crc kubenswrapper[4730]: I0320 16:46:07.602215    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfng9\" (UniqueName: \"kubernetes.io/projected/64b3980f-0b4c-4751-9152-a70ce47eca6a-kube-api-access-dfng9\") pod \"community-operators-qgqdz\" (UID: \"64b3980f-0b4c-4751-9152-a70ce47eca6a\") " pod="openshift-marketplace/community-operators-qgqdz"
Mar 20 16:46:07 crc kubenswrapper[4730]: I0320 16:46:07.744448    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qgqdz"
Mar 20 16:46:08 crc kubenswrapper[4730]: W0320 16:46:08.242328    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64b3980f_0b4c_4751_9152_a70ce47eca6a.slice/crio-007c95ca65feb4b29413d4a25914d3478366cc965bcb7c6e91b5caf3907ff6e4 WatchSource:0}: Error finding container 007c95ca65feb4b29413d4a25914d3478366cc965bcb7c6e91b5caf3907ff6e4: Status 404 returned error can't find the container with id 007c95ca65feb4b29413d4a25914d3478366cc965bcb7c6e91b5caf3907ff6e4
Mar 20 16:46:08 crc kubenswrapper[4730]: I0320 16:46:08.263001    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qgqdz"]
Mar 20 16:46:08 crc kubenswrapper[4730]: I0320 16:46:08.274640    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qgqdz" event={"ID":"64b3980f-0b4c-4751-9152-a70ce47eca6a","Type":"ContainerStarted","Data":"007c95ca65feb4b29413d4a25914d3478366cc965bcb7c6e91b5caf3907ff6e4"}
Mar 20 16:46:08 crc kubenswrapper[4730]: I0320 16:46:08.533552    4730 scope.go:117] "RemoveContainer" containerID="ee13c0427129be645bbda82e57ffd84564f844ed3cec3a2748df09a4a24fe610"
Mar 20 16:46:08 crc kubenswrapper[4730]: E0320 16:46:08.534053    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:46:09 crc kubenswrapper[4730]: I0320 16:46:09.238733    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-64spl"
Mar 20 16:46:09 crc kubenswrapper[4730]: I0320 16:46:09.239071    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-64spl"
Mar 20 16:46:09 crc kubenswrapper[4730]: I0320 16:46:09.287293    4730 generic.go:334] "Generic (PLEG): container finished" podID="64b3980f-0b4c-4751-9152-a70ce47eca6a" containerID="086dd9b660f91623a636ce9b3aee0e5600fcc4ff44b302c02e3e1d6ab2c57e02" exitCode=0
Mar 20 16:46:09 crc kubenswrapper[4730]: I0320 16:46:09.287356    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qgqdz" event={"ID":"64b3980f-0b4c-4751-9152-a70ce47eca6a","Type":"ContainerDied","Data":"086dd9b660f91623a636ce9b3aee0e5600fcc4ff44b302c02e3e1d6ab2c57e02"}
Mar 20 16:46:09 crc kubenswrapper[4730]: I0320 16:46:09.300584    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-64spl"
Mar 20 16:46:09 crc kubenswrapper[4730]: I0320 16:46:09.353232    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-64spl"
Mar 20 16:46:11 crc kubenswrapper[4730]: I0320 16:46:11.313507    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qgqdz" event={"ID":"64b3980f-0b4c-4751-9152-a70ce47eca6a","Type":"ContainerStarted","Data":"ea1300f4a3d61efadbaf33ca42f5d74a7b4fece1736b7ee2f5ee762ab6d393a0"}
Mar 20 16:46:11 crc kubenswrapper[4730]: I0320 16:46:11.565136    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-64spl"]
Mar 20 16:46:11 crc kubenswrapper[4730]: I0320 16:46:11.565385    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-64spl" podUID="3c95ee43-d4bb-471d-977a-4cb14f99a03e" containerName="registry-server" containerID="cri-o://08a3a7a763d29be2ad327ffc0a70e98c7747477d348e5faafbeb00384d3d3088" gracePeriod=2
Mar 20 16:46:12 crc kubenswrapper[4730]: I0320 16:46:12.172878    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-64spl"
Mar 20 16:46:12 crc kubenswrapper[4730]: I0320 16:46:12.325975    4730 generic.go:334] "Generic (PLEG): container finished" podID="3c95ee43-d4bb-471d-977a-4cb14f99a03e" containerID="08a3a7a763d29be2ad327ffc0a70e98c7747477d348e5faafbeb00384d3d3088" exitCode=0
Mar 20 16:46:12 crc kubenswrapper[4730]: I0320 16:46:12.326019    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64spl" event={"ID":"3c95ee43-d4bb-471d-977a-4cb14f99a03e","Type":"ContainerDied","Data":"08a3a7a763d29be2ad327ffc0a70e98c7747477d348e5faafbeb00384d3d3088"}
Mar 20 16:46:12 crc kubenswrapper[4730]: I0320 16:46:12.326065    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64spl" event={"ID":"3c95ee43-d4bb-471d-977a-4cb14f99a03e","Type":"ContainerDied","Data":"959b6f691ded28673442a436f57cb1d08ee3d5669b036068fbb5bb7dcb1750ab"}
Mar 20 16:46:12 crc kubenswrapper[4730]: I0320 16:46:12.326071    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-64spl"
Mar 20 16:46:12 crc kubenswrapper[4730]: I0320 16:46:12.326090    4730 scope.go:117] "RemoveContainer" containerID="08a3a7a763d29be2ad327ffc0a70e98c7747477d348e5faafbeb00384d3d3088"
Mar 20 16:46:12 crc kubenswrapper[4730]: I0320 16:46:12.333160    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c95ee43-d4bb-471d-977a-4cb14f99a03e-utilities\") pod \"3c95ee43-d4bb-471d-977a-4cb14f99a03e\" (UID: \"3c95ee43-d4bb-471d-977a-4cb14f99a03e\") "
Mar 20 16:46:12 crc kubenswrapper[4730]: I0320 16:46:12.333264    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9s9p2\" (UniqueName: \"kubernetes.io/projected/3c95ee43-d4bb-471d-977a-4cb14f99a03e-kube-api-access-9s9p2\") pod \"3c95ee43-d4bb-471d-977a-4cb14f99a03e\" (UID: \"3c95ee43-d4bb-471d-977a-4cb14f99a03e\") "
Mar 20 16:46:12 crc kubenswrapper[4730]: I0320 16:46:12.333400    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c95ee43-d4bb-471d-977a-4cb14f99a03e-catalog-content\") pod \"3c95ee43-d4bb-471d-977a-4cb14f99a03e\" (UID: \"3c95ee43-d4bb-471d-977a-4cb14f99a03e\") "
Mar 20 16:46:12 crc kubenswrapper[4730]: I0320 16:46:12.339759    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c95ee43-d4bb-471d-977a-4cb14f99a03e-utilities" (OuterVolumeSpecName: "utilities") pod "3c95ee43-d4bb-471d-977a-4cb14f99a03e" (UID: "3c95ee43-d4bb-471d-977a-4cb14f99a03e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:46:12 crc kubenswrapper[4730]: I0320 16:46:12.346726    4730 scope.go:117] "RemoveContainer" containerID="c380a914b21ad400e0c8ebd69b48a7f3bfef4e58619cb65f94f2903eccb23840"
Mar 20 16:46:12 crc kubenswrapper[4730]: I0320 16:46:12.347493    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c95ee43-d4bb-471d-977a-4cb14f99a03e-kube-api-access-9s9p2" (OuterVolumeSpecName: "kube-api-access-9s9p2") pod "3c95ee43-d4bb-471d-977a-4cb14f99a03e" (UID: "3c95ee43-d4bb-471d-977a-4cb14f99a03e"). InnerVolumeSpecName "kube-api-access-9s9p2". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:46:12 crc kubenswrapper[4730]: I0320 16:46:12.389359    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c95ee43-d4bb-471d-977a-4cb14f99a03e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3c95ee43-d4bb-471d-977a-4cb14f99a03e" (UID: "3c95ee43-d4bb-471d-977a-4cb14f99a03e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:46:12 crc kubenswrapper[4730]: I0320 16:46:12.431096    4730 scope.go:117] "RemoveContainer" containerID="bb084137d5918afdeb931c3364180027f66610bb94a9b283e99cdd6522173e0b"
Mar 20 16:46:12 crc kubenswrapper[4730]: I0320 16:46:12.437010    4730 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c95ee43-d4bb-471d-977a-4cb14f99a03e-utilities\") on node \"crc\" DevicePath \"\""
Mar 20 16:46:12 crc kubenswrapper[4730]: I0320 16:46:12.437044    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9s9p2\" (UniqueName: \"kubernetes.io/projected/3c95ee43-d4bb-471d-977a-4cb14f99a03e-kube-api-access-9s9p2\") on node \"crc\" DevicePath \"\""
Mar 20 16:46:12 crc kubenswrapper[4730]: I0320 16:46:12.437081    4730 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c95ee43-d4bb-471d-977a-4cb14f99a03e-catalog-content\") on node \"crc\" DevicePath \"\""
Mar 20 16:46:12 crc kubenswrapper[4730]: I0320 16:46:12.523847    4730 scope.go:117] "RemoveContainer" containerID="08a3a7a763d29be2ad327ffc0a70e98c7747477d348e5faafbeb00384d3d3088"
Mar 20 16:46:12 crc kubenswrapper[4730]: E0320 16:46:12.524715    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08a3a7a763d29be2ad327ffc0a70e98c7747477d348e5faafbeb00384d3d3088\": container with ID starting with 08a3a7a763d29be2ad327ffc0a70e98c7747477d348e5faafbeb00384d3d3088 not found: ID does not exist" containerID="08a3a7a763d29be2ad327ffc0a70e98c7747477d348e5faafbeb00384d3d3088"
Mar 20 16:46:12 crc kubenswrapper[4730]: I0320 16:46:12.524770    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08a3a7a763d29be2ad327ffc0a70e98c7747477d348e5faafbeb00384d3d3088"} err="failed to get container status \"08a3a7a763d29be2ad327ffc0a70e98c7747477d348e5faafbeb00384d3d3088\": rpc error: code = NotFound desc = could not find container \"08a3a7a763d29be2ad327ffc0a70e98c7747477d348e5faafbeb00384d3d3088\": container with ID starting with 08a3a7a763d29be2ad327ffc0a70e98c7747477d348e5faafbeb00384d3d3088 not found: ID does not exist"
Mar 20 16:46:12 crc kubenswrapper[4730]: I0320 16:46:12.524796    4730 scope.go:117] "RemoveContainer" containerID="c380a914b21ad400e0c8ebd69b48a7f3bfef4e58619cb65f94f2903eccb23840"
Mar 20 16:46:12 crc kubenswrapper[4730]: E0320 16:46:12.525306    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c380a914b21ad400e0c8ebd69b48a7f3bfef4e58619cb65f94f2903eccb23840\": container with ID starting with c380a914b21ad400e0c8ebd69b48a7f3bfef4e58619cb65f94f2903eccb23840 not found: ID does not exist" containerID="c380a914b21ad400e0c8ebd69b48a7f3bfef4e58619cb65f94f2903eccb23840"
Mar 20 16:46:12 crc kubenswrapper[4730]: I0320 16:46:12.525345    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c380a914b21ad400e0c8ebd69b48a7f3bfef4e58619cb65f94f2903eccb23840"} err="failed to get container status \"c380a914b21ad400e0c8ebd69b48a7f3bfef4e58619cb65f94f2903eccb23840\": rpc error: code = NotFound desc = could not find container \"c380a914b21ad400e0c8ebd69b48a7f3bfef4e58619cb65f94f2903eccb23840\": container with ID starting with c380a914b21ad400e0c8ebd69b48a7f3bfef4e58619cb65f94f2903eccb23840 not found: ID does not exist"
Mar 20 16:46:12 crc kubenswrapper[4730]: I0320 16:46:12.525361    4730 scope.go:117] "RemoveContainer" containerID="bb084137d5918afdeb931c3364180027f66610bb94a9b283e99cdd6522173e0b"
Mar 20 16:46:12 crc kubenswrapper[4730]: E0320 16:46:12.525874    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb084137d5918afdeb931c3364180027f66610bb94a9b283e99cdd6522173e0b\": container with ID starting with bb084137d5918afdeb931c3364180027f66610bb94a9b283e99cdd6522173e0b not found: ID does not exist" containerID="bb084137d5918afdeb931c3364180027f66610bb94a9b283e99cdd6522173e0b"
Mar 20 16:46:12 crc kubenswrapper[4730]: I0320 16:46:12.525921    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb084137d5918afdeb931c3364180027f66610bb94a9b283e99cdd6522173e0b"} err="failed to get container status \"bb084137d5918afdeb931c3364180027f66610bb94a9b283e99cdd6522173e0b\": rpc error: code = NotFound desc = could not find container \"bb084137d5918afdeb931c3364180027f66610bb94a9b283e99cdd6522173e0b\": container with ID starting with bb084137d5918afdeb931c3364180027f66610bb94a9b283e99cdd6522173e0b not found: ID does not exist"
Mar 20 16:46:12 crc kubenswrapper[4730]: I0320 16:46:12.665460    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-64spl"]
Mar 20 16:46:12 crc kubenswrapper[4730]: I0320 16:46:12.674761    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-64spl"]
Mar 20 16:46:13 crc kubenswrapper[4730]: I0320 16:46:13.338105    4730 generic.go:334] "Generic (PLEG): container finished" podID="64b3980f-0b4c-4751-9152-a70ce47eca6a" containerID="ea1300f4a3d61efadbaf33ca42f5d74a7b4fece1736b7ee2f5ee762ab6d393a0" exitCode=0
Mar 20 16:46:13 crc kubenswrapper[4730]: I0320 16:46:13.338176    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qgqdz" event={"ID":"64b3980f-0b4c-4751-9152-a70ce47eca6a","Type":"ContainerDied","Data":"ea1300f4a3d61efadbaf33ca42f5d74a7b4fece1736b7ee2f5ee762ab6d393a0"}
Mar 20 16:46:13 crc kubenswrapper[4730]: I0320 16:46:13.550560    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c95ee43-d4bb-471d-977a-4cb14f99a03e" path="/var/lib/kubelet/pods/3c95ee43-d4bb-471d-977a-4cb14f99a03e/volumes"
Mar 20 16:46:14 crc kubenswrapper[4730]: I0320 16:46:14.349135    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qgqdz" event={"ID":"64b3980f-0b4c-4751-9152-a70ce47eca6a","Type":"ContainerStarted","Data":"4e18994692c56d92d7f2b3507847c3f6598d1c195897adcc3749b1a091493eb8"}
Mar 20 16:46:14 crc kubenswrapper[4730]: I0320 16:46:14.374075    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qgqdz" podStartSLOduration=2.913672732 podStartE2EDuration="7.374056381s" podCreationTimestamp="2026-03-20 16:46:07 +0000 UTC" firstStartedPulling="2026-03-20 16:46:09.289686246 +0000 UTC m=+4028.503057615" lastFinishedPulling="2026-03-20 16:46:13.750069895 +0000 UTC m=+4032.963441264" observedRunningTime="2026-03-20 16:46:14.367356811 +0000 UTC m=+4033.580728180" watchObservedRunningTime="2026-03-20 16:46:14.374056381 +0000 UTC m=+4033.587427760"
Mar 20 16:46:17 crc kubenswrapper[4730]: I0320 16:46:17.745003    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qgqdz"
Mar 20 16:46:17 crc kubenswrapper[4730]: I0320 16:46:17.747046    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qgqdz"
Mar 20 16:46:17 crc kubenswrapper[4730]: I0320 16:46:17.794149    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qgqdz"
Mar 20 16:46:18 crc kubenswrapper[4730]: I0320 16:46:18.460359    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qgqdz"
Mar 20 16:46:18 crc kubenswrapper[4730]: I0320 16:46:18.963852    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qgqdz"]
Mar 20 16:46:20 crc kubenswrapper[4730]: I0320 16:46:20.405411    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qgqdz" podUID="64b3980f-0b4c-4751-9152-a70ce47eca6a" containerName="registry-server" containerID="cri-o://4e18994692c56d92d7f2b3507847c3f6598d1c195897adcc3749b1a091493eb8" gracePeriod=2
Mar 20 16:46:20 crc kubenswrapper[4730]: I0320 16:46:20.535344    4730 scope.go:117] "RemoveContainer" containerID="ee13c0427129be645bbda82e57ffd84564f844ed3cec3a2748df09a4a24fe610"
Mar 20 16:46:20 crc kubenswrapper[4730]: E0320 16:46:20.536011    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:46:21 crc kubenswrapper[4730]: I0320 16:46:21.007683    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qgqdz"
Mar 20 16:46:21 crc kubenswrapper[4730]: I0320 16:46:21.125634    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfng9\" (UniqueName: \"kubernetes.io/projected/64b3980f-0b4c-4751-9152-a70ce47eca6a-kube-api-access-dfng9\") pod \"64b3980f-0b4c-4751-9152-a70ce47eca6a\" (UID: \"64b3980f-0b4c-4751-9152-a70ce47eca6a\") "
Mar 20 16:46:21 crc kubenswrapper[4730]: I0320 16:46:21.126157    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64b3980f-0b4c-4751-9152-a70ce47eca6a-utilities\") pod \"64b3980f-0b4c-4751-9152-a70ce47eca6a\" (UID: \"64b3980f-0b4c-4751-9152-a70ce47eca6a\") "
Mar 20 16:46:21 crc kubenswrapper[4730]: I0320 16:46:21.126388    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64b3980f-0b4c-4751-9152-a70ce47eca6a-catalog-content\") pod \"64b3980f-0b4c-4751-9152-a70ce47eca6a\" (UID: \"64b3980f-0b4c-4751-9152-a70ce47eca6a\") "
Mar 20 16:46:21 crc kubenswrapper[4730]: I0320 16:46:21.126881    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64b3980f-0b4c-4751-9152-a70ce47eca6a-utilities" (OuterVolumeSpecName: "utilities") pod "64b3980f-0b4c-4751-9152-a70ce47eca6a" (UID: "64b3980f-0b4c-4751-9152-a70ce47eca6a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:46:21 crc kubenswrapper[4730]: I0320 16:46:21.127453    4730 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64b3980f-0b4c-4751-9152-a70ce47eca6a-utilities\") on node \"crc\" DevicePath \"\""
Mar 20 16:46:21 crc kubenswrapper[4730]: I0320 16:46:21.134595    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64b3980f-0b4c-4751-9152-a70ce47eca6a-kube-api-access-dfng9" (OuterVolumeSpecName: "kube-api-access-dfng9") pod "64b3980f-0b4c-4751-9152-a70ce47eca6a" (UID: "64b3980f-0b4c-4751-9152-a70ce47eca6a"). InnerVolumeSpecName "kube-api-access-dfng9". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:46:21 crc kubenswrapper[4730]: I0320 16:46:21.206558    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64b3980f-0b4c-4751-9152-a70ce47eca6a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "64b3980f-0b4c-4751-9152-a70ce47eca6a" (UID: "64b3980f-0b4c-4751-9152-a70ce47eca6a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:46:21 crc kubenswrapper[4730]: I0320 16:46:21.229691    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfng9\" (UniqueName: \"kubernetes.io/projected/64b3980f-0b4c-4751-9152-a70ce47eca6a-kube-api-access-dfng9\") on node \"crc\" DevicePath \"\""
Mar 20 16:46:21 crc kubenswrapper[4730]: I0320 16:46:21.229740    4730 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64b3980f-0b4c-4751-9152-a70ce47eca6a-catalog-content\") on node \"crc\" DevicePath \"\""
Mar 20 16:46:21 crc kubenswrapper[4730]: I0320 16:46:21.415374    4730 generic.go:334] "Generic (PLEG): container finished" podID="64b3980f-0b4c-4751-9152-a70ce47eca6a" containerID="4e18994692c56d92d7f2b3507847c3f6598d1c195897adcc3749b1a091493eb8" exitCode=0
Mar 20 16:46:21 crc kubenswrapper[4730]: I0320 16:46:21.415429    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qgqdz" event={"ID":"64b3980f-0b4c-4751-9152-a70ce47eca6a","Type":"ContainerDied","Data":"4e18994692c56d92d7f2b3507847c3f6598d1c195897adcc3749b1a091493eb8"}
Mar 20 16:46:21 crc kubenswrapper[4730]: I0320 16:46:21.415460    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qgqdz" event={"ID":"64b3980f-0b4c-4751-9152-a70ce47eca6a","Type":"ContainerDied","Data":"007c95ca65feb4b29413d4a25914d3478366cc965bcb7c6e91b5caf3907ff6e4"}
Mar 20 16:46:21 crc kubenswrapper[4730]: I0320 16:46:21.415473    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qgqdz"
Mar 20 16:46:21 crc kubenswrapper[4730]: I0320 16:46:21.415478    4730 scope.go:117] "RemoveContainer" containerID="4e18994692c56d92d7f2b3507847c3f6598d1c195897adcc3749b1a091493eb8"
Mar 20 16:46:21 crc kubenswrapper[4730]: I0320 16:46:21.440741    4730 scope.go:117] "RemoveContainer" containerID="ea1300f4a3d61efadbaf33ca42f5d74a7b4fece1736b7ee2f5ee762ab6d393a0"
Mar 20 16:46:21 crc kubenswrapper[4730]: I0320 16:46:21.468335    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qgqdz"]
Mar 20 16:46:21 crc kubenswrapper[4730]: I0320 16:46:21.476405    4730 scope.go:117] "RemoveContainer" containerID="086dd9b660f91623a636ce9b3aee0e5600fcc4ff44b302c02e3e1d6ab2c57e02"
Mar 20 16:46:21 crc kubenswrapper[4730]: I0320 16:46:21.478192    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qgqdz"]
Mar 20 16:46:21 crc kubenswrapper[4730]: I0320 16:46:21.568708    4730 scope.go:117] "RemoveContainer" containerID="4e18994692c56d92d7f2b3507847c3f6598d1c195897adcc3749b1a091493eb8"
Mar 20 16:46:21 crc kubenswrapper[4730]: E0320 16:46:21.569282    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e18994692c56d92d7f2b3507847c3f6598d1c195897adcc3749b1a091493eb8\": container with ID starting with 4e18994692c56d92d7f2b3507847c3f6598d1c195897adcc3749b1a091493eb8 not found: ID does not exist" containerID="4e18994692c56d92d7f2b3507847c3f6598d1c195897adcc3749b1a091493eb8"
Mar 20 16:46:21 crc kubenswrapper[4730]: I0320 16:46:21.569328    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e18994692c56d92d7f2b3507847c3f6598d1c195897adcc3749b1a091493eb8"} err="failed to get container status \"4e18994692c56d92d7f2b3507847c3f6598d1c195897adcc3749b1a091493eb8\": rpc error: code = NotFound desc = could not find container \"4e18994692c56d92d7f2b3507847c3f6598d1c195897adcc3749b1a091493eb8\": container with ID starting with 4e18994692c56d92d7f2b3507847c3f6598d1c195897adcc3749b1a091493eb8 not found: ID does not exist"
Mar 20 16:46:21 crc kubenswrapper[4730]: I0320 16:46:21.569358    4730 scope.go:117] "RemoveContainer" containerID="ea1300f4a3d61efadbaf33ca42f5d74a7b4fece1736b7ee2f5ee762ab6d393a0"
Mar 20 16:46:21 crc kubenswrapper[4730]: E0320 16:46:21.569648    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea1300f4a3d61efadbaf33ca42f5d74a7b4fece1736b7ee2f5ee762ab6d393a0\": container with ID starting with ea1300f4a3d61efadbaf33ca42f5d74a7b4fece1736b7ee2f5ee762ab6d393a0 not found: ID does not exist" containerID="ea1300f4a3d61efadbaf33ca42f5d74a7b4fece1736b7ee2f5ee762ab6d393a0"
Mar 20 16:46:21 crc kubenswrapper[4730]: I0320 16:46:21.569719    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea1300f4a3d61efadbaf33ca42f5d74a7b4fece1736b7ee2f5ee762ab6d393a0"} err="failed to get container status \"ea1300f4a3d61efadbaf33ca42f5d74a7b4fece1736b7ee2f5ee762ab6d393a0\": rpc error: code = NotFound desc = could not find container \"ea1300f4a3d61efadbaf33ca42f5d74a7b4fece1736b7ee2f5ee762ab6d393a0\": container with ID starting with ea1300f4a3d61efadbaf33ca42f5d74a7b4fece1736b7ee2f5ee762ab6d393a0 not found: ID does not exist"
Mar 20 16:46:21 crc kubenswrapper[4730]: I0320 16:46:21.569761    4730 scope.go:117] "RemoveContainer" containerID="086dd9b660f91623a636ce9b3aee0e5600fcc4ff44b302c02e3e1d6ab2c57e02"
Mar 20 16:46:21 crc kubenswrapper[4730]: E0320 16:46:21.570799    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"086dd9b660f91623a636ce9b3aee0e5600fcc4ff44b302c02e3e1d6ab2c57e02\": container with ID starting with 086dd9b660f91623a636ce9b3aee0e5600fcc4ff44b302c02e3e1d6ab2c57e02 not found: ID does not exist" containerID="086dd9b660f91623a636ce9b3aee0e5600fcc4ff44b302c02e3e1d6ab2c57e02"
Mar 20 16:46:21 crc kubenswrapper[4730]: I0320 16:46:21.570837    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"086dd9b660f91623a636ce9b3aee0e5600fcc4ff44b302c02e3e1d6ab2c57e02"} err="failed to get container status \"086dd9b660f91623a636ce9b3aee0e5600fcc4ff44b302c02e3e1d6ab2c57e02\": rpc error: code = NotFound desc = could not find container \"086dd9b660f91623a636ce9b3aee0e5600fcc4ff44b302c02e3e1d6ab2c57e02\": container with ID starting with 086dd9b660f91623a636ce9b3aee0e5600fcc4ff44b302c02e3e1d6ab2c57e02 not found: ID does not exist"
Mar 20 16:46:21 crc kubenswrapper[4730]: I0320 16:46:21.571613    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64b3980f-0b4c-4751-9152-a70ce47eca6a" path="/var/lib/kubelet/pods/64b3980f-0b4c-4751-9152-a70ce47eca6a/volumes"
Mar 20 16:46:31 crc kubenswrapper[4730]: I0320 16:46:31.546646    4730 scope.go:117] "RemoveContainer" containerID="ee13c0427129be645bbda82e57ffd84564f844ed3cec3a2748df09a4a24fe610"
Mar 20 16:46:31 crc kubenswrapper[4730]: E0320 16:46:31.547987    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:46:45 crc kubenswrapper[4730]: I0320 16:46:45.533522    4730 scope.go:117] "RemoveContainer" containerID="ee13c0427129be645bbda82e57ffd84564f844ed3cec3a2748df09a4a24fe610"
Mar 20 16:46:45 crc kubenswrapper[4730]: E0320 16:46:45.534354    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:46:51 crc kubenswrapper[4730]: I0320 16:46:51.289025    4730 scope.go:117] "RemoveContainer" containerID="86dd9cb2df6336d37948551b33d5a151e10e12f60435fec8a924c6900e110929"
Mar 20 16:46:59 crc kubenswrapper[4730]: I0320 16:46:59.533888    4730 scope.go:117] "RemoveContainer" containerID="ee13c0427129be645bbda82e57ffd84564f844ed3cec3a2748df09a4a24fe610"
Mar 20 16:46:59 crc kubenswrapper[4730]: E0320 16:46:59.535187    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:47:13 crc kubenswrapper[4730]: I0320 16:47:13.532881    4730 scope.go:117] "RemoveContainer" containerID="ee13c0427129be645bbda82e57ffd84564f844ed3cec3a2748df09a4a24fe610"
Mar 20 16:47:13 crc kubenswrapper[4730]: E0320 16:47:13.533957    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:47:26 crc kubenswrapper[4730]: I0320 16:47:26.534693    4730 scope.go:117] "RemoveContainer" containerID="ee13c0427129be645bbda82e57ffd84564f844ed3cec3a2748df09a4a24fe610"
Mar 20 16:47:26 crc kubenswrapper[4730]: E0320 16:47:26.535942    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:47:41 crc kubenswrapper[4730]: I0320 16:47:41.539490    4730 scope.go:117] "RemoveContainer" containerID="ee13c0427129be645bbda82e57ffd84564f844ed3cec3a2748df09a4a24fe610"
Mar 20 16:47:41 crc kubenswrapper[4730]: E0320 16:47:41.540448    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:47:55 crc kubenswrapper[4730]: I0320 16:47:55.534070    4730 scope.go:117] "RemoveContainer" containerID="ee13c0427129be645bbda82e57ffd84564f844ed3cec3a2748df09a4a24fe610"
Mar 20 16:47:55 crc kubenswrapper[4730]: E0320 16:47:55.535284    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:48:00 crc kubenswrapper[4730]: I0320 16:48:00.145497    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567088-jzf6l"]
Mar 20 16:48:00 crc kubenswrapper[4730]: E0320 16:48:00.146428    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64b3980f-0b4c-4751-9152-a70ce47eca6a" containerName="extract-utilities"
Mar 20 16:48:00 crc kubenswrapper[4730]: I0320 16:48:00.146444    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="64b3980f-0b4c-4751-9152-a70ce47eca6a" containerName="extract-utilities"
Mar 20 16:48:00 crc kubenswrapper[4730]: E0320 16:48:00.146464    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c95ee43-d4bb-471d-977a-4cb14f99a03e" containerName="registry-server"
Mar 20 16:48:00 crc kubenswrapper[4730]: I0320 16:48:00.146471    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c95ee43-d4bb-471d-977a-4cb14f99a03e" containerName="registry-server"
Mar 20 16:48:00 crc kubenswrapper[4730]: E0320 16:48:00.146481    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64b3980f-0b4c-4751-9152-a70ce47eca6a" containerName="extract-content"
Mar 20 16:48:00 crc kubenswrapper[4730]: I0320 16:48:00.146487    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="64b3980f-0b4c-4751-9152-a70ce47eca6a" containerName="extract-content"
Mar 20 16:48:00 crc kubenswrapper[4730]: E0320 16:48:00.146501    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c95ee43-d4bb-471d-977a-4cb14f99a03e" containerName="extract-utilities"
Mar 20 16:48:00 crc kubenswrapper[4730]: I0320 16:48:00.146506    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c95ee43-d4bb-471d-977a-4cb14f99a03e" containerName="extract-utilities"
Mar 20 16:48:00 crc kubenswrapper[4730]: E0320 16:48:00.146514    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c95ee43-d4bb-471d-977a-4cb14f99a03e" containerName="extract-content"
Mar 20 16:48:00 crc kubenswrapper[4730]: I0320 16:48:00.146519    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c95ee43-d4bb-471d-977a-4cb14f99a03e" containerName="extract-content"
Mar 20 16:48:00 crc kubenswrapper[4730]: E0320 16:48:00.146536    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64b3980f-0b4c-4751-9152-a70ce47eca6a" containerName="registry-server"
Mar 20 16:48:00 crc kubenswrapper[4730]: I0320 16:48:00.146542    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="64b3980f-0b4c-4751-9152-a70ce47eca6a" containerName="registry-server"
Mar 20 16:48:00 crc kubenswrapper[4730]: I0320 16:48:00.146718    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c95ee43-d4bb-471d-977a-4cb14f99a03e" containerName="registry-server"
Mar 20 16:48:00 crc kubenswrapper[4730]: I0320 16:48:00.146748    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="64b3980f-0b4c-4751-9152-a70ce47eca6a" containerName="registry-server"
Mar 20 16:48:00 crc kubenswrapper[4730]: I0320 16:48:00.147459    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567088-jzf6l"
Mar 20 16:48:00 crc kubenswrapper[4730]: I0320 16:48:00.149229    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt"
Mar 20 16:48:00 crc kubenswrapper[4730]: I0320 16:48:00.149365    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt"
Mar 20 16:48:00 crc kubenswrapper[4730]: I0320 16:48:00.150943    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vlqbl"
Mar 20 16:48:00 crc kubenswrapper[4730]: I0320 16:48:00.163558    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567088-jzf6l"]
Mar 20 16:48:00 crc kubenswrapper[4730]: I0320 16:48:00.264290    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcf6j\" (UniqueName: \"kubernetes.io/projected/423d9abb-9507-4e8e-aa00-42b3d34328ed-kube-api-access-jcf6j\") pod \"auto-csr-approver-29567088-jzf6l\" (UID: \"423d9abb-9507-4e8e-aa00-42b3d34328ed\") " pod="openshift-infra/auto-csr-approver-29567088-jzf6l"
Mar 20 16:48:00 crc kubenswrapper[4730]: I0320 16:48:00.366897    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcf6j\" (UniqueName: \"kubernetes.io/projected/423d9abb-9507-4e8e-aa00-42b3d34328ed-kube-api-access-jcf6j\") pod \"auto-csr-approver-29567088-jzf6l\" (UID: \"423d9abb-9507-4e8e-aa00-42b3d34328ed\") " pod="openshift-infra/auto-csr-approver-29567088-jzf6l"
Mar 20 16:48:00 crc kubenswrapper[4730]: I0320 16:48:00.399015    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcf6j\" (UniqueName: \"kubernetes.io/projected/423d9abb-9507-4e8e-aa00-42b3d34328ed-kube-api-access-jcf6j\") pod \"auto-csr-approver-29567088-jzf6l\" (UID: \"423d9abb-9507-4e8e-aa00-42b3d34328ed\") " pod="openshift-infra/auto-csr-approver-29567088-jzf6l"
Mar 20 16:48:00 crc kubenswrapper[4730]: I0320 16:48:00.470179    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567088-jzf6l"
Mar 20 16:48:00 crc kubenswrapper[4730]: I0320 16:48:00.974122    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567088-jzf6l"]
Mar 20 16:48:01 crc kubenswrapper[4730]: I0320 16:48:01.529744    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567088-jzf6l" event={"ID":"423d9abb-9507-4e8e-aa00-42b3d34328ed","Type":"ContainerStarted","Data":"1c6704cd2d75950e01dbade980fd9ad91866b5accc9f9b477ddc38c55b304a3e"}
Mar 20 16:48:03 crc kubenswrapper[4730]: I0320 16:48:03.547792    4730 generic.go:334] "Generic (PLEG): container finished" podID="423d9abb-9507-4e8e-aa00-42b3d34328ed" containerID="7b312e45c65bdab74120878c5bbe6f1323de4c86f0295b5909d9931e6d7a0af0" exitCode=0
Mar 20 16:48:03 crc kubenswrapper[4730]: I0320 16:48:03.547864    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567088-jzf6l" event={"ID":"423d9abb-9507-4e8e-aa00-42b3d34328ed","Type":"ContainerDied","Data":"7b312e45c65bdab74120878c5bbe6f1323de4c86f0295b5909d9931e6d7a0af0"}
Mar 20 16:48:05 crc kubenswrapper[4730]: I0320 16:48:05.001298    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567088-jzf6l"
Mar 20 16:48:05 crc kubenswrapper[4730]: I0320 16:48:05.184145    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcf6j\" (UniqueName: \"kubernetes.io/projected/423d9abb-9507-4e8e-aa00-42b3d34328ed-kube-api-access-jcf6j\") pod \"423d9abb-9507-4e8e-aa00-42b3d34328ed\" (UID: \"423d9abb-9507-4e8e-aa00-42b3d34328ed\") "
Mar 20 16:48:05 crc kubenswrapper[4730]: I0320 16:48:05.190477    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/423d9abb-9507-4e8e-aa00-42b3d34328ed-kube-api-access-jcf6j" (OuterVolumeSpecName: "kube-api-access-jcf6j") pod "423d9abb-9507-4e8e-aa00-42b3d34328ed" (UID: "423d9abb-9507-4e8e-aa00-42b3d34328ed"). InnerVolumeSpecName "kube-api-access-jcf6j". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:48:05 crc kubenswrapper[4730]: I0320 16:48:05.286783    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcf6j\" (UniqueName: \"kubernetes.io/projected/423d9abb-9507-4e8e-aa00-42b3d34328ed-kube-api-access-jcf6j\") on node \"crc\" DevicePath \"\""
Mar 20 16:48:05 crc kubenswrapper[4730]: I0320 16:48:05.580618    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567088-jzf6l" event={"ID":"423d9abb-9507-4e8e-aa00-42b3d34328ed","Type":"ContainerDied","Data":"1c6704cd2d75950e01dbade980fd9ad91866b5accc9f9b477ddc38c55b304a3e"}
Mar 20 16:48:05 crc kubenswrapper[4730]: I0320 16:48:05.580671    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c6704cd2d75950e01dbade980fd9ad91866b5accc9f9b477ddc38c55b304a3e"
Mar 20 16:48:05 crc kubenswrapper[4730]: I0320 16:48:05.580761    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567088-jzf6l"
Mar 20 16:48:06 crc kubenswrapper[4730]: I0320 16:48:06.078356    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567082-597m4"]
Mar 20 16:48:06 crc kubenswrapper[4730]: I0320 16:48:06.090846    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567082-597m4"]
Mar 20 16:48:07 crc kubenswrapper[4730]: I0320 16:48:07.546936    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="713df8c0-cae2-4cfd-9ecf-66856a78c066" path="/var/lib/kubelet/pods/713df8c0-cae2-4cfd-9ecf-66856a78c066/volumes"
Mar 20 16:48:08 crc kubenswrapper[4730]: I0320 16:48:08.533723    4730 scope.go:117] "RemoveContainer" containerID="ee13c0427129be645bbda82e57ffd84564f844ed3cec3a2748df09a4a24fe610"
Mar 20 16:48:08 crc kubenswrapper[4730]: E0320 16:48:08.534826    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:48:17 crc kubenswrapper[4730]: I0320 16:48:17.272127    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-7c5c8ffdd9-xpfhf" podUID="b9780622-27f3-4339-8107-321feed5e25b" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502"
Mar 20 16:48:23 crc kubenswrapper[4730]: I0320 16:48:23.533386    4730 scope.go:117] "RemoveContainer" containerID="ee13c0427129be645bbda82e57ffd84564f844ed3cec3a2748df09a4a24fe610"
Mar 20 16:48:23 crc kubenswrapper[4730]: E0320 16:48:23.534353    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:48:35 crc kubenswrapper[4730]: I0320 16:48:35.532709    4730 scope.go:117] "RemoveContainer" containerID="ee13c0427129be645bbda82e57ffd84564f844ed3cec3a2748df09a4a24fe610"
Mar 20 16:48:35 crc kubenswrapper[4730]: E0320 16:48:35.533512    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:48:47 crc kubenswrapper[4730]: I0320 16:48:47.200391    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-l6l4h"]
Mar 20 16:48:47 crc kubenswrapper[4730]: E0320 16:48:47.201591    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="423d9abb-9507-4e8e-aa00-42b3d34328ed" containerName="oc"
Mar 20 16:48:47 crc kubenswrapper[4730]: I0320 16:48:47.201612    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="423d9abb-9507-4e8e-aa00-42b3d34328ed" containerName="oc"
Mar 20 16:48:47 crc kubenswrapper[4730]: I0320 16:48:47.201865    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="423d9abb-9507-4e8e-aa00-42b3d34328ed" containerName="oc"
Mar 20 16:48:47 crc kubenswrapper[4730]: I0320 16:48:47.203624    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l6l4h"
Mar 20 16:48:47 crc kubenswrapper[4730]: I0320 16:48:47.258148    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l6l4h"]
Mar 20 16:48:47 crc kubenswrapper[4730]: I0320 16:48:47.369511    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68gqd\" (UniqueName: \"kubernetes.io/projected/e6a0115e-9fa1-4809-adb5-76be4e66cd52-kube-api-access-68gqd\") pod \"redhat-operators-l6l4h\" (UID: \"e6a0115e-9fa1-4809-adb5-76be4e66cd52\") " pod="openshift-marketplace/redhat-operators-l6l4h"
Mar 20 16:48:47 crc kubenswrapper[4730]: I0320 16:48:47.369961    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6a0115e-9fa1-4809-adb5-76be4e66cd52-utilities\") pod \"redhat-operators-l6l4h\" (UID: \"e6a0115e-9fa1-4809-adb5-76be4e66cd52\") " pod="openshift-marketplace/redhat-operators-l6l4h"
Mar 20 16:48:47 crc kubenswrapper[4730]: I0320 16:48:47.370082    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6a0115e-9fa1-4809-adb5-76be4e66cd52-catalog-content\") pod \"redhat-operators-l6l4h\" (UID: \"e6a0115e-9fa1-4809-adb5-76be4e66cd52\") " pod="openshift-marketplace/redhat-operators-l6l4h"
Mar 20 16:48:47 crc kubenswrapper[4730]: I0320 16:48:47.472049    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6a0115e-9fa1-4809-adb5-76be4e66cd52-catalog-content\") pod \"redhat-operators-l6l4h\" (UID: \"e6a0115e-9fa1-4809-adb5-76be4e66cd52\") " pod="openshift-marketplace/redhat-operators-l6l4h"
Mar 20 16:48:47 crc kubenswrapper[4730]: I0320 16:48:47.472170    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68gqd\" (UniqueName: \"kubernetes.io/projected/e6a0115e-9fa1-4809-adb5-76be4e66cd52-kube-api-access-68gqd\") pod \"redhat-operators-l6l4h\" (UID: \"e6a0115e-9fa1-4809-adb5-76be4e66cd52\") " pod="openshift-marketplace/redhat-operators-l6l4h"
Mar 20 16:48:47 crc kubenswrapper[4730]: I0320 16:48:47.472261    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6a0115e-9fa1-4809-adb5-76be4e66cd52-utilities\") pod \"redhat-operators-l6l4h\" (UID: \"e6a0115e-9fa1-4809-adb5-76be4e66cd52\") " pod="openshift-marketplace/redhat-operators-l6l4h"
Mar 20 16:48:47 crc kubenswrapper[4730]: I0320 16:48:47.472718    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6a0115e-9fa1-4809-adb5-76be4e66cd52-utilities\") pod \"redhat-operators-l6l4h\" (UID: \"e6a0115e-9fa1-4809-adb5-76be4e66cd52\") " pod="openshift-marketplace/redhat-operators-l6l4h"
Mar 20 16:48:47 crc kubenswrapper[4730]: I0320 16:48:47.472762    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6a0115e-9fa1-4809-adb5-76be4e66cd52-catalog-content\") pod \"redhat-operators-l6l4h\" (UID: \"e6a0115e-9fa1-4809-adb5-76be4e66cd52\") " pod="openshift-marketplace/redhat-operators-l6l4h"
Mar 20 16:48:47 crc kubenswrapper[4730]: I0320 16:48:47.490403    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68gqd\" (UniqueName: \"kubernetes.io/projected/e6a0115e-9fa1-4809-adb5-76be4e66cd52-kube-api-access-68gqd\") pod \"redhat-operators-l6l4h\" (UID: \"e6a0115e-9fa1-4809-adb5-76be4e66cd52\") " pod="openshift-marketplace/redhat-operators-l6l4h"
Mar 20 16:48:47 crc kubenswrapper[4730]: I0320 16:48:47.591745    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l6l4h"
Mar 20 16:48:47 crc kubenswrapper[4730]: I0320 16:48:47.797337    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-z7gbm"]
Mar 20 16:48:47 crc kubenswrapper[4730]: I0320 16:48:47.799668    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z7gbm"
Mar 20 16:48:47 crc kubenswrapper[4730]: I0320 16:48:47.817318    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z7gbm"]
Mar 20 16:48:47 crc kubenswrapper[4730]: I0320 16:48:47.880115    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9529743d-7cc8-4356-a60a-efb0c0657fc6-catalog-content\") pod \"certified-operators-z7gbm\" (UID: \"9529743d-7cc8-4356-a60a-efb0c0657fc6\") " pod="openshift-marketplace/certified-operators-z7gbm"
Mar 20 16:48:47 crc kubenswrapper[4730]: I0320 16:48:47.880179    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9529743d-7cc8-4356-a60a-efb0c0657fc6-utilities\") pod \"certified-operators-z7gbm\" (UID: \"9529743d-7cc8-4356-a60a-efb0c0657fc6\") " pod="openshift-marketplace/certified-operators-z7gbm"
Mar 20 16:48:47 crc kubenswrapper[4730]: I0320 16:48:47.880207    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfnsd\" (UniqueName: \"kubernetes.io/projected/9529743d-7cc8-4356-a60a-efb0c0657fc6-kube-api-access-kfnsd\") pod \"certified-operators-z7gbm\" (UID: \"9529743d-7cc8-4356-a60a-efb0c0657fc6\") " pod="openshift-marketplace/certified-operators-z7gbm"
Mar 20 16:48:47 crc kubenswrapper[4730]: I0320 16:48:47.982366    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9529743d-7cc8-4356-a60a-efb0c0657fc6-catalog-content\") pod \"certified-operators-z7gbm\" (UID: \"9529743d-7cc8-4356-a60a-efb0c0657fc6\") " pod="openshift-marketplace/certified-operators-z7gbm"
Mar 20 16:48:47 crc kubenswrapper[4730]: I0320 16:48:47.982434    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9529743d-7cc8-4356-a60a-efb0c0657fc6-utilities\") pod \"certified-operators-z7gbm\" (UID: \"9529743d-7cc8-4356-a60a-efb0c0657fc6\") " pod="openshift-marketplace/certified-operators-z7gbm"
Mar 20 16:48:47 crc kubenswrapper[4730]: I0320 16:48:47.982464    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfnsd\" (UniqueName: \"kubernetes.io/projected/9529743d-7cc8-4356-a60a-efb0c0657fc6-kube-api-access-kfnsd\") pod \"certified-operators-z7gbm\" (UID: \"9529743d-7cc8-4356-a60a-efb0c0657fc6\") " pod="openshift-marketplace/certified-operators-z7gbm"
Mar 20 16:48:47 crc kubenswrapper[4730]: I0320 16:48:47.982992    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9529743d-7cc8-4356-a60a-efb0c0657fc6-catalog-content\") pod \"certified-operators-z7gbm\" (UID: \"9529743d-7cc8-4356-a60a-efb0c0657fc6\") " pod="openshift-marketplace/certified-operators-z7gbm"
Mar 20 16:48:47 crc kubenswrapper[4730]: I0320 16:48:47.982992    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9529743d-7cc8-4356-a60a-efb0c0657fc6-utilities\") pod \"certified-operators-z7gbm\" (UID: \"9529743d-7cc8-4356-a60a-efb0c0657fc6\") " pod="openshift-marketplace/certified-operators-z7gbm"
Mar 20 16:48:48 crc kubenswrapper[4730]: I0320 16:48:48.003166    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfnsd\" (UniqueName: \"kubernetes.io/projected/9529743d-7cc8-4356-a60a-efb0c0657fc6-kube-api-access-kfnsd\") pod \"certified-operators-z7gbm\" (UID: \"9529743d-7cc8-4356-a60a-efb0c0657fc6\") " pod="openshift-marketplace/certified-operators-z7gbm"
Mar 20 16:48:48 crc kubenswrapper[4730]: I0320 16:48:48.065595    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l6l4h"]
Mar 20 16:48:48 crc kubenswrapper[4730]: I0320 16:48:48.126394    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z7gbm"
Mar 20 16:48:48 crc kubenswrapper[4730]: I0320 16:48:48.635729    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z7gbm"]
Mar 20 16:48:49 crc kubenswrapper[4730]: I0320 16:48:49.044854    4730 generic.go:334] "Generic (PLEG): container finished" podID="e6a0115e-9fa1-4809-adb5-76be4e66cd52" containerID="9cafb991c7ca35e5ba77af18a045990efcbfc9d1be49bb90913f2673a3398384" exitCode=0
Mar 20 16:48:49 crc kubenswrapper[4730]: I0320 16:48:49.044953    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l6l4h" event={"ID":"e6a0115e-9fa1-4809-adb5-76be4e66cd52","Type":"ContainerDied","Data":"9cafb991c7ca35e5ba77af18a045990efcbfc9d1be49bb90913f2673a3398384"}
Mar 20 16:48:49 crc kubenswrapper[4730]: I0320 16:48:49.045346    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l6l4h" event={"ID":"e6a0115e-9fa1-4809-adb5-76be4e66cd52","Type":"ContainerStarted","Data":"d2ca20484b8b40d7c6540ecf524ef385e1e84275942650c4814c8b0e6d8e9e71"}
Mar 20 16:48:49 crc kubenswrapper[4730]: I0320 16:48:49.047979    4730 generic.go:334] "Generic (PLEG): container finished" podID="9529743d-7cc8-4356-a60a-efb0c0657fc6" containerID="d65dd161a59341b12a4cdf8839f51c7e61e875c1a19f37eba22280b7b377868a" exitCode=0
Mar 20 16:48:49 crc kubenswrapper[4730]: I0320 16:48:49.048014    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z7gbm" event={"ID":"9529743d-7cc8-4356-a60a-efb0c0657fc6","Type":"ContainerDied","Data":"d65dd161a59341b12a4cdf8839f51c7e61e875c1a19f37eba22280b7b377868a"}
Mar 20 16:48:49 crc kubenswrapper[4730]: I0320 16:48:49.048042    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z7gbm" event={"ID":"9529743d-7cc8-4356-a60a-efb0c0657fc6","Type":"ContainerStarted","Data":"3d1bafb556879d48fd1c609a49ffd281b803156d4929627c53c4cf726a9391f4"}
Mar 20 16:48:50 crc kubenswrapper[4730]: I0320 16:48:50.057043    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l6l4h" event={"ID":"e6a0115e-9fa1-4809-adb5-76be4e66cd52","Type":"ContainerStarted","Data":"74f6aec5b0a2a2981ace72e9b9444851318412a96c97f48bbc92f4581cf79a47"}
Mar 20 16:48:50 crc kubenswrapper[4730]: I0320 16:48:50.059978    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z7gbm" event={"ID":"9529743d-7cc8-4356-a60a-efb0c0657fc6","Type":"ContainerStarted","Data":"dc80adcd98dab5030f9344d7fc4b434975389290e0aacc050a76cea518392d8e"}
Mar 20 16:48:50 crc kubenswrapper[4730]: I0320 16:48:50.533690    4730 scope.go:117] "RemoveContainer" containerID="ee13c0427129be645bbda82e57ffd84564f844ed3cec3a2748df09a4a24fe610"
Mar 20 16:48:50 crc kubenswrapper[4730]: E0320 16:48:50.534009    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:48:51 crc kubenswrapper[4730]: I0320 16:48:51.415472    4730 scope.go:117] "RemoveContainer" containerID="44f54f3fc7434586ebe0f8d3b305da77181f85e1a148a72f852acf4f69b33aae"
Mar 20 16:48:52 crc kubenswrapper[4730]: I0320 16:48:52.084417    4730 generic.go:334] "Generic (PLEG): container finished" podID="9529743d-7cc8-4356-a60a-efb0c0657fc6" containerID="dc80adcd98dab5030f9344d7fc4b434975389290e0aacc050a76cea518392d8e" exitCode=0
Mar 20 16:48:52 crc kubenswrapper[4730]: I0320 16:48:52.084465    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z7gbm" event={"ID":"9529743d-7cc8-4356-a60a-efb0c0657fc6","Type":"ContainerDied","Data":"dc80adcd98dab5030f9344d7fc4b434975389290e0aacc050a76cea518392d8e"}
Mar 20 16:48:53 crc kubenswrapper[4730]: I0320 16:48:53.095582    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z7gbm" event={"ID":"9529743d-7cc8-4356-a60a-efb0c0657fc6","Type":"ContainerStarted","Data":"338ae4a95e896fe7a2224e63f3555e3b661a6774b8e06328bbe2fece926f00df"}
Mar 20 16:48:53 crc kubenswrapper[4730]: I0320 16:48:53.118383    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-z7gbm" podStartSLOduration=2.358345201 podStartE2EDuration="6.118357394s" podCreationTimestamp="2026-03-20 16:48:47 +0000 UTC" firstStartedPulling="2026-03-20 16:48:49.049579673 +0000 UTC m=+4188.262951042" lastFinishedPulling="2026-03-20 16:48:52.809591866 +0000 UTC m=+4192.022963235" observedRunningTime="2026-03-20 16:48:53.112282281 +0000 UTC m=+4192.325653670" watchObservedRunningTime="2026-03-20 16:48:53.118357394 +0000 UTC m=+4192.331728763"
Mar 20 16:48:55 crc kubenswrapper[4730]: I0320 16:48:55.114962    4730 generic.go:334] "Generic (PLEG): container finished" podID="e6a0115e-9fa1-4809-adb5-76be4e66cd52" containerID="74f6aec5b0a2a2981ace72e9b9444851318412a96c97f48bbc92f4581cf79a47" exitCode=0
Mar 20 16:48:55 crc kubenswrapper[4730]: I0320 16:48:55.115322    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l6l4h" event={"ID":"e6a0115e-9fa1-4809-adb5-76be4e66cd52","Type":"ContainerDied","Data":"74f6aec5b0a2a2981ace72e9b9444851318412a96c97f48bbc92f4581cf79a47"}
Mar 20 16:48:56 crc kubenswrapper[4730]: I0320 16:48:56.129214    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l6l4h" event={"ID":"e6a0115e-9fa1-4809-adb5-76be4e66cd52","Type":"ContainerStarted","Data":"171abedfb7066a417cee9f0d2c0b610b90402d3dd8de8f88927c40fb5e73a8d3"}
Mar 20 16:48:56 crc kubenswrapper[4730]: I0320 16:48:56.160999    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-l6l4h" podStartSLOduration=2.47031257 podStartE2EDuration="9.160980956s" podCreationTimestamp="2026-03-20 16:48:47 +0000 UTC" firstStartedPulling="2026-03-20 16:48:49.047492553 +0000 UTC m=+4188.260863922" lastFinishedPulling="2026-03-20 16:48:55.738160939 +0000 UTC m=+4194.951532308" observedRunningTime="2026-03-20 16:48:56.150768906 +0000 UTC m=+4195.364140295" watchObservedRunningTime="2026-03-20 16:48:56.160980956 +0000 UTC m=+4195.374352315"
Mar 20 16:48:57 crc kubenswrapper[4730]: I0320 16:48:57.592280    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-l6l4h"
Mar 20 16:48:57 crc kubenswrapper[4730]: I0320 16:48:57.593795    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-l6l4h"
Mar 20 16:48:58 crc kubenswrapper[4730]: I0320 16:48:58.127022    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-z7gbm"
Mar 20 16:48:58 crc kubenswrapper[4730]: I0320 16:48:58.127081    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-z7gbm"
Mar 20 16:48:58 crc kubenswrapper[4730]: I0320 16:48:58.637929    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l6l4h" podUID="e6a0115e-9fa1-4809-adb5-76be4e66cd52" containerName="registry-server" probeResult="failure" output=<
Mar 20 16:48:58 crc kubenswrapper[4730]:         timeout: failed to connect service ":50051" within 1s
Mar 20 16:48:58 crc kubenswrapper[4730]:  >
Mar 20 16:48:59 crc kubenswrapper[4730]: I0320 16:48:59.172419    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-z7gbm" podUID="9529743d-7cc8-4356-a60a-efb0c0657fc6" containerName="registry-server" probeResult="failure" output=<
Mar 20 16:48:59 crc kubenswrapper[4730]:         timeout: failed to connect service ":50051" within 1s
Mar 20 16:48:59 crc kubenswrapper[4730]:  >
Mar 20 16:49:04 crc kubenswrapper[4730]: I0320 16:49:04.534731    4730 scope.go:117] "RemoveContainer" containerID="ee13c0427129be645bbda82e57ffd84564f844ed3cec3a2748df09a4a24fe610"
Mar 20 16:49:04 crc kubenswrapper[4730]: E0320 16:49:04.535316    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:49:08 crc kubenswrapper[4730]: I0320 16:49:08.207443    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-z7gbm"
Mar 20 16:49:08 crc kubenswrapper[4730]: I0320 16:49:08.271831    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-z7gbm"
Mar 20 16:49:08 crc kubenswrapper[4730]: I0320 16:49:08.451172    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z7gbm"]
Mar 20 16:49:09 crc kubenswrapper[4730]: I0320 16:49:09.045637    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l6l4h" podUID="e6a0115e-9fa1-4809-adb5-76be4e66cd52" containerName="registry-server" probeResult="failure" output=<
Mar 20 16:49:09 crc kubenswrapper[4730]:         timeout: failed to connect service ":50051" within 1s
Mar 20 16:49:09 crc kubenswrapper[4730]:  >
Mar 20 16:49:09 crc kubenswrapper[4730]: I0320 16:49:09.268694    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-z7gbm" podUID="9529743d-7cc8-4356-a60a-efb0c0657fc6" containerName="registry-server" containerID="cri-o://338ae4a95e896fe7a2224e63f3555e3b661a6774b8e06328bbe2fece926f00df" gracePeriod=2
Mar 20 16:49:10 crc kubenswrapper[4730]: I0320 16:49:10.286400    4730 generic.go:334] "Generic (PLEG): container finished" podID="9529743d-7cc8-4356-a60a-efb0c0657fc6" containerID="338ae4a95e896fe7a2224e63f3555e3b661a6774b8e06328bbe2fece926f00df" exitCode=0
Mar 20 16:49:10 crc kubenswrapper[4730]: I0320 16:49:10.286741    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z7gbm" event={"ID":"9529743d-7cc8-4356-a60a-efb0c0657fc6","Type":"ContainerDied","Data":"338ae4a95e896fe7a2224e63f3555e3b661a6774b8e06328bbe2fece926f00df"}
Mar 20 16:49:10 crc kubenswrapper[4730]: I0320 16:49:10.468361    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z7gbm"
Mar 20 16:49:10 crc kubenswrapper[4730]: I0320 16:49:10.662169    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9529743d-7cc8-4356-a60a-efb0c0657fc6-utilities\") pod \"9529743d-7cc8-4356-a60a-efb0c0657fc6\" (UID: \"9529743d-7cc8-4356-a60a-efb0c0657fc6\") "
Mar 20 16:49:10 crc kubenswrapper[4730]: I0320 16:49:10.662479    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfnsd\" (UniqueName: \"kubernetes.io/projected/9529743d-7cc8-4356-a60a-efb0c0657fc6-kube-api-access-kfnsd\") pod \"9529743d-7cc8-4356-a60a-efb0c0657fc6\" (UID: \"9529743d-7cc8-4356-a60a-efb0c0657fc6\") "
Mar 20 16:49:10 crc kubenswrapper[4730]: I0320 16:49:10.662545    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9529743d-7cc8-4356-a60a-efb0c0657fc6-catalog-content\") pod \"9529743d-7cc8-4356-a60a-efb0c0657fc6\" (UID: \"9529743d-7cc8-4356-a60a-efb0c0657fc6\") "
Mar 20 16:49:10 crc kubenswrapper[4730]: I0320 16:49:10.663194    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9529743d-7cc8-4356-a60a-efb0c0657fc6-utilities" (OuterVolumeSpecName: "utilities") pod "9529743d-7cc8-4356-a60a-efb0c0657fc6" (UID: "9529743d-7cc8-4356-a60a-efb0c0657fc6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:49:10 crc kubenswrapper[4730]: I0320 16:49:10.668502    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9529743d-7cc8-4356-a60a-efb0c0657fc6-kube-api-access-kfnsd" (OuterVolumeSpecName: "kube-api-access-kfnsd") pod "9529743d-7cc8-4356-a60a-efb0c0657fc6" (UID: "9529743d-7cc8-4356-a60a-efb0c0657fc6"). InnerVolumeSpecName "kube-api-access-kfnsd". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:49:10 crc kubenswrapper[4730]: I0320 16:49:10.726212    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9529743d-7cc8-4356-a60a-efb0c0657fc6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9529743d-7cc8-4356-a60a-efb0c0657fc6" (UID: "9529743d-7cc8-4356-a60a-efb0c0657fc6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:49:10 crc kubenswrapper[4730]: I0320 16:49:10.765642    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfnsd\" (UniqueName: \"kubernetes.io/projected/9529743d-7cc8-4356-a60a-efb0c0657fc6-kube-api-access-kfnsd\") on node \"crc\" DevicePath \"\""
Mar 20 16:49:10 crc kubenswrapper[4730]: I0320 16:49:10.765675    4730 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9529743d-7cc8-4356-a60a-efb0c0657fc6-catalog-content\") on node \"crc\" DevicePath \"\""
Mar 20 16:49:10 crc kubenswrapper[4730]: I0320 16:49:10.765684    4730 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9529743d-7cc8-4356-a60a-efb0c0657fc6-utilities\") on node \"crc\" DevicePath \"\""
Mar 20 16:49:11 crc kubenswrapper[4730]: I0320 16:49:11.298195    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z7gbm" event={"ID":"9529743d-7cc8-4356-a60a-efb0c0657fc6","Type":"ContainerDied","Data":"3d1bafb556879d48fd1c609a49ffd281b803156d4929627c53c4cf726a9391f4"}
Mar 20 16:49:11 crc kubenswrapper[4730]: I0320 16:49:11.298546    4730 scope.go:117] "RemoveContainer" containerID="338ae4a95e896fe7a2224e63f3555e3b661a6774b8e06328bbe2fece926f00df"
Mar 20 16:49:11 crc kubenswrapper[4730]: I0320 16:49:11.298399    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z7gbm"
Mar 20 16:49:11 crc kubenswrapper[4730]: I0320 16:49:11.320837    4730 scope.go:117] "RemoveContainer" containerID="dc80adcd98dab5030f9344d7fc4b434975389290e0aacc050a76cea518392d8e"
Mar 20 16:49:11 crc kubenswrapper[4730]: I0320 16:49:11.344972    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z7gbm"]
Mar 20 16:49:11 crc kubenswrapper[4730]: I0320 16:49:11.353781    4730 scope.go:117] "RemoveContainer" containerID="d65dd161a59341b12a4cdf8839f51c7e61e875c1a19f37eba22280b7b377868a"
Mar 20 16:49:11 crc kubenswrapper[4730]: I0320 16:49:11.358388    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-z7gbm"]
Mar 20 16:49:11 crc kubenswrapper[4730]: I0320 16:49:11.560776    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9529743d-7cc8-4356-a60a-efb0c0657fc6" path="/var/lib/kubelet/pods/9529743d-7cc8-4356-a60a-efb0c0657fc6/volumes"
Mar 20 16:49:18 crc kubenswrapper[4730]: I0320 16:49:18.635198    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l6l4h" podUID="e6a0115e-9fa1-4809-adb5-76be4e66cd52" containerName="registry-server" probeResult="failure" output=<
Mar 20 16:49:18 crc kubenswrapper[4730]:         timeout: failed to connect service ":50051" within 1s
Mar 20 16:49:18 crc kubenswrapper[4730]:  >
Mar 20 16:49:19 crc kubenswrapper[4730]: I0320 16:49:19.538289    4730 scope.go:117] "RemoveContainer" containerID="ee13c0427129be645bbda82e57ffd84564f844ed3cec3a2748df09a4a24fe610"
Mar 20 16:49:19 crc kubenswrapper[4730]: E0320 16:49:19.538588    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:49:27 crc kubenswrapper[4730]: I0320 16:49:27.658421    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-l6l4h"
Mar 20 16:49:27 crc kubenswrapper[4730]: I0320 16:49:27.736763    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-l6l4h"
Mar 20 16:49:27 crc kubenswrapper[4730]: I0320 16:49:27.909186    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l6l4h"]
Mar 20 16:49:29 crc kubenswrapper[4730]: I0320 16:49:29.492690    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-l6l4h" podUID="e6a0115e-9fa1-4809-adb5-76be4e66cd52" containerName="registry-server" containerID="cri-o://171abedfb7066a417cee9f0d2c0b610b90402d3dd8de8f88927c40fb5e73a8d3" gracePeriod=2
Mar 20 16:49:30 crc kubenswrapper[4730]: I0320 16:49:30.448565    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l6l4h"
Mar 20 16:49:30 crc kubenswrapper[4730]: I0320 16:49:30.506190    4730 generic.go:334] "Generic (PLEG): container finished" podID="e6a0115e-9fa1-4809-adb5-76be4e66cd52" containerID="171abedfb7066a417cee9f0d2c0b610b90402d3dd8de8f88927c40fb5e73a8d3" exitCode=0
Mar 20 16:49:30 crc kubenswrapper[4730]: I0320 16:49:30.506236    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l6l4h" event={"ID":"e6a0115e-9fa1-4809-adb5-76be4e66cd52","Type":"ContainerDied","Data":"171abedfb7066a417cee9f0d2c0b610b90402d3dd8de8f88927c40fb5e73a8d3"}
Mar 20 16:49:30 crc kubenswrapper[4730]: I0320 16:49:30.506277    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l6l4h" event={"ID":"e6a0115e-9fa1-4809-adb5-76be4e66cd52","Type":"ContainerDied","Data":"d2ca20484b8b40d7c6540ecf524ef385e1e84275942650c4814c8b0e6d8e9e71"}
Mar 20 16:49:30 crc kubenswrapper[4730]: I0320 16:49:30.506299    4730 scope.go:117] "RemoveContainer" containerID="171abedfb7066a417cee9f0d2c0b610b90402d3dd8de8f88927c40fb5e73a8d3"
Mar 20 16:49:30 crc kubenswrapper[4730]: I0320 16:49:30.506383    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l6l4h"
Mar 20 16:49:30 crc kubenswrapper[4730]: I0320 16:49:30.531447    4730 scope.go:117] "RemoveContainer" containerID="74f6aec5b0a2a2981ace72e9b9444851318412a96c97f48bbc92f4581cf79a47"
Mar 20 16:49:30 crc kubenswrapper[4730]: I0320 16:49:30.564671    4730 scope.go:117] "RemoveContainer" containerID="9cafb991c7ca35e5ba77af18a045990efcbfc9d1be49bb90913f2673a3398384"
Mar 20 16:49:30 crc kubenswrapper[4730]: I0320 16:49:30.623278    4730 scope.go:117] "RemoveContainer" containerID="171abedfb7066a417cee9f0d2c0b610b90402d3dd8de8f88927c40fb5e73a8d3"
Mar 20 16:49:30 crc kubenswrapper[4730]: E0320 16:49:30.623803    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"171abedfb7066a417cee9f0d2c0b610b90402d3dd8de8f88927c40fb5e73a8d3\": container with ID starting with 171abedfb7066a417cee9f0d2c0b610b90402d3dd8de8f88927c40fb5e73a8d3 not found: ID does not exist" containerID="171abedfb7066a417cee9f0d2c0b610b90402d3dd8de8f88927c40fb5e73a8d3"
Mar 20 16:49:30 crc kubenswrapper[4730]: I0320 16:49:30.623845    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"171abedfb7066a417cee9f0d2c0b610b90402d3dd8de8f88927c40fb5e73a8d3"} err="failed to get container status \"171abedfb7066a417cee9f0d2c0b610b90402d3dd8de8f88927c40fb5e73a8d3\": rpc error: code = NotFound desc = could not find container \"171abedfb7066a417cee9f0d2c0b610b90402d3dd8de8f88927c40fb5e73a8d3\": container with ID starting with 171abedfb7066a417cee9f0d2c0b610b90402d3dd8de8f88927c40fb5e73a8d3 not found: ID does not exist"
Mar 20 16:49:30 crc kubenswrapper[4730]: I0320 16:49:30.623873    4730 scope.go:117] "RemoveContainer" containerID="74f6aec5b0a2a2981ace72e9b9444851318412a96c97f48bbc92f4581cf79a47"
Mar 20 16:49:30 crc kubenswrapper[4730]: E0320 16:49:30.624341    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74f6aec5b0a2a2981ace72e9b9444851318412a96c97f48bbc92f4581cf79a47\": container with ID starting with 74f6aec5b0a2a2981ace72e9b9444851318412a96c97f48bbc92f4581cf79a47 not found: ID does not exist" containerID="74f6aec5b0a2a2981ace72e9b9444851318412a96c97f48bbc92f4581cf79a47"
Mar 20 16:49:30 crc kubenswrapper[4730]: I0320 16:49:30.624404    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74f6aec5b0a2a2981ace72e9b9444851318412a96c97f48bbc92f4581cf79a47"} err="failed to get container status \"74f6aec5b0a2a2981ace72e9b9444851318412a96c97f48bbc92f4581cf79a47\": rpc error: code = NotFound desc = could not find container \"74f6aec5b0a2a2981ace72e9b9444851318412a96c97f48bbc92f4581cf79a47\": container with ID starting with 74f6aec5b0a2a2981ace72e9b9444851318412a96c97f48bbc92f4581cf79a47 not found: ID does not exist"
Mar 20 16:49:30 crc kubenswrapper[4730]: I0320 16:49:30.624442    4730 scope.go:117] "RemoveContainer" containerID="9cafb991c7ca35e5ba77af18a045990efcbfc9d1be49bb90913f2673a3398384"
Mar 20 16:49:30 crc kubenswrapper[4730]: E0320 16:49:30.624723    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cafb991c7ca35e5ba77af18a045990efcbfc9d1be49bb90913f2673a3398384\": container with ID starting with 9cafb991c7ca35e5ba77af18a045990efcbfc9d1be49bb90913f2673a3398384 not found: ID does not exist" containerID="9cafb991c7ca35e5ba77af18a045990efcbfc9d1be49bb90913f2673a3398384"
Mar 20 16:49:30 crc kubenswrapper[4730]: I0320 16:49:30.624765    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cafb991c7ca35e5ba77af18a045990efcbfc9d1be49bb90913f2673a3398384"} err="failed to get container status \"9cafb991c7ca35e5ba77af18a045990efcbfc9d1be49bb90913f2673a3398384\": rpc error: code = NotFound desc = could not find container \"9cafb991c7ca35e5ba77af18a045990efcbfc9d1be49bb90913f2673a3398384\": container with ID starting with 9cafb991c7ca35e5ba77af18a045990efcbfc9d1be49bb90913f2673a3398384 not found: ID does not exist"
Mar 20 16:49:30 crc kubenswrapper[4730]: I0320 16:49:30.629320    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6a0115e-9fa1-4809-adb5-76be4e66cd52-catalog-content\") pod \"e6a0115e-9fa1-4809-adb5-76be4e66cd52\" (UID: \"e6a0115e-9fa1-4809-adb5-76be4e66cd52\") "
Mar 20 16:49:30 crc kubenswrapper[4730]: I0320 16:49:30.629526    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68gqd\" (UniqueName: \"kubernetes.io/projected/e6a0115e-9fa1-4809-adb5-76be4e66cd52-kube-api-access-68gqd\") pod \"e6a0115e-9fa1-4809-adb5-76be4e66cd52\" (UID: \"e6a0115e-9fa1-4809-adb5-76be4e66cd52\") "
Mar 20 16:49:30 crc kubenswrapper[4730]: I0320 16:49:30.629661    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6a0115e-9fa1-4809-adb5-76be4e66cd52-utilities\") pod \"e6a0115e-9fa1-4809-adb5-76be4e66cd52\" (UID: \"e6a0115e-9fa1-4809-adb5-76be4e66cd52\") "
Mar 20 16:49:30 crc kubenswrapper[4730]: I0320 16:49:30.630539    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6a0115e-9fa1-4809-adb5-76be4e66cd52-utilities" (OuterVolumeSpecName: "utilities") pod "e6a0115e-9fa1-4809-adb5-76be4e66cd52" (UID: "e6a0115e-9fa1-4809-adb5-76be4e66cd52"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:49:30 crc kubenswrapper[4730]: I0320 16:49:30.638299    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6a0115e-9fa1-4809-adb5-76be4e66cd52-kube-api-access-68gqd" (OuterVolumeSpecName: "kube-api-access-68gqd") pod "e6a0115e-9fa1-4809-adb5-76be4e66cd52" (UID: "e6a0115e-9fa1-4809-adb5-76be4e66cd52"). InnerVolumeSpecName "kube-api-access-68gqd". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:49:30 crc kubenswrapper[4730]: I0320 16:49:30.731946    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68gqd\" (UniqueName: \"kubernetes.io/projected/e6a0115e-9fa1-4809-adb5-76be4e66cd52-kube-api-access-68gqd\") on node \"crc\" DevicePath \"\""
Mar 20 16:49:30 crc kubenswrapper[4730]: I0320 16:49:30.732344    4730 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6a0115e-9fa1-4809-adb5-76be4e66cd52-utilities\") on node \"crc\" DevicePath \"\""
Mar 20 16:49:30 crc kubenswrapper[4730]: I0320 16:49:30.772649    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6a0115e-9fa1-4809-adb5-76be4e66cd52-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e6a0115e-9fa1-4809-adb5-76be4e66cd52" (UID: "e6a0115e-9fa1-4809-adb5-76be4e66cd52"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:49:30 crc kubenswrapper[4730]: I0320 16:49:30.835081    4730 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6a0115e-9fa1-4809-adb5-76be4e66cd52-catalog-content\") on node \"crc\" DevicePath \"\""
Mar 20 16:49:30 crc kubenswrapper[4730]: I0320 16:49:30.856298    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l6l4h"]
Mar 20 16:49:30 crc kubenswrapper[4730]: I0320 16:49:30.865652    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-l6l4h"]
Mar 20 16:49:31 crc kubenswrapper[4730]: I0320 16:49:31.556110    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6a0115e-9fa1-4809-adb5-76be4e66cd52" path="/var/lib/kubelet/pods/e6a0115e-9fa1-4809-adb5-76be4e66cd52/volumes"
Mar 20 16:49:32 crc kubenswrapper[4730]: I0320 16:49:32.534004    4730 scope.go:117] "RemoveContainer" containerID="ee13c0427129be645bbda82e57ffd84564f844ed3cec3a2748df09a4a24fe610"
Mar 20 16:49:32 crc kubenswrapper[4730]: E0320 16:49:32.535075    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:49:45 crc kubenswrapper[4730]: I0320 16:49:45.533182    4730 scope.go:117] "RemoveContainer" containerID="ee13c0427129be645bbda82e57ffd84564f844ed3cec3a2748df09a4a24fe610"
Mar 20 16:49:46 crc kubenswrapper[4730]: I0320 16:49:46.674653    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" event={"ID":"7fcd3db3-55f1-4c23-8fa9-78844495cea3","Type":"ContainerStarted","Data":"1911f50d0c8e9a77c325e3c9daab7021dcd8581ac14223ba2443e24f00e39603"}
Mar 20 16:50:00 crc kubenswrapper[4730]: I0320 16:50:00.168223    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567090-775ds"]
Mar 20 16:50:00 crc kubenswrapper[4730]: E0320 16:50:00.169321    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6a0115e-9fa1-4809-adb5-76be4e66cd52" containerName="extract-utilities"
Mar 20 16:50:00 crc kubenswrapper[4730]: I0320 16:50:00.169338    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6a0115e-9fa1-4809-adb5-76be4e66cd52" containerName="extract-utilities"
Mar 20 16:50:00 crc kubenswrapper[4730]: E0320 16:50:00.169365    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9529743d-7cc8-4356-a60a-efb0c0657fc6" containerName="registry-server"
Mar 20 16:50:00 crc kubenswrapper[4730]: I0320 16:50:00.169373    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="9529743d-7cc8-4356-a60a-efb0c0657fc6" containerName="registry-server"
Mar 20 16:50:00 crc kubenswrapper[4730]: E0320 16:50:00.169389    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9529743d-7cc8-4356-a60a-efb0c0657fc6" containerName="extract-utilities"
Mar 20 16:50:00 crc kubenswrapper[4730]: I0320 16:50:00.169400    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="9529743d-7cc8-4356-a60a-efb0c0657fc6" containerName="extract-utilities"
Mar 20 16:50:00 crc kubenswrapper[4730]: E0320 16:50:00.169429    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6a0115e-9fa1-4809-adb5-76be4e66cd52" containerName="extract-content"
Mar 20 16:50:00 crc kubenswrapper[4730]: I0320 16:50:00.169439    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6a0115e-9fa1-4809-adb5-76be4e66cd52" containerName="extract-content"
Mar 20 16:50:00 crc kubenswrapper[4730]: E0320 16:50:00.169456    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9529743d-7cc8-4356-a60a-efb0c0657fc6" containerName="extract-content"
Mar 20 16:50:00 crc kubenswrapper[4730]: I0320 16:50:00.169599    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="9529743d-7cc8-4356-a60a-efb0c0657fc6" containerName="extract-content"
Mar 20 16:50:00 crc kubenswrapper[4730]: E0320 16:50:00.169624    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6a0115e-9fa1-4809-adb5-76be4e66cd52" containerName="registry-server"
Mar 20 16:50:00 crc kubenswrapper[4730]: I0320 16:50:00.169632    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6a0115e-9fa1-4809-adb5-76be4e66cd52" containerName="registry-server"
Mar 20 16:50:00 crc kubenswrapper[4730]: I0320 16:50:00.169894    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="9529743d-7cc8-4356-a60a-efb0c0657fc6" containerName="registry-server"
Mar 20 16:50:00 crc kubenswrapper[4730]: I0320 16:50:00.169923    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6a0115e-9fa1-4809-adb5-76be4e66cd52" containerName="registry-server"
Mar 20 16:50:00 crc kubenswrapper[4730]: I0320 16:50:00.170791    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567090-775ds"
Mar 20 16:50:00 crc kubenswrapper[4730]: I0320 16:50:00.172798    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vlqbl"
Mar 20 16:50:00 crc kubenswrapper[4730]: I0320 16:50:00.173911    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt"
Mar 20 16:50:00 crc kubenswrapper[4730]: I0320 16:50:00.178698    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt"
Mar 20 16:50:00 crc kubenswrapper[4730]: I0320 16:50:00.189754    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567090-775ds"]
Mar 20 16:50:00 crc kubenswrapper[4730]: I0320 16:50:00.292965    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52p6j\" (UniqueName: \"kubernetes.io/projected/347b4579-1fb8-49d0-97df-83dafdafae60-kube-api-access-52p6j\") pod \"auto-csr-approver-29567090-775ds\" (UID: \"347b4579-1fb8-49d0-97df-83dafdafae60\") " pod="openshift-infra/auto-csr-approver-29567090-775ds"
Mar 20 16:50:00 crc kubenswrapper[4730]: I0320 16:50:00.395851    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52p6j\" (UniqueName: \"kubernetes.io/projected/347b4579-1fb8-49d0-97df-83dafdafae60-kube-api-access-52p6j\") pod \"auto-csr-approver-29567090-775ds\" (UID: \"347b4579-1fb8-49d0-97df-83dafdafae60\") " pod="openshift-infra/auto-csr-approver-29567090-775ds"
Mar 20 16:50:00 crc kubenswrapper[4730]: I0320 16:50:00.418492    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52p6j\" (UniqueName: \"kubernetes.io/projected/347b4579-1fb8-49d0-97df-83dafdafae60-kube-api-access-52p6j\") pod \"auto-csr-approver-29567090-775ds\" (UID: \"347b4579-1fb8-49d0-97df-83dafdafae60\") " pod="openshift-infra/auto-csr-approver-29567090-775ds"
Mar 20 16:50:00 crc kubenswrapper[4730]: I0320 16:50:00.506929    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567090-775ds"
Mar 20 16:50:01 crc kubenswrapper[4730]: I0320 16:50:01.000643    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567090-775ds"]
Mar 20 16:50:01 crc kubenswrapper[4730]: I0320 16:50:01.841837    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567090-775ds" event={"ID":"347b4579-1fb8-49d0-97df-83dafdafae60","Type":"ContainerStarted","Data":"38c9e55b6b94f60c333409db3275047b6f0ecc0cb0071bf61778bab5d364cfe2"}
Mar 20 16:50:03 crc kubenswrapper[4730]: I0320 16:50:03.875946    4730 generic.go:334] "Generic (PLEG): container finished" podID="347b4579-1fb8-49d0-97df-83dafdafae60" containerID="1b59cf8fa45b394473fc97dca5b148ed197f64170983d43d46fd520ecdca6208" exitCode=0
Mar 20 16:50:03 crc kubenswrapper[4730]: I0320 16:50:03.876090    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567090-775ds" event={"ID":"347b4579-1fb8-49d0-97df-83dafdafae60","Type":"ContainerDied","Data":"1b59cf8fa45b394473fc97dca5b148ed197f64170983d43d46fd520ecdca6208"}
Mar 20 16:50:05 crc kubenswrapper[4730]: I0320 16:50:05.303800    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567090-775ds"
Mar 20 16:50:05 crc kubenswrapper[4730]: I0320 16:50:05.410926    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52p6j\" (UniqueName: \"kubernetes.io/projected/347b4579-1fb8-49d0-97df-83dafdafae60-kube-api-access-52p6j\") pod \"347b4579-1fb8-49d0-97df-83dafdafae60\" (UID: \"347b4579-1fb8-49d0-97df-83dafdafae60\") "
Mar 20 16:50:05 crc kubenswrapper[4730]: I0320 16:50:05.425234    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/347b4579-1fb8-49d0-97df-83dafdafae60-kube-api-access-52p6j" (OuterVolumeSpecName: "kube-api-access-52p6j") pod "347b4579-1fb8-49d0-97df-83dafdafae60" (UID: "347b4579-1fb8-49d0-97df-83dafdafae60"). InnerVolumeSpecName "kube-api-access-52p6j". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:50:05 crc kubenswrapper[4730]: I0320 16:50:05.513575    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52p6j\" (UniqueName: \"kubernetes.io/projected/347b4579-1fb8-49d0-97df-83dafdafae60-kube-api-access-52p6j\") on node \"crc\" DevicePath \"\""
Mar 20 16:50:05 crc kubenswrapper[4730]: I0320 16:50:05.910611    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567090-775ds" event={"ID":"347b4579-1fb8-49d0-97df-83dafdafae60","Type":"ContainerDied","Data":"38c9e55b6b94f60c333409db3275047b6f0ecc0cb0071bf61778bab5d364cfe2"}
Mar 20 16:50:05 crc kubenswrapper[4730]: I0320 16:50:05.910647    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38c9e55b6b94f60c333409db3275047b6f0ecc0cb0071bf61778bab5d364cfe2"
Mar 20 16:50:05 crc kubenswrapper[4730]: I0320 16:50:05.910685    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567090-775ds"
Mar 20 16:50:06 crc kubenswrapper[4730]: I0320 16:50:06.392053    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567084-c4mpd"]
Mar 20 16:50:06 crc kubenswrapper[4730]: I0320 16:50:06.402019    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567084-c4mpd"]
Mar 20 16:50:07 crc kubenswrapper[4730]: I0320 16:50:07.552491    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51901671-4d27-46c5-9a9d-baf51b2b9c01" path="/var/lib/kubelet/pods/51901671-4d27-46c5-9a9d-baf51b2b9c01/volumes"
Mar 20 16:50:51 crc kubenswrapper[4730]: I0320 16:50:51.588194    4730 scope.go:117] "RemoveContainer" containerID="cbdd90e3d11772056ef45ec365a19533f01de2f8c0583c5498c86a843612b56d"
Mar 20 16:52:00 crc kubenswrapper[4730]: I0320 16:52:00.150084    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567092-v8dfz"]
Mar 20 16:52:00 crc kubenswrapper[4730]: E0320 16:52:00.151417    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="347b4579-1fb8-49d0-97df-83dafdafae60" containerName="oc"
Mar 20 16:52:00 crc kubenswrapper[4730]: I0320 16:52:00.151441    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="347b4579-1fb8-49d0-97df-83dafdafae60" containerName="oc"
Mar 20 16:52:00 crc kubenswrapper[4730]: I0320 16:52:00.151737    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="347b4579-1fb8-49d0-97df-83dafdafae60" containerName="oc"
Mar 20 16:52:00 crc kubenswrapper[4730]: I0320 16:52:00.152606    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567092-v8dfz"
Mar 20 16:52:00 crc kubenswrapper[4730]: I0320 16:52:00.154341    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vlqbl"
Mar 20 16:52:00 crc kubenswrapper[4730]: I0320 16:52:00.155506    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt"
Mar 20 16:52:00 crc kubenswrapper[4730]: I0320 16:52:00.155801    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt"
Mar 20 16:52:00 crc kubenswrapper[4730]: I0320 16:52:00.159946    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567092-v8dfz"]
Mar 20 16:52:00 crc kubenswrapper[4730]: I0320 16:52:00.239536    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmwgk\" (UniqueName: \"kubernetes.io/projected/31e0f6b1-5405-420b-941e-4e711281673f-kube-api-access-nmwgk\") pod \"auto-csr-approver-29567092-v8dfz\" (UID: \"31e0f6b1-5405-420b-941e-4e711281673f\") " pod="openshift-infra/auto-csr-approver-29567092-v8dfz"
Mar 20 16:52:00 crc kubenswrapper[4730]: I0320 16:52:00.341914    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmwgk\" (UniqueName: \"kubernetes.io/projected/31e0f6b1-5405-420b-941e-4e711281673f-kube-api-access-nmwgk\") pod \"auto-csr-approver-29567092-v8dfz\" (UID: \"31e0f6b1-5405-420b-941e-4e711281673f\") " pod="openshift-infra/auto-csr-approver-29567092-v8dfz"
Mar 20 16:52:00 crc kubenswrapper[4730]: I0320 16:52:00.373615    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmwgk\" (UniqueName: \"kubernetes.io/projected/31e0f6b1-5405-420b-941e-4e711281673f-kube-api-access-nmwgk\") pod \"auto-csr-approver-29567092-v8dfz\" (UID: \"31e0f6b1-5405-420b-941e-4e711281673f\") " pod="openshift-infra/auto-csr-approver-29567092-v8dfz"
Mar 20 16:52:00 crc kubenswrapper[4730]: I0320 16:52:00.471435    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567092-v8dfz"
Mar 20 16:52:00 crc kubenswrapper[4730]: I0320 16:52:00.933591    4730 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider
Mar 20 16:52:00 crc kubenswrapper[4730]: I0320 16:52:00.940556    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567092-v8dfz"]
Mar 20 16:52:01 crc kubenswrapper[4730]: I0320 16:52:01.132430    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567092-v8dfz" event={"ID":"31e0f6b1-5405-420b-941e-4e711281673f","Type":"ContainerStarted","Data":"a2bdd09093211e701e97bcc2dc049bdcfd33afcd27d21b5c9962c562e342d089"}
Mar 20 16:52:03 crc kubenswrapper[4730]: I0320 16:52:03.152068    4730 generic.go:334] "Generic (PLEG): container finished" podID="31e0f6b1-5405-420b-941e-4e711281673f" containerID="ab6be6d2c86b64d8fa302dbb64f113f58f10be72cfc3ad77f609318442cc34dd" exitCode=0
Mar 20 16:52:03 crc kubenswrapper[4730]: I0320 16:52:03.152177    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567092-v8dfz" event={"ID":"31e0f6b1-5405-420b-941e-4e711281673f","Type":"ContainerDied","Data":"ab6be6d2c86b64d8fa302dbb64f113f58f10be72cfc3ad77f609318442cc34dd"}
Mar 20 16:52:04 crc kubenswrapper[4730]: I0320 16:52:04.511325    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567092-v8dfz"
Mar 20 16:52:04 crc kubenswrapper[4730]: I0320 16:52:04.631591    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmwgk\" (UniqueName: \"kubernetes.io/projected/31e0f6b1-5405-420b-941e-4e711281673f-kube-api-access-nmwgk\") pod \"31e0f6b1-5405-420b-941e-4e711281673f\" (UID: \"31e0f6b1-5405-420b-941e-4e711281673f\") "
Mar 20 16:52:04 crc kubenswrapper[4730]: I0320 16:52:04.637634    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31e0f6b1-5405-420b-941e-4e711281673f-kube-api-access-nmwgk" (OuterVolumeSpecName: "kube-api-access-nmwgk") pod "31e0f6b1-5405-420b-941e-4e711281673f" (UID: "31e0f6b1-5405-420b-941e-4e711281673f"). InnerVolumeSpecName "kube-api-access-nmwgk". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:52:04 crc kubenswrapper[4730]: I0320 16:52:04.734484    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmwgk\" (UniqueName: \"kubernetes.io/projected/31e0f6b1-5405-420b-941e-4e711281673f-kube-api-access-nmwgk\") on node \"crc\" DevicePath \"\""
Mar 20 16:52:05 crc kubenswrapper[4730]: I0320 16:52:05.173766    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567092-v8dfz" event={"ID":"31e0f6b1-5405-420b-941e-4e711281673f","Type":"ContainerDied","Data":"a2bdd09093211e701e97bcc2dc049bdcfd33afcd27d21b5c9962c562e342d089"}
Mar 20 16:52:05 crc kubenswrapper[4730]: I0320 16:52:05.173827    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2bdd09093211e701e97bcc2dc049bdcfd33afcd27d21b5c9962c562e342d089"
Mar 20 16:52:05 crc kubenswrapper[4730]: I0320 16:52:05.173829    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567092-v8dfz"
Mar 20 16:52:05 crc kubenswrapper[4730]: I0320 16:52:05.580271    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567086-bgtzj"]
Mar 20 16:52:05 crc kubenswrapper[4730]: I0320 16:52:05.605258    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567086-bgtzj"]
Mar 20 16:52:07 crc kubenswrapper[4730]: I0320 16:52:07.544651    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1598084b-3967-4d5d-8911-87b4bbf10965" path="/var/lib/kubelet/pods/1598084b-3967-4d5d-8911-87b4bbf10965/volumes"
Mar 20 16:52:12 crc kubenswrapper[4730]: I0320 16:52:12.879821    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 16:52:12 crc kubenswrapper[4730]: I0320 16:52:12.880119    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 16:52:42 crc kubenswrapper[4730]: I0320 16:52:42.882944    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 16:52:42 crc kubenswrapper[4730]: I0320 16:52:42.883498    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 16:52:51 crc kubenswrapper[4730]: I0320 16:52:51.699310    4730 scope.go:117] "RemoveContainer" containerID="090324d1acddd1e29456802f46699a7cfabedaef8f848dbdf774851d4687bf7f"
Mar 20 16:53:12 crc kubenswrapper[4730]: I0320 16:53:12.880552    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 16:53:12 crc kubenswrapper[4730]: I0320 16:53:12.881466    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 16:53:12 crc kubenswrapper[4730]: I0320 16:53:12.881529    4730 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf"
Mar 20 16:53:12 crc kubenswrapper[4730]: I0320 16:53:12.882428    4730 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1911f50d0c8e9a77c325e3c9daab7021dcd8581ac14223ba2443e24f00e39603"} pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted"
Mar 20 16:53:12 crc kubenswrapper[4730]: I0320 16:53:12.882495    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" containerID="cri-o://1911f50d0c8e9a77c325e3c9daab7021dcd8581ac14223ba2443e24f00e39603" gracePeriod=600
Mar 20 16:53:13 crc kubenswrapper[4730]: I0320 16:53:13.829784    4730 generic.go:334] "Generic (PLEG): container finished" podID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerID="1911f50d0c8e9a77c325e3c9daab7021dcd8581ac14223ba2443e24f00e39603" exitCode=0
Mar 20 16:53:13 crc kubenswrapper[4730]: I0320 16:53:13.830117    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" event={"ID":"7fcd3db3-55f1-4c23-8fa9-78844495cea3","Type":"ContainerDied","Data":"1911f50d0c8e9a77c325e3c9daab7021dcd8581ac14223ba2443e24f00e39603"}
Mar 20 16:53:13 crc kubenswrapper[4730]: I0320 16:53:13.830145    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" event={"ID":"7fcd3db3-55f1-4c23-8fa9-78844495cea3","Type":"ContainerStarted","Data":"96fa1a39363ce580a7d7ba62fa4f551be70b7f98a5cbd49713d8269774ccd8b9"}
Mar 20 16:53:13 crc kubenswrapper[4730]: I0320 16:53:13.830161    4730 scope.go:117] "RemoveContainer" containerID="ee13c0427129be645bbda82e57ffd84564f844ed3cec3a2748df09a4a24fe610"
Mar 20 16:54:00 crc kubenswrapper[4730]: I0320 16:54:00.168307    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567094-5zljx"]
Mar 20 16:54:00 crc kubenswrapper[4730]: E0320 16:54:00.169525    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31e0f6b1-5405-420b-941e-4e711281673f" containerName="oc"
Mar 20 16:54:00 crc kubenswrapper[4730]: I0320 16:54:00.169546    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="31e0f6b1-5405-420b-941e-4e711281673f" containerName="oc"
Mar 20 16:54:00 crc kubenswrapper[4730]: I0320 16:54:00.169887    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="31e0f6b1-5405-420b-941e-4e711281673f" containerName="oc"
Mar 20 16:54:00 crc kubenswrapper[4730]: I0320 16:54:00.171008    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567094-5zljx"
Mar 20 16:54:00 crc kubenswrapper[4730]: I0320 16:54:00.175200    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vlqbl"
Mar 20 16:54:00 crc kubenswrapper[4730]: I0320 16:54:00.175582    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt"
Mar 20 16:54:00 crc kubenswrapper[4730]: I0320 16:54:00.175624    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt"
Mar 20 16:54:00 crc kubenswrapper[4730]: I0320 16:54:00.189856    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567094-5zljx"]
Mar 20 16:54:00 crc kubenswrapper[4730]: I0320 16:54:00.329933    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lm8m\" (UniqueName: \"kubernetes.io/projected/a009be79-bc2f-45ca-94b9-f0da37a6abdc-kube-api-access-2lm8m\") pod \"auto-csr-approver-29567094-5zljx\" (UID: \"a009be79-bc2f-45ca-94b9-f0da37a6abdc\") " pod="openshift-infra/auto-csr-approver-29567094-5zljx"
Mar 20 16:54:00 crc kubenswrapper[4730]: I0320 16:54:00.431753    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lm8m\" (UniqueName: \"kubernetes.io/projected/a009be79-bc2f-45ca-94b9-f0da37a6abdc-kube-api-access-2lm8m\") pod \"auto-csr-approver-29567094-5zljx\" (UID: \"a009be79-bc2f-45ca-94b9-f0da37a6abdc\") " pod="openshift-infra/auto-csr-approver-29567094-5zljx"
Mar 20 16:54:00 crc kubenswrapper[4730]: I0320 16:54:00.458669    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lm8m\" (UniqueName: \"kubernetes.io/projected/a009be79-bc2f-45ca-94b9-f0da37a6abdc-kube-api-access-2lm8m\") pod \"auto-csr-approver-29567094-5zljx\" (UID: \"a009be79-bc2f-45ca-94b9-f0da37a6abdc\") " pod="openshift-infra/auto-csr-approver-29567094-5zljx"
Mar 20 16:54:00 crc kubenswrapper[4730]: I0320 16:54:00.498989    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567094-5zljx"
Mar 20 16:54:00 crc kubenswrapper[4730]: I0320 16:54:00.954049    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567094-5zljx"]
Mar 20 16:54:01 crc kubenswrapper[4730]: I0320 16:54:01.326473    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567094-5zljx" event={"ID":"a009be79-bc2f-45ca-94b9-f0da37a6abdc","Type":"ContainerStarted","Data":"2e3f3ff8766b4b70c26134bd30d13d680af318f12ab19620310fa8fbf92370ea"}
Mar 20 16:54:02 crc kubenswrapper[4730]: I0320 16:54:02.340480    4730 generic.go:334] "Generic (PLEG): container finished" podID="a009be79-bc2f-45ca-94b9-f0da37a6abdc" containerID="bfe54023e94fe2434c4e1c76acbd4f6a1cd0b3f0a5a82a87fd8b931a2f1901c9" exitCode=0
Mar 20 16:54:02 crc kubenswrapper[4730]: I0320 16:54:02.340957    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567094-5zljx" event={"ID":"a009be79-bc2f-45ca-94b9-f0da37a6abdc","Type":"ContainerDied","Data":"bfe54023e94fe2434c4e1c76acbd4f6a1cd0b3f0a5a82a87fd8b931a2f1901c9"}
Mar 20 16:54:03 crc kubenswrapper[4730]: I0320 16:54:03.718140    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567094-5zljx"
Mar 20 16:54:03 crc kubenswrapper[4730]: I0320 16:54:03.803961    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lm8m\" (UniqueName: \"kubernetes.io/projected/a009be79-bc2f-45ca-94b9-f0da37a6abdc-kube-api-access-2lm8m\") pod \"a009be79-bc2f-45ca-94b9-f0da37a6abdc\" (UID: \"a009be79-bc2f-45ca-94b9-f0da37a6abdc\") "
Mar 20 16:54:03 crc kubenswrapper[4730]: I0320 16:54:03.811712    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a009be79-bc2f-45ca-94b9-f0da37a6abdc-kube-api-access-2lm8m" (OuterVolumeSpecName: "kube-api-access-2lm8m") pod "a009be79-bc2f-45ca-94b9-f0da37a6abdc" (UID: "a009be79-bc2f-45ca-94b9-f0da37a6abdc"). InnerVolumeSpecName "kube-api-access-2lm8m". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:54:03 crc kubenswrapper[4730]: I0320 16:54:03.907572    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lm8m\" (UniqueName: \"kubernetes.io/projected/a009be79-bc2f-45ca-94b9-f0da37a6abdc-kube-api-access-2lm8m\") on node \"crc\" DevicePath \"\""
Mar 20 16:54:04 crc kubenswrapper[4730]: I0320 16:54:04.364927    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567094-5zljx" event={"ID":"a009be79-bc2f-45ca-94b9-f0da37a6abdc","Type":"ContainerDied","Data":"2e3f3ff8766b4b70c26134bd30d13d680af318f12ab19620310fa8fbf92370ea"}
Mar 20 16:54:04 crc kubenswrapper[4730]: I0320 16:54:04.364963    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e3f3ff8766b4b70c26134bd30d13d680af318f12ab19620310fa8fbf92370ea"
Mar 20 16:54:04 crc kubenswrapper[4730]: I0320 16:54:04.365023    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567094-5zljx"
Mar 20 16:54:04 crc kubenswrapper[4730]: I0320 16:54:04.787667    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567088-jzf6l"]
Mar 20 16:54:04 crc kubenswrapper[4730]: I0320 16:54:04.796804    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567088-jzf6l"]
Mar 20 16:54:05 crc kubenswrapper[4730]: I0320 16:54:05.549377    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="423d9abb-9507-4e8e-aa00-42b3d34328ed" path="/var/lib/kubelet/pods/423d9abb-9507-4e8e-aa00-42b3d34328ed/volumes"
Mar 20 16:54:51 crc kubenswrapper[4730]: I0320 16:54:51.807191    4730 scope.go:117] "RemoveContainer" containerID="7b312e45c65bdab74120878c5bbe6f1323de4c86f0295b5909d9931e6d7a0af0"
Mar 20 16:55:42 crc kubenswrapper[4730]: I0320 16:55:42.882333    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 16:55:42 crc kubenswrapper[4730]: I0320 16:55:42.882951    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 16:56:00 crc kubenswrapper[4730]: I0320 16:56:00.140293    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567096-xvkcr"]
Mar 20 16:56:00 crc kubenswrapper[4730]: E0320 16:56:00.141108    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a009be79-bc2f-45ca-94b9-f0da37a6abdc" containerName="oc"
Mar 20 16:56:00 crc kubenswrapper[4730]: I0320 16:56:00.141120    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="a009be79-bc2f-45ca-94b9-f0da37a6abdc" containerName="oc"
Mar 20 16:56:00 crc kubenswrapper[4730]: I0320 16:56:00.141330    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="a009be79-bc2f-45ca-94b9-f0da37a6abdc" containerName="oc"
Mar 20 16:56:00 crc kubenswrapper[4730]: I0320 16:56:00.141964    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567096-xvkcr"
Mar 20 16:56:00 crc kubenswrapper[4730]: I0320 16:56:00.144234    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt"
Mar 20 16:56:00 crc kubenswrapper[4730]: I0320 16:56:00.145508    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vlqbl"
Mar 20 16:56:00 crc kubenswrapper[4730]: I0320 16:56:00.145862    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt"
Mar 20 16:56:00 crc kubenswrapper[4730]: I0320 16:56:00.149301    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567096-xvkcr"]
Mar 20 16:56:00 crc kubenswrapper[4730]: I0320 16:56:00.282180    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxdp8\" (UniqueName: \"kubernetes.io/projected/f85cd1ac-f48f-46a3-81e3-f82b73719cb1-kube-api-access-zxdp8\") pod \"auto-csr-approver-29567096-xvkcr\" (UID: \"f85cd1ac-f48f-46a3-81e3-f82b73719cb1\") " pod="openshift-infra/auto-csr-approver-29567096-xvkcr"
Mar 20 16:56:00 crc kubenswrapper[4730]: I0320 16:56:00.384228    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxdp8\" (UniqueName: \"kubernetes.io/projected/f85cd1ac-f48f-46a3-81e3-f82b73719cb1-kube-api-access-zxdp8\") pod \"auto-csr-approver-29567096-xvkcr\" (UID: \"f85cd1ac-f48f-46a3-81e3-f82b73719cb1\") " pod="openshift-infra/auto-csr-approver-29567096-xvkcr"
Mar 20 16:56:00 crc kubenswrapper[4730]: I0320 16:56:00.498762    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxdp8\" (UniqueName: \"kubernetes.io/projected/f85cd1ac-f48f-46a3-81e3-f82b73719cb1-kube-api-access-zxdp8\") pod \"auto-csr-approver-29567096-xvkcr\" (UID: \"f85cd1ac-f48f-46a3-81e3-f82b73719cb1\") " pod="openshift-infra/auto-csr-approver-29567096-xvkcr"
Mar 20 16:56:00 crc kubenswrapper[4730]: I0320 16:56:00.760610    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567096-xvkcr"
Mar 20 16:56:01 crc kubenswrapper[4730]: I0320 16:56:01.223486    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567096-xvkcr"]
Mar 20 16:56:01 crc kubenswrapper[4730]: I0320 16:56:01.645832    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567096-xvkcr" event={"ID":"f85cd1ac-f48f-46a3-81e3-f82b73719cb1","Type":"ContainerStarted","Data":"fc1a75c2059b3b96b5c3d8397b9f0be76a34830abeeee52f031a757c8923ae9f"}
Mar 20 16:56:02 crc kubenswrapper[4730]: I0320 16:56:02.656338    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567096-xvkcr" event={"ID":"f85cd1ac-f48f-46a3-81e3-f82b73719cb1","Type":"ContainerStarted","Data":"ad05ce67547cd2c7c1cc69cf885883fc4049286efb953f0b2d8378cbd56d924f"}
Mar 20 16:56:02 crc kubenswrapper[4730]: I0320 16:56:02.682316    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567096-xvkcr" podStartSLOduration=1.574278129 podStartE2EDuration="2.682289862s" podCreationTimestamp="2026-03-20 16:56:00 +0000 UTC" firstStartedPulling="2026-03-20 16:56:01.244894208 +0000 UTC m=+4620.458265577" lastFinishedPulling="2026-03-20 16:56:02.352905941 +0000 UTC m=+4621.566277310" observedRunningTime="2026-03-20 16:56:02.670707783 +0000 UTC m=+4621.884079142" watchObservedRunningTime="2026-03-20 16:56:02.682289862 +0000 UTC m=+4621.895661241"
Mar 20 16:56:03 crc kubenswrapper[4730]: I0320 16:56:03.669001    4730 generic.go:334] "Generic (PLEG): container finished" podID="f85cd1ac-f48f-46a3-81e3-f82b73719cb1" containerID="ad05ce67547cd2c7c1cc69cf885883fc4049286efb953f0b2d8378cbd56d924f" exitCode=0
Mar 20 16:56:03 crc kubenswrapper[4730]: I0320 16:56:03.669424    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567096-xvkcr" event={"ID":"f85cd1ac-f48f-46a3-81e3-f82b73719cb1","Type":"ContainerDied","Data":"ad05ce67547cd2c7c1cc69cf885883fc4049286efb953f0b2d8378cbd56d924f"}
Mar 20 16:56:05 crc kubenswrapper[4730]: I0320 16:56:05.127085    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567096-xvkcr"
Mar 20 16:56:05 crc kubenswrapper[4730]: I0320 16:56:05.308032    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxdp8\" (UniqueName: \"kubernetes.io/projected/f85cd1ac-f48f-46a3-81e3-f82b73719cb1-kube-api-access-zxdp8\") pod \"f85cd1ac-f48f-46a3-81e3-f82b73719cb1\" (UID: \"f85cd1ac-f48f-46a3-81e3-f82b73719cb1\") "
Mar 20 16:56:05 crc kubenswrapper[4730]: I0320 16:56:05.334409    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f85cd1ac-f48f-46a3-81e3-f82b73719cb1-kube-api-access-zxdp8" (OuterVolumeSpecName: "kube-api-access-zxdp8") pod "f85cd1ac-f48f-46a3-81e3-f82b73719cb1" (UID: "f85cd1ac-f48f-46a3-81e3-f82b73719cb1"). InnerVolumeSpecName "kube-api-access-zxdp8". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:56:05 crc kubenswrapper[4730]: I0320 16:56:05.410843    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxdp8\" (UniqueName: \"kubernetes.io/projected/f85cd1ac-f48f-46a3-81e3-f82b73719cb1-kube-api-access-zxdp8\") on node \"crc\" DevicePath \"\""
Mar 20 16:56:05 crc kubenswrapper[4730]: I0320 16:56:05.691024    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567096-xvkcr" event={"ID":"f85cd1ac-f48f-46a3-81e3-f82b73719cb1","Type":"ContainerDied","Data":"fc1a75c2059b3b96b5c3d8397b9f0be76a34830abeeee52f031a757c8923ae9f"}
Mar 20 16:56:05 crc kubenswrapper[4730]: I0320 16:56:05.691058    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc1a75c2059b3b96b5c3d8397b9f0be76a34830abeeee52f031a757c8923ae9f"
Mar 20 16:56:05 crc kubenswrapper[4730]: I0320 16:56:05.691127    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567096-xvkcr"
Mar 20 16:56:05 crc kubenswrapper[4730]: I0320 16:56:05.753593    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567090-775ds"]
Mar 20 16:56:05 crc kubenswrapper[4730]: I0320 16:56:05.762485    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567090-775ds"]
Mar 20 16:56:07 crc kubenswrapper[4730]: I0320 16:56:07.559556    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="347b4579-1fb8-49d0-97df-83dafdafae60" path="/var/lib/kubelet/pods/347b4579-1fb8-49d0-97df-83dafdafae60/volumes"
Mar 20 16:56:08 crc kubenswrapper[4730]: I0320 16:56:08.850922    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jch7c"]
Mar 20 16:56:08 crc kubenswrapper[4730]: E0320 16:56:08.852089    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85cd1ac-f48f-46a3-81e3-f82b73719cb1" containerName="oc"
Mar 20 16:56:08 crc kubenswrapper[4730]: I0320 16:56:08.852106    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85cd1ac-f48f-46a3-81e3-f82b73719cb1" containerName="oc"
Mar 20 16:56:08 crc kubenswrapper[4730]: I0320 16:56:08.852446    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85cd1ac-f48f-46a3-81e3-f82b73719cb1" containerName="oc"
Mar 20 16:56:08 crc kubenswrapper[4730]: I0320 16:56:08.854347    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jch7c"
Mar 20 16:56:08 crc kubenswrapper[4730]: I0320 16:56:08.862704    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jch7c"]
Mar 20 16:56:08 crc kubenswrapper[4730]: I0320 16:56:08.993576    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdxsm\" (UniqueName: \"kubernetes.io/projected/8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202-kube-api-access-rdxsm\") pod \"redhat-marketplace-jch7c\" (UID: \"8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202\") " pod="openshift-marketplace/redhat-marketplace-jch7c"
Mar 20 16:56:08 crc kubenswrapper[4730]: I0320 16:56:08.993681    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202-utilities\") pod \"redhat-marketplace-jch7c\" (UID: \"8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202\") " pod="openshift-marketplace/redhat-marketplace-jch7c"
Mar 20 16:56:08 crc kubenswrapper[4730]: I0320 16:56:08.993752    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202-catalog-content\") pod \"redhat-marketplace-jch7c\" (UID: \"8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202\") " pod="openshift-marketplace/redhat-marketplace-jch7c"
Mar 20 16:56:09 crc kubenswrapper[4730]: I0320 16:56:09.096094    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202-catalog-content\") pod \"redhat-marketplace-jch7c\" (UID: \"8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202\") " pod="openshift-marketplace/redhat-marketplace-jch7c"
Mar 20 16:56:09 crc kubenswrapper[4730]: I0320 16:56:09.096204    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdxsm\" (UniqueName: \"kubernetes.io/projected/8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202-kube-api-access-rdxsm\") pod \"redhat-marketplace-jch7c\" (UID: \"8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202\") " pod="openshift-marketplace/redhat-marketplace-jch7c"
Mar 20 16:56:09 crc kubenswrapper[4730]: I0320 16:56:09.096336    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202-utilities\") pod \"redhat-marketplace-jch7c\" (UID: \"8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202\") " pod="openshift-marketplace/redhat-marketplace-jch7c"
Mar 20 16:56:09 crc kubenswrapper[4730]: I0320 16:56:09.096927    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202-catalog-content\") pod \"redhat-marketplace-jch7c\" (UID: \"8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202\") " pod="openshift-marketplace/redhat-marketplace-jch7c"
Mar 20 16:56:09 crc kubenswrapper[4730]: I0320 16:56:09.096966    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202-utilities\") pod \"redhat-marketplace-jch7c\" (UID: \"8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202\") " pod="openshift-marketplace/redhat-marketplace-jch7c"
Mar 20 16:56:09 crc kubenswrapper[4730]: I0320 16:56:09.149655    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdxsm\" (UniqueName: \"kubernetes.io/projected/8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202-kube-api-access-rdxsm\") pod \"redhat-marketplace-jch7c\" (UID: \"8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202\") " pod="openshift-marketplace/redhat-marketplace-jch7c"
Mar 20 16:56:09 crc kubenswrapper[4730]: I0320 16:56:09.188613    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jch7c"
Mar 20 16:56:09 crc kubenswrapper[4730]: I0320 16:56:09.710949    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jch7c"]
Mar 20 16:56:09 crc kubenswrapper[4730]: I0320 16:56:09.730582    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jch7c" event={"ID":"8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202","Type":"ContainerStarted","Data":"76c65f9a3b0e0d371d9088499eb8c2a8e4892c80999af850b2c400a7c2669c0c"}
Mar 20 16:56:10 crc kubenswrapper[4730]: I0320 16:56:10.744582    4730 generic.go:334] "Generic (PLEG): container finished" podID="8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202" containerID="4294471d41f464796f65094c95dd7a1856ebd4574db318d0dc4012ad1596dd16" exitCode=0
Mar 20 16:56:10 crc kubenswrapper[4730]: I0320 16:56:10.744673    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jch7c" event={"ID":"8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202","Type":"ContainerDied","Data":"4294471d41f464796f65094c95dd7a1856ebd4574db318d0dc4012ad1596dd16"}
Mar 20 16:56:12 crc kubenswrapper[4730]: I0320 16:56:12.765726    4730 generic.go:334] "Generic (PLEG): container finished" podID="8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202" containerID="c19f34ef26b5cfdd9a433322da9f8c1935c8cb8dfc879427725cad25c242ae4b" exitCode=0
Mar 20 16:56:12 crc kubenswrapper[4730]: I0320 16:56:12.765944    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jch7c" event={"ID":"8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202","Type":"ContainerDied","Data":"c19f34ef26b5cfdd9a433322da9f8c1935c8cb8dfc879427725cad25c242ae4b"}
Mar 20 16:56:12 crc kubenswrapper[4730]: I0320 16:56:12.880616    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 16:56:12 crc kubenswrapper[4730]: I0320 16:56:12.881044    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 16:56:14 crc kubenswrapper[4730]: I0320 16:56:14.817513    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jch7c" event={"ID":"8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202","Type":"ContainerStarted","Data":"0accd96625b6010b8f1ed489158a726d6acd23541ee8c36820a6f7d75e099aa5"}
Mar 20 16:56:14 crc kubenswrapper[4730]: I0320 16:56:14.837601    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jch7c" podStartSLOduration=4.244348059 podStartE2EDuration="6.837580074s" podCreationTimestamp="2026-03-20 16:56:08 +0000 UTC" firstStartedPulling="2026-03-20 16:56:10.747952259 +0000 UTC m=+4629.961323628" lastFinishedPulling="2026-03-20 16:56:13.341184234 +0000 UTC m=+4632.554555643" observedRunningTime="2026-03-20 16:56:14.834184028 +0000 UTC m=+4634.047555407" watchObservedRunningTime="2026-03-20 16:56:14.837580074 +0000 UTC m=+4634.050951443"
Mar 20 16:56:19 crc kubenswrapper[4730]: I0320 16:56:19.189273    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jch7c"
Mar 20 16:56:19 crc kubenswrapper[4730]: I0320 16:56:19.189927    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jch7c"
Mar 20 16:56:19 crc kubenswrapper[4730]: I0320 16:56:19.235666    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jch7c"
Mar 20 16:56:19 crc kubenswrapper[4730]: I0320 16:56:19.949444    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jch7c"
Mar 20 16:56:20 crc kubenswrapper[4730]: I0320 16:56:20.027428    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jch7c"]
Mar 20 16:56:21 crc kubenswrapper[4730]: I0320 16:56:21.898574    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jch7c" podUID="8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202" containerName="registry-server" containerID="cri-o://0accd96625b6010b8f1ed489158a726d6acd23541ee8c36820a6f7d75e099aa5" gracePeriod=2
Mar 20 16:56:22 crc kubenswrapper[4730]: I0320 16:56:22.909664    4730 generic.go:334] "Generic (PLEG): container finished" podID="8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202" containerID="0accd96625b6010b8f1ed489158a726d6acd23541ee8c36820a6f7d75e099aa5" exitCode=0
Mar 20 16:56:22 crc kubenswrapper[4730]: I0320 16:56:22.909700    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jch7c" event={"ID":"8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202","Type":"ContainerDied","Data":"0accd96625b6010b8f1ed489158a726d6acd23541ee8c36820a6f7d75e099aa5"}
Mar 20 16:56:23 crc kubenswrapper[4730]: I0320 16:56:23.235354    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jch7c"
Mar 20 16:56:23 crc kubenswrapper[4730]: I0320 16:56:23.432909    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202-utilities\") pod \"8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202\" (UID: \"8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202\") "
Mar 20 16:56:23 crc kubenswrapper[4730]: I0320 16:56:23.433025    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202-catalog-content\") pod \"8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202\" (UID: \"8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202\") "
Mar 20 16:56:23 crc kubenswrapper[4730]: I0320 16:56:23.433145    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdxsm\" (UniqueName: \"kubernetes.io/projected/8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202-kube-api-access-rdxsm\") pod \"8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202\" (UID: \"8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202\") "
Mar 20 16:56:23 crc kubenswrapper[4730]: I0320 16:56:23.435018    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202-utilities" (OuterVolumeSpecName: "utilities") pod "8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202" (UID: "8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:56:23 crc kubenswrapper[4730]: I0320 16:56:23.438518    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202-kube-api-access-rdxsm" (OuterVolumeSpecName: "kube-api-access-rdxsm") pod "8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202" (UID: "8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202"). InnerVolumeSpecName "kube-api-access-rdxsm". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:56:23 crc kubenswrapper[4730]: I0320 16:56:23.458645    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202" (UID: "8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:56:23 crc kubenswrapper[4730]: I0320 16:56:23.534859    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdxsm\" (UniqueName: \"kubernetes.io/projected/8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202-kube-api-access-rdxsm\") on node \"crc\" DevicePath \"\""
Mar 20 16:56:23 crc kubenswrapper[4730]: I0320 16:56:23.534894    4730 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202-utilities\") on node \"crc\" DevicePath \"\""
Mar 20 16:56:23 crc kubenswrapper[4730]: I0320 16:56:23.534908    4730 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202-catalog-content\") on node \"crc\" DevicePath \"\""
Mar 20 16:56:23 crc kubenswrapper[4730]: I0320 16:56:23.926479    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jch7c" event={"ID":"8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202","Type":"ContainerDied","Data":"76c65f9a3b0e0d371d9088499eb8c2a8e4892c80999af850b2c400a7c2669c0c"}
Mar 20 16:56:23 crc kubenswrapper[4730]: I0320 16:56:23.926882    4730 scope.go:117] "RemoveContainer" containerID="0accd96625b6010b8f1ed489158a726d6acd23541ee8c36820a6f7d75e099aa5"
Mar 20 16:56:23 crc kubenswrapper[4730]: I0320 16:56:23.926660    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jch7c"
Mar 20 16:56:23 crc kubenswrapper[4730]: I0320 16:56:23.980407    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jch7c"]
Mar 20 16:56:23 crc kubenswrapper[4730]: I0320 16:56:23.980918    4730 scope.go:117] "RemoveContainer" containerID="c19f34ef26b5cfdd9a433322da9f8c1935c8cb8dfc879427725cad25c242ae4b"
Mar 20 16:56:24 crc kubenswrapper[4730]: I0320 16:56:24.003318    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jch7c"]
Mar 20 16:56:24 crc kubenswrapper[4730]: I0320 16:56:24.013973    4730 scope.go:117] "RemoveContainer" containerID="4294471d41f464796f65094c95dd7a1856ebd4574db318d0dc4012ad1596dd16"
Mar 20 16:56:25 crc kubenswrapper[4730]: I0320 16:56:25.549697    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202" path="/var/lib/kubelet/pods/8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202/volumes"
Mar 20 16:56:42 crc kubenswrapper[4730]: I0320 16:56:42.880154    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 16:56:42 crc kubenswrapper[4730]: I0320 16:56:42.880685    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 16:56:42 crc kubenswrapper[4730]: I0320 16:56:42.880730    4730 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf"
Mar 20 16:56:42 crc kubenswrapper[4730]: I0320 16:56:42.881517    4730 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"96fa1a39363ce580a7d7ba62fa4f551be70b7f98a5cbd49713d8269774ccd8b9"} pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted"
Mar 20 16:56:42 crc kubenswrapper[4730]: I0320 16:56:42.881568    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" containerID="cri-o://96fa1a39363ce580a7d7ba62fa4f551be70b7f98a5cbd49713d8269774ccd8b9" gracePeriod=600
Mar 20 16:56:43 crc kubenswrapper[4730]: E0320 16:56:43.006923    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:56:43 crc kubenswrapper[4730]: I0320 16:56:43.114490    4730 generic.go:334] "Generic (PLEG): container finished" podID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerID="96fa1a39363ce580a7d7ba62fa4f551be70b7f98a5cbd49713d8269774ccd8b9" exitCode=0
Mar 20 16:56:43 crc kubenswrapper[4730]: I0320 16:56:43.114541    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" event={"ID":"7fcd3db3-55f1-4c23-8fa9-78844495cea3","Type":"ContainerDied","Data":"96fa1a39363ce580a7d7ba62fa4f551be70b7f98a5cbd49713d8269774ccd8b9"}
Mar 20 16:56:43 crc kubenswrapper[4730]: I0320 16:56:43.114585    4730 scope.go:117] "RemoveContainer" containerID="1911f50d0c8e9a77c325e3c9daab7021dcd8581ac14223ba2443e24f00e39603"
Mar 20 16:56:43 crc kubenswrapper[4730]: I0320 16:56:43.114996    4730 scope.go:117] "RemoveContainer" containerID="96fa1a39363ce580a7d7ba62fa4f551be70b7f98a5cbd49713d8269774ccd8b9"
Mar 20 16:56:43 crc kubenswrapper[4730]: E0320 16:56:43.115312    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:56:51 crc kubenswrapper[4730]: I0320 16:56:51.911527    4730 scope.go:117] "RemoveContainer" containerID="1b59cf8fa45b394473fc97dca5b148ed197f64170983d43d46fd520ecdca6208"
Mar 20 16:56:55 crc kubenswrapper[4730]: I0320 16:56:55.534572    4730 scope.go:117] "RemoveContainer" containerID="96fa1a39363ce580a7d7ba62fa4f551be70b7f98a5cbd49713d8269774ccd8b9"
Mar 20 16:56:55 crc kubenswrapper[4730]: E0320 16:56:55.535452    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:57:07 crc kubenswrapper[4730]: I0320 16:57:07.533773    4730 scope.go:117] "RemoveContainer" containerID="96fa1a39363ce580a7d7ba62fa4f551be70b7f98a5cbd49713d8269774ccd8b9"
Mar 20 16:57:07 crc kubenswrapper[4730]: E0320 16:57:07.534514    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:57:19 crc kubenswrapper[4730]: I0320 16:57:19.533407    4730 scope.go:117] "RemoveContainer" containerID="96fa1a39363ce580a7d7ba62fa4f551be70b7f98a5cbd49713d8269774ccd8b9"
Mar 20 16:57:19 crc kubenswrapper[4730]: E0320 16:57:19.534349    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:57:33 crc kubenswrapper[4730]: I0320 16:57:33.533506    4730 scope.go:117] "RemoveContainer" containerID="96fa1a39363ce580a7d7ba62fa4f551be70b7f98a5cbd49713d8269774ccd8b9"
Mar 20 16:57:33 crc kubenswrapper[4730]: E0320 16:57:33.534306    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:57:45 crc kubenswrapper[4730]: I0320 16:57:45.533753    4730 scope.go:117] "RemoveContainer" containerID="96fa1a39363ce580a7d7ba62fa4f551be70b7f98a5cbd49713d8269774ccd8b9"
Mar 20 16:57:45 crc kubenswrapper[4730]: E0320 16:57:45.534946    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:57:59 crc kubenswrapper[4730]: I0320 16:57:59.534063    4730 scope.go:117] "RemoveContainer" containerID="96fa1a39363ce580a7d7ba62fa4f551be70b7f98a5cbd49713d8269774ccd8b9"
Mar 20 16:57:59 crc kubenswrapper[4730]: E0320 16:57:59.535109    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:58:00 crc kubenswrapper[4730]: I0320 16:58:00.151496    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567098-4wf7k"]
Mar 20 16:58:00 crc kubenswrapper[4730]: E0320 16:58:00.151961    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202" containerName="extract-utilities"
Mar 20 16:58:00 crc kubenswrapper[4730]: I0320 16:58:00.151983    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202" containerName="extract-utilities"
Mar 20 16:58:00 crc kubenswrapper[4730]: E0320 16:58:00.152016    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202" containerName="extract-content"
Mar 20 16:58:00 crc kubenswrapper[4730]: I0320 16:58:00.152025    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202" containerName="extract-content"
Mar 20 16:58:00 crc kubenswrapper[4730]: E0320 16:58:00.152047    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202" containerName="registry-server"
Mar 20 16:58:00 crc kubenswrapper[4730]: I0320 16:58:00.152059    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202" containerName="registry-server"
Mar 20 16:58:00 crc kubenswrapper[4730]: I0320 16:58:00.152298    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e8e0ea6-d9fa-4ce9-b5a9-f9ae60c75202" containerName="registry-server"
Mar 20 16:58:00 crc kubenswrapper[4730]: I0320 16:58:00.153096    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567098-4wf7k"
Mar 20 16:58:00 crc kubenswrapper[4730]: I0320 16:58:00.157749    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vlqbl"
Mar 20 16:58:00 crc kubenswrapper[4730]: I0320 16:58:00.158608    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt"
Mar 20 16:58:00 crc kubenswrapper[4730]: I0320 16:58:00.158652    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt"
Mar 20 16:58:00 crc kubenswrapper[4730]: I0320 16:58:00.170262    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567098-4wf7k"]
Mar 20 16:58:00 crc kubenswrapper[4730]: I0320 16:58:00.289498    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgfzx\" (UniqueName: \"kubernetes.io/projected/a8d7ea06-69cc-41b0-afc3-fb5f3e55049b-kube-api-access-bgfzx\") pod \"auto-csr-approver-29567098-4wf7k\" (UID: \"a8d7ea06-69cc-41b0-afc3-fb5f3e55049b\") " pod="openshift-infra/auto-csr-approver-29567098-4wf7k"
Mar 20 16:58:00 crc kubenswrapper[4730]: I0320 16:58:00.392188    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgfzx\" (UniqueName: \"kubernetes.io/projected/a8d7ea06-69cc-41b0-afc3-fb5f3e55049b-kube-api-access-bgfzx\") pod \"auto-csr-approver-29567098-4wf7k\" (UID: \"a8d7ea06-69cc-41b0-afc3-fb5f3e55049b\") " pod="openshift-infra/auto-csr-approver-29567098-4wf7k"
Mar 20 16:58:00 crc kubenswrapper[4730]: I0320 16:58:00.414976    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgfzx\" (UniqueName: \"kubernetes.io/projected/a8d7ea06-69cc-41b0-afc3-fb5f3e55049b-kube-api-access-bgfzx\") pod \"auto-csr-approver-29567098-4wf7k\" (UID: \"a8d7ea06-69cc-41b0-afc3-fb5f3e55049b\") " pod="openshift-infra/auto-csr-approver-29567098-4wf7k"
Mar 20 16:58:00 crc kubenswrapper[4730]: I0320 16:58:00.480604    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567098-4wf7k"
Mar 20 16:58:00 crc kubenswrapper[4730]: I0320 16:58:00.959186    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567098-4wf7k"]
Mar 20 16:58:00 crc kubenswrapper[4730]: I0320 16:58:00.962441    4730 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider
Mar 20 16:58:01 crc kubenswrapper[4730]: I0320 16:58:01.958377    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567098-4wf7k" event={"ID":"a8d7ea06-69cc-41b0-afc3-fb5f3e55049b","Type":"ContainerStarted","Data":"fb0cbff14033ddab7ef9c034ebc0a695ee494411468e24a895e3a77bb50c4501"}
Mar 20 16:58:02 crc kubenswrapper[4730]: I0320 16:58:02.972097    4730 generic.go:334] "Generic (PLEG): container finished" podID="a8d7ea06-69cc-41b0-afc3-fb5f3e55049b" containerID="7caef3ce3ae643ea42e6304dc53d81d43ac8e7cc2d51fc8c56b8771cdad2f656" exitCode=0
Mar 20 16:58:02 crc kubenswrapper[4730]: I0320 16:58:02.972198    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567098-4wf7k" event={"ID":"a8d7ea06-69cc-41b0-afc3-fb5f3e55049b","Type":"ContainerDied","Data":"7caef3ce3ae643ea42e6304dc53d81d43ac8e7cc2d51fc8c56b8771cdad2f656"}
Mar 20 16:58:04 crc kubenswrapper[4730]: I0320 16:58:04.378314    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567098-4wf7k"
Mar 20 16:58:04 crc kubenswrapper[4730]: I0320 16:58:04.488611    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgfzx\" (UniqueName: \"kubernetes.io/projected/a8d7ea06-69cc-41b0-afc3-fb5f3e55049b-kube-api-access-bgfzx\") pod \"a8d7ea06-69cc-41b0-afc3-fb5f3e55049b\" (UID: \"a8d7ea06-69cc-41b0-afc3-fb5f3e55049b\") "
Mar 20 16:58:04 crc kubenswrapper[4730]: I0320 16:58:04.498697    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8d7ea06-69cc-41b0-afc3-fb5f3e55049b-kube-api-access-bgfzx" (OuterVolumeSpecName: "kube-api-access-bgfzx") pod "a8d7ea06-69cc-41b0-afc3-fb5f3e55049b" (UID: "a8d7ea06-69cc-41b0-afc3-fb5f3e55049b"). InnerVolumeSpecName "kube-api-access-bgfzx". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:58:04 crc kubenswrapper[4730]: I0320 16:58:04.591602    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgfzx\" (UniqueName: \"kubernetes.io/projected/a8d7ea06-69cc-41b0-afc3-fb5f3e55049b-kube-api-access-bgfzx\") on node \"crc\" DevicePath \"\""
Mar 20 16:58:04 crc kubenswrapper[4730]: I0320 16:58:04.991918    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567098-4wf7k" event={"ID":"a8d7ea06-69cc-41b0-afc3-fb5f3e55049b","Type":"ContainerDied","Data":"fb0cbff14033ddab7ef9c034ebc0a695ee494411468e24a895e3a77bb50c4501"}
Mar 20 16:58:04 crc kubenswrapper[4730]: I0320 16:58:04.992211    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb0cbff14033ddab7ef9c034ebc0a695ee494411468e24a895e3a77bb50c4501"
Mar 20 16:58:04 crc kubenswrapper[4730]: I0320 16:58:04.991947    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567098-4wf7k"
Mar 20 16:58:05 crc kubenswrapper[4730]: I0320 16:58:05.446929    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567092-v8dfz"]
Mar 20 16:58:05 crc kubenswrapper[4730]: I0320 16:58:05.456518    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567092-v8dfz"]
Mar 20 16:58:05 crc kubenswrapper[4730]: I0320 16:58:05.546537    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31e0f6b1-5405-420b-941e-4e711281673f" path="/var/lib/kubelet/pods/31e0f6b1-5405-420b-941e-4e711281673f/volumes"
Mar 20 16:58:14 crc kubenswrapper[4730]: I0320 16:58:14.532696    4730 scope.go:117] "RemoveContainer" containerID="96fa1a39363ce580a7d7ba62fa4f551be70b7f98a5cbd49713d8269774ccd8b9"
Mar 20 16:58:14 crc kubenswrapper[4730]: E0320 16:58:14.533346    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:58:27 crc kubenswrapper[4730]: I0320 16:58:27.533839    4730 scope.go:117] "RemoveContainer" containerID="96fa1a39363ce580a7d7ba62fa4f551be70b7f98a5cbd49713d8269774ccd8b9"
Mar 20 16:58:27 crc kubenswrapper[4730]: E0320 16:58:27.534649    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:58:39 crc kubenswrapper[4730]: I0320 16:58:39.533373    4730 scope.go:117] "RemoveContainer" containerID="96fa1a39363ce580a7d7ba62fa4f551be70b7f98a5cbd49713d8269774ccd8b9"
Mar 20 16:58:39 crc kubenswrapper[4730]: E0320 16:58:39.535325    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:58:50 crc kubenswrapper[4730]: I0320 16:58:50.351489    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cft2p"]
Mar 20 16:58:50 crc kubenswrapper[4730]: E0320 16:58:50.353527    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8d7ea06-69cc-41b0-afc3-fb5f3e55049b" containerName="oc"
Mar 20 16:58:50 crc kubenswrapper[4730]: I0320 16:58:50.353550    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8d7ea06-69cc-41b0-afc3-fb5f3e55049b" containerName="oc"
Mar 20 16:58:50 crc kubenswrapper[4730]: I0320 16:58:50.353762    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8d7ea06-69cc-41b0-afc3-fb5f3e55049b" containerName="oc"
Mar 20 16:58:50 crc kubenswrapper[4730]: I0320 16:58:50.355176    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cft2p"
Mar 20 16:58:50 crc kubenswrapper[4730]: I0320 16:58:50.379102    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cft2p"]
Mar 20 16:58:50 crc kubenswrapper[4730]: I0320 16:58:50.481720    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a16e21d-182d-4c13-9089-49aceb2bf64e-catalog-content\") pod \"redhat-operators-cft2p\" (UID: \"7a16e21d-182d-4c13-9089-49aceb2bf64e\") " pod="openshift-marketplace/redhat-operators-cft2p"
Mar 20 16:58:50 crc kubenswrapper[4730]: I0320 16:58:50.482499    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a16e21d-182d-4c13-9089-49aceb2bf64e-utilities\") pod \"redhat-operators-cft2p\" (UID: \"7a16e21d-182d-4c13-9089-49aceb2bf64e\") " pod="openshift-marketplace/redhat-operators-cft2p"
Mar 20 16:58:50 crc kubenswrapper[4730]: I0320 16:58:50.482630    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlklg\" (UniqueName: \"kubernetes.io/projected/7a16e21d-182d-4c13-9089-49aceb2bf64e-kube-api-access-jlklg\") pod \"redhat-operators-cft2p\" (UID: \"7a16e21d-182d-4c13-9089-49aceb2bf64e\") " pod="openshift-marketplace/redhat-operators-cft2p"
Mar 20 16:58:50 crc kubenswrapper[4730]: I0320 16:58:50.586522    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a16e21d-182d-4c13-9089-49aceb2bf64e-utilities\") pod \"redhat-operators-cft2p\" (UID: \"7a16e21d-182d-4c13-9089-49aceb2bf64e\") " pod="openshift-marketplace/redhat-operators-cft2p"
Mar 20 16:58:50 crc kubenswrapper[4730]: I0320 16:58:50.586970    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlklg\" (UniqueName: \"kubernetes.io/projected/7a16e21d-182d-4c13-9089-49aceb2bf64e-kube-api-access-jlklg\") pod \"redhat-operators-cft2p\" (UID: \"7a16e21d-182d-4c13-9089-49aceb2bf64e\") " pod="openshift-marketplace/redhat-operators-cft2p"
Mar 20 16:58:50 crc kubenswrapper[4730]: I0320 16:58:50.587232    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a16e21d-182d-4c13-9089-49aceb2bf64e-catalog-content\") pod \"redhat-operators-cft2p\" (UID: \"7a16e21d-182d-4c13-9089-49aceb2bf64e\") " pod="openshift-marketplace/redhat-operators-cft2p"
Mar 20 16:58:50 crc kubenswrapper[4730]: I0320 16:58:50.587233    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a16e21d-182d-4c13-9089-49aceb2bf64e-utilities\") pod \"redhat-operators-cft2p\" (UID: \"7a16e21d-182d-4c13-9089-49aceb2bf64e\") " pod="openshift-marketplace/redhat-operators-cft2p"
Mar 20 16:58:50 crc kubenswrapper[4730]: I0320 16:58:50.587825    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a16e21d-182d-4c13-9089-49aceb2bf64e-catalog-content\") pod \"redhat-operators-cft2p\" (UID: \"7a16e21d-182d-4c13-9089-49aceb2bf64e\") " pod="openshift-marketplace/redhat-operators-cft2p"
Mar 20 16:58:50 crc kubenswrapper[4730]: I0320 16:58:50.615955    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlklg\" (UniqueName: \"kubernetes.io/projected/7a16e21d-182d-4c13-9089-49aceb2bf64e-kube-api-access-jlklg\") pod \"redhat-operators-cft2p\" (UID: \"7a16e21d-182d-4c13-9089-49aceb2bf64e\") " pod="openshift-marketplace/redhat-operators-cft2p"
Mar 20 16:58:50 crc kubenswrapper[4730]: I0320 16:58:50.691298    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cft2p"
Mar 20 16:58:51 crc kubenswrapper[4730]: I0320 16:58:51.209638    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cft2p"]
Mar 20 16:58:51 crc kubenswrapper[4730]: I0320 16:58:51.486108    4730 generic.go:334] "Generic (PLEG): container finished" podID="7a16e21d-182d-4c13-9089-49aceb2bf64e" containerID="1ecc4f0c91882002fa7f0d51f0cc4607743264255f04b66356208c63914a4c04" exitCode=0
Mar 20 16:58:51 crc kubenswrapper[4730]: I0320 16:58:51.486228    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cft2p" event={"ID":"7a16e21d-182d-4c13-9089-49aceb2bf64e","Type":"ContainerDied","Data":"1ecc4f0c91882002fa7f0d51f0cc4607743264255f04b66356208c63914a4c04"}
Mar 20 16:58:51 crc kubenswrapper[4730]: I0320 16:58:51.486445    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cft2p" event={"ID":"7a16e21d-182d-4c13-9089-49aceb2bf64e","Type":"ContainerStarted","Data":"3b704111a62172b53d35ce955ecd8ad0d92ece2f9021e12eea289f8e3c5e40aa"}
Mar 20 16:58:52 crc kubenswrapper[4730]: I0320 16:58:52.029845    4730 scope.go:117] "RemoveContainer" containerID="ab6be6d2c86b64d8fa302dbb64f113f58f10be72cfc3ad77f609318442cc34dd"
Mar 20 16:58:52 crc kubenswrapper[4730]: I0320 16:58:52.536879    4730 scope.go:117] "RemoveContainer" containerID="96fa1a39363ce580a7d7ba62fa4f551be70b7f98a5cbd49713d8269774ccd8b9"
Mar 20 16:58:52 crc kubenswrapper[4730]: E0320 16:58:52.537632    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:58:52 crc kubenswrapper[4730]: I0320 16:58:52.543643    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cft2p" event={"ID":"7a16e21d-182d-4c13-9089-49aceb2bf64e","Type":"ContainerStarted","Data":"fa40729ee31ad53435997da4a543c920ca7bb80d17bc2b12c39cd8960d138b6c"}
Mar 20 16:58:57 crc kubenswrapper[4730]: I0320 16:58:57.595681    4730 generic.go:334] "Generic (PLEG): container finished" podID="7a16e21d-182d-4c13-9089-49aceb2bf64e" containerID="fa40729ee31ad53435997da4a543c920ca7bb80d17bc2b12c39cd8960d138b6c" exitCode=0
Mar 20 16:58:57 crc kubenswrapper[4730]: I0320 16:58:57.595758    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cft2p" event={"ID":"7a16e21d-182d-4c13-9089-49aceb2bf64e","Type":"ContainerDied","Data":"fa40729ee31ad53435997da4a543c920ca7bb80d17bc2b12c39cd8960d138b6c"}
Mar 20 16:58:59 crc kubenswrapper[4730]: I0320 16:58:59.623672    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cft2p" event={"ID":"7a16e21d-182d-4c13-9089-49aceb2bf64e","Type":"ContainerStarted","Data":"7bb7ee354edd6a6ac56e90b66e1c20c758e33122d0f1e93edf3cfc7f257c473d"}
Mar 20 16:58:59 crc kubenswrapper[4730]: I0320 16:58:59.646371    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cft2p" podStartSLOduration=1.8380339719999998 podStartE2EDuration="9.646355885s" podCreationTimestamp="2026-03-20 16:58:50 +0000 UTC" firstStartedPulling="2026-03-20 16:58:51.488937472 +0000 UTC m=+4790.702308841" lastFinishedPulling="2026-03-20 16:58:59.297259385 +0000 UTC m=+4798.510630754" observedRunningTime="2026-03-20 16:58:59.642545307 +0000 UTC m=+4798.855916676" watchObservedRunningTime="2026-03-20 16:58:59.646355885 +0000 UTC m=+4798.859727254"
Mar 20 16:59:00 crc kubenswrapper[4730]: I0320 16:59:00.691684    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cft2p"
Mar 20 16:59:00 crc kubenswrapper[4730]: I0320 16:59:00.691763    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cft2p"
Mar 20 16:59:01 crc kubenswrapper[4730]: I0320 16:59:01.746373    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cft2p" podUID="7a16e21d-182d-4c13-9089-49aceb2bf64e" containerName="registry-server" probeResult="failure" output=<
Mar 20 16:59:01 crc kubenswrapper[4730]:         timeout: failed to connect service ":50051" within 1s
Mar 20 16:59:01 crc kubenswrapper[4730]:  >
Mar 20 16:59:06 crc kubenswrapper[4730]: I0320 16:59:06.533884    4730 scope.go:117] "RemoveContainer" containerID="96fa1a39363ce580a7d7ba62fa4f551be70b7f98a5cbd49713d8269774ccd8b9"
Mar 20 16:59:06 crc kubenswrapper[4730]: E0320 16:59:06.535192    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:59:11 crc kubenswrapper[4730]: I0320 16:59:11.758126    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cft2p" podUID="7a16e21d-182d-4c13-9089-49aceb2bf64e" containerName="registry-server" probeResult="failure" output=<
Mar 20 16:59:11 crc kubenswrapper[4730]:         timeout: failed to connect service ":50051" within 1s
Mar 20 16:59:11 crc kubenswrapper[4730]:  >
Mar 20 16:59:18 crc kubenswrapper[4730]: I0320 16:59:18.534225    4730 scope.go:117] "RemoveContainer" containerID="96fa1a39363ce580a7d7ba62fa4f551be70b7f98a5cbd49713d8269774ccd8b9"
Mar 20 16:59:18 crc kubenswrapper[4730]: E0320 16:59:18.535121    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:59:21 crc kubenswrapper[4730]: I0320 16:59:21.746099    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cft2p" podUID="7a16e21d-182d-4c13-9089-49aceb2bf64e" containerName="registry-server" probeResult="failure" output=<
Mar 20 16:59:21 crc kubenswrapper[4730]:         timeout: failed to connect service ":50051" within 1s
Mar 20 16:59:21 crc kubenswrapper[4730]:  >
Mar 20 16:59:30 crc kubenswrapper[4730]: I0320 16:59:30.533956    4730 scope.go:117] "RemoveContainer" containerID="96fa1a39363ce580a7d7ba62fa4f551be70b7f98a5cbd49713d8269774ccd8b9"
Mar 20 16:59:30 crc kubenswrapper[4730]: E0320 16:59:30.534983    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:59:30 crc kubenswrapper[4730]: I0320 16:59:30.770688    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cft2p"
Mar 20 16:59:30 crc kubenswrapper[4730]: I0320 16:59:30.858233    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cft2p"
Mar 20 16:59:31 crc kubenswrapper[4730]: I0320 16:59:31.010119    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cft2p"]
Mar 20 16:59:32 crc kubenswrapper[4730]: I0320 16:59:32.011429    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cft2p" podUID="7a16e21d-182d-4c13-9089-49aceb2bf64e" containerName="registry-server" containerID="cri-o://7bb7ee354edd6a6ac56e90b66e1c20c758e33122d0f1e93edf3cfc7f257c473d" gracePeriod=2
Mar 20 16:59:32 crc kubenswrapper[4730]: I0320 16:59:32.599050    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cft2p"
Mar 20 16:59:32 crc kubenswrapper[4730]: I0320 16:59:32.768035    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a16e21d-182d-4c13-9089-49aceb2bf64e-utilities\") pod \"7a16e21d-182d-4c13-9089-49aceb2bf64e\" (UID: \"7a16e21d-182d-4c13-9089-49aceb2bf64e\") "
Mar 20 16:59:32 crc kubenswrapper[4730]: I0320 16:59:32.768553    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a16e21d-182d-4c13-9089-49aceb2bf64e-catalog-content\") pod \"7a16e21d-182d-4c13-9089-49aceb2bf64e\" (UID: \"7a16e21d-182d-4c13-9089-49aceb2bf64e\") "
Mar 20 16:59:32 crc kubenswrapper[4730]: I0320 16:59:32.768658    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlklg\" (UniqueName: \"kubernetes.io/projected/7a16e21d-182d-4c13-9089-49aceb2bf64e-kube-api-access-jlklg\") pod \"7a16e21d-182d-4c13-9089-49aceb2bf64e\" (UID: \"7a16e21d-182d-4c13-9089-49aceb2bf64e\") "
Mar 20 16:59:32 crc kubenswrapper[4730]: I0320 16:59:32.768960    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a16e21d-182d-4c13-9089-49aceb2bf64e-utilities" (OuterVolumeSpecName: "utilities") pod "7a16e21d-182d-4c13-9089-49aceb2bf64e" (UID: "7a16e21d-182d-4c13-9089-49aceb2bf64e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:59:32 crc kubenswrapper[4730]: I0320 16:59:32.769330    4730 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a16e21d-182d-4c13-9089-49aceb2bf64e-utilities\") on node \"crc\" DevicePath \"\""
Mar 20 16:59:32 crc kubenswrapper[4730]: I0320 16:59:32.779518    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a16e21d-182d-4c13-9089-49aceb2bf64e-kube-api-access-jlklg" (OuterVolumeSpecName: "kube-api-access-jlklg") pod "7a16e21d-182d-4c13-9089-49aceb2bf64e" (UID: "7a16e21d-182d-4c13-9089-49aceb2bf64e"). InnerVolumeSpecName "kube-api-access-jlklg". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:59:32 crc kubenswrapper[4730]: I0320 16:59:32.874587    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlklg\" (UniqueName: \"kubernetes.io/projected/7a16e21d-182d-4c13-9089-49aceb2bf64e-kube-api-access-jlklg\") on node \"crc\" DevicePath \"\""
Mar 20 16:59:32 crc kubenswrapper[4730]: I0320 16:59:32.907481    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a16e21d-182d-4c13-9089-49aceb2bf64e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7a16e21d-182d-4c13-9089-49aceb2bf64e" (UID: "7a16e21d-182d-4c13-9089-49aceb2bf64e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:59:32 crc kubenswrapper[4730]: I0320 16:59:32.976754    4730 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a16e21d-182d-4c13-9089-49aceb2bf64e-catalog-content\") on node \"crc\" DevicePath \"\""
Mar 20 16:59:33 crc kubenswrapper[4730]: I0320 16:59:33.022102    4730 generic.go:334] "Generic (PLEG): container finished" podID="7a16e21d-182d-4c13-9089-49aceb2bf64e" containerID="7bb7ee354edd6a6ac56e90b66e1c20c758e33122d0f1e93edf3cfc7f257c473d" exitCode=0
Mar 20 16:59:33 crc kubenswrapper[4730]: I0320 16:59:33.022141    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cft2p" event={"ID":"7a16e21d-182d-4c13-9089-49aceb2bf64e","Type":"ContainerDied","Data":"7bb7ee354edd6a6ac56e90b66e1c20c758e33122d0f1e93edf3cfc7f257c473d"}
Mar 20 16:59:33 crc kubenswrapper[4730]: I0320 16:59:33.022170    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cft2p"
Mar 20 16:59:33 crc kubenswrapper[4730]: I0320 16:59:33.022187    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cft2p" event={"ID":"7a16e21d-182d-4c13-9089-49aceb2bf64e","Type":"ContainerDied","Data":"3b704111a62172b53d35ce955ecd8ad0d92ece2f9021e12eea289f8e3c5e40aa"}
Mar 20 16:59:33 crc kubenswrapper[4730]: I0320 16:59:33.022204    4730 scope.go:117] "RemoveContainer" containerID="7bb7ee354edd6a6ac56e90b66e1c20c758e33122d0f1e93edf3cfc7f257c473d"
Mar 20 16:59:33 crc kubenswrapper[4730]: I0320 16:59:33.056129    4730 scope.go:117] "RemoveContainer" containerID="fa40729ee31ad53435997da4a543c920ca7bb80d17bc2b12c39cd8960d138b6c"
Mar 20 16:59:33 crc kubenswrapper[4730]: I0320 16:59:33.069217    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cft2p"]
Mar 20 16:59:33 crc kubenswrapper[4730]: I0320 16:59:33.078973    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cft2p"]
Mar 20 16:59:33 crc kubenswrapper[4730]: I0320 16:59:33.084336    4730 scope.go:117] "RemoveContainer" containerID="1ecc4f0c91882002fa7f0d51f0cc4607743264255f04b66356208c63914a4c04"
Mar 20 16:59:33 crc kubenswrapper[4730]: I0320 16:59:33.127387    4730 scope.go:117] "RemoveContainer" containerID="7bb7ee354edd6a6ac56e90b66e1c20c758e33122d0f1e93edf3cfc7f257c473d"
Mar 20 16:59:33 crc kubenswrapper[4730]: E0320 16:59:33.127753    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bb7ee354edd6a6ac56e90b66e1c20c758e33122d0f1e93edf3cfc7f257c473d\": container with ID starting with 7bb7ee354edd6a6ac56e90b66e1c20c758e33122d0f1e93edf3cfc7f257c473d not found: ID does not exist" containerID="7bb7ee354edd6a6ac56e90b66e1c20c758e33122d0f1e93edf3cfc7f257c473d"
Mar 20 16:59:33 crc kubenswrapper[4730]: I0320 16:59:33.127785    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bb7ee354edd6a6ac56e90b66e1c20c758e33122d0f1e93edf3cfc7f257c473d"} err="failed to get container status \"7bb7ee354edd6a6ac56e90b66e1c20c758e33122d0f1e93edf3cfc7f257c473d\": rpc error: code = NotFound desc = could not find container \"7bb7ee354edd6a6ac56e90b66e1c20c758e33122d0f1e93edf3cfc7f257c473d\": container with ID starting with 7bb7ee354edd6a6ac56e90b66e1c20c758e33122d0f1e93edf3cfc7f257c473d not found: ID does not exist"
Mar 20 16:59:33 crc kubenswrapper[4730]: I0320 16:59:33.127809    4730 scope.go:117] "RemoveContainer" containerID="fa40729ee31ad53435997da4a543c920ca7bb80d17bc2b12c39cd8960d138b6c"
Mar 20 16:59:33 crc kubenswrapper[4730]: E0320 16:59:33.128242    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa40729ee31ad53435997da4a543c920ca7bb80d17bc2b12c39cd8960d138b6c\": container with ID starting with fa40729ee31ad53435997da4a543c920ca7bb80d17bc2b12c39cd8960d138b6c not found: ID does not exist" containerID="fa40729ee31ad53435997da4a543c920ca7bb80d17bc2b12c39cd8960d138b6c"
Mar 20 16:59:33 crc kubenswrapper[4730]: I0320 16:59:33.128315    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa40729ee31ad53435997da4a543c920ca7bb80d17bc2b12c39cd8960d138b6c"} err="failed to get container status \"fa40729ee31ad53435997da4a543c920ca7bb80d17bc2b12c39cd8960d138b6c\": rpc error: code = NotFound desc = could not find container \"fa40729ee31ad53435997da4a543c920ca7bb80d17bc2b12c39cd8960d138b6c\": container with ID starting with fa40729ee31ad53435997da4a543c920ca7bb80d17bc2b12c39cd8960d138b6c not found: ID does not exist"
Mar 20 16:59:33 crc kubenswrapper[4730]: I0320 16:59:33.128369    4730 scope.go:117] "RemoveContainer" containerID="1ecc4f0c91882002fa7f0d51f0cc4607743264255f04b66356208c63914a4c04"
Mar 20 16:59:33 crc kubenswrapper[4730]: E0320 16:59:33.128728    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ecc4f0c91882002fa7f0d51f0cc4607743264255f04b66356208c63914a4c04\": container with ID starting with 1ecc4f0c91882002fa7f0d51f0cc4607743264255f04b66356208c63914a4c04 not found: ID does not exist" containerID="1ecc4f0c91882002fa7f0d51f0cc4607743264255f04b66356208c63914a4c04"
Mar 20 16:59:33 crc kubenswrapper[4730]: I0320 16:59:33.128776    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ecc4f0c91882002fa7f0d51f0cc4607743264255f04b66356208c63914a4c04"} err="failed to get container status \"1ecc4f0c91882002fa7f0d51f0cc4607743264255f04b66356208c63914a4c04\": rpc error: code = NotFound desc = could not find container \"1ecc4f0c91882002fa7f0d51f0cc4607743264255f04b66356208c63914a4c04\": container with ID starting with 1ecc4f0c91882002fa7f0d51f0cc4607743264255f04b66356208c63914a4c04 not found: ID does not exist"
Mar 20 16:59:33 crc kubenswrapper[4730]: I0320 16:59:33.547108    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a16e21d-182d-4c13-9089-49aceb2bf64e" path="/var/lib/kubelet/pods/7a16e21d-182d-4c13-9089-49aceb2bf64e/volumes"
Mar 20 16:59:39 crc kubenswrapper[4730]: I0320 16:59:39.158325    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-r6zkc"]
Mar 20 16:59:39 crc kubenswrapper[4730]: E0320 16:59:39.159533    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a16e21d-182d-4c13-9089-49aceb2bf64e" containerName="extract-content"
Mar 20 16:59:39 crc kubenswrapper[4730]: I0320 16:59:39.159557    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a16e21d-182d-4c13-9089-49aceb2bf64e" containerName="extract-content"
Mar 20 16:59:39 crc kubenswrapper[4730]: E0320 16:59:39.159576    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a16e21d-182d-4c13-9089-49aceb2bf64e" containerName="extract-utilities"
Mar 20 16:59:39 crc kubenswrapper[4730]: I0320 16:59:39.159589    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a16e21d-182d-4c13-9089-49aceb2bf64e" containerName="extract-utilities"
Mar 20 16:59:39 crc kubenswrapper[4730]: E0320 16:59:39.159640    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a16e21d-182d-4c13-9089-49aceb2bf64e" containerName="registry-server"
Mar 20 16:59:39 crc kubenswrapper[4730]: I0320 16:59:39.159652    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a16e21d-182d-4c13-9089-49aceb2bf64e" containerName="registry-server"
Mar 20 16:59:39 crc kubenswrapper[4730]: I0320 16:59:39.159970    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a16e21d-182d-4c13-9089-49aceb2bf64e" containerName="registry-server"
Mar 20 16:59:39 crc kubenswrapper[4730]: I0320 16:59:39.162841    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r6zkc"
Mar 20 16:59:39 crc kubenswrapper[4730]: I0320 16:59:39.195295    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r6zkc"]
Mar 20 16:59:39 crc kubenswrapper[4730]: I0320 16:59:39.326441    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94ht5\" (UniqueName: \"kubernetes.io/projected/b1f9169d-1b10-4fa9-a78c-8d0629059181-kube-api-access-94ht5\") pod \"certified-operators-r6zkc\" (UID: \"b1f9169d-1b10-4fa9-a78c-8d0629059181\") " pod="openshift-marketplace/certified-operators-r6zkc"
Mar 20 16:59:39 crc kubenswrapper[4730]: I0320 16:59:39.326501    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1f9169d-1b10-4fa9-a78c-8d0629059181-utilities\") pod \"certified-operators-r6zkc\" (UID: \"b1f9169d-1b10-4fa9-a78c-8d0629059181\") " pod="openshift-marketplace/certified-operators-r6zkc"
Mar 20 16:59:39 crc kubenswrapper[4730]: I0320 16:59:39.326641    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1f9169d-1b10-4fa9-a78c-8d0629059181-catalog-content\") pod \"certified-operators-r6zkc\" (UID: \"b1f9169d-1b10-4fa9-a78c-8d0629059181\") " pod="openshift-marketplace/certified-operators-r6zkc"
Mar 20 16:59:39 crc kubenswrapper[4730]: I0320 16:59:39.428046    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1f9169d-1b10-4fa9-a78c-8d0629059181-catalog-content\") pod \"certified-operators-r6zkc\" (UID: \"b1f9169d-1b10-4fa9-a78c-8d0629059181\") " pod="openshift-marketplace/certified-operators-r6zkc"
Mar 20 16:59:39 crc kubenswrapper[4730]: I0320 16:59:39.428115    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94ht5\" (UniqueName: \"kubernetes.io/projected/b1f9169d-1b10-4fa9-a78c-8d0629059181-kube-api-access-94ht5\") pod \"certified-operators-r6zkc\" (UID: \"b1f9169d-1b10-4fa9-a78c-8d0629059181\") " pod="openshift-marketplace/certified-operators-r6zkc"
Mar 20 16:59:39 crc kubenswrapper[4730]: I0320 16:59:39.428156    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1f9169d-1b10-4fa9-a78c-8d0629059181-utilities\") pod \"certified-operators-r6zkc\" (UID: \"b1f9169d-1b10-4fa9-a78c-8d0629059181\") " pod="openshift-marketplace/certified-operators-r6zkc"
Mar 20 16:59:39 crc kubenswrapper[4730]: I0320 16:59:39.428560    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1f9169d-1b10-4fa9-a78c-8d0629059181-catalog-content\") pod \"certified-operators-r6zkc\" (UID: \"b1f9169d-1b10-4fa9-a78c-8d0629059181\") " pod="openshift-marketplace/certified-operators-r6zkc"
Mar 20 16:59:39 crc kubenswrapper[4730]: I0320 16:59:39.428655    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1f9169d-1b10-4fa9-a78c-8d0629059181-utilities\") pod \"certified-operators-r6zkc\" (UID: \"b1f9169d-1b10-4fa9-a78c-8d0629059181\") " pod="openshift-marketplace/certified-operators-r6zkc"
Mar 20 16:59:39 crc kubenswrapper[4730]: I0320 16:59:39.448416    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94ht5\" (UniqueName: \"kubernetes.io/projected/b1f9169d-1b10-4fa9-a78c-8d0629059181-kube-api-access-94ht5\") pod \"certified-operators-r6zkc\" (UID: \"b1f9169d-1b10-4fa9-a78c-8d0629059181\") " pod="openshift-marketplace/certified-operators-r6zkc"
Mar 20 16:59:39 crc kubenswrapper[4730]: I0320 16:59:39.495381    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r6zkc"
Mar 20 16:59:40 crc kubenswrapper[4730]: I0320 16:59:40.026454    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r6zkc"]
Mar 20 16:59:40 crc kubenswrapper[4730]: I0320 16:59:40.111126    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r6zkc" event={"ID":"b1f9169d-1b10-4fa9-a78c-8d0629059181","Type":"ContainerStarted","Data":"ddb7ee25b4b232823e91d59af40a6a8a917740f268d360194245e482dd0fbdce"}
Mar 20 16:59:41 crc kubenswrapper[4730]: I0320 16:59:41.121821    4730 generic.go:334] "Generic (PLEG): container finished" podID="b1f9169d-1b10-4fa9-a78c-8d0629059181" containerID="e93f916c76f0c714a78d5f1db0c26a9a99a0af0d21ca7e5208e0a5cdff340fcc" exitCode=0
Mar 20 16:59:41 crc kubenswrapper[4730]: I0320 16:59:41.123299    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r6zkc" event={"ID":"b1f9169d-1b10-4fa9-a78c-8d0629059181","Type":"ContainerDied","Data":"e93f916c76f0c714a78d5f1db0c26a9a99a0af0d21ca7e5208e0a5cdff340fcc"}
Mar 20 16:59:42 crc kubenswrapper[4730]: I0320 16:59:42.135403    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r6zkc" event={"ID":"b1f9169d-1b10-4fa9-a78c-8d0629059181","Type":"ContainerStarted","Data":"fc7a56139b1dc20f843850c6ed730e68fb5324820a378dd38eb40e9f321e26b3"}
Mar 20 16:59:43 crc kubenswrapper[4730]: I0320 16:59:43.533640    4730 scope.go:117] "RemoveContainer" containerID="96fa1a39363ce580a7d7ba62fa4f551be70b7f98a5cbd49713d8269774ccd8b9"
Mar 20 16:59:43 crc kubenswrapper[4730]: E0320 16:59:43.534730    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 16:59:44 crc kubenswrapper[4730]: I0320 16:59:44.157007    4730 generic.go:334] "Generic (PLEG): container finished" podID="b1f9169d-1b10-4fa9-a78c-8d0629059181" containerID="fc7a56139b1dc20f843850c6ed730e68fb5324820a378dd38eb40e9f321e26b3" exitCode=0
Mar 20 16:59:44 crc kubenswrapper[4730]: I0320 16:59:44.157083    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r6zkc" event={"ID":"b1f9169d-1b10-4fa9-a78c-8d0629059181","Type":"ContainerDied","Data":"fc7a56139b1dc20f843850c6ed730e68fb5324820a378dd38eb40e9f321e26b3"}
Mar 20 16:59:45 crc kubenswrapper[4730]: I0320 16:59:45.171970    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r6zkc" event={"ID":"b1f9169d-1b10-4fa9-a78c-8d0629059181","Type":"ContainerStarted","Data":"f9cb933b2bbced58f72d1e703311eab42c9f076e0b3d7d1b5a991d19fbebe31b"}
Mar 20 16:59:45 crc kubenswrapper[4730]: I0320 16:59:45.206708    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-r6zkc" podStartSLOduration=2.7460172099999998 podStartE2EDuration="6.20666459s" podCreationTimestamp="2026-03-20 16:59:39 +0000 UTC" firstStartedPulling="2026-03-20 16:59:41.123444726 +0000 UTC m=+4840.336816095" lastFinishedPulling="2026-03-20 16:59:44.584092116 +0000 UTC m=+4843.797463475" observedRunningTime="2026-03-20 16:59:45.196578094 +0000 UTC m=+4844.409949483" watchObservedRunningTime="2026-03-20 16:59:45.20666459 +0000 UTC m=+4844.420035959"
Mar 20 16:59:49 crc kubenswrapper[4730]: I0320 16:59:49.495877    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-r6zkc"
Mar 20 16:59:49 crc kubenswrapper[4730]: I0320 16:59:49.496542    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-r6zkc"
Mar 20 16:59:49 crc kubenswrapper[4730]: I0320 16:59:49.553997    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-r6zkc"
Mar 20 16:59:50 crc kubenswrapper[4730]: I0320 16:59:50.318938    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-r6zkc"
Mar 20 16:59:50 crc kubenswrapper[4730]: I0320 16:59:50.603760    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r6zkc"]
Mar 20 16:59:52 crc kubenswrapper[4730]: I0320 16:59:52.247945    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-r6zkc" podUID="b1f9169d-1b10-4fa9-a78c-8d0629059181" containerName="registry-server" containerID="cri-o://f9cb933b2bbced58f72d1e703311eab42c9f076e0b3d7d1b5a991d19fbebe31b" gracePeriod=2
Mar 20 16:59:52 crc kubenswrapper[4730]: I0320 16:59:52.788360    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r6zkc"
Mar 20 16:59:52 crc kubenswrapper[4730]: I0320 16:59:52.834944    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94ht5\" (UniqueName: \"kubernetes.io/projected/b1f9169d-1b10-4fa9-a78c-8d0629059181-kube-api-access-94ht5\") pod \"b1f9169d-1b10-4fa9-a78c-8d0629059181\" (UID: \"b1f9169d-1b10-4fa9-a78c-8d0629059181\") "
Mar 20 16:59:52 crc kubenswrapper[4730]: I0320 16:59:52.835088    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1f9169d-1b10-4fa9-a78c-8d0629059181-catalog-content\") pod \"b1f9169d-1b10-4fa9-a78c-8d0629059181\" (UID: \"b1f9169d-1b10-4fa9-a78c-8d0629059181\") "
Mar 20 16:59:52 crc kubenswrapper[4730]: I0320 16:59:52.835185    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1f9169d-1b10-4fa9-a78c-8d0629059181-utilities\") pod \"b1f9169d-1b10-4fa9-a78c-8d0629059181\" (UID: \"b1f9169d-1b10-4fa9-a78c-8d0629059181\") "
Mar 20 16:59:52 crc kubenswrapper[4730]: I0320 16:59:52.837284    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1f9169d-1b10-4fa9-a78c-8d0629059181-utilities" (OuterVolumeSpecName: "utilities") pod "b1f9169d-1b10-4fa9-a78c-8d0629059181" (UID: "b1f9169d-1b10-4fa9-a78c-8d0629059181"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:59:52 crc kubenswrapper[4730]: I0320 16:59:52.846186    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1f9169d-1b10-4fa9-a78c-8d0629059181-kube-api-access-94ht5" (OuterVolumeSpecName: "kube-api-access-94ht5") pod "b1f9169d-1b10-4fa9-a78c-8d0629059181" (UID: "b1f9169d-1b10-4fa9-a78c-8d0629059181"). InnerVolumeSpecName "kube-api-access-94ht5". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 16:59:52 crc kubenswrapper[4730]: I0320 16:59:52.925075    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1f9169d-1b10-4fa9-a78c-8d0629059181-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b1f9169d-1b10-4fa9-a78c-8d0629059181" (UID: "b1f9169d-1b10-4fa9-a78c-8d0629059181"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 16:59:52 crc kubenswrapper[4730]: I0320 16:59:52.939799    4730 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1f9169d-1b10-4fa9-a78c-8d0629059181-utilities\") on node \"crc\" DevicePath \"\""
Mar 20 16:59:52 crc kubenswrapper[4730]: I0320 16:59:52.939843    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94ht5\" (UniqueName: \"kubernetes.io/projected/b1f9169d-1b10-4fa9-a78c-8d0629059181-kube-api-access-94ht5\") on node \"crc\" DevicePath \"\""
Mar 20 16:59:52 crc kubenswrapper[4730]: I0320 16:59:52.939858    4730 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1f9169d-1b10-4fa9-a78c-8d0629059181-catalog-content\") on node \"crc\" DevicePath \"\""
Mar 20 16:59:53 crc kubenswrapper[4730]: I0320 16:59:53.260427    4730 generic.go:334] "Generic (PLEG): container finished" podID="b1f9169d-1b10-4fa9-a78c-8d0629059181" containerID="f9cb933b2bbced58f72d1e703311eab42c9f076e0b3d7d1b5a991d19fbebe31b" exitCode=0
Mar 20 16:59:53 crc kubenswrapper[4730]: I0320 16:59:53.260503    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r6zkc" event={"ID":"b1f9169d-1b10-4fa9-a78c-8d0629059181","Type":"ContainerDied","Data":"f9cb933b2bbced58f72d1e703311eab42c9f076e0b3d7d1b5a991d19fbebe31b"}
Mar 20 16:59:53 crc kubenswrapper[4730]: I0320 16:59:53.260532    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r6zkc"
Mar 20 16:59:53 crc kubenswrapper[4730]: I0320 16:59:53.260564    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r6zkc" event={"ID":"b1f9169d-1b10-4fa9-a78c-8d0629059181","Type":"ContainerDied","Data":"ddb7ee25b4b232823e91d59af40a6a8a917740f268d360194245e482dd0fbdce"}
Mar 20 16:59:53 crc kubenswrapper[4730]: I0320 16:59:53.260595    4730 scope.go:117] "RemoveContainer" containerID="f9cb933b2bbced58f72d1e703311eab42c9f076e0b3d7d1b5a991d19fbebe31b"
Mar 20 16:59:53 crc kubenswrapper[4730]: I0320 16:59:53.281561    4730 scope.go:117] "RemoveContainer" containerID="fc7a56139b1dc20f843850c6ed730e68fb5324820a378dd38eb40e9f321e26b3"
Mar 20 16:59:53 crc kubenswrapper[4730]: I0320 16:59:53.318978    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r6zkc"]
Mar 20 16:59:53 crc kubenswrapper[4730]: I0320 16:59:53.330242    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-r6zkc"]
Mar 20 16:59:53 crc kubenswrapper[4730]: I0320 16:59:53.332949    4730 scope.go:117] "RemoveContainer" containerID="e93f916c76f0c714a78d5f1db0c26a9a99a0af0d21ca7e5208e0a5cdff340fcc"
Mar 20 16:59:53 crc kubenswrapper[4730]: I0320 16:59:53.377150    4730 scope.go:117] "RemoveContainer" containerID="f9cb933b2bbced58f72d1e703311eab42c9f076e0b3d7d1b5a991d19fbebe31b"
Mar 20 16:59:53 crc kubenswrapper[4730]: E0320 16:59:53.377761    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9cb933b2bbced58f72d1e703311eab42c9f076e0b3d7d1b5a991d19fbebe31b\": container with ID starting with f9cb933b2bbced58f72d1e703311eab42c9f076e0b3d7d1b5a991d19fbebe31b not found: ID does not exist" containerID="f9cb933b2bbced58f72d1e703311eab42c9f076e0b3d7d1b5a991d19fbebe31b"
Mar 20 16:59:53 crc kubenswrapper[4730]: I0320 16:59:53.377803    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9cb933b2bbced58f72d1e703311eab42c9f076e0b3d7d1b5a991d19fbebe31b"} err="failed to get container status \"f9cb933b2bbced58f72d1e703311eab42c9f076e0b3d7d1b5a991d19fbebe31b\": rpc error: code = NotFound desc = could not find container \"f9cb933b2bbced58f72d1e703311eab42c9f076e0b3d7d1b5a991d19fbebe31b\": container with ID starting with f9cb933b2bbced58f72d1e703311eab42c9f076e0b3d7d1b5a991d19fbebe31b not found: ID does not exist"
Mar 20 16:59:53 crc kubenswrapper[4730]: I0320 16:59:53.377828    4730 scope.go:117] "RemoveContainer" containerID="fc7a56139b1dc20f843850c6ed730e68fb5324820a378dd38eb40e9f321e26b3"
Mar 20 16:59:53 crc kubenswrapper[4730]: E0320 16:59:53.378242    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc7a56139b1dc20f843850c6ed730e68fb5324820a378dd38eb40e9f321e26b3\": container with ID starting with fc7a56139b1dc20f843850c6ed730e68fb5324820a378dd38eb40e9f321e26b3 not found: ID does not exist" containerID="fc7a56139b1dc20f843850c6ed730e68fb5324820a378dd38eb40e9f321e26b3"
Mar 20 16:59:53 crc kubenswrapper[4730]: I0320 16:59:53.378310    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc7a56139b1dc20f843850c6ed730e68fb5324820a378dd38eb40e9f321e26b3"} err="failed to get container status \"fc7a56139b1dc20f843850c6ed730e68fb5324820a378dd38eb40e9f321e26b3\": rpc error: code = NotFound desc = could not find container \"fc7a56139b1dc20f843850c6ed730e68fb5324820a378dd38eb40e9f321e26b3\": container with ID starting with fc7a56139b1dc20f843850c6ed730e68fb5324820a378dd38eb40e9f321e26b3 not found: ID does not exist"
Mar 20 16:59:53 crc kubenswrapper[4730]: I0320 16:59:53.378336    4730 scope.go:117] "RemoveContainer" containerID="e93f916c76f0c714a78d5f1db0c26a9a99a0af0d21ca7e5208e0a5cdff340fcc"
Mar 20 16:59:53 crc kubenswrapper[4730]: E0320 16:59:53.378726    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e93f916c76f0c714a78d5f1db0c26a9a99a0af0d21ca7e5208e0a5cdff340fcc\": container with ID starting with e93f916c76f0c714a78d5f1db0c26a9a99a0af0d21ca7e5208e0a5cdff340fcc not found: ID does not exist" containerID="e93f916c76f0c714a78d5f1db0c26a9a99a0af0d21ca7e5208e0a5cdff340fcc"
Mar 20 16:59:53 crc kubenswrapper[4730]: I0320 16:59:53.378755    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e93f916c76f0c714a78d5f1db0c26a9a99a0af0d21ca7e5208e0a5cdff340fcc"} err="failed to get container status \"e93f916c76f0c714a78d5f1db0c26a9a99a0af0d21ca7e5208e0a5cdff340fcc\": rpc error: code = NotFound desc = could not find container \"e93f916c76f0c714a78d5f1db0c26a9a99a0af0d21ca7e5208e0a5cdff340fcc\": container with ID starting with e93f916c76f0c714a78d5f1db0c26a9a99a0af0d21ca7e5208e0a5cdff340fcc not found: ID does not exist"
Mar 20 16:59:53 crc kubenswrapper[4730]: I0320 16:59:53.546636    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1f9169d-1b10-4fa9-a78c-8d0629059181" path="/var/lib/kubelet/pods/b1f9169d-1b10-4fa9-a78c-8d0629059181/volumes"
Mar 20 16:59:55 crc kubenswrapper[4730]: I0320 16:59:55.533922    4730 scope.go:117] "RemoveContainer" containerID="96fa1a39363ce580a7d7ba62fa4f551be70b7f98a5cbd49713d8269774ccd8b9"
Mar 20 16:59:55 crc kubenswrapper[4730]: E0320 16:59:55.534880    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 17:00:00 crc kubenswrapper[4730]: I0320 17:00:00.171400    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567100-vgj2v"]
Mar 20 17:00:00 crc kubenswrapper[4730]: E0320 17:00:00.172138    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1f9169d-1b10-4fa9-a78c-8d0629059181" containerName="registry-server"
Mar 20 17:00:00 crc kubenswrapper[4730]: I0320 17:00:00.172153    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1f9169d-1b10-4fa9-a78c-8d0629059181" containerName="registry-server"
Mar 20 17:00:00 crc kubenswrapper[4730]: E0320 17:00:00.172191    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1f9169d-1b10-4fa9-a78c-8d0629059181" containerName="extract-utilities"
Mar 20 17:00:00 crc kubenswrapper[4730]: I0320 17:00:00.172199    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1f9169d-1b10-4fa9-a78c-8d0629059181" containerName="extract-utilities"
Mar 20 17:00:00 crc kubenswrapper[4730]: E0320 17:00:00.172214    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1f9169d-1b10-4fa9-a78c-8d0629059181" containerName="extract-content"
Mar 20 17:00:00 crc kubenswrapper[4730]: I0320 17:00:00.172222    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1f9169d-1b10-4fa9-a78c-8d0629059181" containerName="extract-content"
Mar 20 17:00:00 crc kubenswrapper[4730]: I0320 17:00:00.172465    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1f9169d-1b10-4fa9-a78c-8d0629059181" containerName="registry-server"
Mar 20 17:00:00 crc kubenswrapper[4730]: I0320 17:00:00.173241    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567100-vgj2v"
Mar 20 17:00:00 crc kubenswrapper[4730]: I0320 17:00:00.176222    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt"
Mar 20 17:00:00 crc kubenswrapper[4730]: I0320 17:00:00.176380    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vlqbl"
Mar 20 17:00:00 crc kubenswrapper[4730]: I0320 17:00:00.178989    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt"
Mar 20 17:00:00 crc kubenswrapper[4730]: I0320 17:00:00.192826    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567100-q7w74"]
Mar 20 17:00:00 crc kubenswrapper[4730]: I0320 17:00:00.194589    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567100-q7w74"
Mar 20 17:00:00 crc kubenswrapper[4730]: I0320 17:00:00.197452    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config"
Mar 20 17:00:00 crc kubenswrapper[4730]: I0320 17:00:00.199547    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t"
Mar 20 17:00:00 crc kubenswrapper[4730]: I0320 17:00:00.203345    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567100-vgj2v"]
Mar 20 17:00:00 crc kubenswrapper[4730]: I0320 17:00:00.205074    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvkzv\" (UniqueName: \"kubernetes.io/projected/1293fe12-0f59-44fb-b726-9d72c790dabd-kube-api-access-rvkzv\") pod \"auto-csr-approver-29567100-vgj2v\" (UID: \"1293fe12-0f59-44fb-b726-9d72c790dabd\") " pod="openshift-infra/auto-csr-approver-29567100-vgj2v"
Mar 20 17:00:00 crc kubenswrapper[4730]: I0320 17:00:00.229913    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567100-q7w74"]
Mar 20 17:00:00 crc kubenswrapper[4730]: I0320 17:00:00.307348    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/85790076-e9c5-4a47-9a91-23e3371238e0-config-volume\") pod \"collect-profiles-29567100-q7w74\" (UID: \"85790076-e9c5-4a47-9a91-23e3371238e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567100-q7w74"
Mar 20 17:00:00 crc kubenswrapper[4730]: I0320 17:00:00.307662    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/85790076-e9c5-4a47-9a91-23e3371238e0-secret-volume\") pod \"collect-profiles-29567100-q7w74\" (UID: \"85790076-e9c5-4a47-9a91-23e3371238e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567100-q7w74"
Mar 20 17:00:00 crc kubenswrapper[4730]: I0320 17:00:00.307729    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvkzv\" (UniqueName: \"kubernetes.io/projected/1293fe12-0f59-44fb-b726-9d72c790dabd-kube-api-access-rvkzv\") pod \"auto-csr-approver-29567100-vgj2v\" (UID: \"1293fe12-0f59-44fb-b726-9d72c790dabd\") " pod="openshift-infra/auto-csr-approver-29567100-vgj2v"
Mar 20 17:00:00 crc kubenswrapper[4730]: I0320 17:00:00.307796    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt95r\" (UniqueName: \"kubernetes.io/projected/85790076-e9c5-4a47-9a91-23e3371238e0-kube-api-access-jt95r\") pod \"collect-profiles-29567100-q7w74\" (UID: \"85790076-e9c5-4a47-9a91-23e3371238e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567100-q7w74"
Mar 20 17:00:00 crc kubenswrapper[4730]: I0320 17:00:00.330166    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvkzv\" (UniqueName: \"kubernetes.io/projected/1293fe12-0f59-44fb-b726-9d72c790dabd-kube-api-access-rvkzv\") pod \"auto-csr-approver-29567100-vgj2v\" (UID: \"1293fe12-0f59-44fb-b726-9d72c790dabd\") " pod="openshift-infra/auto-csr-approver-29567100-vgj2v"
Mar 20 17:00:00 crc kubenswrapper[4730]: I0320 17:00:00.410068    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/85790076-e9c5-4a47-9a91-23e3371238e0-secret-volume\") pod \"collect-profiles-29567100-q7w74\" (UID: \"85790076-e9c5-4a47-9a91-23e3371238e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567100-q7w74"
Mar 20 17:00:00 crc kubenswrapper[4730]: I0320 17:00:00.410202    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt95r\" (UniqueName: \"kubernetes.io/projected/85790076-e9c5-4a47-9a91-23e3371238e0-kube-api-access-jt95r\") pod \"collect-profiles-29567100-q7w74\" (UID: \"85790076-e9c5-4a47-9a91-23e3371238e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567100-q7w74"
Mar 20 17:00:00 crc kubenswrapper[4730]: I0320 17:00:00.410407    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/85790076-e9c5-4a47-9a91-23e3371238e0-config-volume\") pod \"collect-profiles-29567100-q7w74\" (UID: \"85790076-e9c5-4a47-9a91-23e3371238e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567100-q7w74"
Mar 20 17:00:00 crc kubenswrapper[4730]: I0320 17:00:00.411991    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/85790076-e9c5-4a47-9a91-23e3371238e0-config-volume\") pod \"collect-profiles-29567100-q7w74\" (UID: \"85790076-e9c5-4a47-9a91-23e3371238e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567100-q7w74"
Mar 20 17:00:00 crc kubenswrapper[4730]: I0320 17:00:00.414597    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/85790076-e9c5-4a47-9a91-23e3371238e0-secret-volume\") pod \"collect-profiles-29567100-q7w74\" (UID: \"85790076-e9c5-4a47-9a91-23e3371238e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567100-q7w74"
Mar 20 17:00:00 crc kubenswrapper[4730]: I0320 17:00:00.436048    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt95r\" (UniqueName: \"kubernetes.io/projected/85790076-e9c5-4a47-9a91-23e3371238e0-kube-api-access-jt95r\") pod \"collect-profiles-29567100-q7w74\" (UID: \"85790076-e9c5-4a47-9a91-23e3371238e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567100-q7w74"
Mar 20 17:00:00 crc kubenswrapper[4730]: I0320 17:00:00.499594    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567100-vgj2v"
Mar 20 17:00:00 crc kubenswrapper[4730]: I0320 17:00:00.523388    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567100-q7w74"
Mar 20 17:00:00 crc kubenswrapper[4730]: I0320 17:00:00.989623    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567100-vgj2v"]
Mar 20 17:00:01 crc kubenswrapper[4730]: W0320 17:00:01.075544    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85790076_e9c5_4a47_9a91_23e3371238e0.slice/crio-53df402fb46fe245b14079dfe719c4b3cda288137d5920adfef0e389c3a057f1 WatchSource:0}: Error finding container 53df402fb46fe245b14079dfe719c4b3cda288137d5920adfef0e389c3a057f1: Status 404 returned error can't find the container with id 53df402fb46fe245b14079dfe719c4b3cda288137d5920adfef0e389c3a057f1
Mar 20 17:00:01 crc kubenswrapper[4730]: I0320 17:00:01.075956    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567100-q7w74"]
Mar 20 17:00:01 crc kubenswrapper[4730]: I0320 17:00:01.359511    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567100-q7w74" event={"ID":"85790076-e9c5-4a47-9a91-23e3371238e0","Type":"ContainerStarted","Data":"98a696c1bfd3c064c3dfdf3b917dd5513f01acf718b3d9511f52bd8373893cf2"}
Mar 20 17:00:01 crc kubenswrapper[4730]: I0320 17:00:01.359562    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567100-q7w74" event={"ID":"85790076-e9c5-4a47-9a91-23e3371238e0","Type":"ContainerStarted","Data":"53df402fb46fe245b14079dfe719c4b3cda288137d5920adfef0e389c3a057f1"}
Mar 20 17:00:01 crc kubenswrapper[4730]: I0320 17:00:01.361747    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567100-vgj2v" event={"ID":"1293fe12-0f59-44fb-b726-9d72c790dabd","Type":"ContainerStarted","Data":"e1c9976bc45f87566b1b1016223c485c739f36677650be579d93b3378ac982b4"}
Mar 20 17:00:01 crc kubenswrapper[4730]: I0320 17:00:01.383495    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29567100-q7w74" podStartSLOduration=1.383475043 podStartE2EDuration="1.383475043s" podCreationTimestamp="2026-03-20 17:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:00:01.379726607 +0000 UTC m=+4860.593097986" watchObservedRunningTime="2026-03-20 17:00:01.383475043 +0000 UTC m=+4860.596846412"
Mar 20 17:00:02 crc kubenswrapper[4730]: I0320 17:00:02.373316    4730 generic.go:334] "Generic (PLEG): container finished" podID="85790076-e9c5-4a47-9a91-23e3371238e0" containerID="98a696c1bfd3c064c3dfdf3b917dd5513f01acf718b3d9511f52bd8373893cf2" exitCode=0
Mar 20 17:00:02 crc kubenswrapper[4730]: I0320 17:00:02.373361    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567100-q7w74" event={"ID":"85790076-e9c5-4a47-9a91-23e3371238e0","Type":"ContainerDied","Data":"98a696c1bfd3c064c3dfdf3b917dd5513f01acf718b3d9511f52bd8373893cf2"}
Mar 20 17:00:03 crc kubenswrapper[4730]: I0320 17:00:03.846467    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567100-q7w74"
Mar 20 17:00:03 crc kubenswrapper[4730]: I0320 17:00:03.891589    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jt95r\" (UniqueName: \"kubernetes.io/projected/85790076-e9c5-4a47-9a91-23e3371238e0-kube-api-access-jt95r\") pod \"85790076-e9c5-4a47-9a91-23e3371238e0\" (UID: \"85790076-e9c5-4a47-9a91-23e3371238e0\") "
Mar 20 17:00:03 crc kubenswrapper[4730]: I0320 17:00:03.891762    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/85790076-e9c5-4a47-9a91-23e3371238e0-config-volume\") pod \"85790076-e9c5-4a47-9a91-23e3371238e0\" (UID: \"85790076-e9c5-4a47-9a91-23e3371238e0\") "
Mar 20 17:00:03 crc kubenswrapper[4730]: I0320 17:00:03.891838    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/85790076-e9c5-4a47-9a91-23e3371238e0-secret-volume\") pod \"85790076-e9c5-4a47-9a91-23e3371238e0\" (UID: \"85790076-e9c5-4a47-9a91-23e3371238e0\") "
Mar 20 17:00:03 crc kubenswrapper[4730]: I0320 17:00:03.892150    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85790076-e9c5-4a47-9a91-23e3371238e0-config-volume" (OuterVolumeSpecName: "config-volume") pod "85790076-e9c5-4a47-9a91-23e3371238e0" (UID: "85790076-e9c5-4a47-9a91-23e3371238e0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 17:00:03 crc kubenswrapper[4730]: I0320 17:00:03.892319    4730 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/85790076-e9c5-4a47-9a91-23e3371238e0-config-volume\") on node \"crc\" DevicePath \"\""
Mar 20 17:00:03 crc kubenswrapper[4730]: I0320 17:00:03.905476    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85790076-e9c5-4a47-9a91-23e3371238e0-kube-api-access-jt95r" (OuterVolumeSpecName: "kube-api-access-jt95r") pod "85790076-e9c5-4a47-9a91-23e3371238e0" (UID: "85790076-e9c5-4a47-9a91-23e3371238e0"). InnerVolumeSpecName "kube-api-access-jt95r". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 17:00:03 crc kubenswrapper[4730]: I0320 17:00:03.905839    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85790076-e9c5-4a47-9a91-23e3371238e0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "85790076-e9c5-4a47-9a91-23e3371238e0" (UID: "85790076-e9c5-4a47-9a91-23e3371238e0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 17:00:03 crc kubenswrapper[4730]: I0320 17:00:03.994701    4730 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/85790076-e9c5-4a47-9a91-23e3371238e0-secret-volume\") on node \"crc\" DevicePath \"\""
Mar 20 17:00:03 crc kubenswrapper[4730]: I0320 17:00:03.994732    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jt95r\" (UniqueName: \"kubernetes.io/projected/85790076-e9c5-4a47-9a91-23e3371238e0-kube-api-access-jt95r\") on node \"crc\" DevicePath \"\""
Mar 20 17:00:04 crc kubenswrapper[4730]: I0320 17:00:04.398220    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567100-q7w74" event={"ID":"85790076-e9c5-4a47-9a91-23e3371238e0","Type":"ContainerDied","Data":"53df402fb46fe245b14079dfe719c4b3cda288137d5920adfef0e389c3a057f1"}
Mar 20 17:00:04 crc kubenswrapper[4730]: I0320 17:00:04.398272    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53df402fb46fe245b14079dfe719c4b3cda288137d5920adfef0e389c3a057f1"
Mar 20 17:00:04 crc kubenswrapper[4730]: I0320 17:00:04.398330    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567100-q7w74"
Mar 20 17:00:04 crc kubenswrapper[4730]: I0320 17:00:04.471393    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567055-r9s4z"]
Mar 20 17:00:04 crc kubenswrapper[4730]: I0320 17:00:04.482136    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567055-r9s4z"]
Mar 20 17:00:05 crc kubenswrapper[4730]: I0320 17:00:05.554696    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5af1b002-c577-4334-8304-5f44a67a5119" path="/var/lib/kubelet/pods/5af1b002-c577-4334-8304-5f44a67a5119/volumes"
Mar 20 17:00:09 crc kubenswrapper[4730]: I0320 17:00:09.533487    4730 scope.go:117] "RemoveContainer" containerID="96fa1a39363ce580a7d7ba62fa4f551be70b7f98a5cbd49713d8269774ccd8b9"
Mar 20 17:00:09 crc kubenswrapper[4730]: E0320 17:00:09.534363    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 17:00:13 crc kubenswrapper[4730]: I0320 17:00:13.541788    4730 generic.go:334] "Generic (PLEG): container finished" podID="1293fe12-0f59-44fb-b726-9d72c790dabd" containerID="f0d02d12d3b8583d27ef06f8d4e4230e6d9bdedae9fb10c5b6dcf9c218e3e2d5" exitCode=0
Mar 20 17:00:13 crc kubenswrapper[4730]: I0320 17:00:13.575923    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567100-vgj2v" event={"ID":"1293fe12-0f59-44fb-b726-9d72c790dabd","Type":"ContainerDied","Data":"f0d02d12d3b8583d27ef06f8d4e4230e6d9bdedae9fb10c5b6dcf9c218e3e2d5"}
Mar 20 17:00:14 crc kubenswrapper[4730]: I0320 17:00:14.973467    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567100-vgj2v"
Mar 20 17:00:15 crc kubenswrapper[4730]: I0320 17:00:15.050871    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvkzv\" (UniqueName: \"kubernetes.io/projected/1293fe12-0f59-44fb-b726-9d72c790dabd-kube-api-access-rvkzv\") pod \"1293fe12-0f59-44fb-b726-9d72c790dabd\" (UID: \"1293fe12-0f59-44fb-b726-9d72c790dabd\") "
Mar 20 17:00:15 crc kubenswrapper[4730]: I0320 17:00:15.057455    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1293fe12-0f59-44fb-b726-9d72c790dabd-kube-api-access-rvkzv" (OuterVolumeSpecName: "kube-api-access-rvkzv") pod "1293fe12-0f59-44fb-b726-9d72c790dabd" (UID: "1293fe12-0f59-44fb-b726-9d72c790dabd"). InnerVolumeSpecName "kube-api-access-rvkzv". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 17:00:15 crc kubenswrapper[4730]: I0320 17:00:15.153364    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvkzv\" (UniqueName: \"kubernetes.io/projected/1293fe12-0f59-44fb-b726-9d72c790dabd-kube-api-access-rvkzv\") on node \"crc\" DevicePath \"\""
Mar 20 17:00:15 crc kubenswrapper[4730]: I0320 17:00:15.566322    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567100-vgj2v" event={"ID":"1293fe12-0f59-44fb-b726-9d72c790dabd","Type":"ContainerDied","Data":"e1c9976bc45f87566b1b1016223c485c739f36677650be579d93b3378ac982b4"}
Mar 20 17:00:15 crc kubenswrapper[4730]: I0320 17:00:15.566397    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1c9976bc45f87566b1b1016223c485c739f36677650be579d93b3378ac982b4"
Mar 20 17:00:15 crc kubenswrapper[4730]: I0320 17:00:15.566468    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567100-vgj2v"
Mar 20 17:00:16 crc kubenswrapper[4730]: I0320 17:00:16.051864    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567094-5zljx"]
Mar 20 17:00:16 crc kubenswrapper[4730]: I0320 17:00:16.060308    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567094-5zljx"]
Mar 20 17:00:17 crc kubenswrapper[4730]: I0320 17:00:17.560113    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a009be79-bc2f-45ca-94b9-f0da37a6abdc" path="/var/lib/kubelet/pods/a009be79-bc2f-45ca-94b9-f0da37a6abdc/volumes"
Mar 20 17:00:24 crc kubenswrapper[4730]: I0320 17:00:24.533385    4730 scope.go:117] "RemoveContainer" containerID="96fa1a39363ce580a7d7ba62fa4f551be70b7f98a5cbd49713d8269774ccd8b9"
Mar 20 17:00:24 crc kubenswrapper[4730]: E0320 17:00:24.534290    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 17:00:38 crc kubenswrapper[4730]: I0320 17:00:38.533932    4730 scope.go:117] "RemoveContainer" containerID="96fa1a39363ce580a7d7ba62fa4f551be70b7f98a5cbd49713d8269774ccd8b9"
Mar 20 17:00:38 crc kubenswrapper[4730]: E0320 17:00:38.534988    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 17:00:52 crc kubenswrapper[4730]: I0320 17:00:52.167679    4730 scope.go:117] "RemoveContainer" containerID="e6a0ef74485773b0b9248ebe8daaaea6116dc164de300c92368c1b1d44b5c372"
Mar 20 17:00:52 crc kubenswrapper[4730]: I0320 17:00:52.203283    4730 scope.go:117] "RemoveContainer" containerID="bfe54023e94fe2434c4e1c76acbd4f6a1cd0b3f0a5a82a87fd8b931a2f1901c9"
Mar 20 17:00:52 crc kubenswrapper[4730]: I0320 17:00:52.533054    4730 scope.go:117] "RemoveContainer" containerID="96fa1a39363ce580a7d7ba62fa4f551be70b7f98a5cbd49713d8269774ccd8b9"
Mar 20 17:00:52 crc kubenswrapper[4730]: E0320 17:00:52.533681    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 17:01:00 crc kubenswrapper[4730]: I0320 17:01:00.187926    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29567101-lzlmb"]
Mar 20 17:01:00 crc kubenswrapper[4730]: E0320 17:01:00.189023    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85790076-e9c5-4a47-9a91-23e3371238e0" containerName="collect-profiles"
Mar 20 17:01:00 crc kubenswrapper[4730]: I0320 17:01:00.189039    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="85790076-e9c5-4a47-9a91-23e3371238e0" containerName="collect-profiles"
Mar 20 17:01:00 crc kubenswrapper[4730]: E0320 17:01:00.189051    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1293fe12-0f59-44fb-b726-9d72c790dabd" containerName="oc"
Mar 20 17:01:00 crc kubenswrapper[4730]: I0320 17:01:00.189059    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="1293fe12-0f59-44fb-b726-9d72c790dabd" containerName="oc"
Mar 20 17:01:00 crc kubenswrapper[4730]: I0320 17:01:00.189344    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="85790076-e9c5-4a47-9a91-23e3371238e0" containerName="collect-profiles"
Mar 20 17:01:00 crc kubenswrapper[4730]: I0320 17:01:00.189356    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="1293fe12-0f59-44fb-b726-9d72c790dabd" containerName="oc"
Mar 20 17:01:00 crc kubenswrapper[4730]: I0320 17:01:00.190163    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29567101-lzlmb"
Mar 20 17:01:00 crc kubenswrapper[4730]: I0320 17:01:00.199890    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29567101-lzlmb"]
Mar 20 17:01:00 crc kubenswrapper[4730]: I0320 17:01:00.336946    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a113133-c537-41c7-a14e-614fb8bcd24f-config-data\") pod \"keystone-cron-29567101-lzlmb\" (UID: \"8a113133-c537-41c7-a14e-614fb8bcd24f\") " pod="openstack/keystone-cron-29567101-lzlmb"
Mar 20 17:01:00 crc kubenswrapper[4730]: I0320 17:01:00.337429    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a113133-c537-41c7-a14e-614fb8bcd24f-combined-ca-bundle\") pod \"keystone-cron-29567101-lzlmb\" (UID: \"8a113133-c537-41c7-a14e-614fb8bcd24f\") " pod="openstack/keystone-cron-29567101-lzlmb"
Mar 20 17:01:00 crc kubenswrapper[4730]: I0320 17:01:00.337474    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8a113133-c537-41c7-a14e-614fb8bcd24f-fernet-keys\") pod \"keystone-cron-29567101-lzlmb\" (UID: \"8a113133-c537-41c7-a14e-614fb8bcd24f\") " pod="openstack/keystone-cron-29567101-lzlmb"
Mar 20 17:01:00 crc kubenswrapper[4730]: I0320 17:01:00.337530    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25frt\" (UniqueName: \"kubernetes.io/projected/8a113133-c537-41c7-a14e-614fb8bcd24f-kube-api-access-25frt\") pod \"keystone-cron-29567101-lzlmb\" (UID: \"8a113133-c537-41c7-a14e-614fb8bcd24f\") " pod="openstack/keystone-cron-29567101-lzlmb"
Mar 20 17:01:00 crc kubenswrapper[4730]: I0320 17:01:00.439710    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a113133-c537-41c7-a14e-614fb8bcd24f-combined-ca-bundle\") pod \"keystone-cron-29567101-lzlmb\" (UID: \"8a113133-c537-41c7-a14e-614fb8bcd24f\") " pod="openstack/keystone-cron-29567101-lzlmb"
Mar 20 17:01:00 crc kubenswrapper[4730]: I0320 17:01:00.439868    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8a113133-c537-41c7-a14e-614fb8bcd24f-fernet-keys\") pod \"keystone-cron-29567101-lzlmb\" (UID: \"8a113133-c537-41c7-a14e-614fb8bcd24f\") " pod="openstack/keystone-cron-29567101-lzlmb"
Mar 20 17:01:00 crc kubenswrapper[4730]: I0320 17:01:00.440013    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25frt\" (UniqueName: \"kubernetes.io/projected/8a113133-c537-41c7-a14e-614fb8bcd24f-kube-api-access-25frt\") pod \"keystone-cron-29567101-lzlmb\" (UID: \"8a113133-c537-41c7-a14e-614fb8bcd24f\") " pod="openstack/keystone-cron-29567101-lzlmb"
Mar 20 17:01:00 crc kubenswrapper[4730]: I0320 17:01:00.440290    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a113133-c537-41c7-a14e-614fb8bcd24f-config-data\") pod \"keystone-cron-29567101-lzlmb\" (UID: \"8a113133-c537-41c7-a14e-614fb8bcd24f\") " pod="openstack/keystone-cron-29567101-lzlmb"
Mar 20 17:01:00 crc kubenswrapper[4730]: I0320 17:01:00.447298    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a113133-c537-41c7-a14e-614fb8bcd24f-config-data\") pod \"keystone-cron-29567101-lzlmb\" (UID: \"8a113133-c537-41c7-a14e-614fb8bcd24f\") " pod="openstack/keystone-cron-29567101-lzlmb"
Mar 20 17:01:00 crc kubenswrapper[4730]: I0320 17:01:00.447808    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8a113133-c537-41c7-a14e-614fb8bcd24f-fernet-keys\") pod \"keystone-cron-29567101-lzlmb\" (UID: \"8a113133-c537-41c7-a14e-614fb8bcd24f\") " pod="openstack/keystone-cron-29567101-lzlmb"
Mar 20 17:01:00 crc kubenswrapper[4730]: I0320 17:01:00.454188    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a113133-c537-41c7-a14e-614fb8bcd24f-combined-ca-bundle\") pod \"keystone-cron-29567101-lzlmb\" (UID: \"8a113133-c537-41c7-a14e-614fb8bcd24f\") " pod="openstack/keystone-cron-29567101-lzlmb"
Mar 20 17:01:00 crc kubenswrapper[4730]: I0320 17:01:00.460669    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25frt\" (UniqueName: \"kubernetes.io/projected/8a113133-c537-41c7-a14e-614fb8bcd24f-kube-api-access-25frt\") pod \"keystone-cron-29567101-lzlmb\" (UID: \"8a113133-c537-41c7-a14e-614fb8bcd24f\") " pod="openstack/keystone-cron-29567101-lzlmb"
Mar 20 17:01:00 crc kubenswrapper[4730]: I0320 17:01:00.518689    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29567101-lzlmb"
Mar 20 17:01:01 crc kubenswrapper[4730]: I0320 17:01:01.013505    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29567101-lzlmb"]
Mar 20 17:01:01 crc kubenswrapper[4730]: I0320 17:01:01.640109    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29567101-lzlmb" event={"ID":"8a113133-c537-41c7-a14e-614fb8bcd24f","Type":"ContainerStarted","Data":"0978ba5bfa0ab31eba2979df1d165332fe49e303739053270ed423cb2aa3e09c"}
Mar 20 17:01:01 crc kubenswrapper[4730]: I0320 17:01:01.640506    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29567101-lzlmb" event={"ID":"8a113133-c537-41c7-a14e-614fb8bcd24f","Type":"ContainerStarted","Data":"21e481ed3b6f703dfa039524e2814e97aab2752e3bd7ea80b43efd84a998db93"}
Mar 20 17:01:01 crc kubenswrapper[4730]: I0320 17:01:01.666507    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29567101-lzlmb" podStartSLOduration=1.666490099 podStartE2EDuration="1.666490099s" podCreationTimestamp="2026-03-20 17:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:01:01.656416232 +0000 UTC m=+4920.869787621" watchObservedRunningTime="2026-03-20 17:01:01.666490099 +0000 UTC m=+4920.879861468"
Mar 20 17:01:05 crc kubenswrapper[4730]: I0320 17:01:05.687298    4730 generic.go:334] "Generic (PLEG): container finished" podID="8a113133-c537-41c7-a14e-614fb8bcd24f" containerID="0978ba5bfa0ab31eba2979df1d165332fe49e303739053270ed423cb2aa3e09c" exitCode=0
Mar 20 17:01:05 crc kubenswrapper[4730]: I0320 17:01:05.687350    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29567101-lzlmb" event={"ID":"8a113133-c537-41c7-a14e-614fb8bcd24f","Type":"ContainerDied","Data":"0978ba5bfa0ab31eba2979df1d165332fe49e303739053270ed423cb2aa3e09c"}
Mar 20 17:01:06 crc kubenswrapper[4730]: I0320 17:01:06.534176    4730 scope.go:117] "RemoveContainer" containerID="96fa1a39363ce580a7d7ba62fa4f551be70b7f98a5cbd49713d8269774ccd8b9"
Mar 20 17:01:06 crc kubenswrapper[4730]: E0320 17:01:06.535439    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 17:01:07 crc kubenswrapper[4730]: I0320 17:01:07.522519    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29567101-lzlmb"
Mar 20 17:01:07 crc kubenswrapper[4730]: I0320 17:01:07.706787    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29567101-lzlmb" event={"ID":"8a113133-c537-41c7-a14e-614fb8bcd24f","Type":"ContainerDied","Data":"21e481ed3b6f703dfa039524e2814e97aab2752e3bd7ea80b43efd84a998db93"}
Mar 20 17:01:07 crc kubenswrapper[4730]: I0320 17:01:07.706824    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21e481ed3b6f703dfa039524e2814e97aab2752e3bd7ea80b43efd84a998db93"
Mar 20 17:01:07 crc kubenswrapper[4730]: I0320 17:01:07.706870    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29567101-lzlmb"
Mar 20 17:01:07 crc kubenswrapper[4730]: I0320 17:01:07.710452    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8a113133-c537-41c7-a14e-614fb8bcd24f-fernet-keys\") pod \"8a113133-c537-41c7-a14e-614fb8bcd24f\" (UID: \"8a113133-c537-41c7-a14e-614fb8bcd24f\") "
Mar 20 17:01:07 crc kubenswrapper[4730]: I0320 17:01:07.710512    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a113133-c537-41c7-a14e-614fb8bcd24f-combined-ca-bundle\") pod \"8a113133-c537-41c7-a14e-614fb8bcd24f\" (UID: \"8a113133-c537-41c7-a14e-614fb8bcd24f\") "
Mar 20 17:01:07 crc kubenswrapper[4730]: I0320 17:01:07.710583    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25frt\" (UniqueName: \"kubernetes.io/projected/8a113133-c537-41c7-a14e-614fb8bcd24f-kube-api-access-25frt\") pod \"8a113133-c537-41c7-a14e-614fb8bcd24f\" (UID: \"8a113133-c537-41c7-a14e-614fb8bcd24f\") "
Mar 20 17:01:07 crc kubenswrapper[4730]: I0320 17:01:07.710657    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a113133-c537-41c7-a14e-614fb8bcd24f-config-data\") pod \"8a113133-c537-41c7-a14e-614fb8bcd24f\" (UID: \"8a113133-c537-41c7-a14e-614fb8bcd24f\") "
Mar 20 17:01:07 crc kubenswrapper[4730]: I0320 17:01:07.723468    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a113133-c537-41c7-a14e-614fb8bcd24f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "8a113133-c537-41c7-a14e-614fb8bcd24f" (UID: "8a113133-c537-41c7-a14e-614fb8bcd24f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 17:01:07 crc kubenswrapper[4730]: I0320 17:01:07.723521    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a113133-c537-41c7-a14e-614fb8bcd24f-kube-api-access-25frt" (OuterVolumeSpecName: "kube-api-access-25frt") pod "8a113133-c537-41c7-a14e-614fb8bcd24f" (UID: "8a113133-c537-41c7-a14e-614fb8bcd24f"). InnerVolumeSpecName "kube-api-access-25frt". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 17:01:07 crc kubenswrapper[4730]: I0320 17:01:07.762763    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a113133-c537-41c7-a14e-614fb8bcd24f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a113133-c537-41c7-a14e-614fb8bcd24f" (UID: "8a113133-c537-41c7-a14e-614fb8bcd24f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 17:01:07 crc kubenswrapper[4730]: I0320 17:01:07.778962    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a113133-c537-41c7-a14e-614fb8bcd24f-config-data" (OuterVolumeSpecName: "config-data") pod "8a113133-c537-41c7-a14e-614fb8bcd24f" (UID: "8a113133-c537-41c7-a14e-614fb8bcd24f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 17:01:07 crc kubenswrapper[4730]: I0320 17:01:07.812834    4730 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8a113133-c537-41c7-a14e-614fb8bcd24f-fernet-keys\") on node \"crc\" DevicePath \"\""
Mar 20 17:01:07 crc kubenswrapper[4730]: I0320 17:01:07.812876    4730 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a113133-c537-41c7-a14e-614fb8bcd24f-combined-ca-bundle\") on node \"crc\" DevicePath \"\""
Mar 20 17:01:07 crc kubenswrapper[4730]: I0320 17:01:07.812894    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25frt\" (UniqueName: \"kubernetes.io/projected/8a113133-c537-41c7-a14e-614fb8bcd24f-kube-api-access-25frt\") on node \"crc\" DevicePath \"\""
Mar 20 17:01:07 crc kubenswrapper[4730]: I0320 17:01:07.812906    4730 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a113133-c537-41c7-a14e-614fb8bcd24f-config-data\") on node \"crc\" DevicePath \"\""
Mar 20 17:01:21 crc kubenswrapper[4730]: I0320 17:01:21.541414    4730 scope.go:117] "RemoveContainer" containerID="96fa1a39363ce580a7d7ba62fa4f551be70b7f98a5cbd49713d8269774ccd8b9"
Mar 20 17:01:21 crc kubenswrapper[4730]: E0320 17:01:21.542498    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 17:01:34 crc kubenswrapper[4730]: I0320 17:01:34.533185    4730 scope.go:117] "RemoveContainer" containerID="96fa1a39363ce580a7d7ba62fa4f551be70b7f98a5cbd49713d8269774ccd8b9"
Mar 20 17:01:34 crc kubenswrapper[4730]: E0320 17:01:34.534268    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 17:01:45 crc kubenswrapper[4730]: I0320 17:01:45.533692    4730 scope.go:117] "RemoveContainer" containerID="96fa1a39363ce580a7d7ba62fa4f551be70b7f98a5cbd49713d8269774ccd8b9"
Mar 20 17:01:46 crc kubenswrapper[4730]: I0320 17:01:46.148753    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" event={"ID":"7fcd3db3-55f1-4c23-8fa9-78844495cea3","Type":"ContainerStarted","Data":"c3ac4abf290606f9ec67064e3bf0182c8ecb8c9be4ecf85a1bb60feb02fd27e6"}
Mar 20 17:01:54 crc kubenswrapper[4730]: I0320 17:01:54.240841    4730 generic.go:334] "Generic (PLEG): container finished" podID="c69a80b5-69a7-48c5-8ad4-5063b6cb4676" containerID="7d54b7219e1263f587317fcdebaae6f3c46012a7941ad45c24813ffa14627f5b" exitCode=1
Mar 20 17:01:54 crc kubenswrapper[4730]: I0320 17:01:54.240925    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"c69a80b5-69a7-48c5-8ad4-5063b6cb4676","Type":"ContainerDied","Data":"7d54b7219e1263f587317fcdebaae6f3c46012a7941ad45c24813ffa14627f5b"}
Mar 20 17:01:55 crc kubenswrapper[4730]: I0320 17:01:55.774022    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest"
Mar 20 17:01:55 crc kubenswrapper[4730]: I0320 17:01:55.847673    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/c69a80b5-69a7-48c5-8ad4-5063b6cb4676-test-operator-ephemeral-workdir\") pod \"c69a80b5-69a7-48c5-8ad4-5063b6cb4676\" (UID: \"c69a80b5-69a7-48c5-8ad4-5063b6cb4676\") "
Mar 20 17:01:55 crc kubenswrapper[4730]: I0320 17:01:55.847732    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/c69a80b5-69a7-48c5-8ad4-5063b6cb4676-ca-certs\") pod \"c69a80b5-69a7-48c5-8ad4-5063b6cb4676\" (UID: \"c69a80b5-69a7-48c5-8ad4-5063b6cb4676\") "
Mar 20 17:01:55 crc kubenswrapper[4730]: I0320 17:01:55.847756    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75t8r\" (UniqueName: \"kubernetes.io/projected/c69a80b5-69a7-48c5-8ad4-5063b6cb4676-kube-api-access-75t8r\") pod \"c69a80b5-69a7-48c5-8ad4-5063b6cb4676\" (UID: \"c69a80b5-69a7-48c5-8ad4-5063b6cb4676\") "
Mar 20 17:01:55 crc kubenswrapper[4730]: I0320 17:01:55.847793    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c69a80b5-69a7-48c5-8ad4-5063b6cb4676-openstack-config-secret\") pod \"c69a80b5-69a7-48c5-8ad4-5063b6cb4676\" (UID: \"c69a80b5-69a7-48c5-8ad4-5063b6cb4676\") "
Mar 20 17:01:55 crc kubenswrapper[4730]: I0320 17:01:55.847870    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c69a80b5-69a7-48c5-8ad4-5063b6cb4676-config-data\") pod \"c69a80b5-69a7-48c5-8ad4-5063b6cb4676\" (UID: \"c69a80b5-69a7-48c5-8ad4-5063b6cb4676\") "
Mar 20 17:01:55 crc kubenswrapper[4730]: I0320 17:01:55.847952    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"c69a80b5-69a7-48c5-8ad4-5063b6cb4676\" (UID: \"c69a80b5-69a7-48c5-8ad4-5063b6cb4676\") "
Mar 20 17:01:55 crc kubenswrapper[4730]: I0320 17:01:55.847991    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c69a80b5-69a7-48c5-8ad4-5063b6cb4676-ssh-key\") pod \"c69a80b5-69a7-48c5-8ad4-5063b6cb4676\" (UID: \"c69a80b5-69a7-48c5-8ad4-5063b6cb4676\") "
Mar 20 17:01:55 crc kubenswrapper[4730]: I0320 17:01:55.848023    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c69a80b5-69a7-48c5-8ad4-5063b6cb4676-openstack-config\") pod \"c69a80b5-69a7-48c5-8ad4-5063b6cb4676\" (UID: \"c69a80b5-69a7-48c5-8ad4-5063b6cb4676\") "
Mar 20 17:01:55 crc kubenswrapper[4730]: I0320 17:01:55.848119    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/c69a80b5-69a7-48c5-8ad4-5063b6cb4676-test-operator-ephemeral-temporary\") pod \"c69a80b5-69a7-48c5-8ad4-5063b6cb4676\" (UID: \"c69a80b5-69a7-48c5-8ad4-5063b6cb4676\") "
Mar 20 17:01:55 crc kubenswrapper[4730]: I0320 17:01:55.849081    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c69a80b5-69a7-48c5-8ad4-5063b6cb4676-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "c69a80b5-69a7-48c5-8ad4-5063b6cb4676" (UID: "c69a80b5-69a7-48c5-8ad4-5063b6cb4676"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 17:01:55 crc kubenswrapper[4730]: I0320 17:01:55.849981    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c69a80b5-69a7-48c5-8ad4-5063b6cb4676-config-data" (OuterVolumeSpecName: "config-data") pod "c69a80b5-69a7-48c5-8ad4-5063b6cb4676" (UID: "c69a80b5-69a7-48c5-8ad4-5063b6cb4676"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 17:01:55 crc kubenswrapper[4730]: I0320 17:01:55.856845    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c69a80b5-69a7-48c5-8ad4-5063b6cb4676-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "c69a80b5-69a7-48c5-8ad4-5063b6cb4676" (UID: "c69a80b5-69a7-48c5-8ad4-5063b6cb4676"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 17:01:55 crc kubenswrapper[4730]: I0320 17:01:55.882162    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c69a80b5-69a7-48c5-8ad4-5063b6cb4676-kube-api-access-75t8r" (OuterVolumeSpecName: "kube-api-access-75t8r") pod "c69a80b5-69a7-48c5-8ad4-5063b6cb4676" (UID: "c69a80b5-69a7-48c5-8ad4-5063b6cb4676"). InnerVolumeSpecName "kube-api-access-75t8r". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 17:01:55 crc kubenswrapper[4730]: I0320 17:01:55.890435    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "test-operator-logs") pod "c69a80b5-69a7-48c5-8ad4-5063b6cb4676" (UID: "c69a80b5-69a7-48c5-8ad4-5063b6cb4676"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue ""
Mar 20 17:01:55 crc kubenswrapper[4730]: I0320 17:01:55.907415    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c69a80b5-69a7-48c5-8ad4-5063b6cb4676-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "c69a80b5-69a7-48c5-8ad4-5063b6cb4676" (UID: "c69a80b5-69a7-48c5-8ad4-5063b6cb4676"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 17:01:55 crc kubenswrapper[4730]: I0320 17:01:55.919574    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c69a80b5-69a7-48c5-8ad4-5063b6cb4676-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "c69a80b5-69a7-48c5-8ad4-5063b6cb4676" (UID: "c69a80b5-69a7-48c5-8ad4-5063b6cb4676"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 17:01:55 crc kubenswrapper[4730]: I0320 17:01:55.954424    4730 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" "
Mar 20 17:01:55 crc kubenswrapper[4730]: I0320 17:01:55.954461    4730 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/c69a80b5-69a7-48c5-8ad4-5063b6cb4676-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\""
Mar 20 17:01:55 crc kubenswrapper[4730]: I0320 17:01:55.954476    4730 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/c69a80b5-69a7-48c5-8ad4-5063b6cb4676-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\""
Mar 20 17:01:55 crc kubenswrapper[4730]: I0320 17:01:55.954489    4730 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/c69a80b5-69a7-48c5-8ad4-5063b6cb4676-ca-certs\") on node \"crc\" DevicePath \"\""
Mar 20 17:01:55 crc kubenswrapper[4730]: I0320 17:01:55.954500    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75t8r\" (UniqueName: \"kubernetes.io/projected/c69a80b5-69a7-48c5-8ad4-5063b6cb4676-kube-api-access-75t8r\") on node \"crc\" DevicePath \"\""
Mar 20 17:01:55 crc kubenswrapper[4730]: I0320 17:01:55.954510    4730 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c69a80b5-69a7-48c5-8ad4-5063b6cb4676-openstack-config-secret\") on node \"crc\" DevicePath \"\""
Mar 20 17:01:55 crc kubenswrapper[4730]: I0320 17:01:55.954523    4730 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c69a80b5-69a7-48c5-8ad4-5063b6cb4676-config-data\") on node \"crc\" DevicePath \"\""
Mar 20 17:01:55 crc kubenswrapper[4730]: I0320 17:01:55.955457    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c69a80b5-69a7-48c5-8ad4-5063b6cb4676-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c69a80b5-69a7-48c5-8ad4-5063b6cb4676" (UID: "c69a80b5-69a7-48c5-8ad4-5063b6cb4676"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 17:01:56 crc kubenswrapper[4730]: I0320 17:01:56.027707    4730 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc"
Mar 20 17:01:56 crc kubenswrapper[4730]: I0320 17:01:56.050276    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c69a80b5-69a7-48c5-8ad4-5063b6cb4676-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "c69a80b5-69a7-48c5-8ad4-5063b6cb4676" (UID: "c69a80b5-69a7-48c5-8ad4-5063b6cb4676"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 17:01:56 crc kubenswrapper[4730]: I0320 17:01:56.056101    4730 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\""
Mar 20 17:01:56 crc kubenswrapper[4730]: I0320 17:01:56.056128    4730 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c69a80b5-69a7-48c5-8ad4-5063b6cb4676-ssh-key\") on node \"crc\" DevicePath \"\""
Mar 20 17:01:56 crc kubenswrapper[4730]: I0320 17:01:56.056139    4730 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c69a80b5-69a7-48c5-8ad4-5063b6cb4676-openstack-config\") on node \"crc\" DevicePath \"\""
Mar 20 17:01:56 crc kubenswrapper[4730]: I0320 17:01:56.269702    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"c69a80b5-69a7-48c5-8ad4-5063b6cb4676","Type":"ContainerDied","Data":"3a3b9dc78f4221095ec1a260d19a50071b9bafd10f8f90a8b372cb1bb88e13e5"}
Mar 20 17:01:56 crc kubenswrapper[4730]: I0320 17:01:56.269761    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a3b9dc78f4221095ec1a260d19a50071b9bafd10f8f90a8b372cb1bb88e13e5"
Mar 20 17:01:56 crc kubenswrapper[4730]: I0320 17:01:56.269803    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest"
Mar 20 17:02:00 crc kubenswrapper[4730]: I0320 17:02:00.167875    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567102-k48jl"]
Mar 20 17:02:00 crc kubenswrapper[4730]: E0320 17:02:00.169925    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a113133-c537-41c7-a14e-614fb8bcd24f" containerName="keystone-cron"
Mar 20 17:02:00 crc kubenswrapper[4730]: I0320 17:02:00.170032    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a113133-c537-41c7-a14e-614fb8bcd24f" containerName="keystone-cron"
Mar 20 17:02:00 crc kubenswrapper[4730]: E0320 17:02:00.170138    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c69a80b5-69a7-48c5-8ad4-5063b6cb4676" containerName="tempest-tests-tempest-tests-runner"
Mar 20 17:02:00 crc kubenswrapper[4730]: I0320 17:02:00.170219    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="c69a80b5-69a7-48c5-8ad4-5063b6cb4676" containerName="tempest-tests-tempest-tests-runner"
Mar 20 17:02:00 crc kubenswrapper[4730]: I0320 17:02:00.170574    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a113133-c537-41c7-a14e-614fb8bcd24f" containerName="keystone-cron"
Mar 20 17:02:00 crc kubenswrapper[4730]: I0320 17:02:00.170813    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="c69a80b5-69a7-48c5-8ad4-5063b6cb4676" containerName="tempest-tests-tempest-tests-runner"
Mar 20 17:02:00 crc kubenswrapper[4730]: I0320 17:02:00.171721    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567102-k48jl"
Mar 20 17:02:00 crc kubenswrapper[4730]: I0320 17:02:00.173740    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt"
Mar 20 17:02:00 crc kubenswrapper[4730]: I0320 17:02:00.173990    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt"
Mar 20 17:02:00 crc kubenswrapper[4730]: I0320 17:02:00.174842    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vlqbl"
Mar 20 17:02:00 crc kubenswrapper[4730]: I0320 17:02:00.180465    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567102-k48jl"]
Mar 20 17:02:00 crc kubenswrapper[4730]: I0320 17:02:00.288787    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9z57r\" (UniqueName: \"kubernetes.io/projected/d317b26b-912f-4276-a234-084782092ff3-kube-api-access-9z57r\") pod \"auto-csr-approver-29567102-k48jl\" (UID: \"d317b26b-912f-4276-a234-084782092ff3\") " pod="openshift-infra/auto-csr-approver-29567102-k48jl"
Mar 20 17:02:00 crc kubenswrapper[4730]: I0320 17:02:00.317358    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"]
Mar 20 17:02:00 crc kubenswrapper[4730]: I0320 17:02:00.319568    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"
Mar 20 17:02:00 crc kubenswrapper[4730]: I0320 17:02:00.321867    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-gh48g"
Mar 20 17:02:00 crc kubenswrapper[4730]: I0320 17:02:00.326213    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"]
Mar 20 17:02:00 crc kubenswrapper[4730]: I0320 17:02:00.391373    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9z57r\" (UniqueName: \"kubernetes.io/projected/d317b26b-912f-4276-a234-084782092ff3-kube-api-access-9z57r\") pod \"auto-csr-approver-29567102-k48jl\" (UID: \"d317b26b-912f-4276-a234-084782092ff3\") " pod="openshift-infra/auto-csr-approver-29567102-k48jl"
Mar 20 17:02:00 crc kubenswrapper[4730]: I0320 17:02:00.412282    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9z57r\" (UniqueName: \"kubernetes.io/projected/d317b26b-912f-4276-a234-084782092ff3-kube-api-access-9z57r\") pod \"auto-csr-approver-29567102-k48jl\" (UID: \"d317b26b-912f-4276-a234-084782092ff3\") " pod="openshift-infra/auto-csr-approver-29567102-k48jl"
Mar 20 17:02:00 crc kubenswrapper[4730]: I0320 17:02:00.489819    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567102-k48jl"
Mar 20 17:02:00 crc kubenswrapper[4730]: I0320 17:02:00.494206    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d66zz\" (UniqueName: \"kubernetes.io/projected/d79eb29a-c814-4aa0-a268-2069d58b08d2-kube-api-access-d66zz\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"d79eb29a-c814-4aa0-a268-2069d58b08d2\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"
Mar 20 17:02:00 crc kubenswrapper[4730]: I0320 17:02:00.494379    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"d79eb29a-c814-4aa0-a268-2069d58b08d2\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"
Mar 20 17:02:00 crc kubenswrapper[4730]: I0320 17:02:00.595859    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d66zz\" (UniqueName: \"kubernetes.io/projected/d79eb29a-c814-4aa0-a268-2069d58b08d2-kube-api-access-d66zz\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"d79eb29a-c814-4aa0-a268-2069d58b08d2\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"
Mar 20 17:02:00 crc kubenswrapper[4730]: I0320 17:02:00.595902    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"d79eb29a-c814-4aa0-a268-2069d58b08d2\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"
Mar 20 17:02:00 crc kubenswrapper[4730]: I0320 17:02:00.596262    4730 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"d79eb29a-c814-4aa0-a268-2069d58b08d2\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"
Mar 20 17:02:00 crc kubenswrapper[4730]: I0320 17:02:00.619938    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d66zz\" (UniqueName: \"kubernetes.io/projected/d79eb29a-c814-4aa0-a268-2069d58b08d2-kube-api-access-d66zz\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"d79eb29a-c814-4aa0-a268-2069d58b08d2\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"
Mar 20 17:02:00 crc kubenswrapper[4730]: I0320 17:02:00.631576    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"d79eb29a-c814-4aa0-a268-2069d58b08d2\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"
Mar 20 17:02:00 crc kubenswrapper[4730]: I0320 17:02:00.642706    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"
Mar 20 17:02:01 crc kubenswrapper[4730]: I0320 17:02:01.089706    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567102-k48jl"]
Mar 20 17:02:01 crc kubenswrapper[4730]: W0320 17:02:01.194779    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd79eb29a_c814_4aa0_a268_2069d58b08d2.slice/crio-bda47bb83d0a47e9e8896f47119001302843c3eb091e25a7eb760efbbe20544e WatchSource:0}: Error finding container bda47bb83d0a47e9e8896f47119001302843c3eb091e25a7eb760efbbe20544e: Status 404 returned error can't find the container with id bda47bb83d0a47e9e8896f47119001302843c3eb091e25a7eb760efbbe20544e
Mar 20 17:02:01 crc kubenswrapper[4730]: I0320 17:02:01.197213    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"]
Mar 20 17:02:01 crc kubenswrapper[4730]: I0320 17:02:01.321820    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567102-k48jl" event={"ID":"d317b26b-912f-4276-a234-084782092ff3","Type":"ContainerStarted","Data":"e866ddd76cac6a16b70501572318fe6f314ed313536675c55b5aa885d38c9e42"}
Mar 20 17:02:01 crc kubenswrapper[4730]: I0320 17:02:01.323380    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"d79eb29a-c814-4aa0-a268-2069d58b08d2","Type":"ContainerStarted","Data":"bda47bb83d0a47e9e8896f47119001302843c3eb091e25a7eb760efbbe20544e"}
Mar 20 17:02:03 crc kubenswrapper[4730]: I0320 17:02:03.354560    4730 generic.go:334] "Generic (PLEG): container finished" podID="d317b26b-912f-4276-a234-084782092ff3" containerID="bf551572383a97d7725c248e69428cf0db8c3b25722e05b3bf7b441d82bc6b56" exitCode=0
Mar 20 17:02:03 crc kubenswrapper[4730]: I0320 17:02:03.354634    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567102-k48jl" event={"ID":"d317b26b-912f-4276-a234-084782092ff3","Type":"ContainerDied","Data":"bf551572383a97d7725c248e69428cf0db8c3b25722e05b3bf7b441d82bc6b56"}
Mar 20 17:02:03 crc kubenswrapper[4730]: I0320 17:02:03.356828    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"d79eb29a-c814-4aa0-a268-2069d58b08d2","Type":"ContainerStarted","Data":"f6443ae982c113e8299daeacfa8f2b42dffd4d724dbf09dadca8f0160bc9718a"}
Mar 20 17:02:03 crc kubenswrapper[4730]: I0320 17:02:03.403056    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.508226136 podStartE2EDuration="3.403035637s" podCreationTimestamp="2026-03-20 17:02:00 +0000 UTC" firstStartedPulling="2026-03-20 17:02:01.196761487 +0000 UTC m=+4980.410132856" lastFinishedPulling="2026-03-20 17:02:02.091570978 +0000 UTC m=+4981.304942357" observedRunningTime="2026-03-20 17:02:03.395860582 +0000 UTC m=+4982.609231961" watchObservedRunningTime="2026-03-20 17:02:03.403035637 +0000 UTC m=+4982.616407006"
Mar 20 17:02:04 crc kubenswrapper[4730]: I0320 17:02:04.763408    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567102-k48jl"
Mar 20 17:02:04 crc kubenswrapper[4730]: I0320 17:02:04.898719    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9z57r\" (UniqueName: \"kubernetes.io/projected/d317b26b-912f-4276-a234-084782092ff3-kube-api-access-9z57r\") pod \"d317b26b-912f-4276-a234-084782092ff3\" (UID: \"d317b26b-912f-4276-a234-084782092ff3\") "
Mar 20 17:02:04 crc kubenswrapper[4730]: I0320 17:02:04.907545    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d317b26b-912f-4276-a234-084782092ff3-kube-api-access-9z57r" (OuterVolumeSpecName: "kube-api-access-9z57r") pod "d317b26b-912f-4276-a234-084782092ff3" (UID: "d317b26b-912f-4276-a234-084782092ff3"). InnerVolumeSpecName "kube-api-access-9z57r". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 17:02:05 crc kubenswrapper[4730]: I0320 17:02:05.002063    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9z57r\" (UniqueName: \"kubernetes.io/projected/d317b26b-912f-4276-a234-084782092ff3-kube-api-access-9z57r\") on node \"crc\" DevicePath \"\""
Mar 20 17:02:05 crc kubenswrapper[4730]: I0320 17:02:05.382548    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567102-k48jl" event={"ID":"d317b26b-912f-4276-a234-084782092ff3","Type":"ContainerDied","Data":"e866ddd76cac6a16b70501572318fe6f314ed313536675c55b5aa885d38c9e42"}
Mar 20 17:02:05 crc kubenswrapper[4730]: I0320 17:02:05.383048    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e866ddd76cac6a16b70501572318fe6f314ed313536675c55b5aa885d38c9e42"
Mar 20 17:02:05 crc kubenswrapper[4730]: I0320 17:02:05.382696    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567102-k48jl"
Mar 20 17:02:05 crc kubenswrapper[4730]: I0320 17:02:05.857552    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567096-xvkcr"]
Mar 20 17:02:05 crc kubenswrapper[4730]: I0320 17:02:05.869429    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567096-xvkcr"]
Mar 20 17:02:07 crc kubenswrapper[4730]: I0320 17:02:07.554521    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85cd1ac-f48f-46a3-81e3-f82b73719cb1" path="/var/lib/kubelet/pods/f85cd1ac-f48f-46a3-81e3-f82b73719cb1/volumes"
Mar 20 17:02:46 crc kubenswrapper[4730]: I0320 17:02:46.831374    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zs57x/must-gather-nxsd6"]
Mar 20 17:02:46 crc kubenswrapper[4730]: E0320 17:02:46.832278    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d317b26b-912f-4276-a234-084782092ff3" containerName="oc"
Mar 20 17:02:46 crc kubenswrapper[4730]: I0320 17:02:46.832292    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="d317b26b-912f-4276-a234-084782092ff3" containerName="oc"
Mar 20 17:02:46 crc kubenswrapper[4730]: I0320 17:02:46.832494    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="d317b26b-912f-4276-a234-084782092ff3" containerName="oc"
Mar 20 17:02:46 crc kubenswrapper[4730]: I0320 17:02:46.833506    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zs57x/must-gather-nxsd6"
Mar 20 17:02:46 crc kubenswrapper[4730]: I0320 17:02:46.835357    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-zs57x"/"default-dockercfg-kdxq2"
Mar 20 17:02:46 crc kubenswrapper[4730]: I0320 17:02:46.836378    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-zs57x"/"kube-root-ca.crt"
Mar 20 17:02:46 crc kubenswrapper[4730]: I0320 17:02:46.837924    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-zs57x"/"openshift-service-ca.crt"
Mar 20 17:02:46 crc kubenswrapper[4730]: I0320 17:02:46.853795    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zs57x/must-gather-nxsd6"]
Mar 20 17:02:46 crc kubenswrapper[4730]: I0320 17:02:46.927780    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrzpq\" (UniqueName: \"kubernetes.io/projected/4e1ddcc0-c7cc-4670-9679-bd5ccf32f30d-kube-api-access-rrzpq\") pod \"must-gather-nxsd6\" (UID: \"4e1ddcc0-c7cc-4670-9679-bd5ccf32f30d\") " pod="openshift-must-gather-zs57x/must-gather-nxsd6"
Mar 20 17:02:46 crc kubenswrapper[4730]: I0320 17:02:46.927908    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4e1ddcc0-c7cc-4670-9679-bd5ccf32f30d-must-gather-output\") pod \"must-gather-nxsd6\" (UID: \"4e1ddcc0-c7cc-4670-9679-bd5ccf32f30d\") " pod="openshift-must-gather-zs57x/must-gather-nxsd6"
Mar 20 17:02:47 crc kubenswrapper[4730]: I0320 17:02:47.029293    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrzpq\" (UniqueName: \"kubernetes.io/projected/4e1ddcc0-c7cc-4670-9679-bd5ccf32f30d-kube-api-access-rrzpq\") pod \"must-gather-nxsd6\" (UID: \"4e1ddcc0-c7cc-4670-9679-bd5ccf32f30d\") " pod="openshift-must-gather-zs57x/must-gather-nxsd6"
Mar 20 17:02:47 crc kubenswrapper[4730]: I0320 17:02:47.029421    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4e1ddcc0-c7cc-4670-9679-bd5ccf32f30d-must-gather-output\") pod \"must-gather-nxsd6\" (UID: \"4e1ddcc0-c7cc-4670-9679-bd5ccf32f30d\") " pod="openshift-must-gather-zs57x/must-gather-nxsd6"
Mar 20 17:02:47 crc kubenswrapper[4730]: I0320 17:02:47.029807    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4e1ddcc0-c7cc-4670-9679-bd5ccf32f30d-must-gather-output\") pod \"must-gather-nxsd6\" (UID: \"4e1ddcc0-c7cc-4670-9679-bd5ccf32f30d\") " pod="openshift-must-gather-zs57x/must-gather-nxsd6"
Mar 20 17:02:47 crc kubenswrapper[4730]: I0320 17:02:47.047561    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrzpq\" (UniqueName: \"kubernetes.io/projected/4e1ddcc0-c7cc-4670-9679-bd5ccf32f30d-kube-api-access-rrzpq\") pod \"must-gather-nxsd6\" (UID: \"4e1ddcc0-c7cc-4670-9679-bd5ccf32f30d\") " pod="openshift-must-gather-zs57x/must-gather-nxsd6"
Mar 20 17:02:47 crc kubenswrapper[4730]: I0320 17:02:47.152453    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zs57x/must-gather-nxsd6"
Mar 20 17:02:47 crc kubenswrapper[4730]: I0320 17:02:47.631366    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zs57x/must-gather-nxsd6"]
Mar 20 17:02:47 crc kubenswrapper[4730]: I0320 17:02:47.886238    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zs57x/must-gather-nxsd6" event={"ID":"4e1ddcc0-c7cc-4670-9679-bd5ccf32f30d","Type":"ContainerStarted","Data":"3bc790ab8f6e59c3b86417b0a55bed11ef4d53e15381c3de1bb130ff16d58cbf"}
Mar 20 17:02:52 crc kubenswrapper[4730]: I0320 17:02:52.379240    4730 scope.go:117] "RemoveContainer" containerID="ad05ce67547cd2c7c1cc69cf885883fc4049286efb953f0b2d8378cbd56d924f"
Mar 20 17:02:54 crc kubenswrapper[4730]: I0320 17:02:54.991238    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zs57x/must-gather-nxsd6" event={"ID":"4e1ddcc0-c7cc-4670-9679-bd5ccf32f30d","Type":"ContainerStarted","Data":"5f9bd62b83471aba96f37df030e3e1c5b574f8d8437b38bdf436e67eecd3f71d"}
Mar 20 17:02:54 crc kubenswrapper[4730]: I0320 17:02:54.991798    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zs57x/must-gather-nxsd6" event={"ID":"4e1ddcc0-c7cc-4670-9679-bd5ccf32f30d","Type":"ContainerStarted","Data":"5d6c90f5aec42032ef5d1044f85f5311f53e8138aa56ffcadeca5419bfb41e2e"}
Mar 20 17:02:55 crc kubenswrapper[4730]: I0320 17:02:55.006859    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zs57x/must-gather-nxsd6" podStartSLOduration=2.2706341119999998 podStartE2EDuration="9.006843952s" podCreationTimestamp="2026-03-20 17:02:46 +0000 UTC" firstStartedPulling="2026-03-20 17:02:47.635884781 +0000 UTC m=+5026.849256150" lastFinishedPulling="2026-03-20 17:02:54.372094611 +0000 UTC m=+5033.585465990" observedRunningTime="2026-03-20 17:02:55.004183856 +0000 UTC m=+5034.217555225" watchObservedRunningTime="2026-03-20 17:02:55.006843952 +0000 UTC m=+5034.220215321"
Mar 20 17:02:58 crc kubenswrapper[4730]: I0320 17:02:58.756608    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zs57x/crc-debug-xtxgl"]
Mar 20 17:02:58 crc kubenswrapper[4730]: I0320 17:02:58.758432    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zs57x/crc-debug-xtxgl"
Mar 20 17:02:58 crc kubenswrapper[4730]: I0320 17:02:58.845691    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bd7a63f3-7b9b-489f-b957-6d5e10689cae-host\") pod \"crc-debug-xtxgl\" (UID: \"bd7a63f3-7b9b-489f-b957-6d5e10689cae\") " pod="openshift-must-gather-zs57x/crc-debug-xtxgl"
Mar 20 17:02:58 crc kubenswrapper[4730]: I0320 17:02:58.845843    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bj9r\" (UniqueName: \"kubernetes.io/projected/bd7a63f3-7b9b-489f-b957-6d5e10689cae-kube-api-access-8bj9r\") pod \"crc-debug-xtxgl\" (UID: \"bd7a63f3-7b9b-489f-b957-6d5e10689cae\") " pod="openshift-must-gather-zs57x/crc-debug-xtxgl"
Mar 20 17:02:58 crc kubenswrapper[4730]: I0320 17:02:58.947219    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bd7a63f3-7b9b-489f-b957-6d5e10689cae-host\") pod \"crc-debug-xtxgl\" (UID: \"bd7a63f3-7b9b-489f-b957-6d5e10689cae\") " pod="openshift-must-gather-zs57x/crc-debug-xtxgl"
Mar 20 17:02:58 crc kubenswrapper[4730]: I0320 17:02:58.947369    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bd7a63f3-7b9b-489f-b957-6d5e10689cae-host\") pod \"crc-debug-xtxgl\" (UID: \"bd7a63f3-7b9b-489f-b957-6d5e10689cae\") " pod="openshift-must-gather-zs57x/crc-debug-xtxgl"
Mar 20 17:02:58 crc kubenswrapper[4730]: I0320 17:02:58.947386    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bj9r\" (UniqueName: \"kubernetes.io/projected/bd7a63f3-7b9b-489f-b957-6d5e10689cae-kube-api-access-8bj9r\") pod \"crc-debug-xtxgl\" (UID: \"bd7a63f3-7b9b-489f-b957-6d5e10689cae\") " pod="openshift-must-gather-zs57x/crc-debug-xtxgl"
Mar 20 17:02:58 crc kubenswrapper[4730]: I0320 17:02:58.970361    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bj9r\" (UniqueName: \"kubernetes.io/projected/bd7a63f3-7b9b-489f-b957-6d5e10689cae-kube-api-access-8bj9r\") pod \"crc-debug-xtxgl\" (UID: \"bd7a63f3-7b9b-489f-b957-6d5e10689cae\") " pod="openshift-must-gather-zs57x/crc-debug-xtxgl"
Mar 20 17:02:59 crc kubenswrapper[4730]: I0320 17:02:59.086586    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zs57x/crc-debug-xtxgl"
Mar 20 17:03:00 crc kubenswrapper[4730]: I0320 17:03:00.043908    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zs57x/crc-debug-xtxgl" event={"ID":"bd7a63f3-7b9b-489f-b957-6d5e10689cae","Type":"ContainerStarted","Data":"6c374a1f1385b9e2b38e32d1f04faa659c4af4e23397e6bc76bac1e6d675b60a"}
Mar 20 17:03:10 crc kubenswrapper[4730]: I0320 17:03:10.142185    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zs57x/crc-debug-xtxgl" event={"ID":"bd7a63f3-7b9b-489f-b957-6d5e10689cae","Type":"ContainerStarted","Data":"7cfa3c4c40647b5e6e0d0655d8ae502be7742a62b0dcf48099fd5c14403c3160"}
Mar 20 17:03:10 crc kubenswrapper[4730]: I0320 17:03:10.164541    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zs57x/crc-debug-xtxgl" podStartSLOduration=1.8728616009999999 podStartE2EDuration="12.164517308s" podCreationTimestamp="2026-03-20 17:02:58 +0000 UTC" firstStartedPulling="2026-03-20 17:02:59.151046066 +0000 UTC m=+5038.364417435" lastFinishedPulling="2026-03-20 17:03:09.442701773 +0000 UTC m=+5048.656073142" observedRunningTime="2026-03-20 17:03:10.157807467 +0000 UTC m=+5049.371178836" watchObservedRunningTime="2026-03-20 17:03:10.164517308 +0000 UTC m=+5049.377888687"
Mar 20 17:03:54 crc kubenswrapper[4730]: I0320 17:03:54.590896    4730 generic.go:334] "Generic (PLEG): container finished" podID="bd7a63f3-7b9b-489f-b957-6d5e10689cae" containerID="7cfa3c4c40647b5e6e0d0655d8ae502be7742a62b0dcf48099fd5c14403c3160" exitCode=0
Mar 20 17:03:54 crc kubenswrapper[4730]: I0320 17:03:54.590973    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zs57x/crc-debug-xtxgl" event={"ID":"bd7a63f3-7b9b-489f-b957-6d5e10689cae","Type":"ContainerDied","Data":"7cfa3c4c40647b5e6e0d0655d8ae502be7742a62b0dcf48099fd5c14403c3160"}
Mar 20 17:03:55 crc kubenswrapper[4730]: I0320 17:03:55.758761    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zs57x/crc-debug-xtxgl"
Mar 20 17:03:55 crc kubenswrapper[4730]: I0320 17:03:55.796923    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-zs57x/crc-debug-xtxgl"]
Mar 20 17:03:55 crc kubenswrapper[4730]: I0320 17:03:55.809387    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-zs57x/crc-debug-xtxgl"]
Mar 20 17:03:55 crc kubenswrapper[4730]: I0320 17:03:55.948566    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bj9r\" (UniqueName: \"kubernetes.io/projected/bd7a63f3-7b9b-489f-b957-6d5e10689cae-kube-api-access-8bj9r\") pod \"bd7a63f3-7b9b-489f-b957-6d5e10689cae\" (UID: \"bd7a63f3-7b9b-489f-b957-6d5e10689cae\") "
Mar 20 17:03:55 crc kubenswrapper[4730]: I0320 17:03:55.948701    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bd7a63f3-7b9b-489f-b957-6d5e10689cae-host\") pod \"bd7a63f3-7b9b-489f-b957-6d5e10689cae\" (UID: \"bd7a63f3-7b9b-489f-b957-6d5e10689cae\") "
Mar 20 17:03:55 crc kubenswrapper[4730]: I0320 17:03:55.948758    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bd7a63f3-7b9b-489f-b957-6d5e10689cae-host" (OuterVolumeSpecName: "host") pod "bd7a63f3-7b9b-489f-b957-6d5e10689cae" (UID: "bd7a63f3-7b9b-489f-b957-6d5e10689cae"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue ""
Mar 20 17:03:55 crc kubenswrapper[4730]: I0320 17:03:55.949905    4730 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bd7a63f3-7b9b-489f-b957-6d5e10689cae-host\") on node \"crc\" DevicePath \"\""
Mar 20 17:03:55 crc kubenswrapper[4730]: I0320 17:03:55.960631    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd7a63f3-7b9b-489f-b957-6d5e10689cae-kube-api-access-8bj9r" (OuterVolumeSpecName: "kube-api-access-8bj9r") pod "bd7a63f3-7b9b-489f-b957-6d5e10689cae" (UID: "bd7a63f3-7b9b-489f-b957-6d5e10689cae"). InnerVolumeSpecName "kube-api-access-8bj9r". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 17:03:56 crc kubenswrapper[4730]: I0320 17:03:56.053060    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bj9r\" (UniqueName: \"kubernetes.io/projected/bd7a63f3-7b9b-489f-b957-6d5e10689cae-kube-api-access-8bj9r\") on node \"crc\" DevicePath \"\""
Mar 20 17:03:56 crc kubenswrapper[4730]: I0320 17:03:56.636636    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c374a1f1385b9e2b38e32d1f04faa659c4af4e23397e6bc76bac1e6d675b60a"
Mar 20 17:03:56 crc kubenswrapper[4730]: I0320 17:03:56.636792    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zs57x/crc-debug-xtxgl"
Mar 20 17:03:56 crc kubenswrapper[4730]: I0320 17:03:56.990548    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zs57x/crc-debug-r2nsh"]
Mar 20 17:03:56 crc kubenswrapper[4730]: E0320 17:03:56.990977    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd7a63f3-7b9b-489f-b957-6d5e10689cae" containerName="container-00"
Mar 20 17:03:56 crc kubenswrapper[4730]: I0320 17:03:56.990990    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd7a63f3-7b9b-489f-b957-6d5e10689cae" containerName="container-00"
Mar 20 17:03:56 crc kubenswrapper[4730]: I0320 17:03:56.991224    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd7a63f3-7b9b-489f-b957-6d5e10689cae" containerName="container-00"
Mar 20 17:03:56 crc kubenswrapper[4730]: I0320 17:03:56.991904    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zs57x/crc-debug-r2nsh"
Mar 20 17:03:57 crc kubenswrapper[4730]: I0320 17:03:57.078149    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2j5cn\" (UniqueName: \"kubernetes.io/projected/9b604a34-be8f-425a-aef6-c3e9581035d7-kube-api-access-2j5cn\") pod \"crc-debug-r2nsh\" (UID: \"9b604a34-be8f-425a-aef6-c3e9581035d7\") " pod="openshift-must-gather-zs57x/crc-debug-r2nsh"
Mar 20 17:03:57 crc kubenswrapper[4730]: I0320 17:03:57.078407    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9b604a34-be8f-425a-aef6-c3e9581035d7-host\") pod \"crc-debug-r2nsh\" (UID: \"9b604a34-be8f-425a-aef6-c3e9581035d7\") " pod="openshift-must-gather-zs57x/crc-debug-r2nsh"
Mar 20 17:03:57 crc kubenswrapper[4730]: I0320 17:03:57.179367    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9b604a34-be8f-425a-aef6-c3e9581035d7-host\") pod \"crc-debug-r2nsh\" (UID: \"9b604a34-be8f-425a-aef6-c3e9581035d7\") " pod="openshift-must-gather-zs57x/crc-debug-r2nsh"
Mar 20 17:03:57 crc kubenswrapper[4730]: I0320 17:03:57.179510    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2j5cn\" (UniqueName: \"kubernetes.io/projected/9b604a34-be8f-425a-aef6-c3e9581035d7-kube-api-access-2j5cn\") pod \"crc-debug-r2nsh\" (UID: \"9b604a34-be8f-425a-aef6-c3e9581035d7\") " pod="openshift-must-gather-zs57x/crc-debug-r2nsh"
Mar 20 17:03:57 crc kubenswrapper[4730]: I0320 17:03:57.179847    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9b604a34-be8f-425a-aef6-c3e9581035d7-host\") pod \"crc-debug-r2nsh\" (UID: \"9b604a34-be8f-425a-aef6-c3e9581035d7\") " pod="openshift-must-gather-zs57x/crc-debug-r2nsh"
Mar 20 17:03:57 crc kubenswrapper[4730]: I0320 17:03:57.200082    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j5cn\" (UniqueName: \"kubernetes.io/projected/9b604a34-be8f-425a-aef6-c3e9581035d7-kube-api-access-2j5cn\") pod \"crc-debug-r2nsh\" (UID: \"9b604a34-be8f-425a-aef6-c3e9581035d7\") " pod="openshift-must-gather-zs57x/crc-debug-r2nsh"
Mar 20 17:03:57 crc kubenswrapper[4730]: I0320 17:03:57.310183    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zs57x/crc-debug-r2nsh"
Mar 20 17:03:57 crc kubenswrapper[4730]: I0320 17:03:57.543283    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd7a63f3-7b9b-489f-b957-6d5e10689cae" path="/var/lib/kubelet/pods/bd7a63f3-7b9b-489f-b957-6d5e10689cae/volumes"
Mar 20 17:03:57 crc kubenswrapper[4730]: I0320 17:03:57.647208    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zs57x/crc-debug-r2nsh" event={"ID":"9b604a34-be8f-425a-aef6-c3e9581035d7","Type":"ContainerStarted","Data":"f2a5b11498b3565ac584ae0844e2ddcd32e123d95c3d6fe4c6c0d6653ce556a9"}
Mar 20 17:03:57 crc kubenswrapper[4730]: I0320 17:03:57.647281    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zs57x/crc-debug-r2nsh" event={"ID":"9b604a34-be8f-425a-aef6-c3e9581035d7","Type":"ContainerStarted","Data":"c74b2375ab93ace2ca445822106d4ea1bc9f5071d7fa4bd714394b0009c98bde"}
Mar 20 17:03:57 crc kubenswrapper[4730]: I0320 17:03:57.676529    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zs57x/crc-debug-r2nsh" podStartSLOduration=1.6765088110000002 podStartE2EDuration="1.676508811s" podCreationTimestamp="2026-03-20 17:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:03:57.660394692 +0000 UTC m=+5096.873766071" watchObservedRunningTime="2026-03-20 17:03:57.676508811 +0000 UTC m=+5096.889880180"
Mar 20 17:03:58 crc kubenswrapper[4730]: I0320 17:03:58.658630    4730 generic.go:334] "Generic (PLEG): container finished" podID="9b604a34-be8f-425a-aef6-c3e9581035d7" containerID="f2a5b11498b3565ac584ae0844e2ddcd32e123d95c3d6fe4c6c0d6653ce556a9" exitCode=0
Mar 20 17:03:58 crc kubenswrapper[4730]: I0320 17:03:58.658678    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zs57x/crc-debug-r2nsh" event={"ID":"9b604a34-be8f-425a-aef6-c3e9581035d7","Type":"ContainerDied","Data":"f2a5b11498b3565ac584ae0844e2ddcd32e123d95c3d6fe4c6c0d6653ce556a9"}
Mar 20 17:03:59 crc kubenswrapper[4730]: I0320 17:03:59.786845    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zs57x/crc-debug-r2nsh"
Mar 20 17:03:59 crc kubenswrapper[4730]: I0320 17:03:59.821835    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-zs57x/crc-debug-r2nsh"]
Mar 20 17:03:59 crc kubenswrapper[4730]: I0320 17:03:59.831565    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-zs57x/crc-debug-r2nsh"]
Mar 20 17:03:59 crc kubenswrapper[4730]: I0320 17:03:59.922768    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9b604a34-be8f-425a-aef6-c3e9581035d7-host\") pod \"9b604a34-be8f-425a-aef6-c3e9581035d7\" (UID: \"9b604a34-be8f-425a-aef6-c3e9581035d7\") "
Mar 20 17:03:59 crc kubenswrapper[4730]: I0320 17:03:59.922867    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9b604a34-be8f-425a-aef6-c3e9581035d7-host" (OuterVolumeSpecName: "host") pod "9b604a34-be8f-425a-aef6-c3e9581035d7" (UID: "9b604a34-be8f-425a-aef6-c3e9581035d7"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue ""
Mar 20 17:03:59 crc kubenswrapper[4730]: I0320 17:03:59.922902    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2j5cn\" (UniqueName: \"kubernetes.io/projected/9b604a34-be8f-425a-aef6-c3e9581035d7-kube-api-access-2j5cn\") pod \"9b604a34-be8f-425a-aef6-c3e9581035d7\" (UID: \"9b604a34-be8f-425a-aef6-c3e9581035d7\") "
Mar 20 17:03:59 crc kubenswrapper[4730]: I0320 17:03:59.923745    4730 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9b604a34-be8f-425a-aef6-c3e9581035d7-host\") on node \"crc\" DevicePath \"\""
Mar 20 17:03:59 crc kubenswrapper[4730]: I0320 17:03:59.927967    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b604a34-be8f-425a-aef6-c3e9581035d7-kube-api-access-2j5cn" (OuterVolumeSpecName: "kube-api-access-2j5cn") pod "9b604a34-be8f-425a-aef6-c3e9581035d7" (UID: "9b604a34-be8f-425a-aef6-c3e9581035d7"). InnerVolumeSpecName "kube-api-access-2j5cn". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 17:04:00 crc kubenswrapper[4730]: I0320 17:04:00.025432    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2j5cn\" (UniqueName: \"kubernetes.io/projected/9b604a34-be8f-425a-aef6-c3e9581035d7-kube-api-access-2j5cn\") on node \"crc\" DevicePath \"\""
Mar 20 17:04:00 crc kubenswrapper[4730]: I0320 17:04:00.144845    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567104-xfdqf"]
Mar 20 17:04:00 crc kubenswrapper[4730]: E0320 17:04:00.145329    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b604a34-be8f-425a-aef6-c3e9581035d7" containerName="container-00"
Mar 20 17:04:00 crc kubenswrapper[4730]: I0320 17:04:00.145346    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b604a34-be8f-425a-aef6-c3e9581035d7" containerName="container-00"
Mar 20 17:04:00 crc kubenswrapper[4730]: I0320 17:04:00.145589    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b604a34-be8f-425a-aef6-c3e9581035d7" containerName="container-00"
Mar 20 17:04:00 crc kubenswrapper[4730]: I0320 17:04:00.146322    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567104-xfdqf"
Mar 20 17:04:00 crc kubenswrapper[4730]: I0320 17:04:00.148868    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt"
Mar 20 17:04:00 crc kubenswrapper[4730]: I0320 17:04:00.149380    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vlqbl"
Mar 20 17:04:00 crc kubenswrapper[4730]: I0320 17:04:00.149660    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt"
Mar 20 17:04:00 crc kubenswrapper[4730]: I0320 17:04:00.154683    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567104-xfdqf"]
Mar 20 17:04:00 crc kubenswrapper[4730]: I0320 17:04:00.230461    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxl6s\" (UniqueName: \"kubernetes.io/projected/cb4bb26e-8780-490e-b7d4-5068d41079d5-kube-api-access-fxl6s\") pod \"auto-csr-approver-29567104-xfdqf\" (UID: \"cb4bb26e-8780-490e-b7d4-5068d41079d5\") " pod="openshift-infra/auto-csr-approver-29567104-xfdqf"
Mar 20 17:04:00 crc kubenswrapper[4730]: I0320 17:04:00.331731    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxl6s\" (UniqueName: \"kubernetes.io/projected/cb4bb26e-8780-490e-b7d4-5068d41079d5-kube-api-access-fxl6s\") pod \"auto-csr-approver-29567104-xfdqf\" (UID: \"cb4bb26e-8780-490e-b7d4-5068d41079d5\") " pod="openshift-infra/auto-csr-approver-29567104-xfdqf"
Mar 20 17:04:00 crc kubenswrapper[4730]: I0320 17:04:00.351125    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxl6s\" (UniqueName: \"kubernetes.io/projected/cb4bb26e-8780-490e-b7d4-5068d41079d5-kube-api-access-fxl6s\") pod \"auto-csr-approver-29567104-xfdqf\" (UID: \"cb4bb26e-8780-490e-b7d4-5068d41079d5\") " pod="openshift-infra/auto-csr-approver-29567104-xfdqf"
Mar 20 17:04:00 crc kubenswrapper[4730]: I0320 17:04:00.499972    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567104-xfdqf"
Mar 20 17:04:00 crc kubenswrapper[4730]: I0320 17:04:00.703530    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c74b2375ab93ace2ca445822106d4ea1bc9f5071d7fa4bd714394b0009c98bde"
Mar 20 17:04:00 crc kubenswrapper[4730]: I0320 17:04:00.703633    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zs57x/crc-debug-r2nsh"
Mar 20 17:04:01 crc kubenswrapper[4730]: I0320 17:04:01.021045    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zs57x/crc-debug-v8sjq"]
Mar 20 17:04:01 crc kubenswrapper[4730]: I0320 17:04:01.024787    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zs57x/crc-debug-v8sjq"
Mar 20 17:04:01 crc kubenswrapper[4730]: I0320 17:04:01.062562    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567104-xfdqf"]
Mar 20 17:04:01 crc kubenswrapper[4730]: W0320 17:04:01.064219    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb4bb26e_8780_490e_b7d4_5068d41079d5.slice/crio-02919563296b440dbb8f5883abe6fd55e90bbc6eeb56409984dea21fb85e0519 WatchSource:0}: Error finding container 02919563296b440dbb8f5883abe6fd55e90bbc6eeb56409984dea21fb85e0519: Status 404 returned error can't find the container with id 02919563296b440dbb8f5883abe6fd55e90bbc6eeb56409984dea21fb85e0519
Mar 20 17:04:01 crc kubenswrapper[4730]: I0320 17:04:01.066651    4730 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider
Mar 20 17:04:01 crc kubenswrapper[4730]: I0320 17:04:01.149127    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d071ff87-5bd9-451c-a9f5-e23dc9dda0f8-host\") pod \"crc-debug-v8sjq\" (UID: \"d071ff87-5bd9-451c-a9f5-e23dc9dda0f8\") " pod="openshift-must-gather-zs57x/crc-debug-v8sjq"
Mar 20 17:04:01 crc kubenswrapper[4730]: I0320 17:04:01.149211    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz2vg\" (UniqueName: \"kubernetes.io/projected/d071ff87-5bd9-451c-a9f5-e23dc9dda0f8-kube-api-access-fz2vg\") pod \"crc-debug-v8sjq\" (UID: \"d071ff87-5bd9-451c-a9f5-e23dc9dda0f8\") " pod="openshift-must-gather-zs57x/crc-debug-v8sjq"
Mar 20 17:04:01 crc kubenswrapper[4730]: I0320 17:04:01.250709    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fz2vg\" (UniqueName: \"kubernetes.io/projected/d071ff87-5bd9-451c-a9f5-e23dc9dda0f8-kube-api-access-fz2vg\") pod \"crc-debug-v8sjq\" (UID: \"d071ff87-5bd9-451c-a9f5-e23dc9dda0f8\") " pod="openshift-must-gather-zs57x/crc-debug-v8sjq"
Mar 20 17:04:01 crc kubenswrapper[4730]: I0320 17:04:01.250890    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d071ff87-5bd9-451c-a9f5-e23dc9dda0f8-host\") pod \"crc-debug-v8sjq\" (UID: \"d071ff87-5bd9-451c-a9f5-e23dc9dda0f8\") " pod="openshift-must-gather-zs57x/crc-debug-v8sjq"
Mar 20 17:04:01 crc kubenswrapper[4730]: I0320 17:04:01.251009    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d071ff87-5bd9-451c-a9f5-e23dc9dda0f8-host\") pod \"crc-debug-v8sjq\" (UID: \"d071ff87-5bd9-451c-a9f5-e23dc9dda0f8\") " pod="openshift-must-gather-zs57x/crc-debug-v8sjq"
Mar 20 17:04:01 crc kubenswrapper[4730]: I0320 17:04:01.277815    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz2vg\" (UniqueName: \"kubernetes.io/projected/d071ff87-5bd9-451c-a9f5-e23dc9dda0f8-kube-api-access-fz2vg\") pod \"crc-debug-v8sjq\" (UID: \"d071ff87-5bd9-451c-a9f5-e23dc9dda0f8\") " pod="openshift-must-gather-zs57x/crc-debug-v8sjq"
Mar 20 17:04:01 crc kubenswrapper[4730]: I0320 17:04:01.338639    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zs57x/crc-debug-v8sjq"
Mar 20 17:04:01 crc kubenswrapper[4730]: W0320 17:04:01.380088    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd071ff87_5bd9_451c_a9f5_e23dc9dda0f8.slice/crio-77307fa29bb7e2a5f9baface29502c291c6437a1eda4d88b6c1a52cb6e7e8cde WatchSource:0}: Error finding container 77307fa29bb7e2a5f9baface29502c291c6437a1eda4d88b6c1a52cb6e7e8cde: Status 404 returned error can't find the container with id 77307fa29bb7e2a5f9baface29502c291c6437a1eda4d88b6c1a52cb6e7e8cde
Mar 20 17:04:01 crc kubenswrapper[4730]: I0320 17:04:01.555225    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b604a34-be8f-425a-aef6-c3e9581035d7" path="/var/lib/kubelet/pods/9b604a34-be8f-425a-aef6-c3e9581035d7/volumes"
Mar 20 17:04:01 crc kubenswrapper[4730]: I0320 17:04:01.713471    4730 generic.go:334] "Generic (PLEG): container finished" podID="d071ff87-5bd9-451c-a9f5-e23dc9dda0f8" containerID="1ed06aadad6b1efaa33beb6ea1265b5a0894dda501160b5c7cfc4bd8932d7469" exitCode=0
Mar 20 17:04:01 crc kubenswrapper[4730]: I0320 17:04:01.713565    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zs57x/crc-debug-v8sjq" event={"ID":"d071ff87-5bd9-451c-a9f5-e23dc9dda0f8","Type":"ContainerDied","Data":"1ed06aadad6b1efaa33beb6ea1265b5a0894dda501160b5c7cfc4bd8932d7469"}
Mar 20 17:04:01 crc kubenswrapper[4730]: I0320 17:04:01.713878    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zs57x/crc-debug-v8sjq" event={"ID":"d071ff87-5bd9-451c-a9f5-e23dc9dda0f8","Type":"ContainerStarted","Data":"77307fa29bb7e2a5f9baface29502c291c6437a1eda4d88b6c1a52cb6e7e8cde"}
Mar 20 17:04:01 crc kubenswrapper[4730]: I0320 17:04:01.714981    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567104-xfdqf" event={"ID":"cb4bb26e-8780-490e-b7d4-5068d41079d5","Type":"ContainerStarted","Data":"02919563296b440dbb8f5883abe6fd55e90bbc6eeb56409984dea21fb85e0519"}
Mar 20 17:04:01 crc kubenswrapper[4730]: I0320 17:04:01.758296    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-zs57x/crc-debug-v8sjq"]
Mar 20 17:04:01 crc kubenswrapper[4730]: I0320 17:04:01.768297    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-zs57x/crc-debug-v8sjq"]
Mar 20 17:04:02 crc kubenswrapper[4730]: I0320 17:04:02.726698    4730 generic.go:334] "Generic (PLEG): container finished" podID="cb4bb26e-8780-490e-b7d4-5068d41079d5" containerID="570fedc27114561def00c9fd45b45e729130f1ccdb2c50a188548ae1020d9f83" exitCode=0
Mar 20 17:04:02 crc kubenswrapper[4730]: I0320 17:04:02.728605    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567104-xfdqf" event={"ID":"cb4bb26e-8780-490e-b7d4-5068d41079d5","Type":"ContainerDied","Data":"570fedc27114561def00c9fd45b45e729130f1ccdb2c50a188548ae1020d9f83"}
Mar 20 17:04:02 crc kubenswrapper[4730]: I0320 17:04:02.874931    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zs57x/crc-debug-v8sjq"
Mar 20 17:04:03 crc kubenswrapper[4730]: I0320 17:04:03.001627    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d071ff87-5bd9-451c-a9f5-e23dc9dda0f8-host\") pod \"d071ff87-5bd9-451c-a9f5-e23dc9dda0f8\" (UID: \"d071ff87-5bd9-451c-a9f5-e23dc9dda0f8\") "
Mar 20 17:04:03 crc kubenswrapper[4730]: I0320 17:04:03.001821    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fz2vg\" (UniqueName: \"kubernetes.io/projected/d071ff87-5bd9-451c-a9f5-e23dc9dda0f8-kube-api-access-fz2vg\") pod \"d071ff87-5bd9-451c-a9f5-e23dc9dda0f8\" (UID: \"d071ff87-5bd9-451c-a9f5-e23dc9dda0f8\") "
Mar 20 17:04:03 crc kubenswrapper[4730]: I0320 17:04:03.002151    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d071ff87-5bd9-451c-a9f5-e23dc9dda0f8-host" (OuterVolumeSpecName: "host") pod "d071ff87-5bd9-451c-a9f5-e23dc9dda0f8" (UID: "d071ff87-5bd9-451c-a9f5-e23dc9dda0f8"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue ""
Mar 20 17:04:03 crc kubenswrapper[4730]: I0320 17:04:03.002519    4730 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d071ff87-5bd9-451c-a9f5-e23dc9dda0f8-host\") on node \"crc\" DevicePath \"\""
Mar 20 17:04:03 crc kubenswrapper[4730]: I0320 17:04:03.018200    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d071ff87-5bd9-451c-a9f5-e23dc9dda0f8-kube-api-access-fz2vg" (OuterVolumeSpecName: "kube-api-access-fz2vg") pod "d071ff87-5bd9-451c-a9f5-e23dc9dda0f8" (UID: "d071ff87-5bd9-451c-a9f5-e23dc9dda0f8"). InnerVolumeSpecName "kube-api-access-fz2vg". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 17:04:03 crc kubenswrapper[4730]: I0320 17:04:03.105254    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fz2vg\" (UniqueName: \"kubernetes.io/projected/d071ff87-5bd9-451c-a9f5-e23dc9dda0f8-kube-api-access-fz2vg\") on node \"crc\" DevicePath \"\""
Mar 20 17:04:03 crc kubenswrapper[4730]: I0320 17:04:03.553868    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d071ff87-5bd9-451c-a9f5-e23dc9dda0f8" path="/var/lib/kubelet/pods/d071ff87-5bd9-451c-a9f5-e23dc9dda0f8/volumes"
Mar 20 17:04:03 crc kubenswrapper[4730]: I0320 17:04:03.740462    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zs57x/crc-debug-v8sjq"
Mar 20 17:04:03 crc kubenswrapper[4730]: I0320 17:04:03.741296    4730 scope.go:117] "RemoveContainer" containerID="1ed06aadad6b1efaa33beb6ea1265b5a0894dda501160b5c7cfc4bd8932d7469"
Mar 20 17:04:04 crc kubenswrapper[4730]: I0320 17:04:04.147953    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567104-xfdqf"
Mar 20 17:04:04 crc kubenswrapper[4730]: I0320 17:04:04.336503    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxl6s\" (UniqueName: \"kubernetes.io/projected/cb4bb26e-8780-490e-b7d4-5068d41079d5-kube-api-access-fxl6s\") pod \"cb4bb26e-8780-490e-b7d4-5068d41079d5\" (UID: \"cb4bb26e-8780-490e-b7d4-5068d41079d5\") "
Mar 20 17:04:04 crc kubenswrapper[4730]: I0320 17:04:04.342865    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb4bb26e-8780-490e-b7d4-5068d41079d5-kube-api-access-fxl6s" (OuterVolumeSpecName: "kube-api-access-fxl6s") pod "cb4bb26e-8780-490e-b7d4-5068d41079d5" (UID: "cb4bb26e-8780-490e-b7d4-5068d41079d5"). InnerVolumeSpecName "kube-api-access-fxl6s". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 17:04:04 crc kubenswrapper[4730]: I0320 17:04:04.439750    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxl6s\" (UniqueName: \"kubernetes.io/projected/cb4bb26e-8780-490e-b7d4-5068d41079d5-kube-api-access-fxl6s\") on node \"crc\" DevicePath \"\""
Mar 20 17:04:04 crc kubenswrapper[4730]: I0320 17:04:04.764653    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567104-xfdqf" event={"ID":"cb4bb26e-8780-490e-b7d4-5068d41079d5","Type":"ContainerDied","Data":"02919563296b440dbb8f5883abe6fd55e90bbc6eeb56409984dea21fb85e0519"}
Mar 20 17:04:04 crc kubenswrapper[4730]: I0320 17:04:04.765047    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02919563296b440dbb8f5883abe6fd55e90bbc6eeb56409984dea21fb85e0519"
Mar 20 17:04:04 crc kubenswrapper[4730]: I0320 17:04:04.765134    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567104-xfdqf"
Mar 20 17:04:05 crc kubenswrapper[4730]: I0320 17:04:05.221006    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567098-4wf7k"]
Mar 20 17:04:05 crc kubenswrapper[4730]: I0320 17:04:05.232080    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567098-4wf7k"]
Mar 20 17:04:05 crc kubenswrapper[4730]: I0320 17:04:05.542891    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8d7ea06-69cc-41b0-afc3-fb5f3e55049b" path="/var/lib/kubelet/pods/a8d7ea06-69cc-41b0-afc3-fb5f3e55049b/volumes"
Mar 20 17:04:12 crc kubenswrapper[4730]: I0320 17:04:12.879836    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 17:04:12 crc kubenswrapper[4730]: I0320 17:04:12.880437    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 17:04:29 crc kubenswrapper[4730]: I0320 17:04:29.184191    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fcd2p"]
Mar 20 17:04:29 crc kubenswrapper[4730]: E0320 17:04:29.185005    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb4bb26e-8780-490e-b7d4-5068d41079d5" containerName="oc"
Mar 20 17:04:29 crc kubenswrapper[4730]: I0320 17:04:29.185016    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb4bb26e-8780-490e-b7d4-5068d41079d5" containerName="oc"
Mar 20 17:04:29 crc kubenswrapper[4730]: E0320 17:04:29.185048    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d071ff87-5bd9-451c-a9f5-e23dc9dda0f8" containerName="container-00"
Mar 20 17:04:29 crc kubenswrapper[4730]: I0320 17:04:29.185053    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="d071ff87-5bd9-451c-a9f5-e23dc9dda0f8" containerName="container-00"
Mar 20 17:04:29 crc kubenswrapper[4730]: I0320 17:04:29.185326    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="d071ff87-5bd9-451c-a9f5-e23dc9dda0f8" containerName="container-00"
Mar 20 17:04:29 crc kubenswrapper[4730]: I0320 17:04:29.185346    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb4bb26e-8780-490e-b7d4-5068d41079d5" containerName="oc"
Mar 20 17:04:29 crc kubenswrapper[4730]: I0320 17:04:29.186750    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fcd2p"
Mar 20 17:04:29 crc kubenswrapper[4730]: I0320 17:04:29.213755    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fcd2p"]
Mar 20 17:04:29 crc kubenswrapper[4730]: I0320 17:04:29.386488    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfe373a9-f4c2-49e5-a90e-9e50cae59b9d-utilities\") pod \"community-operators-fcd2p\" (UID: \"bfe373a9-f4c2-49e5-a90e-9e50cae59b9d\") " pod="openshift-marketplace/community-operators-fcd2p"
Mar 20 17:04:29 crc kubenswrapper[4730]: I0320 17:04:29.386547    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw8xq\" (UniqueName: \"kubernetes.io/projected/bfe373a9-f4c2-49e5-a90e-9e50cae59b9d-kube-api-access-sw8xq\") pod \"community-operators-fcd2p\" (UID: \"bfe373a9-f4c2-49e5-a90e-9e50cae59b9d\") " pod="openshift-marketplace/community-operators-fcd2p"
Mar 20 17:04:29 crc kubenswrapper[4730]: I0320 17:04:29.386645    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfe373a9-f4c2-49e5-a90e-9e50cae59b9d-catalog-content\") pod \"community-operators-fcd2p\" (UID: \"bfe373a9-f4c2-49e5-a90e-9e50cae59b9d\") " pod="openshift-marketplace/community-operators-fcd2p"
Mar 20 17:04:29 crc kubenswrapper[4730]: I0320 17:04:29.488430    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfe373a9-f4c2-49e5-a90e-9e50cae59b9d-catalog-content\") pod \"community-operators-fcd2p\" (UID: \"bfe373a9-f4c2-49e5-a90e-9e50cae59b9d\") " pod="openshift-marketplace/community-operators-fcd2p"
Mar 20 17:04:29 crc kubenswrapper[4730]: I0320 17:04:29.488543    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfe373a9-f4c2-49e5-a90e-9e50cae59b9d-utilities\") pod \"community-operators-fcd2p\" (UID: \"bfe373a9-f4c2-49e5-a90e-9e50cae59b9d\") " pod="openshift-marketplace/community-operators-fcd2p"
Mar 20 17:04:29 crc kubenswrapper[4730]: I0320 17:04:29.488583    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sw8xq\" (UniqueName: \"kubernetes.io/projected/bfe373a9-f4c2-49e5-a90e-9e50cae59b9d-kube-api-access-sw8xq\") pod \"community-operators-fcd2p\" (UID: \"bfe373a9-f4c2-49e5-a90e-9e50cae59b9d\") " pod="openshift-marketplace/community-operators-fcd2p"
Mar 20 17:04:29 crc kubenswrapper[4730]: I0320 17:04:29.489084    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfe373a9-f4c2-49e5-a90e-9e50cae59b9d-catalog-content\") pod \"community-operators-fcd2p\" (UID: \"bfe373a9-f4c2-49e5-a90e-9e50cae59b9d\") " pod="openshift-marketplace/community-operators-fcd2p"
Mar 20 17:04:29 crc kubenswrapper[4730]: I0320 17:04:29.489099    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfe373a9-f4c2-49e5-a90e-9e50cae59b9d-utilities\") pod \"community-operators-fcd2p\" (UID: \"bfe373a9-f4c2-49e5-a90e-9e50cae59b9d\") " pod="openshift-marketplace/community-operators-fcd2p"
Mar 20 17:04:29 crc kubenswrapper[4730]: I0320 17:04:29.515675    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw8xq\" (UniqueName: \"kubernetes.io/projected/bfe373a9-f4c2-49e5-a90e-9e50cae59b9d-kube-api-access-sw8xq\") pod \"community-operators-fcd2p\" (UID: \"bfe373a9-f4c2-49e5-a90e-9e50cae59b9d\") " pod="openshift-marketplace/community-operators-fcd2p"
Mar 20 17:04:29 crc kubenswrapper[4730]: I0320 17:04:29.804342    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fcd2p"
Mar 20 17:04:30 crc kubenswrapper[4730]: I0320 17:04:30.267555    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fcd2p"]
Mar 20 17:04:31 crc kubenswrapper[4730]: I0320 17:04:31.051094    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcd2p" event={"ID":"bfe373a9-f4c2-49e5-a90e-9e50cae59b9d","Type":"ContainerStarted","Data":"8cabdbdc57eeb19551cd55bb8a1050f15d88aad3e277347b9124a8a7e866177e"}
Mar 20 17:04:32 crc kubenswrapper[4730]: I0320 17:04:32.060708    4730 generic.go:334] "Generic (PLEG): container finished" podID="bfe373a9-f4c2-49e5-a90e-9e50cae59b9d" containerID="59aa36d9c92972c8823e6afd5423c8f1fd3bae685e916c413b590cf24dcc5d7f" exitCode=0
Mar 20 17:04:32 crc kubenswrapper[4730]: I0320 17:04:32.060818    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcd2p" event={"ID":"bfe373a9-f4c2-49e5-a90e-9e50cae59b9d","Type":"ContainerDied","Data":"59aa36d9c92972c8823e6afd5423c8f1fd3bae685e916c413b590cf24dcc5d7f"}
Mar 20 17:04:34 crc kubenswrapper[4730]: I0320 17:04:34.082157    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcd2p" event={"ID":"bfe373a9-f4c2-49e5-a90e-9e50cae59b9d","Type":"ContainerStarted","Data":"96aab3d66b9ef5da28cba835dbe221b8f36436bf1099b8ede616d1131a49af7b"}
Mar 20 17:04:35 crc kubenswrapper[4730]: I0320 17:04:35.093376    4730 generic.go:334] "Generic (PLEG): container finished" podID="bfe373a9-f4c2-49e5-a90e-9e50cae59b9d" containerID="96aab3d66b9ef5da28cba835dbe221b8f36436bf1099b8ede616d1131a49af7b" exitCode=0
Mar 20 17:04:35 crc kubenswrapper[4730]: I0320 17:04:35.093430    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcd2p" event={"ID":"bfe373a9-f4c2-49e5-a90e-9e50cae59b9d","Type":"ContainerDied","Data":"96aab3d66b9ef5da28cba835dbe221b8f36436bf1099b8ede616d1131a49af7b"}
Mar 20 17:04:36 crc kubenswrapper[4730]: I0320 17:04:36.105565    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcd2p" event={"ID":"bfe373a9-f4c2-49e5-a90e-9e50cae59b9d","Type":"ContainerStarted","Data":"537fe8bc77ad4b9ecb53781873569e461fd7df07fd39d0f5686d297ff949b58f"}
Mar 20 17:04:36 crc kubenswrapper[4730]: I0320 17:04:36.129238    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fcd2p" podStartSLOduration=3.497064389 podStartE2EDuration="7.129215173s" podCreationTimestamp="2026-03-20 17:04:29 +0000 UTC" firstStartedPulling="2026-03-20 17:04:32.062453298 +0000 UTC m=+5131.275824667" lastFinishedPulling="2026-03-20 17:04:35.694604042 +0000 UTC m=+5134.907975451" observedRunningTime="2026-03-20 17:04:36.121997597 +0000 UTC m=+5135.335369006" watchObservedRunningTime="2026-03-20 17:04:36.129215173 +0000 UTC m=+5135.342586572"
Mar 20 17:04:39 crc kubenswrapper[4730]: I0320 17:04:39.806079    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fcd2p"
Mar 20 17:04:39 crc kubenswrapper[4730]: I0320 17:04:39.807223    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fcd2p"
Mar 20 17:04:39 crc kubenswrapper[4730]: I0320 17:04:39.894184    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fcd2p"
Mar 20 17:04:40 crc kubenswrapper[4730]: I0320 17:04:40.248858    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fcd2p"
Mar 20 17:04:40 crc kubenswrapper[4730]: I0320 17:04:40.331474    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fcd2p"]
Mar 20 17:04:42 crc kubenswrapper[4730]: I0320 17:04:42.179473    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fcd2p" podUID="bfe373a9-f4c2-49e5-a90e-9e50cae59b9d" containerName="registry-server" containerID="cri-o://537fe8bc77ad4b9ecb53781873569e461fd7df07fd39d0f5686d297ff949b58f" gracePeriod=2
Mar 20 17:04:42 crc kubenswrapper[4730]: I0320 17:04:42.651074    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fcd2p"
Mar 20 17:04:42 crc kubenswrapper[4730]: I0320 17:04:42.785999    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfe373a9-f4c2-49e5-a90e-9e50cae59b9d-catalog-content\") pod \"bfe373a9-f4c2-49e5-a90e-9e50cae59b9d\" (UID: \"bfe373a9-f4c2-49e5-a90e-9e50cae59b9d\") "
Mar 20 17:04:42 crc kubenswrapper[4730]: I0320 17:04:42.786170    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sw8xq\" (UniqueName: \"kubernetes.io/projected/bfe373a9-f4c2-49e5-a90e-9e50cae59b9d-kube-api-access-sw8xq\") pod \"bfe373a9-f4c2-49e5-a90e-9e50cae59b9d\" (UID: \"bfe373a9-f4c2-49e5-a90e-9e50cae59b9d\") "
Mar 20 17:04:42 crc kubenswrapper[4730]: I0320 17:04:42.786237    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfe373a9-f4c2-49e5-a90e-9e50cae59b9d-utilities\") pod \"bfe373a9-f4c2-49e5-a90e-9e50cae59b9d\" (UID: \"bfe373a9-f4c2-49e5-a90e-9e50cae59b9d\") "
Mar 20 17:04:42 crc kubenswrapper[4730]: I0320 17:04:42.788301    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfe373a9-f4c2-49e5-a90e-9e50cae59b9d-utilities" (OuterVolumeSpecName: "utilities") pod "bfe373a9-f4c2-49e5-a90e-9e50cae59b9d" (UID: "bfe373a9-f4c2-49e5-a90e-9e50cae59b9d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 17:04:42 crc kubenswrapper[4730]: I0320 17:04:42.795529    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfe373a9-f4c2-49e5-a90e-9e50cae59b9d-kube-api-access-sw8xq" (OuterVolumeSpecName: "kube-api-access-sw8xq") pod "bfe373a9-f4c2-49e5-a90e-9e50cae59b9d" (UID: "bfe373a9-f4c2-49e5-a90e-9e50cae59b9d"). InnerVolumeSpecName "kube-api-access-sw8xq". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 17:04:42 crc kubenswrapper[4730]: I0320 17:04:42.866415    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfe373a9-f4c2-49e5-a90e-9e50cae59b9d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bfe373a9-f4c2-49e5-a90e-9e50cae59b9d" (UID: "bfe373a9-f4c2-49e5-a90e-9e50cae59b9d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 17:04:42 crc kubenswrapper[4730]: I0320 17:04:42.879961    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 17:04:42 crc kubenswrapper[4730]: I0320 17:04:42.880025    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 17:04:42 crc kubenswrapper[4730]: I0320 17:04:42.888824    4730 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfe373a9-f4c2-49e5-a90e-9e50cae59b9d-catalog-content\") on node \"crc\" DevicePath \"\""
Mar 20 17:04:42 crc kubenswrapper[4730]: I0320 17:04:42.888879    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sw8xq\" (UniqueName: \"kubernetes.io/projected/bfe373a9-f4c2-49e5-a90e-9e50cae59b9d-kube-api-access-sw8xq\") on node \"crc\" DevicePath \"\""
Mar 20 17:04:42 crc kubenswrapper[4730]: I0320 17:04:42.888896    4730 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfe373a9-f4c2-49e5-a90e-9e50cae59b9d-utilities\") on node \"crc\" DevicePath \"\""
Mar 20 17:04:43 crc kubenswrapper[4730]: I0320 17:04:43.192895    4730 generic.go:334] "Generic (PLEG): container finished" podID="bfe373a9-f4c2-49e5-a90e-9e50cae59b9d" containerID="537fe8bc77ad4b9ecb53781873569e461fd7df07fd39d0f5686d297ff949b58f" exitCode=0
Mar 20 17:04:43 crc kubenswrapper[4730]: I0320 17:04:43.192940    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcd2p" event={"ID":"bfe373a9-f4c2-49e5-a90e-9e50cae59b9d","Type":"ContainerDied","Data":"537fe8bc77ad4b9ecb53781873569e461fd7df07fd39d0f5686d297ff949b58f"}
Mar 20 17:04:43 crc kubenswrapper[4730]: I0320 17:04:43.192972    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcd2p" event={"ID":"bfe373a9-f4c2-49e5-a90e-9e50cae59b9d","Type":"ContainerDied","Data":"8cabdbdc57eeb19551cd55bb8a1050f15d88aad3e277347b9124a8a7e866177e"}
Mar 20 17:04:43 crc kubenswrapper[4730]: I0320 17:04:43.192989    4730 scope.go:117] "RemoveContainer" containerID="537fe8bc77ad4b9ecb53781873569e461fd7df07fd39d0f5686d297ff949b58f"
Mar 20 17:04:43 crc kubenswrapper[4730]: I0320 17:04:43.193031    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fcd2p"
Mar 20 17:04:43 crc kubenswrapper[4730]: I0320 17:04:43.229332    4730 scope.go:117] "RemoveContainer" containerID="96aab3d66b9ef5da28cba835dbe221b8f36436bf1099b8ede616d1131a49af7b"
Mar 20 17:04:43 crc kubenswrapper[4730]: I0320 17:04:43.251918    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fcd2p"]
Mar 20 17:04:43 crc kubenswrapper[4730]: I0320 17:04:43.260816    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fcd2p"]
Mar 20 17:04:43 crc kubenswrapper[4730]: I0320 17:04:43.420954    4730 scope.go:117] "RemoveContainer" containerID="59aa36d9c92972c8823e6afd5423c8f1fd3bae685e916c413b590cf24dcc5d7f"
Mar 20 17:04:43 crc kubenswrapper[4730]: I0320 17:04:43.467058    4730 scope.go:117] "RemoveContainer" containerID="537fe8bc77ad4b9ecb53781873569e461fd7df07fd39d0f5686d297ff949b58f"
Mar 20 17:04:43 crc kubenswrapper[4730]: E0320 17:04:43.467733    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"537fe8bc77ad4b9ecb53781873569e461fd7df07fd39d0f5686d297ff949b58f\": container with ID starting with 537fe8bc77ad4b9ecb53781873569e461fd7df07fd39d0f5686d297ff949b58f not found: ID does not exist" containerID="537fe8bc77ad4b9ecb53781873569e461fd7df07fd39d0f5686d297ff949b58f"
Mar 20 17:04:43 crc kubenswrapper[4730]: I0320 17:04:43.467795    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"537fe8bc77ad4b9ecb53781873569e461fd7df07fd39d0f5686d297ff949b58f"} err="failed to get container status \"537fe8bc77ad4b9ecb53781873569e461fd7df07fd39d0f5686d297ff949b58f\": rpc error: code = NotFound desc = could not find container \"537fe8bc77ad4b9ecb53781873569e461fd7df07fd39d0f5686d297ff949b58f\": container with ID starting with 537fe8bc77ad4b9ecb53781873569e461fd7df07fd39d0f5686d297ff949b58f not found: ID does not exist"
Mar 20 17:04:43 crc kubenswrapper[4730]: I0320 17:04:43.467834    4730 scope.go:117] "RemoveContainer" containerID="96aab3d66b9ef5da28cba835dbe221b8f36436bf1099b8ede616d1131a49af7b"
Mar 20 17:04:43 crc kubenswrapper[4730]: E0320 17:04:43.468355    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96aab3d66b9ef5da28cba835dbe221b8f36436bf1099b8ede616d1131a49af7b\": container with ID starting with 96aab3d66b9ef5da28cba835dbe221b8f36436bf1099b8ede616d1131a49af7b not found: ID does not exist" containerID="96aab3d66b9ef5da28cba835dbe221b8f36436bf1099b8ede616d1131a49af7b"
Mar 20 17:04:43 crc kubenswrapper[4730]: I0320 17:04:43.468429    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96aab3d66b9ef5da28cba835dbe221b8f36436bf1099b8ede616d1131a49af7b"} err="failed to get container status \"96aab3d66b9ef5da28cba835dbe221b8f36436bf1099b8ede616d1131a49af7b\": rpc error: code = NotFound desc = could not find container \"96aab3d66b9ef5da28cba835dbe221b8f36436bf1099b8ede616d1131a49af7b\": container with ID starting with 96aab3d66b9ef5da28cba835dbe221b8f36436bf1099b8ede616d1131a49af7b not found: ID does not exist"
Mar 20 17:04:43 crc kubenswrapper[4730]: I0320 17:04:43.468491    4730 scope.go:117] "RemoveContainer" containerID="59aa36d9c92972c8823e6afd5423c8f1fd3bae685e916c413b590cf24dcc5d7f"
Mar 20 17:04:43 crc kubenswrapper[4730]: E0320 17:04:43.470595    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59aa36d9c92972c8823e6afd5423c8f1fd3bae685e916c413b590cf24dcc5d7f\": container with ID starting with 59aa36d9c92972c8823e6afd5423c8f1fd3bae685e916c413b590cf24dcc5d7f not found: ID does not exist" containerID="59aa36d9c92972c8823e6afd5423c8f1fd3bae685e916c413b590cf24dcc5d7f"
Mar 20 17:04:43 crc kubenswrapper[4730]: I0320 17:04:43.470641    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59aa36d9c92972c8823e6afd5423c8f1fd3bae685e916c413b590cf24dcc5d7f"} err="failed to get container status \"59aa36d9c92972c8823e6afd5423c8f1fd3bae685e916c413b590cf24dcc5d7f\": rpc error: code = NotFound desc = could not find container \"59aa36d9c92972c8823e6afd5423c8f1fd3bae685e916c413b590cf24dcc5d7f\": container with ID starting with 59aa36d9c92972c8823e6afd5423c8f1fd3bae685e916c413b590cf24dcc5d7f not found: ID does not exist"
Mar 20 17:04:43 crc kubenswrapper[4730]: I0320 17:04:43.550159    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfe373a9-f4c2-49e5-a90e-9e50cae59b9d" path="/var/lib/kubelet/pods/bfe373a9-f4c2-49e5-a90e-9e50cae59b9d/volumes"
Mar 20 17:04:47 crc kubenswrapper[4730]: I0320 17:04:47.990731    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-86947bcbc8-94hl8_59460a49-c9fe-46c9-b898-d08234ca7cd3/barbican-api/0.log"
Mar 20 17:04:48 crc kubenswrapper[4730]: I0320 17:04:48.115360    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-86947bcbc8-94hl8_59460a49-c9fe-46c9-b898-d08234ca7cd3/barbican-api-log/0.log"
Mar 20 17:04:48 crc kubenswrapper[4730]: I0320 17:04:48.222592    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-86bc9f54b4-6szxq_e4dfee88-47ff-4e8b-9f46-60cc17fb0080/barbican-keystone-listener/0.log"
Mar 20 17:04:48 crc kubenswrapper[4730]: I0320 17:04:48.282038    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-86bc9f54b4-6szxq_e4dfee88-47ff-4e8b-9f46-60cc17fb0080/barbican-keystone-listener-log/0.log"
Mar 20 17:04:48 crc kubenswrapper[4730]: I0320 17:04:48.399360    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-54b9958865-vn9kj_8f8c40f6-c8d3-4c8c-97eb-643d32774174/barbican-worker/0.log"
Mar 20 17:04:48 crc kubenswrapper[4730]: I0320 17:04:48.429312    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-54b9958865-vn9kj_8f8c40f6-c8d3-4c8c-97eb-643d32774174/barbican-worker-log/0.log"
Mar 20 17:04:48 crc kubenswrapper[4730]: I0320 17:04:48.753297    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2333c0f3-d6ce-405f-b8c8-755be42ba74b/ceilometer-central-agent/0.log"
Mar 20 17:04:48 crc kubenswrapper[4730]: I0320 17:04:48.815382    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2333c0f3-d6ce-405f-b8c8-755be42ba74b/ceilometer-notification-agent/0.log"
Mar 20 17:04:48 crc kubenswrapper[4730]: I0320 17:04:48.924444    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2333c0f3-d6ce-405f-b8c8-755be42ba74b/proxy-httpd/0.log"
Mar 20 17:04:48 crc kubenswrapper[4730]: I0320 17:04:48.929290    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-ppqdh_73c1c649-4459-497e-ba5b-245a4eb5ad04/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log"
Mar 20 17:04:49 crc kubenswrapper[4730]: I0320 17:04:49.039836    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2333c0f3-d6ce-405f-b8c8-755be42ba74b/sg-core/0.log"
Mar 20 17:04:49 crc kubenswrapper[4730]: I0320 17:04:49.175920    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa/cinder-api-log/0.log"
Mar 20 17:04:49 crc kubenswrapper[4730]: I0320 17:04:49.691940    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_0bde1710-3861-42cb-8647-292785ee4392/probe/0.log"
Mar 20 17:04:49 crc kubenswrapper[4730]: I0320 17:04:49.834518    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_4ae2fc19-770e-471e-a3d4-f0f8c08f4eaa/cinder-api/0.log"
Mar 20 17:04:49 crc kubenswrapper[4730]: I0320 17:04:49.952049    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_8ff07e31-53ad-49da-941d-607115f965e0/cinder-scheduler/0.log"
Mar 20 17:04:50 crc kubenswrapper[4730]: I0320 17:04:50.027873    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_0bde1710-3861-42cb-8647-292785ee4392/cinder-backup/0.log"
Mar 20 17:04:50 crc kubenswrapper[4730]: I0320 17:04:50.079118    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_8ff07e31-53ad-49da-941d-607115f965e0/probe/0.log"
Mar 20 17:04:50 crc kubenswrapper[4730]: I0320 17:04:50.277117    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_436a7a40-7823-4670-a107-ff5ca02da822/probe/0.log"
Mar 20 17:04:50 crc kubenswrapper[4730]: I0320 17:04:50.552064    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_20675030-52b7-4f1d-b087-d7703a59f5e1/probe/0.log"
Mar 20 17:04:50 crc kubenswrapper[4730]: I0320 17:04:50.603538    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_436a7a40-7823-4670-a107-ff5ca02da822/cinder-volume/0.log"
Mar 20 17:04:50 crc kubenswrapper[4730]: I0320 17:04:50.727299    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_20675030-52b7-4f1d-b087-d7703a59f5e1/cinder-volume/0.log"
Mar 20 17:04:51 crc kubenswrapper[4730]: I0320 17:04:51.012112    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-dff2j_ca62ee94-4983-4acc-856a-3faf59cae3e1/configure-network-edpm-deployment-openstack-edpm-ipam/0.log"
Mar 20 17:04:51 crc kubenswrapper[4730]: I0320 17:04:51.085371    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-m89bj_a8c27e63-ebf9-45ff-87b2-4782b20e19e3/configure-os-edpm-deployment-openstack-edpm-ipam/0.log"
Mar 20 17:04:51 crc kubenswrapper[4730]: I0320 17:04:51.523597    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-9449c877-vxfrw_4139a04b-4804-475f-9da3-6c40dad56690/init/0.log"
Mar 20 17:04:51 crc kubenswrapper[4730]: I0320 17:04:51.724580    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-9449c877-vxfrw_4139a04b-4804-475f-9da3-6c40dad56690/init/0.log"
Mar 20 17:04:51 crc kubenswrapper[4730]: I0320 17:04:51.824660    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-9449c877-vxfrw_4139a04b-4804-475f-9da3-6c40dad56690/dnsmasq-dns/0.log"
Mar 20 17:04:52 crc kubenswrapper[4730]: I0320 17:04:52.048476    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-9p7mz_962231f7-41b6-4754-b63c-523277f7cf50/download-cache-edpm-deployment-openstack-edpm-ipam/0.log"
Mar 20 17:04:52 crc kubenswrapper[4730]: I0320 17:04:52.063155    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_47ed5bd7-7aa8-4f16-98de-f09e21218ae6/glance-httpd/0.log"
Mar 20 17:04:52 crc kubenswrapper[4730]: I0320 17:04:52.074054    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_47ed5bd7-7aa8-4f16-98de-f09e21218ae6/glance-log/0.log"
Mar 20 17:04:52 crc kubenswrapper[4730]: I0320 17:04:52.222101    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_84366eea-e5f9-43da-ac65-8e79cb659c0a/glance-httpd/0.log"
Mar 20 17:04:52 crc kubenswrapper[4730]: I0320 17:04:52.243915    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_84366eea-e5f9-43da-ac65-8e79cb659c0a/glance-log/0.log"
Mar 20 17:04:52 crc kubenswrapper[4730]: I0320 17:04:52.388466    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-4d78n_423144fa-9b01-4466-993c-6ab7075e1ad5/install-certs-edpm-deployment-openstack-edpm-ipam/0.log"
Mar 20 17:04:52 crc kubenswrapper[4730]: I0320 17:04:52.984954    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29567041-zmx9n_d3747d18-1b1e-4c43-ac1a-efeeb453b1ae/keystone-cron/0.log"
Mar 20 17:04:53 crc kubenswrapper[4730]: I0320 17:04:53.050214    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-dkksx_133f1969-bed7-44cd-9dac-b9dfaa376515/install-os-edpm-deployment-openstack-edpm-ipam/0.log"
Mar 20 17:04:53 crc kubenswrapper[4730]: I0320 17:04:53.168840    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29567101-lzlmb_8a113133-c537-41c7-a14e-614fb8bcd24f/keystone-cron/0.log"
Mar 20 17:04:53 crc kubenswrapper[4730]: I0320 17:04:53.202518    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6fb7949f77-2l9t7_e2b9f0c5-80cc-4a4c-bbd8-c70cda9d5d3d/keystone-api/0.log"
Mar 20 17:04:53 crc kubenswrapper[4730]: I0320 17:04:53.285102    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_2455b53b-7716-45b9-ac24-cd0bd892fbb9/kube-state-metrics/0.log"
Mar 20 17:04:53 crc kubenswrapper[4730]: I0320 17:04:53.896209    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5dc7dd859f-wtxnj_62339bcb-2edc-4881-a15e-a9387442db89/neutron-api/0.log"
Mar 20 17:04:53 crc kubenswrapper[4730]: I0320 17:04:53.931607    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5dc7dd859f-wtxnj_62339bcb-2edc-4881-a15e-a9387442db89/neutron-httpd/0.log"
Mar 20 17:04:54 crc kubenswrapper[4730]: I0320 17:04:54.073213    4730 scope.go:117] "RemoveContainer" containerID="7caef3ce3ae643ea42e6304dc53d81d43ac8e7cc2d51fc8c56b8771cdad2f656"
Mar 20 17:04:54 crc kubenswrapper[4730]: I0320 17:04:54.183966    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_notifications-rabbitmq-server-0_df9ca02d-e20f-4f55-ba14-92b91812afb6/setup-container/0.log"
Mar 20 17:04:54 crc kubenswrapper[4730]: I0320 17:04:54.286355    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-n8bcw_74d70014-6de1-4d90-b04a-8f8376d3a9e0/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log"
Mar 20 17:04:54 crc kubenswrapper[4730]: I0320 17:04:54.375493    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_notifications-rabbitmq-server-0_df9ca02d-e20f-4f55-ba14-92b91812afb6/setup-container/0.log"
Mar 20 17:04:54 crc kubenswrapper[4730]: I0320 17:04:54.487931    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_notifications-rabbitmq-server-0_df9ca02d-e20f-4f55-ba14-92b91812afb6/rabbitmq/0.log"
Mar 20 17:04:54 crc kubenswrapper[4730]: I0320 17:04:54.664804    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-z4vhj_43d453db-c8fb-438d-927e-6eaee8383df1/libvirt-edpm-deployment-openstack-edpm-ipam/0.log"
Mar 20 17:04:55 crc kubenswrapper[4730]: I0320 17:04:55.108983    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_0940bcf4-b3ca-4f1d-92df-5fa9f477c800/nova-cell0-conductor-conductor/0.log"
Mar 20 17:04:55 crc kubenswrapper[4730]: I0320 17:04:55.381679    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_e788dfbe-bc18-46f9-b2bf-674940e1c392/nova-cell1-conductor-conductor/0.log"
Mar 20 17:04:55 crc kubenswrapper[4730]: I0320 17:04:55.686558    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_6586493e-e5d0-4504-b516-ebaac5defd79/nova-cell1-novncproxy-novncproxy/0.log"
Mar 20 17:04:55 crc kubenswrapper[4730]: I0320 17:04:55.815212    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_2380321d-63e0-40a2-8ca4-5780cba46259/nova-api-log/0.log"
Mar 20 17:04:56 crc kubenswrapper[4730]: I0320 17:04:56.244393    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_a58f453e-84d8-47b1-8740-406f92c4ca79/nova-metadata-log/0.log"
Mar 20 17:04:56 crc kubenswrapper[4730]: I0320 17:04:56.433150    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_2380321d-63e0-40a2-8ca4-5780cba46259/nova-api-api/0.log"
Mar 20 17:04:56 crc kubenswrapper[4730]: I0320 17:04:56.880495    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_899bd9ae-9354-4e70-ad37-b438a5a33a24/mysql-bootstrap/0.log"
Mar 20 17:04:56 crc kubenswrapper[4730]: I0320 17:04:56.943295    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_4deff063-ecb8-4cf2-8e94-45ab62a613bc/nova-scheduler-scheduler/0.log"
Mar 20 17:04:57 crc kubenswrapper[4730]: I0320 17:04:57.068194    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_899bd9ae-9354-4e70-ad37-b438a5a33a24/mysql-bootstrap/0.log"
Mar 20 17:04:57 crc kubenswrapper[4730]: I0320 17:04:57.072132    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_a58f453e-84d8-47b1-8740-406f92c4ca79/nova-metadata-metadata/0.log"
Mar 20 17:04:57 crc kubenswrapper[4730]: I0320 17:04:57.127146    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_899bd9ae-9354-4e70-ad37-b438a5a33a24/galera/0.log"
Mar 20 17:04:57 crc kubenswrapper[4730]: I0320 17:04:57.214147    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-x4t58_6ffb462f-06f9-49df-bfe7-d41c274d4b05/nova-edpm-deployment-openstack-edpm-ipam/0.log"
Mar 20 17:04:57 crc kubenswrapper[4730]: I0320 17:04:57.242949    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_6abf778f-200f-4d48-97b6-08a638b4efa2/mysql-bootstrap/0.log"
Mar 20 17:04:57 crc kubenswrapper[4730]: I0320 17:04:57.498210    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_6abf778f-200f-4d48-97b6-08a638b4efa2/mysql-bootstrap/0.log"
Mar 20 17:04:57 crc kubenswrapper[4730]: I0320 17:04:57.505903    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_6abf778f-200f-4d48-97b6-08a638b4efa2/galera/0.log"
Mar 20 17:04:57 crc kubenswrapper[4730]: I0320 17:04:57.540938    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_a893eba7-9715-4599-93c2-0365a45134e9/openstackclient/0.log"
Mar 20 17:04:57 crc kubenswrapper[4730]: I0320 17:04:57.703897    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-gtrnp_31651551-edb9-4793-a752-39fa60a85ee3/ovn-controller/0.log"
Mar 20 17:04:57 crc kubenswrapper[4730]: I0320 17:04:57.751061    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-ktnvd_c615e3c6-d705-46e8-a1e7-c1c86df055f5/openstack-network-exporter/0.log"
Mar 20 17:04:57 crc kubenswrapper[4730]: I0320 17:04:57.948615    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cdd7f_35efb2c2-6521-4f6f-a350-a4dc537ecaf8/ovsdb-server-init/0.log"
Mar 20 17:04:58 crc kubenswrapper[4730]: I0320 17:04:58.157347    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cdd7f_35efb2c2-6521-4f6f-a350-a4dc537ecaf8/ovsdb-server-init/0.log"
Mar 20 17:04:58 crc kubenswrapper[4730]: I0320 17:04:58.179037    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cdd7f_35efb2c2-6521-4f6f-a350-a4dc537ecaf8/ovsdb-server/0.log"
Mar 20 17:04:58 crc kubenswrapper[4730]: I0320 17:04:58.422464    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_4cb9ef9a-6d98-43c1-8e74-7f24ba39357d/openstack-network-exporter/0.log"
Mar 20 17:04:58 crc kubenswrapper[4730]: I0320 17:04:58.525840    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cdd7f_35efb2c2-6521-4f6f-a350-a4dc537ecaf8/ovs-vswitchd/0.log"
Mar 20 17:04:58 crc kubenswrapper[4730]: I0320 17:04:58.551522    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_4cb9ef9a-6d98-43c1-8e74-7f24ba39357d/ovn-northd/0.log"
Mar 20 17:04:58 crc kubenswrapper[4730]: I0320 17:04:58.566198    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-fxwgt_efd41cb9-678e-43d9-8643-b5aa95f1ec3e/ovn-edpm-deployment-openstack-edpm-ipam/0.log"
Mar 20 17:04:58 crc kubenswrapper[4730]: I0320 17:04:58.763479    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_caa1db28-afc0-4abc-aa80-84cccb3d8412/ovsdbserver-nb/0.log"
Mar 20 17:04:58 crc kubenswrapper[4730]: I0320 17:04:58.769832    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_caa1db28-afc0-4abc-aa80-84cccb3d8412/openstack-network-exporter/0.log"
Mar 20 17:04:58 crc kubenswrapper[4730]: I0320 17:04:58.925872    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_1ba8c36f-1882-4bb3-bcb5-b3518ce35553/openstack-network-exporter/0.log"
Mar 20 17:04:59 crc kubenswrapper[4730]: I0320 17:04:59.011972    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_1ba8c36f-1882-4bb3-bcb5-b3518ce35553/ovsdbserver-sb/0.log"
Mar 20 17:04:59 crc kubenswrapper[4730]: I0320 17:04:59.290653    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_457a736d-6c3f-486d-b8d1-fef19df33e26/init-config-reloader/0.log"
Mar 20 17:04:59 crc kubenswrapper[4730]: I0320 17:04:59.369895    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-78b446cdb6-zs6nw_d2885c5d-681f-4e22-bdeb-b716957d83e1/placement-api/0.log"
Mar 20 17:04:59 crc kubenswrapper[4730]: I0320 17:04:59.373511    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-78b446cdb6-zs6nw_d2885c5d-681f-4e22-bdeb-b716957d83e1/placement-log/0.log"
Mar 20 17:04:59 crc kubenswrapper[4730]: I0320 17:04:59.550956    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_457a736d-6c3f-486d-b8d1-fef19df33e26/init-config-reloader/0.log"
Mar 20 17:04:59 crc kubenswrapper[4730]: I0320 17:04:59.607156    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_457a736d-6c3f-486d-b8d1-fef19df33e26/config-reloader/0.log"
Mar 20 17:04:59 crc kubenswrapper[4730]: I0320 17:04:59.650723    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_457a736d-6c3f-486d-b8d1-fef19df33e26/prometheus/0.log"
Mar 20 17:04:59 crc kubenswrapper[4730]: I0320 17:04:59.694537    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_457a736d-6c3f-486d-b8d1-fef19df33e26/thanos-sidecar/0.log"
Mar 20 17:04:59 crc kubenswrapper[4730]: I0320 17:04:59.816524    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b92f799a-be4e-45a1-9e2e-c93c4992c9ce/setup-container/0.log"
Mar 20 17:05:00 crc kubenswrapper[4730]: I0320 17:05:00.066886    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b92f799a-be4e-45a1-9e2e-c93c4992c9ce/rabbitmq/0.log"
Mar 20 17:05:00 crc kubenswrapper[4730]: I0320 17:05:00.079373    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b92f799a-be4e-45a1-9e2e-c93c4992c9ce/setup-container/0.log"
Mar 20 17:05:00 crc kubenswrapper[4730]: I0320 17:05:00.110742    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_707f8f93-76f2-4472-a015-5dccae194c5e/setup-container/0.log"
Mar 20 17:05:00 crc kubenswrapper[4730]: I0320 17:05:00.369499    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_707f8f93-76f2-4472-a015-5dccae194c5e/setup-container/0.log"
Mar 20 17:05:00 crc kubenswrapper[4730]: I0320 17:05:00.457969    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_707f8f93-76f2-4472-a015-5dccae194c5e/rabbitmq/0.log"
Mar 20 17:05:00 crc kubenswrapper[4730]: I0320 17:05:00.483241    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-fcg5s_b49a7544-a685-49c3-81fa-e1bbec4453ba/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log"
Mar 20 17:05:01 crc kubenswrapper[4730]: I0320 17:05:01.173725    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-q8dm9_129ce6b6-b215-4ca0-9583-78aae3c2371c/redhat-edpm-deployment-openstack-edpm-ipam/0.log"
Mar 20 17:05:01 crc kubenswrapper[4730]: I0320 17:05:01.236065    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-l27lg_16667e9d-1075-4c26-8002-61c737a8f76a/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log"
Mar 20 17:05:01 crc kubenswrapper[4730]: I0320 17:05:01.447280    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-ckzj9_0cdca1cb-76d0-4486-8ae9-d67a1ed9b79c/run-os-edpm-deployment-openstack-edpm-ipam/0.log"
Mar 20 17:05:01 crc kubenswrapper[4730]: I0320 17:05:01.494021    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-4mmgp_eebb2eb5-4553-41b0-85e6-81e470576d50/ssh-known-hosts-edpm-deployment/0.log"
Mar 20 17:05:01 crc kubenswrapper[4730]: I0320 17:05:01.728980    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7c5c8ffdd9-xpfhf_b9780622-27f3-4339-8107-321feed5e25b/proxy-server/0.log"
Mar 20 17:05:01 crc kubenswrapper[4730]: I0320 17:05:01.839848    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7c5c8ffdd9-xpfhf_b9780622-27f3-4339-8107-321feed5e25b/proxy-httpd/0.log"
Mar 20 17:05:01 crc kubenswrapper[4730]: I0320 17:05:01.901359    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-7d8lv_167282ce-29fc-44db-9b0b-baf2c956f433/swift-ring-rebalance/0.log"
Mar 20 17:05:02 crc kubenswrapper[4730]: I0320 17:05:02.611935    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2c9def6e-27a0-4543-8d3c-07b3e4005b33/account-auditor/0.log"
Mar 20 17:05:02 crc kubenswrapper[4730]: I0320 17:05:02.738421    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2c9def6e-27a0-4543-8d3c-07b3e4005b33/account-reaper/0.log"
Mar 20 17:05:02 crc kubenswrapper[4730]: I0320 17:05:02.745828    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2c9def6e-27a0-4543-8d3c-07b3e4005b33/account-replicator/0.log"
Mar 20 17:05:02 crc kubenswrapper[4730]: I0320 17:05:02.754068    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2c9def6e-27a0-4543-8d3c-07b3e4005b33/account-server/0.log"
Mar 20 17:05:02 crc kubenswrapper[4730]: I0320 17:05:02.820809    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2c9def6e-27a0-4543-8d3c-07b3e4005b33/container-auditor/0.log"
Mar 20 17:05:02 crc kubenswrapper[4730]: I0320 17:05:02.951987    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2c9def6e-27a0-4543-8d3c-07b3e4005b33/container-replicator/0.log"
Mar 20 17:05:02 crc kubenswrapper[4730]: I0320 17:05:02.959768    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2c9def6e-27a0-4543-8d3c-07b3e4005b33/container-updater/0.log"
Mar 20 17:05:02 crc kubenswrapper[4730]: I0320 17:05:02.986969    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2c9def6e-27a0-4543-8d3c-07b3e4005b33/container-server/0.log"
Mar 20 17:05:03 crc kubenswrapper[4730]: I0320 17:05:03.046072    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2c9def6e-27a0-4543-8d3c-07b3e4005b33/object-auditor/0.log"
Mar 20 17:05:03 crc kubenswrapper[4730]: I0320 17:05:03.159041    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2c9def6e-27a0-4543-8d3c-07b3e4005b33/object-expirer/0.log"
Mar 20 17:05:03 crc kubenswrapper[4730]: I0320 17:05:03.163940    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2c9def6e-27a0-4543-8d3c-07b3e4005b33/object-server/0.log"
Mar 20 17:05:03 crc kubenswrapper[4730]: I0320 17:05:03.243502    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2c9def6e-27a0-4543-8d3c-07b3e4005b33/object-replicator/0.log"
Mar 20 17:05:03 crc kubenswrapper[4730]: I0320 17:05:03.278235    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2c9def6e-27a0-4543-8d3c-07b3e4005b33/object-updater/0.log"
Mar 20 17:05:03 crc kubenswrapper[4730]: I0320 17:05:03.359467    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2c9def6e-27a0-4543-8d3c-07b3e4005b33/rsync/0.log"
Mar 20 17:05:03 crc kubenswrapper[4730]: I0320 17:05:03.429695    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2c9def6e-27a0-4543-8d3c-07b3e4005b33/swift-recon-cron/0.log"
Mar 20 17:05:03 crc kubenswrapper[4730]: I0320 17:05:03.799302    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_d79eb29a-c814-4aa0-a268-2069d58b08d2/test-operator-logs-container/0.log"
Mar 20 17:05:04 crc kubenswrapper[4730]: I0320 17:05:04.075693    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-7pnjt_cdec76f1-b3a6-4c3d-a9d6-3553fd6b1122/validate-network-edpm-deployment-openstack-edpm-ipam/0.log"
Mar 20 17:05:04 crc kubenswrapper[4730]: I0320 17:05:04.141301    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_c69a80b5-69a7-48c5-8ad4-5063b6cb4676/tempest-tests-tempest-tests-runner/0.log"
Mar 20 17:05:04 crc kubenswrapper[4730]: I0320 17:05:04.307892    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-mb8wq_884c2fa6-babb-44b8-b8e2-3e4fbce27153/telemetry-edpm-deployment-openstack-edpm-ipam/0.log"
Mar 20 17:05:04 crc kubenswrapper[4730]: I0320 17:05:04.910098    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_0d05b7e1-a651-404e-89e9-8276427610fc/watcher-applier/0.log"
Mar 20 17:05:05 crc kubenswrapper[4730]: I0320 17:05:05.430166    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_ba310e23-3097-4114-8628-4e7ada94eac6/watcher-api-log/0.log"
Mar 20 17:05:07 crc kubenswrapper[4730]: I0320 17:05:07.850381    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_4adb002b-165b-4e7c-9e26-0a98f30dd467/watcher-decision-engine/0.log"
Mar 20 17:05:08 crc kubenswrapper[4730]: I0320 17:05:08.748317    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_ba310e23-3097-4114-8628-4e7ada94eac6/watcher-api/0.log"
Mar 20 17:05:12 crc kubenswrapper[4730]: I0320 17:05:12.883099    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 17:05:12 crc kubenswrapper[4730]: I0320 17:05:12.883659    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 17:05:12 crc kubenswrapper[4730]: I0320 17:05:12.883711    4730 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf"
Mar 20 17:05:12 crc kubenswrapper[4730]: I0320 17:05:12.884569    4730 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c3ac4abf290606f9ec67064e3bf0182c8ecb8c9be4ecf85a1bb60feb02fd27e6"} pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted"
Mar 20 17:05:12 crc kubenswrapper[4730]: I0320 17:05:12.884620    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" containerID="cri-o://c3ac4abf290606f9ec67064e3bf0182c8ecb8c9be4ecf85a1bb60feb02fd27e6" gracePeriod=600
Mar 20 17:05:13 crc kubenswrapper[4730]: I0320 17:05:13.483909    4730 generic.go:334] "Generic (PLEG): container finished" podID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerID="c3ac4abf290606f9ec67064e3bf0182c8ecb8c9be4ecf85a1bb60feb02fd27e6" exitCode=0
Mar 20 17:05:13 crc kubenswrapper[4730]: I0320 17:05:13.483992    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" event={"ID":"7fcd3db3-55f1-4c23-8fa9-78844495cea3","Type":"ContainerDied","Data":"c3ac4abf290606f9ec67064e3bf0182c8ecb8c9be4ecf85a1bb60feb02fd27e6"}
Mar 20 17:05:13 crc kubenswrapper[4730]: I0320 17:05:13.484593    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" event={"ID":"7fcd3db3-55f1-4c23-8fa9-78844495cea3","Type":"ContainerStarted","Data":"581aa02672219ffcaf2185db619f24c1c4e9f4dc71ab9a090d63ff524fc5bbca"}
Mar 20 17:05:13 crc kubenswrapper[4730]: I0320 17:05:13.484654    4730 scope.go:117] "RemoveContainer" containerID="96fa1a39363ce580a7d7ba62fa4f551be70b7f98a5cbd49713d8269774ccd8b9"
Mar 20 17:05:20 crc kubenswrapper[4730]: I0320 17:05:20.960975    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_84bbdebb-43de-41d6-82d4-71b0948c25f8/memcached/0.log"
Mar 20 17:05:39 crc kubenswrapper[4730]: I0320 17:05:39.620352    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6e5835fdb519dbfa6963d219a5ec71b0d73708b5df93d36b8aba1618bammpbt_8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a/util/0.log"
Mar 20 17:05:39 crc kubenswrapper[4730]: I0320 17:05:39.812948    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6e5835fdb519dbfa6963d219a5ec71b0d73708b5df93d36b8aba1618bammpbt_8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a/pull/0.log"
Mar 20 17:05:39 crc kubenswrapper[4730]: I0320 17:05:39.822165    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6e5835fdb519dbfa6963d219a5ec71b0d73708b5df93d36b8aba1618bammpbt_8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a/util/0.log"
Mar 20 17:05:39 crc kubenswrapper[4730]: I0320 17:05:39.839669    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6e5835fdb519dbfa6963d219a5ec71b0d73708b5df93d36b8aba1618bammpbt_8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a/pull/0.log"
Mar 20 17:05:40 crc kubenswrapper[4730]: I0320 17:05:40.063284    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6e5835fdb519dbfa6963d219a5ec71b0d73708b5df93d36b8aba1618bammpbt_8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a/util/0.log"
Mar 20 17:05:40 crc kubenswrapper[4730]: I0320 17:05:40.064266    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6e5835fdb519dbfa6963d219a5ec71b0d73708b5df93d36b8aba1618bammpbt_8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a/extract/0.log"
Mar 20 17:05:40 crc kubenswrapper[4730]: I0320 17:05:40.074019    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6e5835fdb519dbfa6963d219a5ec71b0d73708b5df93d36b8aba1618bammpbt_8b2334c1-9644-4fd3-9ea3-984ebcd8dc5a/pull/0.log"
Mar 20 17:05:40 crc kubenswrapper[4730]: I0320 17:05:40.283204    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-dmd8z_4fb51ed6-04e3-40db-ab21-eb0fe66442fe/manager/0.log"
Mar 20 17:05:40 crc kubenswrapper[4730]: I0320 17:05:40.480131    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-nwwzc_e8ad6f56-863f-473b-a4d4-d4f70d9489a4/manager/0.log"
Mar 20 17:05:40 crc kubenswrapper[4730]: I0320 17:05:40.616815    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-llp6b_d658514c-f369-4ce2-ad50-d055fd208694/manager/0.log"
Mar 20 17:05:40 crc kubenswrapper[4730]: I0320 17:05:40.766022    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-v96m5_acffaecc-dd6c-4819-91cf-99c5d0154143/manager/0.log"
Mar 20 17:05:40 crc kubenswrapper[4730]: I0320 17:05:40.951499    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-pf8sw_f733406e-5258-4cfe-870d-4fb86152363e/manager/0.log"
Mar 20 17:05:41 crc kubenswrapper[4730]: I0320 17:05:41.209301    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-9k6lh_24280954-941c-445f-aa52-e360ce544046/manager/0.log"
Mar 20 17:05:41 crc kubenswrapper[4730]: I0320 17:05:41.521649    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-g4kgd_cf3ded14-d81b-4384-93e4-e51cde6a31ec/manager/0.log"
Mar 20 17:05:41 crc kubenswrapper[4730]: I0320 17:05:41.565181    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7b9c774f96-4pkr9_d8b68e41-b53d-4fb3-8a86-0c604cda0e46/manager/0.log"
Mar 20 17:05:41 crc kubenswrapper[4730]: I0320 17:05:41.758340    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-bqjxs_87b37583-ab1d-4f9e-98e9-8cb9bdcc5165/manager/0.log"
Mar 20 17:05:41 crc kubenswrapper[4730]: I0320 17:05:41.899488    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-rnx2d_19a5ba3c-9f89-43f6-bd55-6998df2e3533/manager/0.log"
Mar 20 17:05:42 crc kubenswrapper[4730]: I0320 17:05:42.093985    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-wqqnd_c5aaa9e9-aebc-4daa-b7ab-c6064b5a78ef/manager/0.log"
Mar 20 17:05:42 crc kubenswrapper[4730]: I0320 17:05:42.100140    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-xw6kk_61755ffd-de91-4a38-a174-fe1a4c57dfd0/manager/0.log"
Mar 20 17:05:42 crc kubenswrapper[4730]: I0320 17:05:42.271650    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-l7v9q_36dd23cb-43b2-4c25-9e24-3e2f69f93eff/manager/0.log"
Mar 20 17:05:42 crc kubenswrapper[4730]: I0320 17:05:42.299407    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-w8x5z_d7ad408f-56db-4b5b-bea9-ba821eae2b80/manager/0.log"
Mar 20 17:05:42 crc kubenswrapper[4730]: I0320 17:05:42.455875    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-89d64c458-f8l2x_8f74be61-d309-417c-90a3-2962b57071c4/manager/0.log"
Mar 20 17:05:42 crc kubenswrapper[4730]: I0320 17:05:42.598179    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-646f48576b-5p6h9_d85bb2c7-8dba-4091-a6cf-12cf58bf64a9/operator/0.log"
Mar 20 17:05:42 crc kubenswrapper[4730]: I0320 17:05:42.845455    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-f9xcd_d125a115-3173-4a52-8794-2832951fa428/registry-server/0.log"
Mar 20 17:05:43 crc kubenswrapper[4730]: I0320 17:05:43.071779    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-t7kkm_6944c865-92a4-441c-907b-27424898cb99/manager/0.log"
Mar 20 17:05:43 crc kubenswrapper[4730]: I0320 17:05:43.114003    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-lt49w_9944d85d-4f1c-4312-ac57-49ee75a8fd16/manager/0.log"
Mar 20 17:05:43 crc kubenswrapper[4730]: I0320 17:05:43.411853    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-mdzv5_82cae974-2029-42c3-81bf-e9bee167e991/operator/0.log"
Mar 20 17:05:43 crc kubenswrapper[4730]: I0320 17:05:43.899691    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6f58c59cbb-76ssq_c66e7fcc-f4ab-4d70-ad2b-b9186a4a2008/manager/0.log"
Mar 20 17:05:43 crc kubenswrapper[4730]: I0320 17:05:43.984782    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-6f2w8_db4a9305-eefd-4804-ac7a-4d811bd928f5/manager/0.log"
Mar 20 17:05:44 crc kubenswrapper[4730]: I0320 17:05:44.142876    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-bm7hr_92c29eff-b9ab-4420-86c6-6b388cfc87af/manager/0.log"
Mar 20 17:05:44 crc kubenswrapper[4730]: I0320 17:05:44.162966    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-lrpjm_cdbd62c8-9960-4257-87d9-d4923c7ef8dd/manager/0.log"
Mar 20 17:05:44 crc kubenswrapper[4730]: I0320 17:05:44.333910    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c5858c67b-cfmtk_f00b4813-358d-49c4-bf9d-486e35f5a94f/manager/0.log"
Mar 20 17:06:00 crc kubenswrapper[4730]: I0320 17:06:00.154361    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567106-pg9p8"]
Mar 20 17:06:00 crc kubenswrapper[4730]: E0320 17:06:00.155776    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfe373a9-f4c2-49e5-a90e-9e50cae59b9d" containerName="registry-server"
Mar 20 17:06:00 crc kubenswrapper[4730]: I0320 17:06:00.155806    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfe373a9-f4c2-49e5-a90e-9e50cae59b9d" containerName="registry-server"
Mar 20 17:06:00 crc kubenswrapper[4730]: E0320 17:06:00.155827    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfe373a9-f4c2-49e5-a90e-9e50cae59b9d" containerName="extract-utilities"
Mar 20 17:06:00 crc kubenswrapper[4730]: I0320 17:06:00.155840    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfe373a9-f4c2-49e5-a90e-9e50cae59b9d" containerName="extract-utilities"
Mar 20 17:06:00 crc kubenswrapper[4730]: E0320 17:06:00.155893    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfe373a9-f4c2-49e5-a90e-9e50cae59b9d" containerName="extract-content"
Mar 20 17:06:00 crc kubenswrapper[4730]: I0320 17:06:00.155907    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfe373a9-f4c2-49e5-a90e-9e50cae59b9d" containerName="extract-content"
Mar 20 17:06:00 crc kubenswrapper[4730]: I0320 17:06:00.156282    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfe373a9-f4c2-49e5-a90e-9e50cae59b9d" containerName="registry-server"
Mar 20 17:06:00 crc kubenswrapper[4730]: I0320 17:06:00.157496    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567106-pg9p8"
Mar 20 17:06:00 crc kubenswrapper[4730]: I0320 17:06:00.159452    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt"
Mar 20 17:06:00 crc kubenswrapper[4730]: I0320 17:06:00.159654    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt"
Mar 20 17:06:00 crc kubenswrapper[4730]: I0320 17:06:00.159948    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vlqbl"
Mar 20 17:06:00 crc kubenswrapper[4730]: I0320 17:06:00.163266    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567106-pg9p8"]
Mar 20 17:06:00 crc kubenswrapper[4730]: I0320 17:06:00.220420    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jdhf\" (UniqueName: \"kubernetes.io/projected/9695c6e8-6be3-4465-95a1-887c6a568fb7-kube-api-access-4jdhf\") pod \"auto-csr-approver-29567106-pg9p8\" (UID: \"9695c6e8-6be3-4465-95a1-887c6a568fb7\") " pod="openshift-infra/auto-csr-approver-29567106-pg9p8"
Mar 20 17:06:00 crc kubenswrapper[4730]: I0320 17:06:00.323311    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jdhf\" (UniqueName: \"kubernetes.io/projected/9695c6e8-6be3-4465-95a1-887c6a568fb7-kube-api-access-4jdhf\") pod \"auto-csr-approver-29567106-pg9p8\" (UID: \"9695c6e8-6be3-4465-95a1-887c6a568fb7\") " pod="openshift-infra/auto-csr-approver-29567106-pg9p8"
Mar 20 17:06:00 crc kubenswrapper[4730]: I0320 17:06:00.345963    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jdhf\" (UniqueName: \"kubernetes.io/projected/9695c6e8-6be3-4465-95a1-887c6a568fb7-kube-api-access-4jdhf\") pod \"auto-csr-approver-29567106-pg9p8\" (UID: \"9695c6e8-6be3-4465-95a1-887c6a568fb7\") " pod="openshift-infra/auto-csr-approver-29567106-pg9p8"
Mar 20 17:06:00 crc kubenswrapper[4730]: I0320 17:06:00.488923    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567106-pg9p8"
Mar 20 17:06:00 crc kubenswrapper[4730]: I0320 17:06:00.991931    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567106-pg9p8"]
Mar 20 17:06:01 crc kubenswrapper[4730]: I0320 17:06:01.035064    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567106-pg9p8" event={"ID":"9695c6e8-6be3-4465-95a1-887c6a568fb7","Type":"ContainerStarted","Data":"d0e8ff4e8f7264bc67ea642d0b97ca002626697a4ef1bbaa4564b31ddd2b7e19"}
Mar 20 17:06:03 crc kubenswrapper[4730]: I0320 17:06:03.059215    4730 generic.go:334] "Generic (PLEG): container finished" podID="9695c6e8-6be3-4465-95a1-887c6a568fb7" containerID="e24f74530672b4126d2aef0eaec17c584c50b3452f9280f1c1dd7481992b500e" exitCode=0
Mar 20 17:06:03 crc kubenswrapper[4730]: I0320 17:06:03.059283    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567106-pg9p8" event={"ID":"9695c6e8-6be3-4465-95a1-887c6a568fb7","Type":"ContainerDied","Data":"e24f74530672b4126d2aef0eaec17c584c50b3452f9280f1c1dd7481992b500e"}
Mar 20 17:06:04 crc kubenswrapper[4730]: I0320 17:06:04.481795    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567106-pg9p8"
Mar 20 17:06:04 crc kubenswrapper[4730]: I0320 17:06:04.523503    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jdhf\" (UniqueName: \"kubernetes.io/projected/9695c6e8-6be3-4465-95a1-887c6a568fb7-kube-api-access-4jdhf\") pod \"9695c6e8-6be3-4465-95a1-887c6a568fb7\" (UID: \"9695c6e8-6be3-4465-95a1-887c6a568fb7\") "
Mar 20 17:06:04 crc kubenswrapper[4730]: I0320 17:06:04.533477    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9695c6e8-6be3-4465-95a1-887c6a568fb7-kube-api-access-4jdhf" (OuterVolumeSpecName: "kube-api-access-4jdhf") pod "9695c6e8-6be3-4465-95a1-887c6a568fb7" (UID: "9695c6e8-6be3-4465-95a1-887c6a568fb7"). InnerVolumeSpecName "kube-api-access-4jdhf". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 17:06:04 crc kubenswrapper[4730]: I0320 17:06:04.627592    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jdhf\" (UniqueName: \"kubernetes.io/projected/9695c6e8-6be3-4465-95a1-887c6a568fb7-kube-api-access-4jdhf\") on node \"crc\" DevicePath \"\""
Mar 20 17:06:05 crc kubenswrapper[4730]: I0320 17:06:05.080269    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567106-pg9p8" event={"ID":"9695c6e8-6be3-4465-95a1-887c6a568fb7","Type":"ContainerDied","Data":"d0e8ff4e8f7264bc67ea642d0b97ca002626697a4ef1bbaa4564b31ddd2b7e19"}
Mar 20 17:06:05 crc kubenswrapper[4730]: I0320 17:06:05.080558    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0e8ff4e8f7264bc67ea642d0b97ca002626697a4ef1bbaa4564b31ddd2b7e19"
Mar 20 17:06:05 crc kubenswrapper[4730]: I0320 17:06:05.080391    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567106-pg9p8"
Mar 20 17:06:05 crc kubenswrapper[4730]: I0320 17:06:05.555777    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567100-vgj2v"]
Mar 20 17:06:05 crc kubenswrapper[4730]: I0320 17:06:05.565450    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567100-vgj2v"]
Mar 20 17:06:07 crc kubenswrapper[4730]: I0320 17:06:07.546231    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1293fe12-0f59-44fb-b726-9d72c790dabd" path="/var/lib/kubelet/pods/1293fe12-0f59-44fb-b726-9d72c790dabd/volumes"
Mar 20 17:06:08 crc kubenswrapper[4730]: I0320 17:06:08.591009    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-jkk9s_c9f80b42-cff3-48a7-9e09-02ff65e9d9f8/control-plane-machine-set-operator/0.log"
Mar 20 17:06:08 crc kubenswrapper[4730]: I0320 17:06:08.806474    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-k6z2l_5c0e41b3-aa2d-4083-acb2-f0f68a29fcce/kube-rbac-proxy/0.log"
Mar 20 17:06:08 crc kubenswrapper[4730]: I0320 17:06:08.884128    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-k6z2l_5c0e41b3-aa2d-4083-acb2-f0f68a29fcce/machine-api-operator/0.log"
Mar 20 17:06:23 crc kubenswrapper[4730]: I0320 17:06:23.518410    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-dwg9x_096957e4-5a35-42f7-adf0-cac7672589a4/cert-manager-controller/0.log"
Mar 20 17:06:23 crc kubenswrapper[4730]: I0320 17:06:23.698593    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-89r9d_b59581d5-071c-4764-9ef6-50ea4724e0a6/cert-manager-cainjector/0.log"
Mar 20 17:06:23 crc kubenswrapper[4730]: I0320 17:06:23.725815    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-qcz52_e7c6b209-7bad-4eb0-b8d0-61a602be9b89/cert-manager-webhook/0.log"
Mar 20 17:06:29 crc kubenswrapper[4730]: I0320 17:06:29.344497    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7drps"]
Mar 20 17:06:29 crc kubenswrapper[4730]: E0320 17:06:29.346021    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9695c6e8-6be3-4465-95a1-887c6a568fb7" containerName="oc"
Mar 20 17:06:29 crc kubenswrapper[4730]: I0320 17:06:29.346046    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="9695c6e8-6be3-4465-95a1-887c6a568fb7" containerName="oc"
Mar 20 17:06:29 crc kubenswrapper[4730]: I0320 17:06:29.346563    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="9695c6e8-6be3-4465-95a1-887c6a568fb7" containerName="oc"
Mar 20 17:06:29 crc kubenswrapper[4730]: I0320 17:06:29.364892    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7drps"]
Mar 20 17:06:29 crc kubenswrapper[4730]: I0320 17:06:29.365060    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7drps"
Mar 20 17:06:29 crc kubenswrapper[4730]: I0320 17:06:29.488222    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81f396fd-c6d6-4332-8cad-cb7aec1d11cf-catalog-content\") pod \"redhat-marketplace-7drps\" (UID: \"81f396fd-c6d6-4332-8cad-cb7aec1d11cf\") " pod="openshift-marketplace/redhat-marketplace-7drps"
Mar 20 17:06:29 crc kubenswrapper[4730]: I0320 17:06:29.488415    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5tx5\" (UniqueName: \"kubernetes.io/projected/81f396fd-c6d6-4332-8cad-cb7aec1d11cf-kube-api-access-s5tx5\") pod \"redhat-marketplace-7drps\" (UID: \"81f396fd-c6d6-4332-8cad-cb7aec1d11cf\") " pod="openshift-marketplace/redhat-marketplace-7drps"
Mar 20 17:06:29 crc kubenswrapper[4730]: I0320 17:06:29.488485    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81f396fd-c6d6-4332-8cad-cb7aec1d11cf-utilities\") pod \"redhat-marketplace-7drps\" (UID: \"81f396fd-c6d6-4332-8cad-cb7aec1d11cf\") " pod="openshift-marketplace/redhat-marketplace-7drps"
Mar 20 17:06:29 crc kubenswrapper[4730]: I0320 17:06:29.591491    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5tx5\" (UniqueName: \"kubernetes.io/projected/81f396fd-c6d6-4332-8cad-cb7aec1d11cf-kube-api-access-s5tx5\") pod \"redhat-marketplace-7drps\" (UID: \"81f396fd-c6d6-4332-8cad-cb7aec1d11cf\") " pod="openshift-marketplace/redhat-marketplace-7drps"
Mar 20 17:06:29 crc kubenswrapper[4730]: I0320 17:06:29.591917    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81f396fd-c6d6-4332-8cad-cb7aec1d11cf-utilities\") pod \"redhat-marketplace-7drps\" (UID: \"81f396fd-c6d6-4332-8cad-cb7aec1d11cf\") " pod="openshift-marketplace/redhat-marketplace-7drps"
Mar 20 17:06:29 crc kubenswrapper[4730]: I0320 17:06:29.592060    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81f396fd-c6d6-4332-8cad-cb7aec1d11cf-catalog-content\") pod \"redhat-marketplace-7drps\" (UID: \"81f396fd-c6d6-4332-8cad-cb7aec1d11cf\") " pod="openshift-marketplace/redhat-marketplace-7drps"
Mar 20 17:06:29 crc kubenswrapper[4730]: I0320 17:06:29.592764    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81f396fd-c6d6-4332-8cad-cb7aec1d11cf-catalog-content\") pod \"redhat-marketplace-7drps\" (UID: \"81f396fd-c6d6-4332-8cad-cb7aec1d11cf\") " pod="openshift-marketplace/redhat-marketplace-7drps"
Mar 20 17:06:29 crc kubenswrapper[4730]: I0320 17:06:29.593095    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81f396fd-c6d6-4332-8cad-cb7aec1d11cf-utilities\") pod \"redhat-marketplace-7drps\" (UID: \"81f396fd-c6d6-4332-8cad-cb7aec1d11cf\") " pod="openshift-marketplace/redhat-marketplace-7drps"
Mar 20 17:06:29 crc kubenswrapper[4730]: I0320 17:06:29.615151    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5tx5\" (UniqueName: \"kubernetes.io/projected/81f396fd-c6d6-4332-8cad-cb7aec1d11cf-kube-api-access-s5tx5\") pod \"redhat-marketplace-7drps\" (UID: \"81f396fd-c6d6-4332-8cad-cb7aec1d11cf\") " pod="openshift-marketplace/redhat-marketplace-7drps"
Mar 20 17:06:29 crc kubenswrapper[4730]: I0320 17:06:29.692507    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7drps"
Mar 20 17:06:30 crc kubenswrapper[4730]: I0320 17:06:30.187626    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7drps"]
Mar 20 17:06:30 crc kubenswrapper[4730]: I0320 17:06:30.329163    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7drps" event={"ID":"81f396fd-c6d6-4332-8cad-cb7aec1d11cf","Type":"ContainerStarted","Data":"36264ade0ad9d6fee845713f37ec9f2078ade3ea2ad5877b3188271083aef68d"}
Mar 20 17:06:31 crc kubenswrapper[4730]: I0320 17:06:31.345006    4730 generic.go:334] "Generic (PLEG): container finished" podID="81f396fd-c6d6-4332-8cad-cb7aec1d11cf" containerID="287ca69ae95c400004ca862c69e6d11e4b4732f5968d9b7179d8e6e6e6ad1719" exitCode=0
Mar 20 17:06:31 crc kubenswrapper[4730]: I0320 17:06:31.345119    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7drps" event={"ID":"81f396fd-c6d6-4332-8cad-cb7aec1d11cf","Type":"ContainerDied","Data":"287ca69ae95c400004ca862c69e6d11e4b4732f5968d9b7179d8e6e6e6ad1719"}
Mar 20 17:06:33 crc kubenswrapper[4730]: I0320 17:06:33.375692    4730 generic.go:334] "Generic (PLEG): container finished" podID="81f396fd-c6d6-4332-8cad-cb7aec1d11cf" containerID="9b086c5376f1f623201bf1fea4679d805d3951607b8b5d78337530e561fc7ef0" exitCode=0
Mar 20 17:06:33 crc kubenswrapper[4730]: I0320 17:06:33.375867    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7drps" event={"ID":"81f396fd-c6d6-4332-8cad-cb7aec1d11cf","Type":"ContainerDied","Data":"9b086c5376f1f623201bf1fea4679d805d3951607b8b5d78337530e561fc7ef0"}
Mar 20 17:06:34 crc kubenswrapper[4730]: I0320 17:06:34.388822    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7drps" event={"ID":"81f396fd-c6d6-4332-8cad-cb7aec1d11cf","Type":"ContainerStarted","Data":"c209088d7f8758ddccc4f443d60e6fcb50cc66f336529613105424b90c81a25f"}
Mar 20 17:06:34 crc kubenswrapper[4730]: I0320 17:06:34.409937    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7drps" podStartSLOduration=2.873022937 podStartE2EDuration="5.409916342s" podCreationTimestamp="2026-03-20 17:06:29 +0000 UTC" firstStartedPulling="2026-03-20 17:06:31.347855093 +0000 UTC m=+5250.561226462" lastFinishedPulling="2026-03-20 17:06:33.884748488 +0000 UTC m=+5253.098119867" observedRunningTime="2026-03-20 17:06:34.404948731 +0000 UTC m=+5253.618320110" watchObservedRunningTime="2026-03-20 17:06:34.409916342 +0000 UTC m=+5253.623287721"
Mar 20 17:06:39 crc kubenswrapper[4730]: I0320 17:06:39.693908    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7drps"
Mar 20 17:06:39 crc kubenswrapper[4730]: I0320 17:06:39.695224    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7drps"
Mar 20 17:06:39 crc kubenswrapper[4730]: I0320 17:06:39.742898    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7drps"
Mar 20 17:06:40 crc kubenswrapper[4730]: I0320 17:06:40.308593    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-nnrp6_663e9228-322c-4d6a-8988-0033d5dd587a/nmstate-console-plugin/0.log"
Mar 20 17:06:40 crc kubenswrapper[4730]: I0320 17:06:40.498144    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7drps"
Mar 20 17:06:40 crc kubenswrapper[4730]: I0320 17:06:40.552425    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7drps"]
Mar 20 17:06:40 crc kubenswrapper[4730]: I0320 17:06:40.596789    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-nfr9k_3f50a695-6f8b-42e6-aa4f-3dfd888b6afa/kube-rbac-proxy/0.log"
Mar 20 17:06:40 crc kubenswrapper[4730]: I0320 17:06:40.611904    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-6tdt2_0a4f6fcf-7c76-49cf-8f3c-d83879a650f1/nmstate-handler/0.log"
Mar 20 17:06:40 crc kubenswrapper[4730]: I0320 17:06:40.704530    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-nfr9k_3f50a695-6f8b-42e6-aa4f-3dfd888b6afa/nmstate-metrics/0.log"
Mar 20 17:06:40 crc kubenswrapper[4730]: I0320 17:06:40.821816    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-2qcpg_e3bdfb07-3f68-4262-8116-44b5ea591644/nmstate-operator/0.log"
Mar 20 17:06:40 crc kubenswrapper[4730]: I0320 17:06:40.938440    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-nq6dd_0f827638-33ac-4f99-920b-6e9b72db7955/nmstate-webhook/0.log"
Mar 20 17:06:42 crc kubenswrapper[4730]: I0320 17:06:42.457219    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7drps" podUID="81f396fd-c6d6-4332-8cad-cb7aec1d11cf" containerName="registry-server" containerID="cri-o://c209088d7f8758ddccc4f443d60e6fcb50cc66f336529613105424b90c81a25f" gracePeriod=2
Mar 20 17:06:43 crc kubenswrapper[4730]: I0320 17:06:43.007779    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7drps"
Mar 20 17:06:43 crc kubenswrapper[4730]: I0320 17:06:43.166068    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5tx5\" (UniqueName: \"kubernetes.io/projected/81f396fd-c6d6-4332-8cad-cb7aec1d11cf-kube-api-access-s5tx5\") pod \"81f396fd-c6d6-4332-8cad-cb7aec1d11cf\" (UID: \"81f396fd-c6d6-4332-8cad-cb7aec1d11cf\") "
Mar 20 17:06:43 crc kubenswrapper[4730]: I0320 17:06:43.166206    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81f396fd-c6d6-4332-8cad-cb7aec1d11cf-catalog-content\") pod \"81f396fd-c6d6-4332-8cad-cb7aec1d11cf\" (UID: \"81f396fd-c6d6-4332-8cad-cb7aec1d11cf\") "
Mar 20 17:06:43 crc kubenswrapper[4730]: I0320 17:06:43.166240    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81f396fd-c6d6-4332-8cad-cb7aec1d11cf-utilities\") pod \"81f396fd-c6d6-4332-8cad-cb7aec1d11cf\" (UID: \"81f396fd-c6d6-4332-8cad-cb7aec1d11cf\") "
Mar 20 17:06:43 crc kubenswrapper[4730]: I0320 17:06:43.167479    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81f396fd-c6d6-4332-8cad-cb7aec1d11cf-utilities" (OuterVolumeSpecName: "utilities") pod "81f396fd-c6d6-4332-8cad-cb7aec1d11cf" (UID: "81f396fd-c6d6-4332-8cad-cb7aec1d11cf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 17:06:43 crc kubenswrapper[4730]: I0320 17:06:43.186518    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81f396fd-c6d6-4332-8cad-cb7aec1d11cf-kube-api-access-s5tx5" (OuterVolumeSpecName: "kube-api-access-s5tx5") pod "81f396fd-c6d6-4332-8cad-cb7aec1d11cf" (UID: "81f396fd-c6d6-4332-8cad-cb7aec1d11cf"). InnerVolumeSpecName "kube-api-access-s5tx5". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 17:06:43 crc kubenswrapper[4730]: I0320 17:06:43.201878    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81f396fd-c6d6-4332-8cad-cb7aec1d11cf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "81f396fd-c6d6-4332-8cad-cb7aec1d11cf" (UID: "81f396fd-c6d6-4332-8cad-cb7aec1d11cf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 17:06:43 crc kubenswrapper[4730]: I0320 17:06:43.268439    4730 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81f396fd-c6d6-4332-8cad-cb7aec1d11cf-utilities\") on node \"crc\" DevicePath \"\""
Mar 20 17:06:43 crc kubenswrapper[4730]: I0320 17:06:43.268465    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5tx5\" (UniqueName: \"kubernetes.io/projected/81f396fd-c6d6-4332-8cad-cb7aec1d11cf-kube-api-access-s5tx5\") on node \"crc\" DevicePath \"\""
Mar 20 17:06:43 crc kubenswrapper[4730]: I0320 17:06:43.268476    4730 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81f396fd-c6d6-4332-8cad-cb7aec1d11cf-catalog-content\") on node \"crc\" DevicePath \"\""
Mar 20 17:06:43 crc kubenswrapper[4730]: I0320 17:06:43.468176    4730 generic.go:334] "Generic (PLEG): container finished" podID="81f396fd-c6d6-4332-8cad-cb7aec1d11cf" containerID="c209088d7f8758ddccc4f443d60e6fcb50cc66f336529613105424b90c81a25f" exitCode=0
Mar 20 17:06:43 crc kubenswrapper[4730]: I0320 17:06:43.468216    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7drps" event={"ID":"81f396fd-c6d6-4332-8cad-cb7aec1d11cf","Type":"ContainerDied","Data":"c209088d7f8758ddccc4f443d60e6fcb50cc66f336529613105424b90c81a25f"}
Mar 20 17:06:43 crc kubenswrapper[4730]: I0320 17:06:43.468240    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7drps" event={"ID":"81f396fd-c6d6-4332-8cad-cb7aec1d11cf","Type":"ContainerDied","Data":"36264ade0ad9d6fee845713f37ec9f2078ade3ea2ad5877b3188271083aef68d"}
Mar 20 17:06:43 crc kubenswrapper[4730]: I0320 17:06:43.468271    4730 scope.go:117] "RemoveContainer" containerID="c209088d7f8758ddccc4f443d60e6fcb50cc66f336529613105424b90c81a25f"
Mar 20 17:06:43 crc kubenswrapper[4730]: I0320 17:06:43.468365    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7drps"
Mar 20 17:06:43 crc kubenswrapper[4730]: I0320 17:06:43.488666    4730 scope.go:117] "RemoveContainer" containerID="9b086c5376f1f623201bf1fea4679d805d3951607b8b5d78337530e561fc7ef0"
Mar 20 17:06:43 crc kubenswrapper[4730]: I0320 17:06:43.543477    4730 scope.go:117] "RemoveContainer" containerID="287ca69ae95c400004ca862c69e6d11e4b4732f5968d9b7179d8e6e6e6ad1719"
Mar 20 17:06:43 crc kubenswrapper[4730]: I0320 17:06:43.570024    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7drps"]
Mar 20 17:06:43 crc kubenswrapper[4730]: I0320 17:06:43.570065    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7drps"]
Mar 20 17:06:43 crc kubenswrapper[4730]: I0320 17:06:43.587698    4730 scope.go:117] "RemoveContainer" containerID="c209088d7f8758ddccc4f443d60e6fcb50cc66f336529613105424b90c81a25f"
Mar 20 17:06:43 crc kubenswrapper[4730]: E0320 17:06:43.588110    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c209088d7f8758ddccc4f443d60e6fcb50cc66f336529613105424b90c81a25f\": container with ID starting with c209088d7f8758ddccc4f443d60e6fcb50cc66f336529613105424b90c81a25f not found: ID does not exist" containerID="c209088d7f8758ddccc4f443d60e6fcb50cc66f336529613105424b90c81a25f"
Mar 20 17:06:43 crc kubenswrapper[4730]: I0320 17:06:43.588146    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c209088d7f8758ddccc4f443d60e6fcb50cc66f336529613105424b90c81a25f"} err="failed to get container status \"c209088d7f8758ddccc4f443d60e6fcb50cc66f336529613105424b90c81a25f\": rpc error: code = NotFound desc = could not find container \"c209088d7f8758ddccc4f443d60e6fcb50cc66f336529613105424b90c81a25f\": container with ID starting with c209088d7f8758ddccc4f443d60e6fcb50cc66f336529613105424b90c81a25f not found: ID does not exist"
Mar 20 17:06:43 crc kubenswrapper[4730]: I0320 17:06:43.588171    4730 scope.go:117] "RemoveContainer" containerID="9b086c5376f1f623201bf1fea4679d805d3951607b8b5d78337530e561fc7ef0"
Mar 20 17:06:43 crc kubenswrapper[4730]: E0320 17:06:43.588434    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b086c5376f1f623201bf1fea4679d805d3951607b8b5d78337530e561fc7ef0\": container with ID starting with 9b086c5376f1f623201bf1fea4679d805d3951607b8b5d78337530e561fc7ef0 not found: ID does not exist" containerID="9b086c5376f1f623201bf1fea4679d805d3951607b8b5d78337530e561fc7ef0"
Mar 20 17:06:43 crc kubenswrapper[4730]: I0320 17:06:43.588456    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b086c5376f1f623201bf1fea4679d805d3951607b8b5d78337530e561fc7ef0"} err="failed to get container status \"9b086c5376f1f623201bf1fea4679d805d3951607b8b5d78337530e561fc7ef0\": rpc error: code = NotFound desc = could not find container \"9b086c5376f1f623201bf1fea4679d805d3951607b8b5d78337530e561fc7ef0\": container with ID starting with 9b086c5376f1f623201bf1fea4679d805d3951607b8b5d78337530e561fc7ef0 not found: ID does not exist"
Mar 20 17:06:43 crc kubenswrapper[4730]: I0320 17:06:43.588473    4730 scope.go:117] "RemoveContainer" containerID="287ca69ae95c400004ca862c69e6d11e4b4732f5968d9b7179d8e6e6e6ad1719"
Mar 20 17:06:43 crc kubenswrapper[4730]: E0320 17:06:43.589160    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"287ca69ae95c400004ca862c69e6d11e4b4732f5968d9b7179d8e6e6e6ad1719\": container with ID starting with 287ca69ae95c400004ca862c69e6d11e4b4732f5968d9b7179d8e6e6e6ad1719 not found: ID does not exist" containerID="287ca69ae95c400004ca862c69e6d11e4b4732f5968d9b7179d8e6e6e6ad1719"
Mar 20 17:06:43 crc kubenswrapper[4730]: I0320 17:06:43.589188    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"287ca69ae95c400004ca862c69e6d11e4b4732f5968d9b7179d8e6e6e6ad1719"} err="failed to get container status \"287ca69ae95c400004ca862c69e6d11e4b4732f5968d9b7179d8e6e6e6ad1719\": rpc error: code = NotFound desc = could not find container \"287ca69ae95c400004ca862c69e6d11e4b4732f5968d9b7179d8e6e6e6ad1719\": container with ID starting with 287ca69ae95c400004ca862c69e6d11e4b4732f5968d9b7179d8e6e6e6ad1719 not found: ID does not exist"
Mar 20 17:06:45 crc kubenswrapper[4730]: I0320 17:06:45.543694    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81f396fd-c6d6-4332-8cad-cb7aec1d11cf" path="/var/lib/kubelet/pods/81f396fd-c6d6-4332-8cad-cb7aec1d11cf/volumes"
Mar 20 17:06:54 crc kubenswrapper[4730]: I0320 17:06:54.252596    4730 scope.go:117] "RemoveContainer" containerID="f0d02d12d3b8583d27ef06f8d4e4230e6d9bdedae9fb10c5b6dcf9c218e3e2d5"
Mar 20 17:06:58 crc kubenswrapper[4730]: I0320 17:06:58.516316    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-8ff7d675-w67vt_5db89423-34f0-46c3-9dcf-2179c6c6f42a/prometheus-operator/0.log"
Mar 20 17:06:58 crc kubenswrapper[4730]: I0320 17:06:58.605569    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-85c477fd8f-7ttsl_7520ba92-5020-48d1-8d1c-fa20f0f407be/prometheus-operator-admission-webhook/0.log"
Mar 20 17:06:58 crc kubenswrapper[4730]: I0320 17:06:58.659597    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-85c477fd8f-d52bp_c12d2a2b-f7db-41be-89e1-97869c8119c2/prometheus-operator-admission-webhook/0.log"
Mar 20 17:06:58 crc kubenswrapper[4730]: I0320 17:06:58.820957    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-6dd7dd855f-nh5dg_28a4594d-a811-4533-8d77-40267a80c581/operator/0.log"
Mar 20 17:06:58 crc kubenswrapper[4730]: I0320 17:06:58.861601    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-6b8b4f7dbd-pmzhq_50aad4a2-a828-49d9-9bb3-115336081293/perses-operator/0.log"
Mar 20 17:07:17 crc kubenswrapper[4730]: I0320 17:07:17.129622    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-jdxzq_a42a5cd0-d730-4d48-8082-2491494e90ff/kube-rbac-proxy/0.log"
Mar 20 17:07:17 crc kubenswrapper[4730]: I0320 17:07:17.147071    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-jdxzq_a42a5cd0-d730-4d48-8082-2491494e90ff/controller/0.log"
Mar 20 17:07:17 crc kubenswrapper[4730]: I0320 17:07:17.218318    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pbr4w_5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f/cp-frr-files/0.log"
Mar 20 17:07:17 crc kubenswrapper[4730]: I0320 17:07:17.445832    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pbr4w_5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f/cp-frr-files/0.log"
Mar 20 17:07:17 crc kubenswrapper[4730]: I0320 17:07:17.447187    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pbr4w_5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f/cp-reloader/0.log"
Mar 20 17:07:17 crc kubenswrapper[4730]: I0320 17:07:17.453582    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pbr4w_5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f/cp-metrics/0.log"
Mar 20 17:07:17 crc kubenswrapper[4730]: I0320 17:07:17.453919    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pbr4w_5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f/cp-reloader/0.log"
Mar 20 17:07:17 crc kubenswrapper[4730]: I0320 17:07:17.747144    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pbr4w_5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f/cp-frr-files/0.log"
Mar 20 17:07:17 crc kubenswrapper[4730]: I0320 17:07:17.791807    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pbr4w_5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f/cp-metrics/0.log"
Mar 20 17:07:17 crc kubenswrapper[4730]: I0320 17:07:17.792036    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pbr4w_5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f/cp-reloader/0.log"
Mar 20 17:07:17 crc kubenswrapper[4730]: I0320 17:07:17.792140    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pbr4w_5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f/cp-metrics/0.log"
Mar 20 17:07:17 crc kubenswrapper[4730]: I0320 17:07:17.994601    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pbr4w_5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f/cp-reloader/0.log"
Mar 20 17:07:18 crc kubenswrapper[4730]: I0320 17:07:18.031738    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pbr4w_5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f/cp-frr-files/0.log"
Mar 20 17:07:18 crc kubenswrapper[4730]: I0320 17:07:18.050176    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pbr4w_5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f/controller/0.log"
Mar 20 17:07:18 crc kubenswrapper[4730]: I0320 17:07:18.060411    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pbr4w_5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f/cp-metrics/0.log"
Mar 20 17:07:18 crc kubenswrapper[4730]: I0320 17:07:18.786530    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pbr4w_5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f/frr-metrics/0.log"
Mar 20 17:07:18 crc kubenswrapper[4730]: I0320 17:07:18.807551    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pbr4w_5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f/kube-rbac-proxy-frr/0.log"
Mar 20 17:07:18 crc kubenswrapper[4730]: I0320 17:07:18.850412    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pbr4w_5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f/kube-rbac-proxy/0.log"
Mar 20 17:07:19 crc kubenswrapper[4730]: I0320 17:07:19.049184    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pbr4w_5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f/reloader/0.log"
Mar 20 17:07:19 crc kubenswrapper[4730]: I0320 17:07:19.137107    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-vmgrx_70093cb9-bc43-427d-a8e4-5750058e2580/frr-k8s-webhook-server/0.log"
Mar 20 17:07:19 crc kubenswrapper[4730]: I0320 17:07:19.393798    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-85db46595-g556k_b41d974a-1e37-48ae-afdc-48c682c73637/manager/0.log"
Mar 20 17:07:19 crc kubenswrapper[4730]: I0320 17:07:19.507105    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5f9794bdc6-ccfwn_9dfba7eb-850f-4e34-a875-8ef219c8c783/webhook-server/0.log"
Mar 20 17:07:19 crc kubenswrapper[4730]: I0320 17:07:19.612818    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-tbvnw_02f5e1af-23a0-43ef-89ad-9c5af9e98cfd/kube-rbac-proxy/0.log"
Mar 20 17:07:20 crc kubenswrapper[4730]: I0320 17:07:20.307958    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-tbvnw_02f5e1af-23a0-43ef-89ad-9c5af9e98cfd/speaker/0.log"
Mar 20 17:07:20 crc kubenswrapper[4730]: I0320 17:07:20.916208    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pbr4w_5a3acee0-3c35-4a9f-8b2d-0e5c64ca0d1f/frr/0.log"
Mar 20 17:07:35 crc kubenswrapper[4730]: I0320 17:07:35.747798    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ft8z9_6caa320c-cdca-4f52-aac0-b5c3325396db/util/0.log"
Mar 20 17:07:35 crc kubenswrapper[4730]: I0320 17:07:35.920749    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ft8z9_6caa320c-cdca-4f52-aac0-b5c3325396db/util/0.log"
Mar 20 17:07:35 crc kubenswrapper[4730]: I0320 17:07:35.930889    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ft8z9_6caa320c-cdca-4f52-aac0-b5c3325396db/pull/0.log"
Mar 20 17:07:35 crc kubenswrapper[4730]: I0320 17:07:35.962896    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ft8z9_6caa320c-cdca-4f52-aac0-b5c3325396db/pull/0.log"
Mar 20 17:07:36 crc kubenswrapper[4730]: I0320 17:07:36.105828    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ft8z9_6caa320c-cdca-4f52-aac0-b5c3325396db/util/0.log"
Mar 20 17:07:36 crc kubenswrapper[4730]: I0320 17:07:36.106549    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ft8z9_6caa320c-cdca-4f52-aac0-b5c3325396db/pull/0.log"
Mar 20 17:07:36 crc kubenswrapper[4730]: I0320 17:07:36.118567    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ft8z9_6caa320c-cdca-4f52-aac0-b5c3325396db/extract/0.log"
Mar 20 17:07:36 crc kubenswrapper[4730]: I0320 17:07:36.302449    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k77ck_aac6cea5-e666-44b1-9507-f57de2361c40/util/0.log"
Mar 20 17:07:36 crc kubenswrapper[4730]: I0320 17:07:36.455822    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k77ck_aac6cea5-e666-44b1-9507-f57de2361c40/util/0.log"
Mar 20 17:07:36 crc kubenswrapper[4730]: I0320 17:07:36.474874    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k77ck_aac6cea5-e666-44b1-9507-f57de2361c40/pull/0.log"
Mar 20 17:07:36 crc kubenswrapper[4730]: I0320 17:07:36.475052    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k77ck_aac6cea5-e666-44b1-9507-f57de2361c40/pull/0.log"
Mar 20 17:07:36 crc kubenswrapper[4730]: I0320 17:07:36.642126    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k77ck_aac6cea5-e666-44b1-9507-f57de2361c40/util/0.log"
Mar 20 17:07:36 crc kubenswrapper[4730]: I0320 17:07:36.656500    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k77ck_aac6cea5-e666-44b1-9507-f57de2361c40/pull/0.log"
Mar 20 17:07:36 crc kubenswrapper[4730]: I0320 17:07:36.693715    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k77ck_aac6cea5-e666-44b1-9507-f57de2361c40/extract/0.log"
Mar 20 17:07:36 crc kubenswrapper[4730]: I0320 17:07:36.803501    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv_25d50abe-8eeb-4761-83b7-d9e7fbb78a76/util/0.log"
Mar 20 17:07:37 crc kubenswrapper[4730]: I0320 17:07:37.001416    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv_25d50abe-8eeb-4761-83b7-d9e7fbb78a76/util/0.log"
Mar 20 17:07:37 crc kubenswrapper[4730]: I0320 17:07:37.006389    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv_25d50abe-8eeb-4761-83b7-d9e7fbb78a76/pull/0.log"
Mar 20 17:07:37 crc kubenswrapper[4730]: I0320 17:07:37.031935    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv_25d50abe-8eeb-4761-83b7-d9e7fbb78a76/pull/0.log"
Mar 20 17:07:37 crc kubenswrapper[4730]: I0320 17:07:37.162381    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv_25d50abe-8eeb-4761-83b7-d9e7fbb78a76/util/0.log"
Mar 20 17:07:37 crc kubenswrapper[4730]: I0320 17:07:37.184719    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv_25d50abe-8eeb-4761-83b7-d9e7fbb78a76/pull/0.log"
Mar 20 17:07:37 crc kubenswrapper[4730]: I0320 17:07:37.201672    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf923972669rzv_25d50abe-8eeb-4761-83b7-d9e7fbb78a76/extract/0.log"
Mar 20 17:07:37 crc kubenswrapper[4730]: I0320 17:07:37.341742    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rkhd6_d6e8fab3-7ebb-4b3f-af2c-fcc299e01381/extract-utilities/0.log"
Mar 20 17:07:37 crc kubenswrapper[4730]: I0320 17:07:37.508670    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rkhd6_d6e8fab3-7ebb-4b3f-af2c-fcc299e01381/extract-content/0.log"
Mar 20 17:07:37 crc kubenswrapper[4730]: I0320 17:07:37.510374    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rkhd6_d6e8fab3-7ebb-4b3f-af2c-fcc299e01381/extract-content/0.log"
Mar 20 17:07:37 crc kubenswrapper[4730]: I0320 17:07:37.547940    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rkhd6_d6e8fab3-7ebb-4b3f-af2c-fcc299e01381/extract-utilities/0.log"
Mar 20 17:07:37 crc kubenswrapper[4730]: I0320 17:07:37.759922    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rkhd6_d6e8fab3-7ebb-4b3f-af2c-fcc299e01381/extract-content/0.log"
Mar 20 17:07:37 crc kubenswrapper[4730]: I0320 17:07:37.763085    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rkhd6_d6e8fab3-7ebb-4b3f-af2c-fcc299e01381/extract-utilities/0.log"
Mar 20 17:07:37 crc kubenswrapper[4730]: I0320 17:07:37.962538    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bbtzz_d8a22a9f-2975-485c-99f7-05e6b934e0a1/extract-utilities/0.log"
Mar 20 17:07:38 crc kubenswrapper[4730]: I0320 17:07:38.142753    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bbtzz_d8a22a9f-2975-485c-99f7-05e6b934e0a1/extract-utilities/0.log"
Mar 20 17:07:38 crc kubenswrapper[4730]: I0320 17:07:38.173616    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rkhd6_d6e8fab3-7ebb-4b3f-af2c-fcc299e01381/registry-server/0.log"
Mar 20 17:07:38 crc kubenswrapper[4730]: I0320 17:07:38.359154    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bbtzz_d8a22a9f-2975-485c-99f7-05e6b934e0a1/extract-content/0.log"
Mar 20 17:07:38 crc kubenswrapper[4730]: I0320 17:07:38.363898    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bbtzz_d8a22a9f-2975-485c-99f7-05e6b934e0a1/extract-content/0.log"
Mar 20 17:07:38 crc kubenswrapper[4730]: I0320 17:07:38.511582    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bbtzz_d8a22a9f-2975-485c-99f7-05e6b934e0a1/extract-utilities/0.log"
Mar 20 17:07:38 crc kubenswrapper[4730]: I0320 17:07:38.528535    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bbtzz_d8a22a9f-2975-485c-99f7-05e6b934e0a1/extract-content/0.log"
Mar 20 17:07:38 crc kubenswrapper[4730]: I0320 17:07:38.745621    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-b842f_b3eaa81f-92a9-49fa-aca0-1e8e35920f20/marketplace-operator/0.log"
Mar 20 17:07:38 crc kubenswrapper[4730]: I0320 17:07:38.824230    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7vhhm_cae6da2c-50d0-460f-b29c-5b3e3df439c5/extract-utilities/0.log"
Mar 20 17:07:39 crc kubenswrapper[4730]: I0320 17:07:39.071015    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7vhhm_cae6da2c-50d0-460f-b29c-5b3e3df439c5/extract-utilities/0.log"
Mar 20 17:07:39 crc kubenswrapper[4730]: I0320 17:07:39.094125    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7vhhm_cae6da2c-50d0-460f-b29c-5b3e3df439c5/extract-content/0.log"
Mar 20 17:07:39 crc kubenswrapper[4730]: I0320 17:07:39.137777    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7vhhm_cae6da2c-50d0-460f-b29c-5b3e3df439c5/extract-content/0.log"
Mar 20 17:07:39 crc kubenswrapper[4730]: I0320 17:07:39.360699    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7vhhm_cae6da2c-50d0-460f-b29c-5b3e3df439c5/extract-utilities/0.log"
Mar 20 17:07:39 crc kubenswrapper[4730]: I0320 17:07:39.364998    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bbtzz_d8a22a9f-2975-485c-99f7-05e6b934e0a1/registry-server/0.log"
Mar 20 17:07:39 crc kubenswrapper[4730]: I0320 17:07:39.387077    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7vhhm_cae6da2c-50d0-460f-b29c-5b3e3df439c5/extract-content/0.log"
Mar 20 17:07:39 crc kubenswrapper[4730]: I0320 17:07:39.545021    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7vhhm_cae6da2c-50d0-460f-b29c-5b3e3df439c5/registry-server/0.log"
Mar 20 17:07:39 crc kubenswrapper[4730]: I0320 17:07:39.963444    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vk6rc_70d03566-9776-4dcc-84b5-17281f8ae66e/extract-utilities/0.log"
Mar 20 17:07:40 crc kubenswrapper[4730]: I0320 17:07:40.147019    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vk6rc_70d03566-9776-4dcc-84b5-17281f8ae66e/extract-utilities/0.log"
Mar 20 17:07:40 crc kubenswrapper[4730]: I0320 17:07:40.166332    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vk6rc_70d03566-9776-4dcc-84b5-17281f8ae66e/extract-content/0.log"
Mar 20 17:07:40 crc kubenswrapper[4730]: I0320 17:07:40.215074    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vk6rc_70d03566-9776-4dcc-84b5-17281f8ae66e/extract-content/0.log"
Mar 20 17:07:40 crc kubenswrapper[4730]: I0320 17:07:40.375197    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vk6rc_70d03566-9776-4dcc-84b5-17281f8ae66e/extract-utilities/0.log"
Mar 20 17:07:40 crc kubenswrapper[4730]: I0320 17:07:40.385298    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vk6rc_70d03566-9776-4dcc-84b5-17281f8ae66e/extract-content/0.log"
Mar 20 17:07:41 crc kubenswrapper[4730]: I0320 17:07:41.175986    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vk6rc_70d03566-9776-4dcc-84b5-17281f8ae66e/registry-server/0.log"
Mar 20 17:07:42 crc kubenswrapper[4730]: I0320 17:07:42.880698    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 17:07:42 crc kubenswrapper[4730]: I0320 17:07:42.881190    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 17:07:56 crc kubenswrapper[4730]: I0320 17:07:56.735633    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-85c477fd8f-d52bp_c12d2a2b-f7db-41be-89e1-97869c8119c2/prometheus-operator-admission-webhook/0.log"
Mar 20 17:07:56 crc kubenswrapper[4730]: I0320 17:07:56.740579    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-8ff7d675-w67vt_5db89423-34f0-46c3-9dcf-2179c6c6f42a/prometheus-operator/0.log"
Mar 20 17:07:56 crc kubenswrapper[4730]: I0320 17:07:56.812774    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-85c477fd8f-7ttsl_7520ba92-5020-48d1-8d1c-fa20f0f407be/prometheus-operator-admission-webhook/0.log"
Mar 20 17:07:56 crc kubenswrapper[4730]: I0320 17:07:56.916727    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-6b8b4f7dbd-pmzhq_50aad4a2-a828-49d9-9bb3-115336081293/perses-operator/0.log"
Mar 20 17:07:56 crc kubenswrapper[4730]: I0320 17:07:56.917664    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-6dd7dd855f-nh5dg_28a4594d-a811-4533-8d77-40267a80c581/operator/0.log"
Mar 20 17:08:00 crc kubenswrapper[4730]: I0320 17:08:00.170015    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567108-zftgj"]
Mar 20 17:08:00 crc kubenswrapper[4730]: E0320 17:08:00.170949    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81f396fd-c6d6-4332-8cad-cb7aec1d11cf" containerName="extract-utilities"
Mar 20 17:08:00 crc kubenswrapper[4730]: I0320 17:08:00.170973    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="81f396fd-c6d6-4332-8cad-cb7aec1d11cf" containerName="extract-utilities"
Mar 20 17:08:00 crc kubenswrapper[4730]: E0320 17:08:00.171024    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81f396fd-c6d6-4332-8cad-cb7aec1d11cf" containerName="extract-content"
Mar 20 17:08:00 crc kubenswrapper[4730]: I0320 17:08:00.171036    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="81f396fd-c6d6-4332-8cad-cb7aec1d11cf" containerName="extract-content"
Mar 20 17:08:00 crc kubenswrapper[4730]: E0320 17:08:00.171081    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81f396fd-c6d6-4332-8cad-cb7aec1d11cf" containerName="registry-server"
Mar 20 17:08:00 crc kubenswrapper[4730]: I0320 17:08:00.171091    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="81f396fd-c6d6-4332-8cad-cb7aec1d11cf" containerName="registry-server"
Mar 20 17:08:00 crc kubenswrapper[4730]: I0320 17:08:00.171436    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="81f396fd-c6d6-4332-8cad-cb7aec1d11cf" containerName="registry-server"
Mar 20 17:08:00 crc kubenswrapper[4730]: I0320 17:08:00.172565    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567108-zftgj"
Mar 20 17:08:00 crc kubenswrapper[4730]: I0320 17:08:00.177192    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt"
Mar 20 17:08:00 crc kubenswrapper[4730]: I0320 17:08:00.180735    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt"
Mar 20 17:08:00 crc kubenswrapper[4730]: I0320 17:08:00.180793    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vlqbl"
Mar 20 17:08:00 crc kubenswrapper[4730]: I0320 17:08:00.191018    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567108-zftgj"]
Mar 20 17:08:00 crc kubenswrapper[4730]: I0320 17:08:00.290395    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnfjb\" (UniqueName: \"kubernetes.io/projected/ee25ae29-2b59-43fa-bee7-ff759f2b962d-kube-api-access-dnfjb\") pod \"auto-csr-approver-29567108-zftgj\" (UID: \"ee25ae29-2b59-43fa-bee7-ff759f2b962d\") " pod="openshift-infra/auto-csr-approver-29567108-zftgj"
Mar 20 17:08:00 crc kubenswrapper[4730]: I0320 17:08:00.392655    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnfjb\" (UniqueName: \"kubernetes.io/projected/ee25ae29-2b59-43fa-bee7-ff759f2b962d-kube-api-access-dnfjb\") pod \"auto-csr-approver-29567108-zftgj\" (UID: \"ee25ae29-2b59-43fa-bee7-ff759f2b962d\") " pod="openshift-infra/auto-csr-approver-29567108-zftgj"
Mar 20 17:08:00 crc kubenswrapper[4730]: I0320 17:08:00.416159    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnfjb\" (UniqueName: \"kubernetes.io/projected/ee25ae29-2b59-43fa-bee7-ff759f2b962d-kube-api-access-dnfjb\") pod \"auto-csr-approver-29567108-zftgj\" (UID: \"ee25ae29-2b59-43fa-bee7-ff759f2b962d\") " pod="openshift-infra/auto-csr-approver-29567108-zftgj"
Mar 20 17:08:00 crc kubenswrapper[4730]: I0320 17:08:00.503205    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567108-zftgj"
Mar 20 17:08:01 crc kubenswrapper[4730]: I0320 17:08:01.003881    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567108-zftgj"]
Mar 20 17:08:01 crc kubenswrapper[4730]: I0320 17:08:01.316573    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567108-zftgj" event={"ID":"ee25ae29-2b59-43fa-bee7-ff759f2b962d","Type":"ContainerStarted","Data":"30cc381bc8fa85c62d2867894ba73e17d2109616119bf0ef41530f071962c2e3"}
Mar 20 17:08:03 crc kubenswrapper[4730]: I0320 17:08:03.335707    4730 generic.go:334] "Generic (PLEG): container finished" podID="ee25ae29-2b59-43fa-bee7-ff759f2b962d" containerID="b89905d33f72768b38a357a1f6d9426d8d5d00caccec7854337ced1d8a3cac16" exitCode=0
Mar 20 17:08:03 crc kubenswrapper[4730]: I0320 17:08:03.335875    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567108-zftgj" event={"ID":"ee25ae29-2b59-43fa-bee7-ff759f2b962d","Type":"ContainerDied","Data":"b89905d33f72768b38a357a1f6d9426d8d5d00caccec7854337ced1d8a3cac16"}
Mar 20 17:08:04 crc kubenswrapper[4730]: I0320 17:08:04.754046    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567108-zftgj"
Mar 20 17:08:04 crc kubenswrapper[4730]: I0320 17:08:04.942584    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnfjb\" (UniqueName: \"kubernetes.io/projected/ee25ae29-2b59-43fa-bee7-ff759f2b962d-kube-api-access-dnfjb\") pod \"ee25ae29-2b59-43fa-bee7-ff759f2b962d\" (UID: \"ee25ae29-2b59-43fa-bee7-ff759f2b962d\") "
Mar 20 17:08:04 crc kubenswrapper[4730]: I0320 17:08:04.954664    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee25ae29-2b59-43fa-bee7-ff759f2b962d-kube-api-access-dnfjb" (OuterVolumeSpecName: "kube-api-access-dnfjb") pod "ee25ae29-2b59-43fa-bee7-ff759f2b962d" (UID: "ee25ae29-2b59-43fa-bee7-ff759f2b962d"). InnerVolumeSpecName "kube-api-access-dnfjb". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 17:08:05 crc kubenswrapper[4730]: I0320 17:08:05.045006    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnfjb\" (UniqueName: \"kubernetes.io/projected/ee25ae29-2b59-43fa-bee7-ff759f2b962d-kube-api-access-dnfjb\") on node \"crc\" DevicePath \"\""
Mar 20 17:08:05 crc kubenswrapper[4730]: I0320 17:08:05.355315    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567108-zftgj" event={"ID":"ee25ae29-2b59-43fa-bee7-ff759f2b962d","Type":"ContainerDied","Data":"30cc381bc8fa85c62d2867894ba73e17d2109616119bf0ef41530f071962c2e3"}
Mar 20 17:08:05 crc kubenswrapper[4730]: I0320 17:08:05.355529    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30cc381bc8fa85c62d2867894ba73e17d2109616119bf0ef41530f071962c2e3"
Mar 20 17:08:05 crc kubenswrapper[4730]: I0320 17:08:05.355637    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567108-zftgj"
Mar 20 17:08:05 crc kubenswrapper[4730]: I0320 17:08:05.842996    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567102-k48jl"]
Mar 20 17:08:05 crc kubenswrapper[4730]: I0320 17:08:05.854504    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567102-k48jl"]
Mar 20 17:08:07 crc kubenswrapper[4730]: I0320 17:08:07.543916    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d317b26b-912f-4276-a234-084782092ff3" path="/var/lib/kubelet/pods/d317b26b-912f-4276-a234-084782092ff3/volumes"
Mar 20 17:08:12 crc kubenswrapper[4730]: I0320 17:08:12.882749    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 17:08:12 crc kubenswrapper[4730]: I0320 17:08:12.883348    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 17:08:42 crc kubenswrapper[4730]: I0320 17:08:42.879963    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 17:08:42 crc kubenswrapper[4730]: I0320 17:08:42.880642    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
Mar 20 17:08:42 crc kubenswrapper[4730]: I0320 17:08:42.880699    4730 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf"
Mar 20 17:08:42 crc kubenswrapper[4730]: I0320 17:08:42.881496    4730 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"581aa02672219ffcaf2185db619f24c1c4e9f4dc71ab9a090d63ff524fc5bbca"} pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted"
Mar 20 17:08:42 crc kubenswrapper[4730]: I0320 17:08:42.881567    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" containerID="cri-o://581aa02672219ffcaf2185db619f24c1c4e9f4dc71ab9a090d63ff524fc5bbca" gracePeriod=600
Mar 20 17:08:43 crc kubenswrapper[4730]: E0320 17:08:43.019981    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 17:08:43 crc kubenswrapper[4730]: I0320 17:08:43.805746    4730 generic.go:334] "Generic (PLEG): container finished" podID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerID="581aa02672219ffcaf2185db619f24c1c4e9f4dc71ab9a090d63ff524fc5bbca" exitCode=0
Mar 20 17:08:43 crc kubenswrapper[4730]: I0320 17:08:43.805808    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" event={"ID":"7fcd3db3-55f1-4c23-8fa9-78844495cea3","Type":"ContainerDied","Data":"581aa02672219ffcaf2185db619f24c1c4e9f4dc71ab9a090d63ff524fc5bbca"}
Mar 20 17:08:43 crc kubenswrapper[4730]: I0320 17:08:43.805893    4730 scope.go:117] "RemoveContainer" containerID="c3ac4abf290606f9ec67064e3bf0182c8ecb8c9be4ecf85a1bb60feb02fd27e6"
Mar 20 17:08:43 crc kubenswrapper[4730]: I0320 17:08:43.806847    4730 scope.go:117] "RemoveContainer" containerID="581aa02672219ffcaf2185db619f24c1c4e9f4dc71ab9a090d63ff524fc5bbca"
Mar 20 17:08:43 crc kubenswrapper[4730]: E0320 17:08:43.807781    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 17:08:51 crc kubenswrapper[4730]: I0320 17:08:51.650265    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tzjmg"]
Mar 20 17:08:51 crc kubenswrapper[4730]: E0320 17:08:51.651202    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee25ae29-2b59-43fa-bee7-ff759f2b962d" containerName="oc"
Mar 20 17:08:51 crc kubenswrapper[4730]: I0320 17:08:51.651216    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee25ae29-2b59-43fa-bee7-ff759f2b962d" containerName="oc"
Mar 20 17:08:51 crc kubenswrapper[4730]: I0320 17:08:51.651554    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee25ae29-2b59-43fa-bee7-ff759f2b962d" containerName="oc"
Mar 20 17:08:51 crc kubenswrapper[4730]: I0320 17:08:51.653428    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tzjmg"
Mar 20 17:08:51 crc kubenswrapper[4730]: I0320 17:08:51.659134    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tzjmg"]
Mar 20 17:08:51 crc kubenswrapper[4730]: I0320 17:08:51.740234    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a7390a1-d2ba-45fd-86f5-173f814c93a9-utilities\") pod \"redhat-operators-tzjmg\" (UID: \"4a7390a1-d2ba-45fd-86f5-173f814c93a9\") " pod="openshift-marketplace/redhat-operators-tzjmg"
Mar 20 17:08:51 crc kubenswrapper[4730]: I0320 17:08:51.740479    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chjph\" (UniqueName: \"kubernetes.io/projected/4a7390a1-d2ba-45fd-86f5-173f814c93a9-kube-api-access-chjph\") pod \"redhat-operators-tzjmg\" (UID: \"4a7390a1-d2ba-45fd-86f5-173f814c93a9\") " pod="openshift-marketplace/redhat-operators-tzjmg"
Mar 20 17:08:51 crc kubenswrapper[4730]: I0320 17:08:51.740650    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a7390a1-d2ba-45fd-86f5-173f814c93a9-catalog-content\") pod \"redhat-operators-tzjmg\" (UID: \"4a7390a1-d2ba-45fd-86f5-173f814c93a9\") " pod="openshift-marketplace/redhat-operators-tzjmg"
Mar 20 17:08:51 crc kubenswrapper[4730]: I0320 17:08:51.842841    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a7390a1-d2ba-45fd-86f5-173f814c93a9-catalog-content\") pod \"redhat-operators-tzjmg\" (UID: \"4a7390a1-d2ba-45fd-86f5-173f814c93a9\") " pod="openshift-marketplace/redhat-operators-tzjmg"
Mar 20 17:08:51 crc kubenswrapper[4730]: I0320 17:08:51.843028    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a7390a1-d2ba-45fd-86f5-173f814c93a9-utilities\") pod \"redhat-operators-tzjmg\" (UID: \"4a7390a1-d2ba-45fd-86f5-173f814c93a9\") " pod="openshift-marketplace/redhat-operators-tzjmg"
Mar 20 17:08:51 crc kubenswrapper[4730]: I0320 17:08:51.843086    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chjph\" (UniqueName: \"kubernetes.io/projected/4a7390a1-d2ba-45fd-86f5-173f814c93a9-kube-api-access-chjph\") pod \"redhat-operators-tzjmg\" (UID: \"4a7390a1-d2ba-45fd-86f5-173f814c93a9\") " pod="openshift-marketplace/redhat-operators-tzjmg"
Mar 20 17:08:51 crc kubenswrapper[4730]: I0320 17:08:51.843949    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a7390a1-d2ba-45fd-86f5-173f814c93a9-utilities\") pod \"redhat-operators-tzjmg\" (UID: \"4a7390a1-d2ba-45fd-86f5-173f814c93a9\") " pod="openshift-marketplace/redhat-operators-tzjmg"
Mar 20 17:08:51 crc kubenswrapper[4730]: I0320 17:08:51.843965    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a7390a1-d2ba-45fd-86f5-173f814c93a9-catalog-content\") pod \"redhat-operators-tzjmg\" (UID: \"4a7390a1-d2ba-45fd-86f5-173f814c93a9\") " pod="openshift-marketplace/redhat-operators-tzjmg"
Mar 20 17:08:51 crc kubenswrapper[4730]: I0320 17:08:51.863707    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chjph\" (UniqueName: \"kubernetes.io/projected/4a7390a1-d2ba-45fd-86f5-173f814c93a9-kube-api-access-chjph\") pod \"redhat-operators-tzjmg\" (UID: \"4a7390a1-d2ba-45fd-86f5-173f814c93a9\") " pod="openshift-marketplace/redhat-operators-tzjmg"
Mar 20 17:08:51 crc kubenswrapper[4730]: I0320 17:08:51.989725    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tzjmg"
Mar 20 17:08:52 crc kubenswrapper[4730]: I0320 17:08:52.475207    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tzjmg"]
Mar 20 17:08:52 crc kubenswrapper[4730]: I0320 17:08:52.924700    4730 generic.go:334] "Generic (PLEG): container finished" podID="4a7390a1-d2ba-45fd-86f5-173f814c93a9" containerID="85b46985717f5402312da3ba16d7344bfe930fc2131a84e29d94e9ed5b0900e9" exitCode=0
Mar 20 17:08:52 crc kubenswrapper[4730]: I0320 17:08:52.924771    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tzjmg" event={"ID":"4a7390a1-d2ba-45fd-86f5-173f814c93a9","Type":"ContainerDied","Data":"85b46985717f5402312da3ba16d7344bfe930fc2131a84e29d94e9ed5b0900e9"}
Mar 20 17:08:52 crc kubenswrapper[4730]: I0320 17:08:52.925134    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tzjmg" event={"ID":"4a7390a1-d2ba-45fd-86f5-173f814c93a9","Type":"ContainerStarted","Data":"546e0511e70de04145fcedfd822bf13d8f1a11303e7f2ed0ec7c3de19cbb37aa"}
Mar 20 17:08:53 crc kubenswrapper[4730]: I0320 17:08:53.939631    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tzjmg" event={"ID":"4a7390a1-d2ba-45fd-86f5-173f814c93a9","Type":"ContainerStarted","Data":"a5a3ccc441c93068e49eab5a94102c5fc521cd85b318db8bfbb39924f2f902d1"}
Mar 20 17:08:54 crc kubenswrapper[4730]: I0320 17:08:54.879856    4730 scope.go:117] "RemoveContainer" containerID="bf551572383a97d7725c248e69428cf0db8c3b25722e05b3bf7b441d82bc6b56"
Mar 20 17:08:58 crc kubenswrapper[4730]: I0320 17:08:58.536985    4730 scope.go:117] "RemoveContainer" containerID="581aa02672219ffcaf2185db619f24c1c4e9f4dc71ab9a090d63ff524fc5bbca"
Mar 20 17:08:58 crc kubenswrapper[4730]: E0320 17:08:58.538032    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 17:09:00 crc kubenswrapper[4730]: I0320 17:09:00.030904    4730 generic.go:334] "Generic (PLEG): container finished" podID="4a7390a1-d2ba-45fd-86f5-173f814c93a9" containerID="a5a3ccc441c93068e49eab5a94102c5fc521cd85b318db8bfbb39924f2f902d1" exitCode=0
Mar 20 17:09:00 crc kubenswrapper[4730]: I0320 17:09:00.030958    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tzjmg" event={"ID":"4a7390a1-d2ba-45fd-86f5-173f814c93a9","Type":"ContainerDied","Data":"a5a3ccc441c93068e49eab5a94102c5fc521cd85b318db8bfbb39924f2f902d1"}
Mar 20 17:09:01 crc kubenswrapper[4730]: I0320 17:09:01.044487    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tzjmg" event={"ID":"4a7390a1-d2ba-45fd-86f5-173f814c93a9","Type":"ContainerStarted","Data":"6629ef5d1c4e362c0e58ac52b41f12978595079c6c024c9726e99d103ab25d4c"}
Mar 20 17:09:01 crc kubenswrapper[4730]: I0320 17:09:01.085901    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tzjmg" podStartSLOduration=2.569202216 podStartE2EDuration="10.085867959s" podCreationTimestamp="2026-03-20 17:08:51 +0000 UTC" firstStartedPulling="2026-03-20 17:08:52.926894722 +0000 UTC m=+5392.140266091" lastFinishedPulling="2026-03-20 17:09:00.443560435 +0000 UTC m=+5399.656931834" observedRunningTime="2026-03-20 17:09:01.081236867 +0000 UTC m=+5400.294608246" watchObservedRunningTime="2026-03-20 17:09:01.085867959 +0000 UTC m=+5400.299239368"
Mar 20 17:09:01 crc kubenswrapper[4730]: I0320 17:09:01.990616    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tzjmg"
Mar 20 17:09:01 crc kubenswrapper[4730]: I0320 17:09:01.991937    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tzjmg"
Mar 20 17:09:03 crc kubenswrapper[4730]: I0320 17:09:03.031801    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tzjmg" podUID="4a7390a1-d2ba-45fd-86f5-173f814c93a9" containerName="registry-server" probeResult="failure" output=<
Mar 20 17:09:03 crc kubenswrapper[4730]:         timeout: failed to connect service ":50051" within 1s
Mar 20 17:09:03 crc kubenswrapper[4730]:  >
Mar 20 17:09:13 crc kubenswrapper[4730]: I0320 17:09:13.081579    4730 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tzjmg" podUID="4a7390a1-d2ba-45fd-86f5-173f814c93a9" containerName="registry-server" probeResult="failure" output=<
Mar 20 17:09:13 crc kubenswrapper[4730]:         timeout: failed to connect service ":50051" within 1s
Mar 20 17:09:13 crc kubenswrapper[4730]:  >
Mar 20 17:09:13 crc kubenswrapper[4730]: I0320 17:09:13.533571    4730 scope.go:117] "RemoveContainer" containerID="581aa02672219ffcaf2185db619f24c1c4e9f4dc71ab9a090d63ff524fc5bbca"
Mar 20 17:09:13 crc kubenswrapper[4730]: E0320 17:09:13.533891    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 17:09:22 crc kubenswrapper[4730]: I0320 17:09:22.085239    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tzjmg"
Mar 20 17:09:22 crc kubenswrapper[4730]: I0320 17:09:22.175841    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tzjmg"
Mar 20 17:09:22 crc kubenswrapper[4730]: I0320 17:09:22.798592    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tzjmg"]
Mar 20 17:09:23 crc kubenswrapper[4730]: I0320 17:09:23.325675    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tzjmg" podUID="4a7390a1-d2ba-45fd-86f5-173f814c93a9" containerName="registry-server" containerID="cri-o://6629ef5d1c4e362c0e58ac52b41f12978595079c6c024c9726e99d103ab25d4c" gracePeriod=2
Mar 20 17:09:24 crc kubenswrapper[4730]: I0320 17:09:24.289108    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tzjmg"
Mar 20 17:09:24 crc kubenswrapper[4730]: I0320 17:09:24.337547    4730 generic.go:334] "Generic (PLEG): container finished" podID="4a7390a1-d2ba-45fd-86f5-173f814c93a9" containerID="6629ef5d1c4e362c0e58ac52b41f12978595079c6c024c9726e99d103ab25d4c" exitCode=0
Mar 20 17:09:24 crc kubenswrapper[4730]: I0320 17:09:24.337583    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tzjmg" event={"ID":"4a7390a1-d2ba-45fd-86f5-173f814c93a9","Type":"ContainerDied","Data":"6629ef5d1c4e362c0e58ac52b41f12978595079c6c024c9726e99d103ab25d4c"}
Mar 20 17:09:24 crc kubenswrapper[4730]: I0320 17:09:24.337607    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tzjmg" event={"ID":"4a7390a1-d2ba-45fd-86f5-173f814c93a9","Type":"ContainerDied","Data":"546e0511e70de04145fcedfd822bf13d8f1a11303e7f2ed0ec7c3de19cbb37aa"}
Mar 20 17:09:24 crc kubenswrapper[4730]: I0320 17:09:24.337623    4730 scope.go:117] "RemoveContainer" containerID="6629ef5d1c4e362c0e58ac52b41f12978595079c6c024c9726e99d103ab25d4c"
Mar 20 17:09:24 crc kubenswrapper[4730]: I0320 17:09:24.337739    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tzjmg"
Mar 20 17:09:24 crc kubenswrapper[4730]: I0320 17:09:24.373588    4730 scope.go:117] "RemoveContainer" containerID="a5a3ccc441c93068e49eab5a94102c5fc521cd85b318db8bfbb39924f2f902d1"
Mar 20 17:09:24 crc kubenswrapper[4730]: I0320 17:09:24.411534    4730 scope.go:117] "RemoveContainer" containerID="85b46985717f5402312da3ba16d7344bfe930fc2131a84e29d94e9ed5b0900e9"
Mar 20 17:09:24 crc kubenswrapper[4730]: I0320 17:09:24.445203    4730 scope.go:117] "RemoveContainer" containerID="6629ef5d1c4e362c0e58ac52b41f12978595079c6c024c9726e99d103ab25d4c"
Mar 20 17:09:24 crc kubenswrapper[4730]: E0320 17:09:24.445648    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6629ef5d1c4e362c0e58ac52b41f12978595079c6c024c9726e99d103ab25d4c\": container with ID starting with 6629ef5d1c4e362c0e58ac52b41f12978595079c6c024c9726e99d103ab25d4c not found: ID does not exist" containerID="6629ef5d1c4e362c0e58ac52b41f12978595079c6c024c9726e99d103ab25d4c"
Mar 20 17:09:24 crc kubenswrapper[4730]: I0320 17:09:24.445681    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6629ef5d1c4e362c0e58ac52b41f12978595079c6c024c9726e99d103ab25d4c"} err="failed to get container status \"6629ef5d1c4e362c0e58ac52b41f12978595079c6c024c9726e99d103ab25d4c\": rpc error: code = NotFound desc = could not find container \"6629ef5d1c4e362c0e58ac52b41f12978595079c6c024c9726e99d103ab25d4c\": container with ID starting with 6629ef5d1c4e362c0e58ac52b41f12978595079c6c024c9726e99d103ab25d4c not found: ID does not exist"
Mar 20 17:09:24 crc kubenswrapper[4730]: I0320 17:09:24.445703    4730 scope.go:117] "RemoveContainer" containerID="a5a3ccc441c93068e49eab5a94102c5fc521cd85b318db8bfbb39924f2f902d1"
Mar 20 17:09:24 crc kubenswrapper[4730]: E0320 17:09:24.446123    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5a3ccc441c93068e49eab5a94102c5fc521cd85b318db8bfbb39924f2f902d1\": container with ID starting with a5a3ccc441c93068e49eab5a94102c5fc521cd85b318db8bfbb39924f2f902d1 not found: ID does not exist" containerID="a5a3ccc441c93068e49eab5a94102c5fc521cd85b318db8bfbb39924f2f902d1"
Mar 20 17:09:24 crc kubenswrapper[4730]: I0320 17:09:24.446160    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5a3ccc441c93068e49eab5a94102c5fc521cd85b318db8bfbb39924f2f902d1"} err="failed to get container status \"a5a3ccc441c93068e49eab5a94102c5fc521cd85b318db8bfbb39924f2f902d1\": rpc error: code = NotFound desc = could not find container \"a5a3ccc441c93068e49eab5a94102c5fc521cd85b318db8bfbb39924f2f902d1\": container with ID starting with a5a3ccc441c93068e49eab5a94102c5fc521cd85b318db8bfbb39924f2f902d1 not found: ID does not exist"
Mar 20 17:09:24 crc kubenswrapper[4730]: I0320 17:09:24.446186    4730 scope.go:117] "RemoveContainer" containerID="85b46985717f5402312da3ba16d7344bfe930fc2131a84e29d94e9ed5b0900e9"
Mar 20 17:09:24 crc kubenswrapper[4730]: E0320 17:09:24.447315    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85b46985717f5402312da3ba16d7344bfe930fc2131a84e29d94e9ed5b0900e9\": container with ID starting with 85b46985717f5402312da3ba16d7344bfe930fc2131a84e29d94e9ed5b0900e9 not found: ID does not exist" containerID="85b46985717f5402312da3ba16d7344bfe930fc2131a84e29d94e9ed5b0900e9"
Mar 20 17:09:24 crc kubenswrapper[4730]: I0320 17:09:24.447373    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85b46985717f5402312da3ba16d7344bfe930fc2131a84e29d94e9ed5b0900e9"} err="failed to get container status \"85b46985717f5402312da3ba16d7344bfe930fc2131a84e29d94e9ed5b0900e9\": rpc error: code = NotFound desc = could not find container \"85b46985717f5402312da3ba16d7344bfe930fc2131a84e29d94e9ed5b0900e9\": container with ID starting with 85b46985717f5402312da3ba16d7344bfe930fc2131a84e29d94e9ed5b0900e9 not found: ID does not exist"
Mar 20 17:09:24 crc kubenswrapper[4730]: I0320 17:09:24.462577    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a7390a1-d2ba-45fd-86f5-173f814c93a9-utilities\") pod \"4a7390a1-d2ba-45fd-86f5-173f814c93a9\" (UID: \"4a7390a1-d2ba-45fd-86f5-173f814c93a9\") "
Mar 20 17:09:24 crc kubenswrapper[4730]: I0320 17:09:24.462722    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a7390a1-d2ba-45fd-86f5-173f814c93a9-catalog-content\") pod \"4a7390a1-d2ba-45fd-86f5-173f814c93a9\" (UID: \"4a7390a1-d2ba-45fd-86f5-173f814c93a9\") "
Mar 20 17:09:24 crc kubenswrapper[4730]: I0320 17:09:24.462768    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chjph\" (UniqueName: \"kubernetes.io/projected/4a7390a1-d2ba-45fd-86f5-173f814c93a9-kube-api-access-chjph\") pod \"4a7390a1-d2ba-45fd-86f5-173f814c93a9\" (UID: \"4a7390a1-d2ba-45fd-86f5-173f814c93a9\") "
Mar 20 17:09:24 crc kubenswrapper[4730]: I0320 17:09:24.463694    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a7390a1-d2ba-45fd-86f5-173f814c93a9-utilities" (OuterVolumeSpecName: "utilities") pod "4a7390a1-d2ba-45fd-86f5-173f814c93a9" (UID: "4a7390a1-d2ba-45fd-86f5-173f814c93a9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 17:09:24 crc kubenswrapper[4730]: I0320 17:09:24.469442    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a7390a1-d2ba-45fd-86f5-173f814c93a9-kube-api-access-chjph" (OuterVolumeSpecName: "kube-api-access-chjph") pod "4a7390a1-d2ba-45fd-86f5-173f814c93a9" (UID: "4a7390a1-d2ba-45fd-86f5-173f814c93a9"). InnerVolumeSpecName "kube-api-access-chjph". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 17:09:24 crc kubenswrapper[4730]: I0320 17:09:24.565089    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chjph\" (UniqueName: \"kubernetes.io/projected/4a7390a1-d2ba-45fd-86f5-173f814c93a9-kube-api-access-chjph\") on node \"crc\" DevicePath \"\""
Mar 20 17:09:24 crc kubenswrapper[4730]: I0320 17:09:24.565443    4730 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a7390a1-d2ba-45fd-86f5-173f814c93a9-utilities\") on node \"crc\" DevicePath \"\""
Mar 20 17:09:24 crc kubenswrapper[4730]: I0320 17:09:24.583993    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a7390a1-d2ba-45fd-86f5-173f814c93a9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4a7390a1-d2ba-45fd-86f5-173f814c93a9" (UID: "4a7390a1-d2ba-45fd-86f5-173f814c93a9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 17:09:24 crc kubenswrapper[4730]: I0320 17:09:24.673475    4730 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a7390a1-d2ba-45fd-86f5-173f814c93a9-catalog-content\") on node \"crc\" DevicePath \"\""
Mar 20 17:09:24 crc kubenswrapper[4730]: I0320 17:09:24.682494    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tzjmg"]
Mar 20 17:09:24 crc kubenswrapper[4730]: I0320 17:09:24.691820    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tzjmg"]
Mar 20 17:09:25 crc kubenswrapper[4730]: I0320 17:09:25.578368    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a7390a1-d2ba-45fd-86f5-173f814c93a9" path="/var/lib/kubelet/pods/4a7390a1-d2ba-45fd-86f5-173f814c93a9/volumes"
Mar 20 17:09:26 crc kubenswrapper[4730]: I0320 17:09:26.534104    4730 scope.go:117] "RemoveContainer" containerID="581aa02672219ffcaf2185db619f24c1c4e9f4dc71ab9a090d63ff524fc5bbca"
Mar 20 17:09:26 crc kubenswrapper[4730]: E0320 17:09:26.535190    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 17:09:40 crc kubenswrapper[4730]: I0320 17:09:40.532960    4730 scope.go:117] "RemoveContainer" containerID="581aa02672219ffcaf2185db619f24c1c4e9f4dc71ab9a090d63ff524fc5bbca"
Mar 20 17:09:40 crc kubenswrapper[4730]: E0320 17:09:40.534020    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 17:09:53 crc kubenswrapper[4730]: I0320 17:09:53.533675    4730 scope.go:117] "RemoveContainer" containerID="581aa02672219ffcaf2185db619f24c1c4e9f4dc71ab9a090d63ff524fc5bbca"
Mar 20 17:09:53 crc kubenswrapper[4730]: E0320 17:09:53.535077    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 17:09:55 crc kubenswrapper[4730]: I0320 17:09:55.004324    4730 scope.go:117] "RemoveContainer" containerID="7cfa3c4c40647b5e6e0d0655d8ae502be7742a62b0dcf48099fd5c14403c3160"
Mar 20 17:09:55 crc kubenswrapper[4730]: I0320 17:09:55.733557    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dg9kw"]
Mar 20 17:09:55 crc kubenswrapper[4730]: E0320 17:09:55.734522    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a7390a1-d2ba-45fd-86f5-173f814c93a9" containerName="extract-utilities"
Mar 20 17:09:55 crc kubenswrapper[4730]: I0320 17:09:55.734546    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a7390a1-d2ba-45fd-86f5-173f814c93a9" containerName="extract-utilities"
Mar 20 17:09:55 crc kubenswrapper[4730]: E0320 17:09:55.734581    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a7390a1-d2ba-45fd-86f5-173f814c93a9" containerName="extract-content"
Mar 20 17:09:55 crc kubenswrapper[4730]: I0320 17:09:55.734594    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a7390a1-d2ba-45fd-86f5-173f814c93a9" containerName="extract-content"
Mar 20 17:09:55 crc kubenswrapper[4730]: E0320 17:09:55.734618    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a7390a1-d2ba-45fd-86f5-173f814c93a9" containerName="registry-server"
Mar 20 17:09:55 crc kubenswrapper[4730]: I0320 17:09:55.734630    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a7390a1-d2ba-45fd-86f5-173f814c93a9" containerName="registry-server"
Mar 20 17:09:55 crc kubenswrapper[4730]: I0320 17:09:55.734895    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a7390a1-d2ba-45fd-86f5-173f814c93a9" containerName="registry-server"
Mar 20 17:09:55 crc kubenswrapper[4730]: I0320 17:09:55.736934    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dg9kw"
Mar 20 17:09:55 crc kubenswrapper[4730]: I0320 17:09:55.747861    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dg9kw"]
Mar 20 17:09:55 crc kubenswrapper[4730]: I0320 17:09:55.786682    4730 generic.go:334] "Generic (PLEG): container finished" podID="4e1ddcc0-c7cc-4670-9679-bd5ccf32f30d" containerID="5d6c90f5aec42032ef5d1044f85f5311f53e8138aa56ffcadeca5419bfb41e2e" exitCode=0
Mar 20 17:09:55 crc kubenswrapper[4730]: I0320 17:09:55.786722    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zs57x/must-gather-nxsd6" event={"ID":"4e1ddcc0-c7cc-4670-9679-bd5ccf32f30d","Type":"ContainerDied","Data":"5d6c90f5aec42032ef5d1044f85f5311f53e8138aa56ffcadeca5419bfb41e2e"}
Mar 20 17:09:55 crc kubenswrapper[4730]: I0320 17:09:55.787617    4730 scope.go:117] "RemoveContainer" containerID="5d6c90f5aec42032ef5d1044f85f5311f53e8138aa56ffcadeca5419bfb41e2e"
Mar 20 17:09:55 crc kubenswrapper[4730]: I0320 17:09:55.888453    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/241b7a69-ba82-4fbd-afd9-edc9cab27f9d-utilities\") pod \"certified-operators-dg9kw\" (UID: \"241b7a69-ba82-4fbd-afd9-edc9cab27f9d\") " pod="openshift-marketplace/certified-operators-dg9kw"
Mar 20 17:09:55 crc kubenswrapper[4730]: I0320 17:09:55.888713    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r58b8\" (UniqueName: \"kubernetes.io/projected/241b7a69-ba82-4fbd-afd9-edc9cab27f9d-kube-api-access-r58b8\") pod \"certified-operators-dg9kw\" (UID: \"241b7a69-ba82-4fbd-afd9-edc9cab27f9d\") " pod="openshift-marketplace/certified-operators-dg9kw"
Mar 20 17:09:55 crc kubenswrapper[4730]: I0320 17:09:55.888889    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/241b7a69-ba82-4fbd-afd9-edc9cab27f9d-catalog-content\") pod \"certified-operators-dg9kw\" (UID: \"241b7a69-ba82-4fbd-afd9-edc9cab27f9d\") " pod="openshift-marketplace/certified-operators-dg9kw"
Mar 20 17:09:55 crc kubenswrapper[4730]: I0320 17:09:55.991193    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r58b8\" (UniqueName: \"kubernetes.io/projected/241b7a69-ba82-4fbd-afd9-edc9cab27f9d-kube-api-access-r58b8\") pod \"certified-operators-dg9kw\" (UID: \"241b7a69-ba82-4fbd-afd9-edc9cab27f9d\") " pod="openshift-marketplace/certified-operators-dg9kw"
Mar 20 17:09:55 crc kubenswrapper[4730]: I0320 17:09:55.991357    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/241b7a69-ba82-4fbd-afd9-edc9cab27f9d-catalog-content\") pod \"certified-operators-dg9kw\" (UID: \"241b7a69-ba82-4fbd-afd9-edc9cab27f9d\") " pod="openshift-marketplace/certified-operators-dg9kw"
Mar 20 17:09:55 crc kubenswrapper[4730]: I0320 17:09:55.991448    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/241b7a69-ba82-4fbd-afd9-edc9cab27f9d-utilities\") pod \"certified-operators-dg9kw\" (UID: \"241b7a69-ba82-4fbd-afd9-edc9cab27f9d\") " pod="openshift-marketplace/certified-operators-dg9kw"
Mar 20 17:09:55 crc kubenswrapper[4730]: I0320 17:09:55.991820    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/241b7a69-ba82-4fbd-afd9-edc9cab27f9d-catalog-content\") pod \"certified-operators-dg9kw\" (UID: \"241b7a69-ba82-4fbd-afd9-edc9cab27f9d\") " pod="openshift-marketplace/certified-operators-dg9kw"
Mar 20 17:09:55 crc kubenswrapper[4730]: I0320 17:09:55.991864    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/241b7a69-ba82-4fbd-afd9-edc9cab27f9d-utilities\") pod \"certified-operators-dg9kw\" (UID: \"241b7a69-ba82-4fbd-afd9-edc9cab27f9d\") " pod="openshift-marketplace/certified-operators-dg9kw"
Mar 20 17:09:56 crc kubenswrapper[4730]: I0320 17:09:56.016012    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r58b8\" (UniqueName: \"kubernetes.io/projected/241b7a69-ba82-4fbd-afd9-edc9cab27f9d-kube-api-access-r58b8\") pod \"certified-operators-dg9kw\" (UID: \"241b7a69-ba82-4fbd-afd9-edc9cab27f9d\") " pod="openshift-marketplace/certified-operators-dg9kw"
Mar 20 17:09:56 crc kubenswrapper[4730]: I0320 17:09:56.063494    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dg9kw"
Mar 20 17:09:56 crc kubenswrapper[4730]: I0320 17:09:56.714002    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dg9kw"]
Mar 20 17:09:56 crc kubenswrapper[4730]: I0320 17:09:56.798865    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dg9kw" event={"ID":"241b7a69-ba82-4fbd-afd9-edc9cab27f9d","Type":"ContainerStarted","Data":"f937314b5ce57ec36a9a4339d1db04b7a002d434849520857f0973e48a5279bd"}
Mar 20 17:09:56 crc kubenswrapper[4730]: I0320 17:09:56.809239    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zs57x_must-gather-nxsd6_4e1ddcc0-c7cc-4670-9679-bd5ccf32f30d/gather/0.log"
Mar 20 17:09:57 crc kubenswrapper[4730]: I0320 17:09:57.813558    4730 generic.go:334] "Generic (PLEG): container finished" podID="241b7a69-ba82-4fbd-afd9-edc9cab27f9d" containerID="0dc78a3fe7ef0500ac5343944f1eb8adf85cffd267d7b92eb1e3793b7c2af604" exitCode=0
Mar 20 17:09:57 crc kubenswrapper[4730]: I0320 17:09:57.813993    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dg9kw" event={"ID":"241b7a69-ba82-4fbd-afd9-edc9cab27f9d","Type":"ContainerDied","Data":"0dc78a3fe7ef0500ac5343944f1eb8adf85cffd267d7b92eb1e3793b7c2af604"}
Mar 20 17:09:57 crc kubenswrapper[4730]: I0320 17:09:57.816776    4730 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider
Mar 20 17:09:58 crc kubenswrapper[4730]: I0320 17:09:58.842595    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dg9kw" event={"ID":"241b7a69-ba82-4fbd-afd9-edc9cab27f9d","Type":"ContainerStarted","Data":"ad765e8c581dc65afc67f5111a021207a44884a645e6cb7483821431a43439f9"}
Mar 20 17:10:00 crc kubenswrapper[4730]: I0320 17:10:00.159827    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567110-n8lgt"]
Mar 20 17:10:00 crc kubenswrapper[4730]: I0320 17:10:00.162953    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567110-n8lgt"
Mar 20 17:10:00 crc kubenswrapper[4730]: I0320 17:10:00.168612    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567110-n8lgt"]
Mar 20 17:10:00 crc kubenswrapper[4730]: I0320 17:10:00.226683    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vlqbl"
Mar 20 17:10:00 crc kubenswrapper[4730]: I0320 17:10:00.226716    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt"
Mar 20 17:10:00 crc kubenswrapper[4730]: I0320 17:10:00.229461    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt"
Mar 20 17:10:00 crc kubenswrapper[4730]: I0320 17:10:00.331605    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpckb\" (UniqueName: \"kubernetes.io/projected/fe10df03-e254-4075-9487-78370bdbdf87-kube-api-access-tpckb\") pod \"auto-csr-approver-29567110-n8lgt\" (UID: \"fe10df03-e254-4075-9487-78370bdbdf87\") " pod="openshift-infra/auto-csr-approver-29567110-n8lgt"
Mar 20 17:10:00 crc kubenswrapper[4730]: I0320 17:10:00.434048    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpckb\" (UniqueName: \"kubernetes.io/projected/fe10df03-e254-4075-9487-78370bdbdf87-kube-api-access-tpckb\") pod \"auto-csr-approver-29567110-n8lgt\" (UID: \"fe10df03-e254-4075-9487-78370bdbdf87\") " pod="openshift-infra/auto-csr-approver-29567110-n8lgt"
Mar 20 17:10:00 crc kubenswrapper[4730]: I0320 17:10:00.462156    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpckb\" (UniqueName: \"kubernetes.io/projected/fe10df03-e254-4075-9487-78370bdbdf87-kube-api-access-tpckb\") pod \"auto-csr-approver-29567110-n8lgt\" (UID: \"fe10df03-e254-4075-9487-78370bdbdf87\") " pod="openshift-infra/auto-csr-approver-29567110-n8lgt"
Mar 20 17:10:00 crc kubenswrapper[4730]: I0320 17:10:00.560070    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567110-n8lgt"
Mar 20 17:10:00 crc kubenswrapper[4730]: I0320 17:10:00.864723    4730 generic.go:334] "Generic (PLEG): container finished" podID="241b7a69-ba82-4fbd-afd9-edc9cab27f9d" containerID="ad765e8c581dc65afc67f5111a021207a44884a645e6cb7483821431a43439f9" exitCode=0
Mar 20 17:10:00 crc kubenswrapper[4730]: I0320 17:10:00.864776    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dg9kw" event={"ID":"241b7a69-ba82-4fbd-afd9-edc9cab27f9d","Type":"ContainerDied","Data":"ad765e8c581dc65afc67f5111a021207a44884a645e6cb7483821431a43439f9"}
Mar 20 17:10:01 crc kubenswrapper[4730]: I0320 17:10:01.093799    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567110-n8lgt"]
Mar 20 17:10:01 crc kubenswrapper[4730]: W0320 17:10:01.104281    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe10df03_e254_4075_9487_78370bdbdf87.slice/crio-72ad172115d00ed0027c14d609644a41e503a9fc6c051d5d3569a5ad175ed0ad WatchSource:0}: Error finding container 72ad172115d00ed0027c14d609644a41e503a9fc6c051d5d3569a5ad175ed0ad: Status 404 returned error can't find the container with id 72ad172115d00ed0027c14d609644a41e503a9fc6c051d5d3569a5ad175ed0ad
Mar 20 17:10:01 crc kubenswrapper[4730]: I0320 17:10:01.882560    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567110-n8lgt" event={"ID":"fe10df03-e254-4075-9487-78370bdbdf87","Type":"ContainerStarted","Data":"72ad172115d00ed0027c14d609644a41e503a9fc6c051d5d3569a5ad175ed0ad"}
Mar 20 17:10:02 crc kubenswrapper[4730]: I0320 17:10:02.892237    4730 generic.go:334] "Generic (PLEG): container finished" podID="fe10df03-e254-4075-9487-78370bdbdf87" containerID="2423acdd77361bd88556503f198911c6f5c67615d5fb3a870a16f2bcad32e4e8" exitCode=0
Mar 20 17:10:02 crc kubenswrapper[4730]: I0320 17:10:02.892349    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567110-n8lgt" event={"ID":"fe10df03-e254-4075-9487-78370bdbdf87","Type":"ContainerDied","Data":"2423acdd77361bd88556503f198911c6f5c67615d5fb3a870a16f2bcad32e4e8"}
Mar 20 17:10:02 crc kubenswrapper[4730]: I0320 17:10:02.895085    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dg9kw" event={"ID":"241b7a69-ba82-4fbd-afd9-edc9cab27f9d","Type":"ContainerStarted","Data":"9d7e5bf4b6a562f1ed448cda6789c8a058112f6f09b47cec2a321e00fe2d25e9"}
Mar 20 17:10:02 crc kubenswrapper[4730]: I0320 17:10:02.936937    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dg9kw" podStartSLOduration=4.44623531 podStartE2EDuration="7.936920314s" podCreationTimestamp="2026-03-20 17:09:55 +0000 UTC" firstStartedPulling="2026-03-20 17:09:57.816448456 +0000 UTC m=+5457.029819825" lastFinishedPulling="2026-03-20 17:10:01.30713346 +0000 UTC m=+5460.520504829" observedRunningTime="2026-03-20 17:10:02.928155814 +0000 UTC m=+5462.141527183" watchObservedRunningTime="2026-03-20 17:10:02.936920314 +0000 UTC m=+5462.150291683"
Mar 20 17:10:04 crc kubenswrapper[4730]: I0320 17:10:04.532488    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567110-n8lgt"
Mar 20 17:10:04 crc kubenswrapper[4730]: I0320 17:10:04.730823    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpckb\" (UniqueName: \"kubernetes.io/projected/fe10df03-e254-4075-9487-78370bdbdf87-kube-api-access-tpckb\") pod \"fe10df03-e254-4075-9487-78370bdbdf87\" (UID: \"fe10df03-e254-4075-9487-78370bdbdf87\") "
Mar 20 17:10:04 crc kubenswrapper[4730]: I0320 17:10:04.736615    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe10df03-e254-4075-9487-78370bdbdf87-kube-api-access-tpckb" (OuterVolumeSpecName: "kube-api-access-tpckb") pod "fe10df03-e254-4075-9487-78370bdbdf87" (UID: "fe10df03-e254-4075-9487-78370bdbdf87"). InnerVolumeSpecName "kube-api-access-tpckb". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 17:10:04 crc kubenswrapper[4730]: I0320 17:10:04.834478    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpckb\" (UniqueName: \"kubernetes.io/projected/fe10df03-e254-4075-9487-78370bdbdf87-kube-api-access-tpckb\") on node \"crc\" DevicePath \"\""
Mar 20 17:10:04 crc kubenswrapper[4730]: I0320 17:10:04.925777    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567110-n8lgt" event={"ID":"fe10df03-e254-4075-9487-78370bdbdf87","Type":"ContainerDied","Data":"72ad172115d00ed0027c14d609644a41e503a9fc6c051d5d3569a5ad175ed0ad"}
Mar 20 17:10:04 crc kubenswrapper[4730]: I0320 17:10:04.925865    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72ad172115d00ed0027c14d609644a41e503a9fc6c051d5d3569a5ad175ed0ad"
Mar 20 17:10:04 crc kubenswrapper[4730]: I0320 17:10:04.925814    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567110-n8lgt"
Mar 20 17:10:05 crc kubenswrapper[4730]: I0320 17:10:05.609880    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567104-xfdqf"]
Mar 20 17:10:05 crc kubenswrapper[4730]: I0320 17:10:05.626647    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567104-xfdqf"]
Mar 20 17:10:06 crc kubenswrapper[4730]: I0320 17:10:06.033106    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-zs57x/must-gather-nxsd6"]
Mar 20 17:10:06 crc kubenswrapper[4730]: I0320 17:10:06.033380    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-zs57x/must-gather-nxsd6" podUID="4e1ddcc0-c7cc-4670-9679-bd5ccf32f30d" containerName="copy" containerID="cri-o://5f9bd62b83471aba96f37df030e3e1c5b574f8d8437b38bdf436e67eecd3f71d" gracePeriod=2
Mar 20 17:10:06 crc kubenswrapper[4730]: I0320 17:10:06.045581    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-zs57x/must-gather-nxsd6"]
Mar 20 17:10:06 crc kubenswrapper[4730]: I0320 17:10:06.063959    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dg9kw"
Mar 20 17:10:06 crc kubenswrapper[4730]: I0320 17:10:06.064016    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dg9kw"
Mar 20 17:10:06 crc kubenswrapper[4730]: I0320 17:10:06.110733    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dg9kw"
Mar 20 17:10:06 crc kubenswrapper[4730]: I0320 17:10:06.533351    4730 scope.go:117] "RemoveContainer" containerID="581aa02672219ffcaf2185db619f24c1c4e9f4dc71ab9a090d63ff524fc5bbca"
Mar 20 17:10:06 crc kubenswrapper[4730]: E0320 17:10:06.534367    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 17:10:06 crc kubenswrapper[4730]: I0320 17:10:06.599502    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zs57x_must-gather-nxsd6_4e1ddcc0-c7cc-4670-9679-bd5ccf32f30d/copy/0.log"
Mar 20 17:10:06 crc kubenswrapper[4730]: I0320 17:10:06.599899    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zs57x/must-gather-nxsd6"
Mar 20 17:10:06 crc kubenswrapper[4730]: I0320 17:10:06.769141    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4e1ddcc0-c7cc-4670-9679-bd5ccf32f30d-must-gather-output\") pod \"4e1ddcc0-c7cc-4670-9679-bd5ccf32f30d\" (UID: \"4e1ddcc0-c7cc-4670-9679-bd5ccf32f30d\") "
Mar 20 17:10:06 crc kubenswrapper[4730]: I0320 17:10:06.769207    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrzpq\" (UniqueName: \"kubernetes.io/projected/4e1ddcc0-c7cc-4670-9679-bd5ccf32f30d-kube-api-access-rrzpq\") pod \"4e1ddcc0-c7cc-4670-9679-bd5ccf32f30d\" (UID: \"4e1ddcc0-c7cc-4670-9679-bd5ccf32f30d\") "
Mar 20 17:10:06 crc kubenswrapper[4730]: I0320 17:10:06.776654    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e1ddcc0-c7cc-4670-9679-bd5ccf32f30d-kube-api-access-rrzpq" (OuterVolumeSpecName: "kube-api-access-rrzpq") pod "4e1ddcc0-c7cc-4670-9679-bd5ccf32f30d" (UID: "4e1ddcc0-c7cc-4670-9679-bd5ccf32f30d"). InnerVolumeSpecName "kube-api-access-rrzpq". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 17:10:06 crc kubenswrapper[4730]: I0320 17:10:06.871859    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrzpq\" (UniqueName: \"kubernetes.io/projected/4e1ddcc0-c7cc-4670-9679-bd5ccf32f30d-kube-api-access-rrzpq\") on node \"crc\" DevicePath \"\""
Mar 20 17:10:06 crc kubenswrapper[4730]: I0320 17:10:06.946900    4730 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zs57x_must-gather-nxsd6_4e1ddcc0-c7cc-4670-9679-bd5ccf32f30d/copy/0.log"
Mar 20 17:10:06 crc kubenswrapper[4730]: I0320 17:10:06.947270    4730 generic.go:334] "Generic (PLEG): container finished" podID="4e1ddcc0-c7cc-4670-9679-bd5ccf32f30d" containerID="5f9bd62b83471aba96f37df030e3e1c5b574f8d8437b38bdf436e67eecd3f71d" exitCode=143
Mar 20 17:10:06 crc kubenswrapper[4730]: I0320 17:10:06.947400    4730 scope.go:117] "RemoveContainer" containerID="5f9bd62b83471aba96f37df030e3e1c5b574f8d8437b38bdf436e67eecd3f71d"
Mar 20 17:10:06 crc kubenswrapper[4730]: I0320 17:10:06.947354    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zs57x/must-gather-nxsd6"
Mar 20 17:10:06 crc kubenswrapper[4730]: I0320 17:10:06.959877    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e1ddcc0-c7cc-4670-9679-bd5ccf32f30d-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "4e1ddcc0-c7cc-4670-9679-bd5ccf32f30d" (UID: "4e1ddcc0-c7cc-4670-9679-bd5ccf32f30d"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 17:10:06 crc kubenswrapper[4730]: I0320 17:10:06.968439    4730 scope.go:117] "RemoveContainer" containerID="5d6c90f5aec42032ef5d1044f85f5311f53e8138aa56ffcadeca5419bfb41e2e"
Mar 20 17:10:06 crc kubenswrapper[4730]: I0320 17:10:06.973317    4730 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4e1ddcc0-c7cc-4670-9679-bd5ccf32f30d-must-gather-output\") on node \"crc\" DevicePath \"\""
Mar 20 17:10:07 crc kubenswrapper[4730]: I0320 17:10:07.049291    4730 scope.go:117] "RemoveContainer" containerID="5f9bd62b83471aba96f37df030e3e1c5b574f8d8437b38bdf436e67eecd3f71d"
Mar 20 17:10:07 crc kubenswrapper[4730]: E0320 17:10:07.049820    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f9bd62b83471aba96f37df030e3e1c5b574f8d8437b38bdf436e67eecd3f71d\": container with ID starting with 5f9bd62b83471aba96f37df030e3e1c5b574f8d8437b38bdf436e67eecd3f71d not found: ID does not exist" containerID="5f9bd62b83471aba96f37df030e3e1c5b574f8d8437b38bdf436e67eecd3f71d"
Mar 20 17:10:07 crc kubenswrapper[4730]: I0320 17:10:07.049866    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f9bd62b83471aba96f37df030e3e1c5b574f8d8437b38bdf436e67eecd3f71d"} err="failed to get container status \"5f9bd62b83471aba96f37df030e3e1c5b574f8d8437b38bdf436e67eecd3f71d\": rpc error: code = NotFound desc = could not find container \"5f9bd62b83471aba96f37df030e3e1c5b574f8d8437b38bdf436e67eecd3f71d\": container with ID starting with 5f9bd62b83471aba96f37df030e3e1c5b574f8d8437b38bdf436e67eecd3f71d not found: ID does not exist"
Mar 20 17:10:07 crc kubenswrapper[4730]: I0320 17:10:07.049898    4730 scope.go:117] "RemoveContainer" containerID="5d6c90f5aec42032ef5d1044f85f5311f53e8138aa56ffcadeca5419bfb41e2e"
Mar 20 17:10:07 crc kubenswrapper[4730]: E0320 17:10:07.050351    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d6c90f5aec42032ef5d1044f85f5311f53e8138aa56ffcadeca5419bfb41e2e\": container with ID starting with 5d6c90f5aec42032ef5d1044f85f5311f53e8138aa56ffcadeca5419bfb41e2e not found: ID does not exist" containerID="5d6c90f5aec42032ef5d1044f85f5311f53e8138aa56ffcadeca5419bfb41e2e"
Mar 20 17:10:07 crc kubenswrapper[4730]: I0320 17:10:07.050387    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d6c90f5aec42032ef5d1044f85f5311f53e8138aa56ffcadeca5419bfb41e2e"} err="failed to get container status \"5d6c90f5aec42032ef5d1044f85f5311f53e8138aa56ffcadeca5419bfb41e2e\": rpc error: code = NotFound desc = could not find container \"5d6c90f5aec42032ef5d1044f85f5311f53e8138aa56ffcadeca5419bfb41e2e\": container with ID starting with 5d6c90f5aec42032ef5d1044f85f5311f53e8138aa56ffcadeca5419bfb41e2e not found: ID does not exist"
Mar 20 17:10:07 crc kubenswrapper[4730]: I0320 17:10:07.547697    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e1ddcc0-c7cc-4670-9679-bd5ccf32f30d" path="/var/lib/kubelet/pods/4e1ddcc0-c7cc-4670-9679-bd5ccf32f30d/volumes"
Mar 20 17:10:07 crc kubenswrapper[4730]: I0320 17:10:07.548783    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb4bb26e-8780-490e-b7d4-5068d41079d5" path="/var/lib/kubelet/pods/cb4bb26e-8780-490e-b7d4-5068d41079d5/volumes"
Mar 20 17:10:16 crc kubenswrapper[4730]: I0320 17:10:16.383585    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dg9kw"
Mar 20 17:10:16 crc kubenswrapper[4730]: I0320 17:10:16.449982    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dg9kw"]
Mar 20 17:10:17 crc kubenswrapper[4730]: I0320 17:10:17.068776    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dg9kw" podUID="241b7a69-ba82-4fbd-afd9-edc9cab27f9d" containerName="registry-server" containerID="cri-o://9d7e5bf4b6a562f1ed448cda6789c8a058112f6f09b47cec2a321e00fe2d25e9" gracePeriod=2
Mar 20 17:10:17 crc kubenswrapper[4730]: I0320 17:10:17.607960    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dg9kw"
Mar 20 17:10:17 crc kubenswrapper[4730]: I0320 17:10:17.629115    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/241b7a69-ba82-4fbd-afd9-edc9cab27f9d-catalog-content\") pod \"241b7a69-ba82-4fbd-afd9-edc9cab27f9d\" (UID: \"241b7a69-ba82-4fbd-afd9-edc9cab27f9d\") "
Mar 20 17:10:17 crc kubenswrapper[4730]: I0320 17:10:17.629272    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r58b8\" (UniqueName: \"kubernetes.io/projected/241b7a69-ba82-4fbd-afd9-edc9cab27f9d-kube-api-access-r58b8\") pod \"241b7a69-ba82-4fbd-afd9-edc9cab27f9d\" (UID: \"241b7a69-ba82-4fbd-afd9-edc9cab27f9d\") "
Mar 20 17:10:17 crc kubenswrapper[4730]: I0320 17:10:17.629306    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/241b7a69-ba82-4fbd-afd9-edc9cab27f9d-utilities\") pod \"241b7a69-ba82-4fbd-afd9-edc9cab27f9d\" (UID: \"241b7a69-ba82-4fbd-afd9-edc9cab27f9d\") "
Mar 20 17:10:17 crc kubenswrapper[4730]: I0320 17:10:17.630659    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/241b7a69-ba82-4fbd-afd9-edc9cab27f9d-utilities" (OuterVolumeSpecName: "utilities") pod "241b7a69-ba82-4fbd-afd9-edc9cab27f9d" (UID: "241b7a69-ba82-4fbd-afd9-edc9cab27f9d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 17:10:17 crc kubenswrapper[4730]: I0320 17:10:17.637177    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/241b7a69-ba82-4fbd-afd9-edc9cab27f9d-kube-api-access-r58b8" (OuterVolumeSpecName: "kube-api-access-r58b8") pod "241b7a69-ba82-4fbd-afd9-edc9cab27f9d" (UID: "241b7a69-ba82-4fbd-afd9-edc9cab27f9d"). InnerVolumeSpecName "kube-api-access-r58b8". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 17:10:17 crc kubenswrapper[4730]: I0320 17:10:17.695781    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/241b7a69-ba82-4fbd-afd9-edc9cab27f9d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "241b7a69-ba82-4fbd-afd9-edc9cab27f9d" (UID: "241b7a69-ba82-4fbd-afd9-edc9cab27f9d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 17:10:17 crc kubenswrapper[4730]: I0320 17:10:17.731954    4730 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/241b7a69-ba82-4fbd-afd9-edc9cab27f9d-catalog-content\") on node \"crc\" DevicePath \"\""
Mar 20 17:10:17 crc kubenswrapper[4730]: I0320 17:10:17.732060    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r58b8\" (UniqueName: \"kubernetes.io/projected/241b7a69-ba82-4fbd-afd9-edc9cab27f9d-kube-api-access-r58b8\") on node \"crc\" DevicePath \"\""
Mar 20 17:10:17 crc kubenswrapper[4730]: I0320 17:10:17.732120    4730 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/241b7a69-ba82-4fbd-afd9-edc9cab27f9d-utilities\") on node \"crc\" DevicePath \"\""
Mar 20 17:10:18 crc kubenswrapper[4730]: I0320 17:10:18.081603    4730 generic.go:334] "Generic (PLEG): container finished" podID="241b7a69-ba82-4fbd-afd9-edc9cab27f9d" containerID="9d7e5bf4b6a562f1ed448cda6789c8a058112f6f09b47cec2a321e00fe2d25e9" exitCode=0
Mar 20 17:10:18 crc kubenswrapper[4730]: I0320 17:10:18.081672    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dg9kw" event={"ID":"241b7a69-ba82-4fbd-afd9-edc9cab27f9d","Type":"ContainerDied","Data":"9d7e5bf4b6a562f1ed448cda6789c8a058112f6f09b47cec2a321e00fe2d25e9"}
Mar 20 17:10:18 crc kubenswrapper[4730]: I0320 17:10:18.082012    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dg9kw" event={"ID":"241b7a69-ba82-4fbd-afd9-edc9cab27f9d","Type":"ContainerDied","Data":"f937314b5ce57ec36a9a4339d1db04b7a002d434849520857f0973e48a5279bd"}
Mar 20 17:10:18 crc kubenswrapper[4730]: I0320 17:10:18.081713    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dg9kw"
Mar 20 17:10:18 crc kubenswrapper[4730]: I0320 17:10:18.082082    4730 scope.go:117] "RemoveContainer" containerID="9d7e5bf4b6a562f1ed448cda6789c8a058112f6f09b47cec2a321e00fe2d25e9"
Mar 20 17:10:18 crc kubenswrapper[4730]: I0320 17:10:18.120530    4730 scope.go:117] "RemoveContainer" containerID="ad765e8c581dc65afc67f5111a021207a44884a645e6cb7483821431a43439f9"
Mar 20 17:10:18 crc kubenswrapper[4730]: I0320 17:10:18.133205    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dg9kw"]
Mar 20 17:10:18 crc kubenswrapper[4730]: I0320 17:10:18.144356    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dg9kw"]
Mar 20 17:10:18 crc kubenswrapper[4730]: I0320 17:10:18.147812    4730 scope.go:117] "RemoveContainer" containerID="0dc78a3fe7ef0500ac5343944f1eb8adf85cffd267d7b92eb1e3793b7c2af604"
Mar 20 17:10:18 crc kubenswrapper[4730]: I0320 17:10:18.216096    4730 scope.go:117] "RemoveContainer" containerID="9d7e5bf4b6a562f1ed448cda6789c8a058112f6f09b47cec2a321e00fe2d25e9"
Mar 20 17:10:18 crc kubenswrapper[4730]: E0320 17:10:18.216756    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d7e5bf4b6a562f1ed448cda6789c8a058112f6f09b47cec2a321e00fe2d25e9\": container with ID starting with 9d7e5bf4b6a562f1ed448cda6789c8a058112f6f09b47cec2a321e00fe2d25e9 not found: ID does not exist" containerID="9d7e5bf4b6a562f1ed448cda6789c8a058112f6f09b47cec2a321e00fe2d25e9"
Mar 20 17:10:18 crc kubenswrapper[4730]: I0320 17:10:18.216805    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d7e5bf4b6a562f1ed448cda6789c8a058112f6f09b47cec2a321e00fe2d25e9"} err="failed to get container status \"9d7e5bf4b6a562f1ed448cda6789c8a058112f6f09b47cec2a321e00fe2d25e9\": rpc error: code = NotFound desc = could not find container \"9d7e5bf4b6a562f1ed448cda6789c8a058112f6f09b47cec2a321e00fe2d25e9\": container with ID starting with 9d7e5bf4b6a562f1ed448cda6789c8a058112f6f09b47cec2a321e00fe2d25e9 not found: ID does not exist"
Mar 20 17:10:18 crc kubenswrapper[4730]: I0320 17:10:18.216837    4730 scope.go:117] "RemoveContainer" containerID="ad765e8c581dc65afc67f5111a021207a44884a645e6cb7483821431a43439f9"
Mar 20 17:10:18 crc kubenswrapper[4730]: E0320 17:10:18.217208    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad765e8c581dc65afc67f5111a021207a44884a645e6cb7483821431a43439f9\": container with ID starting with ad765e8c581dc65afc67f5111a021207a44884a645e6cb7483821431a43439f9 not found: ID does not exist" containerID="ad765e8c581dc65afc67f5111a021207a44884a645e6cb7483821431a43439f9"
Mar 20 17:10:18 crc kubenswrapper[4730]: I0320 17:10:18.217239    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad765e8c581dc65afc67f5111a021207a44884a645e6cb7483821431a43439f9"} err="failed to get container status \"ad765e8c581dc65afc67f5111a021207a44884a645e6cb7483821431a43439f9\": rpc error: code = NotFound desc = could not find container \"ad765e8c581dc65afc67f5111a021207a44884a645e6cb7483821431a43439f9\": container with ID starting with ad765e8c581dc65afc67f5111a021207a44884a645e6cb7483821431a43439f9 not found: ID does not exist"
Mar 20 17:10:18 crc kubenswrapper[4730]: I0320 17:10:18.217272    4730 scope.go:117] "RemoveContainer" containerID="0dc78a3fe7ef0500ac5343944f1eb8adf85cffd267d7b92eb1e3793b7c2af604"
Mar 20 17:10:18 crc kubenswrapper[4730]: E0320 17:10:18.217551    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dc78a3fe7ef0500ac5343944f1eb8adf85cffd267d7b92eb1e3793b7c2af604\": container with ID starting with 0dc78a3fe7ef0500ac5343944f1eb8adf85cffd267d7b92eb1e3793b7c2af604 not found: ID does not exist" containerID="0dc78a3fe7ef0500ac5343944f1eb8adf85cffd267d7b92eb1e3793b7c2af604"
Mar 20 17:10:18 crc kubenswrapper[4730]: I0320 17:10:18.217590    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dc78a3fe7ef0500ac5343944f1eb8adf85cffd267d7b92eb1e3793b7c2af604"} err="failed to get container status \"0dc78a3fe7ef0500ac5343944f1eb8adf85cffd267d7b92eb1e3793b7c2af604\": rpc error: code = NotFound desc = could not find container \"0dc78a3fe7ef0500ac5343944f1eb8adf85cffd267d7b92eb1e3793b7c2af604\": container with ID starting with 0dc78a3fe7ef0500ac5343944f1eb8adf85cffd267d7b92eb1e3793b7c2af604 not found: ID does not exist"
Mar 20 17:10:19 crc kubenswrapper[4730]: I0320 17:10:19.554853    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="241b7a69-ba82-4fbd-afd9-edc9cab27f9d" path="/var/lib/kubelet/pods/241b7a69-ba82-4fbd-afd9-edc9cab27f9d/volumes"
Mar 20 17:10:21 crc kubenswrapper[4730]: I0320 17:10:21.548105    4730 scope.go:117] "RemoveContainer" containerID="581aa02672219ffcaf2185db619f24c1c4e9f4dc71ab9a090d63ff524fc5bbca"
Mar 20 17:10:21 crc kubenswrapper[4730]: E0320 17:10:21.548634    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 17:10:34 crc kubenswrapper[4730]: I0320 17:10:34.534497    4730 scope.go:117] "RemoveContainer" containerID="581aa02672219ffcaf2185db619f24c1c4e9f4dc71ab9a090d63ff524fc5bbca"
Mar 20 17:10:34 crc kubenswrapper[4730]: E0320 17:10:34.535713    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 17:10:45 crc kubenswrapper[4730]: I0320 17:10:45.533660    4730 scope.go:117] "RemoveContainer" containerID="581aa02672219ffcaf2185db619f24c1c4e9f4dc71ab9a090d63ff524fc5bbca"
Mar 20 17:10:45 crc kubenswrapper[4730]: E0320 17:10:45.534407    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 17:10:55 crc kubenswrapper[4730]: I0320 17:10:55.108882    4730 scope.go:117] "RemoveContainer" containerID="570fedc27114561def00c9fd45b45e729130f1ccdb2c50a188548ae1020d9f83"
Mar 20 17:10:55 crc kubenswrapper[4730]: I0320 17:10:55.166875    4730 scope.go:117] "RemoveContainer" containerID="f2a5b11498b3565ac584ae0844e2ddcd32e123d95c3d6fe4c6c0d6653ce556a9"
Mar 20 17:11:00 crc kubenswrapper[4730]: I0320 17:11:00.533370    4730 scope.go:117] "RemoveContainer" containerID="581aa02672219ffcaf2185db619f24c1c4e9f4dc71ab9a090d63ff524fc5bbca"
Mar 20 17:11:00 crc kubenswrapper[4730]: E0320 17:11:00.534683    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 17:11:15 crc kubenswrapper[4730]: I0320 17:11:15.534395    4730 scope.go:117] "RemoveContainer" containerID="581aa02672219ffcaf2185db619f24c1c4e9f4dc71ab9a090d63ff524fc5bbca"
Mar 20 17:11:15 crc kubenswrapper[4730]: E0320 17:11:15.535340    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 17:11:26 crc kubenswrapper[4730]: I0320 17:11:26.533876    4730 scope.go:117] "RemoveContainer" containerID="581aa02672219ffcaf2185db619f24c1c4e9f4dc71ab9a090d63ff524fc5bbca"
Mar 20 17:11:26 crc kubenswrapper[4730]: E0320 17:11:26.534871    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 17:11:40 crc kubenswrapper[4730]: I0320 17:11:40.534495    4730 scope.go:117] "RemoveContainer" containerID="581aa02672219ffcaf2185db619f24c1c4e9f4dc71ab9a090d63ff524fc5bbca"
Mar 20 17:11:40 crc kubenswrapper[4730]: E0320 17:11:40.538501    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 17:11:54 crc kubenswrapper[4730]: I0320 17:11:54.532864    4730 scope.go:117] "RemoveContainer" containerID="581aa02672219ffcaf2185db619f24c1c4e9f4dc71ab9a090d63ff524fc5bbca"
Mar 20 17:11:54 crc kubenswrapper[4730]: E0320 17:11:54.533699    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 17:12:00 crc kubenswrapper[4730]: I0320 17:12:00.158492    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567112-ft78g"]
Mar 20 17:12:00 crc kubenswrapper[4730]: E0320 17:12:00.161952    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e1ddcc0-c7cc-4670-9679-bd5ccf32f30d" containerName="gather"
Mar 20 17:12:00 crc kubenswrapper[4730]: I0320 17:12:00.162087    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e1ddcc0-c7cc-4670-9679-bd5ccf32f30d" containerName="gather"
Mar 20 17:12:00 crc kubenswrapper[4730]: E0320 17:12:00.162204    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e1ddcc0-c7cc-4670-9679-bd5ccf32f30d" containerName="copy"
Mar 20 17:12:00 crc kubenswrapper[4730]: I0320 17:12:00.162342    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e1ddcc0-c7cc-4670-9679-bd5ccf32f30d" containerName="copy"
Mar 20 17:12:00 crc kubenswrapper[4730]: E0320 17:12:00.162463    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="241b7a69-ba82-4fbd-afd9-edc9cab27f9d" containerName="extract-content"
Mar 20 17:12:00 crc kubenswrapper[4730]: I0320 17:12:00.162566    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="241b7a69-ba82-4fbd-afd9-edc9cab27f9d" containerName="extract-content"
Mar 20 17:12:00 crc kubenswrapper[4730]: E0320 17:12:00.162698    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="241b7a69-ba82-4fbd-afd9-edc9cab27f9d" containerName="extract-utilities"
Mar 20 17:12:00 crc kubenswrapper[4730]: I0320 17:12:00.162794    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="241b7a69-ba82-4fbd-afd9-edc9cab27f9d" containerName="extract-utilities"
Mar 20 17:12:00 crc kubenswrapper[4730]: E0320 17:12:00.162919    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe10df03-e254-4075-9487-78370bdbdf87" containerName="oc"
Mar 20 17:12:00 crc kubenswrapper[4730]: I0320 17:12:00.163014    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe10df03-e254-4075-9487-78370bdbdf87" containerName="oc"
Mar 20 17:12:00 crc kubenswrapper[4730]: E0320 17:12:00.163123    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="241b7a69-ba82-4fbd-afd9-edc9cab27f9d" containerName="registry-server"
Mar 20 17:12:00 crc kubenswrapper[4730]: I0320 17:12:00.163220    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="241b7a69-ba82-4fbd-afd9-edc9cab27f9d" containerName="registry-server"
Mar 20 17:12:00 crc kubenswrapper[4730]: I0320 17:12:00.163697    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e1ddcc0-c7cc-4670-9679-bd5ccf32f30d" containerName="copy"
Mar 20 17:12:00 crc kubenswrapper[4730]: I0320 17:12:00.163836    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e1ddcc0-c7cc-4670-9679-bd5ccf32f30d" containerName="gather"
Mar 20 17:12:00 crc kubenswrapper[4730]: I0320 17:12:00.163964    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="241b7a69-ba82-4fbd-afd9-edc9cab27f9d" containerName="registry-server"
Mar 20 17:12:00 crc kubenswrapper[4730]: I0320 17:12:00.164076    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe10df03-e254-4075-9487-78370bdbdf87" containerName="oc"
Mar 20 17:12:00 crc kubenswrapper[4730]: I0320 17:12:00.165530    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567112-ft78g"
Mar 20 17:12:00 crc kubenswrapper[4730]: I0320 17:12:00.169414    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vlqbl"
Mar 20 17:12:00 crc kubenswrapper[4730]: I0320 17:12:00.169501    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt"
Mar 20 17:12:00 crc kubenswrapper[4730]: I0320 17:12:00.170202    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt"
Mar 20 17:12:00 crc kubenswrapper[4730]: I0320 17:12:00.173564    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567112-ft78g"]
Mar 20 17:12:00 crc kubenswrapper[4730]: I0320 17:12:00.264327    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvpv6\" (UniqueName: \"kubernetes.io/projected/2b24c66b-fc08-4f63-8ac8-15d11de2b672-kube-api-access-bvpv6\") pod \"auto-csr-approver-29567112-ft78g\" (UID: \"2b24c66b-fc08-4f63-8ac8-15d11de2b672\") " pod="openshift-infra/auto-csr-approver-29567112-ft78g"
Mar 20 17:12:00 crc kubenswrapper[4730]: I0320 17:12:00.366907    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvpv6\" (UniqueName: \"kubernetes.io/projected/2b24c66b-fc08-4f63-8ac8-15d11de2b672-kube-api-access-bvpv6\") pod \"auto-csr-approver-29567112-ft78g\" (UID: \"2b24c66b-fc08-4f63-8ac8-15d11de2b672\") " pod="openshift-infra/auto-csr-approver-29567112-ft78g"
Mar 20 17:12:00 crc kubenswrapper[4730]: I0320 17:12:00.392551    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvpv6\" (UniqueName: \"kubernetes.io/projected/2b24c66b-fc08-4f63-8ac8-15d11de2b672-kube-api-access-bvpv6\") pod \"auto-csr-approver-29567112-ft78g\" (UID: \"2b24c66b-fc08-4f63-8ac8-15d11de2b672\") " pod="openshift-infra/auto-csr-approver-29567112-ft78g"
Mar 20 17:12:00 crc kubenswrapper[4730]: I0320 17:12:00.494469    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567112-ft78g"
Mar 20 17:12:00 crc kubenswrapper[4730]: I0320 17:12:00.967970    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567112-ft78g"]
Mar 20 17:12:01 crc kubenswrapper[4730]: I0320 17:12:01.339582    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567112-ft78g" event={"ID":"2b24c66b-fc08-4f63-8ac8-15d11de2b672","Type":"ContainerStarted","Data":"80ccaaf3b45a0cc48425f3e70c0a6f860a18b1e69a52e75f4d16543faafe4710"}
Mar 20 17:12:03 crc kubenswrapper[4730]: I0320 17:12:03.364414    4730 generic.go:334] "Generic (PLEG): container finished" podID="2b24c66b-fc08-4f63-8ac8-15d11de2b672" containerID="443bf939ac2c50184710f05da0153e5a7905637d2ddf41d7e108e98cb3d443fe" exitCode=0
Mar 20 17:12:03 crc kubenswrapper[4730]: I0320 17:12:03.364471    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567112-ft78g" event={"ID":"2b24c66b-fc08-4f63-8ac8-15d11de2b672","Type":"ContainerDied","Data":"443bf939ac2c50184710f05da0153e5a7905637d2ddf41d7e108e98cb3d443fe"}
Mar 20 17:12:04 crc kubenswrapper[4730]: I0320 17:12:04.679171    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567112-ft78g"
Mar 20 17:12:04 crc kubenswrapper[4730]: I0320 17:12:04.764108    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvpv6\" (UniqueName: \"kubernetes.io/projected/2b24c66b-fc08-4f63-8ac8-15d11de2b672-kube-api-access-bvpv6\") pod \"2b24c66b-fc08-4f63-8ac8-15d11de2b672\" (UID: \"2b24c66b-fc08-4f63-8ac8-15d11de2b672\") "
Mar 20 17:12:04 crc kubenswrapper[4730]: I0320 17:12:04.772325    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b24c66b-fc08-4f63-8ac8-15d11de2b672-kube-api-access-bvpv6" (OuterVolumeSpecName: "kube-api-access-bvpv6") pod "2b24c66b-fc08-4f63-8ac8-15d11de2b672" (UID: "2b24c66b-fc08-4f63-8ac8-15d11de2b672"). InnerVolumeSpecName "kube-api-access-bvpv6". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 17:12:04 crc kubenswrapper[4730]: I0320 17:12:04.866577    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvpv6\" (UniqueName: \"kubernetes.io/projected/2b24c66b-fc08-4f63-8ac8-15d11de2b672-kube-api-access-bvpv6\") on node \"crc\" DevicePath \"\""
Mar 20 17:12:05 crc kubenswrapper[4730]: I0320 17:12:05.383626    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567112-ft78g" event={"ID":"2b24c66b-fc08-4f63-8ac8-15d11de2b672","Type":"ContainerDied","Data":"80ccaaf3b45a0cc48425f3e70c0a6f860a18b1e69a52e75f4d16543faafe4710"}
Mar 20 17:12:05 crc kubenswrapper[4730]: I0320 17:12:05.383666    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80ccaaf3b45a0cc48425f3e70c0a6f860a18b1e69a52e75f4d16543faafe4710"
Mar 20 17:12:05 crc kubenswrapper[4730]: I0320 17:12:05.383699    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567112-ft78g"
Mar 20 17:12:05 crc kubenswrapper[4730]: I0320 17:12:05.751516    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567106-pg9p8"]
Mar 20 17:12:05 crc kubenswrapper[4730]: I0320 17:12:05.763070    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567106-pg9p8"]
Mar 20 17:12:06 crc kubenswrapper[4730]: I0320 17:12:06.532863    4730 scope.go:117] "RemoveContainer" containerID="581aa02672219ffcaf2185db619f24c1c4e9f4dc71ab9a090d63ff524fc5bbca"
Mar 20 17:12:06 crc kubenswrapper[4730]: E0320 17:12:06.533283    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 17:12:07 crc kubenswrapper[4730]: I0320 17:12:07.543908    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9695c6e8-6be3-4465-95a1-887c6a568fb7" path="/var/lib/kubelet/pods/9695c6e8-6be3-4465-95a1-887c6a568fb7/volumes"
Mar 20 17:12:21 crc kubenswrapper[4730]: I0320 17:12:21.551754    4730 scope.go:117] "RemoveContainer" containerID="581aa02672219ffcaf2185db619f24c1c4e9f4dc71ab9a090d63ff524fc5bbca"
Mar 20 17:12:21 crc kubenswrapper[4730]: E0320 17:12:21.553196    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 17:12:32 crc kubenswrapper[4730]: I0320 17:12:32.533873    4730 scope.go:117] "RemoveContainer" containerID="581aa02672219ffcaf2185db619f24c1c4e9f4dc71ab9a090d63ff524fc5bbca"
Mar 20 17:12:32 crc kubenswrapper[4730]: E0320 17:12:32.534915    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 17:12:47 crc kubenswrapper[4730]: I0320 17:12:47.533704    4730 scope.go:117] "RemoveContainer" containerID="581aa02672219ffcaf2185db619f24c1c4e9f4dc71ab9a090d63ff524fc5bbca"
Mar 20 17:12:47 crc kubenswrapper[4730]: E0320 17:12:47.534621    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 17:12:55 crc kubenswrapper[4730]: I0320 17:12:55.360522    4730 scope.go:117] "RemoveContainer" containerID="e24f74530672b4126d2aef0eaec17c584c50b3452f9280f1c1dd7481992b500e"
Mar 20 17:12:59 crc kubenswrapper[4730]: I0320 17:12:59.533514    4730 scope.go:117] "RemoveContainer" containerID="581aa02672219ffcaf2185db619f24c1c4e9f4dc71ab9a090d63ff524fc5bbca"
Mar 20 17:12:59 crc kubenswrapper[4730]: E0320 17:12:59.534352    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 17:13:13 crc kubenswrapper[4730]: I0320 17:13:13.535043    4730 scope.go:117] "RemoveContainer" containerID="581aa02672219ffcaf2185db619f24c1c4e9f4dc71ab9a090d63ff524fc5bbca"
Mar 20 17:13:13 crc kubenswrapper[4730]: E0320 17:13:13.536496    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 17:13:24 crc kubenswrapper[4730]: I0320 17:13:24.533680    4730 scope.go:117] "RemoveContainer" containerID="581aa02672219ffcaf2185db619f24c1c4e9f4dc71ab9a090d63ff524fc5bbca"
Mar 20 17:13:24 crc kubenswrapper[4730]: E0320 17:13:24.534782    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 17:13:36 crc kubenswrapper[4730]: I0320 17:13:36.533767    4730 scope.go:117] "RemoveContainer" containerID="581aa02672219ffcaf2185db619f24c1c4e9f4dc71ab9a090d63ff524fc5bbca"
Mar 20 17:13:36 crc kubenswrapper[4730]: E0320 17:13:36.534752    4730 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p5qvf_openshift-machine-config-operator(7fcd3db3-55f1-4c23-8fa9-78844495cea3)\"" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3"
Mar 20 17:13:51 crc kubenswrapper[4730]: I0320 17:13:51.540112    4730 scope.go:117] "RemoveContainer" containerID="581aa02672219ffcaf2185db619f24c1c4e9f4dc71ab9a090d63ff524fc5bbca"
Mar 20 17:13:51 crc kubenswrapper[4730]: I0320 17:13:51.844960    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" event={"ID":"7fcd3db3-55f1-4c23-8fa9-78844495cea3","Type":"ContainerStarted","Data":"5e057797e6cbf54310c6ad4bc172547f84470f218b7fb0b50a4e6f0a9d3a806d"}
Mar 20 17:14:00 crc kubenswrapper[4730]: I0320 17:14:00.151867    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567114-j8jvd"]
Mar 20 17:14:00 crc kubenswrapper[4730]: E0320 17:14:00.152791    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b24c66b-fc08-4f63-8ac8-15d11de2b672" containerName="oc"
Mar 20 17:14:00 crc kubenswrapper[4730]: I0320 17:14:00.152802    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b24c66b-fc08-4f63-8ac8-15d11de2b672" containerName="oc"
Mar 20 17:14:00 crc kubenswrapper[4730]: I0320 17:14:00.153004    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b24c66b-fc08-4f63-8ac8-15d11de2b672" containerName="oc"
Mar 20 17:14:00 crc kubenswrapper[4730]: I0320 17:14:00.153679    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567114-j8jvd"
Mar 20 17:14:00 crc kubenswrapper[4730]: I0320 17:14:00.156789    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt"
Mar 20 17:14:00 crc kubenswrapper[4730]: I0320 17:14:00.157336    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt"
Mar 20 17:14:00 crc kubenswrapper[4730]: I0320 17:14:00.157511    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vlqbl"
Mar 20 17:14:00 crc kubenswrapper[4730]: I0320 17:14:00.168034    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567114-j8jvd"]
Mar 20 17:14:00 crc kubenswrapper[4730]: I0320 17:14:00.302366    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc7d7\" (UniqueName: \"kubernetes.io/projected/e7911488-99e9-4ca9-a033-cc7507544155-kube-api-access-xc7d7\") pod \"auto-csr-approver-29567114-j8jvd\" (UID: \"e7911488-99e9-4ca9-a033-cc7507544155\") " pod="openshift-infra/auto-csr-approver-29567114-j8jvd"
Mar 20 17:14:00 crc kubenswrapper[4730]: I0320 17:14:00.404810    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc7d7\" (UniqueName: \"kubernetes.io/projected/e7911488-99e9-4ca9-a033-cc7507544155-kube-api-access-xc7d7\") pod \"auto-csr-approver-29567114-j8jvd\" (UID: \"e7911488-99e9-4ca9-a033-cc7507544155\") " pod="openshift-infra/auto-csr-approver-29567114-j8jvd"
Mar 20 17:14:00 crc kubenswrapper[4730]: I0320 17:14:00.427946    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc7d7\" (UniqueName: \"kubernetes.io/projected/e7911488-99e9-4ca9-a033-cc7507544155-kube-api-access-xc7d7\") pod \"auto-csr-approver-29567114-j8jvd\" (UID: \"e7911488-99e9-4ca9-a033-cc7507544155\") " pod="openshift-infra/auto-csr-approver-29567114-j8jvd"
Mar 20 17:14:00 crc kubenswrapper[4730]: I0320 17:14:00.472907    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567114-j8jvd"
Mar 20 17:14:01 crc kubenswrapper[4730]: I0320 17:14:01.017486    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567114-j8jvd"]
Mar 20 17:14:01 crc kubenswrapper[4730]: I0320 17:14:01.944407    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567114-j8jvd" event={"ID":"e7911488-99e9-4ca9-a033-cc7507544155","Type":"ContainerStarted","Data":"621449cee3581063df8522013ff4f30f2763ddb83fcdb9ac2fa11df0ff7d2a6f"}
Mar 20 17:14:02 crc kubenswrapper[4730]: I0320 17:14:02.955796    4730 generic.go:334] "Generic (PLEG): container finished" podID="e7911488-99e9-4ca9-a033-cc7507544155" containerID="b12b0d9a794aa0785dabcb98dfaf9c1c155b995ea23715f6c0f0aaa1f9e1a841" exitCode=0
Mar 20 17:14:02 crc kubenswrapper[4730]: I0320 17:14:02.955901    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567114-j8jvd" event={"ID":"e7911488-99e9-4ca9-a033-cc7507544155","Type":"ContainerDied","Data":"b12b0d9a794aa0785dabcb98dfaf9c1c155b995ea23715f6c0f0aaa1f9e1a841"}
Mar 20 17:14:04 crc kubenswrapper[4730]: I0320 17:14:04.341515    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567114-j8jvd"
Mar 20 17:14:04 crc kubenswrapper[4730]: I0320 17:14:04.524439    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xc7d7\" (UniqueName: \"kubernetes.io/projected/e7911488-99e9-4ca9-a033-cc7507544155-kube-api-access-xc7d7\") pod \"e7911488-99e9-4ca9-a033-cc7507544155\" (UID: \"e7911488-99e9-4ca9-a033-cc7507544155\") "
Mar 20 17:14:04 crc kubenswrapper[4730]: I0320 17:14:04.539790    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7911488-99e9-4ca9-a033-cc7507544155-kube-api-access-xc7d7" (OuterVolumeSpecName: "kube-api-access-xc7d7") pod "e7911488-99e9-4ca9-a033-cc7507544155" (UID: "e7911488-99e9-4ca9-a033-cc7507544155"). InnerVolumeSpecName "kube-api-access-xc7d7". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 17:14:04 crc kubenswrapper[4730]: I0320 17:14:04.627530    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xc7d7\" (UniqueName: \"kubernetes.io/projected/e7911488-99e9-4ca9-a033-cc7507544155-kube-api-access-xc7d7\") on node \"crc\" DevicePath \"\""
Mar 20 17:14:04 crc kubenswrapper[4730]: I0320 17:14:04.975410    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567114-j8jvd" event={"ID":"e7911488-99e9-4ca9-a033-cc7507544155","Type":"ContainerDied","Data":"621449cee3581063df8522013ff4f30f2763ddb83fcdb9ac2fa11df0ff7d2a6f"}
Mar 20 17:14:04 crc kubenswrapper[4730]: I0320 17:14:04.975456    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="621449cee3581063df8522013ff4f30f2763ddb83fcdb9ac2fa11df0ff7d2a6f"
Mar 20 17:14:04 crc kubenswrapper[4730]: I0320 17:14:04.975472    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567114-j8jvd"
Mar 20 17:14:05 crc kubenswrapper[4730]: I0320 17:14:05.420330    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567108-zftgj"]
Mar 20 17:14:05 crc kubenswrapper[4730]: I0320 17:14:05.430786    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567108-zftgj"]
Mar 20 17:14:05 crc kubenswrapper[4730]: I0320 17:14:05.552374    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee25ae29-2b59-43fa-bee7-ff759f2b962d" path="/var/lib/kubelet/pods/ee25ae29-2b59-43fa-bee7-ff759f2b962d/volumes"
Mar 20 17:14:55 crc kubenswrapper[4730]: I0320 17:14:55.787962    4730 scope.go:117] "RemoveContainer" containerID="b89905d33f72768b38a357a1f6d9426d8d5d00caccec7854337ced1d8a3cac16"
Mar 20 17:15:00 crc kubenswrapper[4730]: I0320 17:15:00.171129    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567115-h44j7"]
Mar 20 17:15:00 crc kubenswrapper[4730]: E0320 17:15:00.172400    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7911488-99e9-4ca9-a033-cc7507544155" containerName="oc"
Mar 20 17:15:00 crc kubenswrapper[4730]: I0320 17:15:00.172421    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7911488-99e9-4ca9-a033-cc7507544155" containerName="oc"
Mar 20 17:15:00 crc kubenswrapper[4730]: I0320 17:15:00.172777    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7911488-99e9-4ca9-a033-cc7507544155" containerName="oc"
Mar 20 17:15:00 crc kubenswrapper[4730]: I0320 17:15:00.174009    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-h44j7"
Mar 20 17:15:00 crc kubenswrapper[4730]: I0320 17:15:00.177136    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config"
Mar 20 17:15:00 crc kubenswrapper[4730]: I0320 17:15:00.186663    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567115-h44j7"]
Mar 20 17:15:00 crc kubenswrapper[4730]: I0320 17:15:00.199399    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t"
Mar 20 17:15:00 crc kubenswrapper[4730]: I0320 17:15:00.337358    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c39e4d8b-b591-43ff-b748-b42cde14f3b4-config-volume\") pod \"collect-profiles-29567115-h44j7\" (UID: \"c39e4d8b-b591-43ff-b748-b42cde14f3b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-h44j7"
Mar 20 17:15:00 crc kubenswrapper[4730]: I0320 17:15:00.337722    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s555p\" (UniqueName: \"kubernetes.io/projected/c39e4d8b-b591-43ff-b748-b42cde14f3b4-kube-api-access-s555p\") pod \"collect-profiles-29567115-h44j7\" (UID: \"c39e4d8b-b591-43ff-b748-b42cde14f3b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-h44j7"
Mar 20 17:15:00 crc kubenswrapper[4730]: I0320 17:15:00.337822    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c39e4d8b-b591-43ff-b748-b42cde14f3b4-secret-volume\") pod \"collect-profiles-29567115-h44j7\" (UID: \"c39e4d8b-b591-43ff-b748-b42cde14f3b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-h44j7"
Mar 20 17:15:00 crc kubenswrapper[4730]: I0320 17:15:00.439958    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s555p\" (UniqueName: \"kubernetes.io/projected/c39e4d8b-b591-43ff-b748-b42cde14f3b4-kube-api-access-s555p\") pod \"collect-profiles-29567115-h44j7\" (UID: \"c39e4d8b-b591-43ff-b748-b42cde14f3b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-h44j7"
Mar 20 17:15:00 crc kubenswrapper[4730]: I0320 17:15:00.440391    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c39e4d8b-b591-43ff-b748-b42cde14f3b4-secret-volume\") pod \"collect-profiles-29567115-h44j7\" (UID: \"c39e4d8b-b591-43ff-b748-b42cde14f3b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-h44j7"
Mar 20 17:15:00 crc kubenswrapper[4730]: I0320 17:15:00.440573    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c39e4d8b-b591-43ff-b748-b42cde14f3b4-config-volume\") pod \"collect-profiles-29567115-h44j7\" (UID: \"c39e4d8b-b591-43ff-b748-b42cde14f3b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-h44j7"
Mar 20 17:15:00 crc kubenswrapper[4730]: I0320 17:15:00.441817    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c39e4d8b-b591-43ff-b748-b42cde14f3b4-config-volume\") pod \"collect-profiles-29567115-h44j7\" (UID: \"c39e4d8b-b591-43ff-b748-b42cde14f3b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-h44j7"
Mar 20 17:15:00 crc kubenswrapper[4730]: I0320 17:15:00.451653    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c39e4d8b-b591-43ff-b748-b42cde14f3b4-secret-volume\") pod \"collect-profiles-29567115-h44j7\" (UID: \"c39e4d8b-b591-43ff-b748-b42cde14f3b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-h44j7"
Mar 20 17:15:00 crc kubenswrapper[4730]: I0320 17:15:00.479672    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s555p\" (UniqueName: \"kubernetes.io/projected/c39e4d8b-b591-43ff-b748-b42cde14f3b4-kube-api-access-s555p\") pod \"collect-profiles-29567115-h44j7\" (UID: \"c39e4d8b-b591-43ff-b748-b42cde14f3b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-h44j7"
Mar 20 17:15:00 crc kubenswrapper[4730]: I0320 17:15:00.506793    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-h44j7"
Mar 20 17:15:01 crc kubenswrapper[4730]: I0320 17:15:01.020507    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567115-h44j7"]
Mar 20 17:15:01 crc kubenswrapper[4730]: I0320 17:15:01.621860    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-h44j7" event={"ID":"c39e4d8b-b591-43ff-b748-b42cde14f3b4","Type":"ContainerStarted","Data":"d876fdd4508e5916e191354e6969b5440f41eff8f94aa2d13d1bc608ceaf31ba"}
Mar 20 17:15:01 crc kubenswrapper[4730]: I0320 17:15:01.622471    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-h44j7" event={"ID":"c39e4d8b-b591-43ff-b748-b42cde14f3b4","Type":"ContainerStarted","Data":"2c4a8eb757e88191b0c1c62006784189fc622053dff0c7d3f1a8b5fd8f3c2e29"}
Mar 20 17:15:01 crc kubenswrapper[4730]: I0320 17:15:01.640622    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-h44j7" podStartSLOduration=1.640605579 podStartE2EDuration="1.640605579s" podCreationTimestamp="2026-03-20 17:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:15:01.637791479 +0000 UTC m=+5760.851162888" watchObservedRunningTime="2026-03-20 17:15:01.640605579 +0000 UTC m=+5760.853976948"
Mar 20 17:15:02 crc kubenswrapper[4730]: I0320 17:15:02.639212    4730 generic.go:334] "Generic (PLEG): container finished" podID="c39e4d8b-b591-43ff-b748-b42cde14f3b4" containerID="d876fdd4508e5916e191354e6969b5440f41eff8f94aa2d13d1bc608ceaf31ba" exitCode=0
Mar 20 17:15:02 crc kubenswrapper[4730]: I0320 17:15:02.639283    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-h44j7" event={"ID":"c39e4d8b-b591-43ff-b748-b42cde14f3b4","Type":"ContainerDied","Data":"d876fdd4508e5916e191354e6969b5440f41eff8f94aa2d13d1bc608ceaf31ba"}
Mar 20 17:15:04 crc kubenswrapper[4730]: I0320 17:15:04.072934    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-h44j7"
Mar 20 17:15:04 crc kubenswrapper[4730]: I0320 17:15:04.134984    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s555p\" (UniqueName: \"kubernetes.io/projected/c39e4d8b-b591-43ff-b748-b42cde14f3b4-kube-api-access-s555p\") pod \"c39e4d8b-b591-43ff-b748-b42cde14f3b4\" (UID: \"c39e4d8b-b591-43ff-b748-b42cde14f3b4\") "
Mar 20 17:15:04 crc kubenswrapper[4730]: I0320 17:15:04.135045    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c39e4d8b-b591-43ff-b748-b42cde14f3b4-secret-volume\") pod \"c39e4d8b-b591-43ff-b748-b42cde14f3b4\" (UID: \"c39e4d8b-b591-43ff-b748-b42cde14f3b4\") "
Mar 20 17:15:04 crc kubenswrapper[4730]: I0320 17:15:04.135234    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c39e4d8b-b591-43ff-b748-b42cde14f3b4-config-volume\") pod \"c39e4d8b-b591-43ff-b748-b42cde14f3b4\" (UID: \"c39e4d8b-b591-43ff-b748-b42cde14f3b4\") "
Mar 20 17:15:04 crc kubenswrapper[4730]: I0320 17:15:04.136595    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c39e4d8b-b591-43ff-b748-b42cde14f3b4-config-volume" (OuterVolumeSpecName: "config-volume") pod "c39e4d8b-b591-43ff-b748-b42cde14f3b4" (UID: "c39e4d8b-b591-43ff-b748-b42cde14f3b4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue ""
Mar 20 17:15:04 crc kubenswrapper[4730]: I0320 17:15:04.142135    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c39e4d8b-b591-43ff-b748-b42cde14f3b4-kube-api-access-s555p" (OuterVolumeSpecName: "kube-api-access-s555p") pod "c39e4d8b-b591-43ff-b748-b42cde14f3b4" (UID: "c39e4d8b-b591-43ff-b748-b42cde14f3b4"). InnerVolumeSpecName "kube-api-access-s555p". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 17:15:04 crc kubenswrapper[4730]: I0320 17:15:04.150863    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c39e4d8b-b591-43ff-b748-b42cde14f3b4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c39e4d8b-b591-43ff-b748-b42cde14f3b4" (UID: "c39e4d8b-b591-43ff-b748-b42cde14f3b4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue ""
Mar 20 17:15:04 crc kubenswrapper[4730]: I0320 17:15:04.236616    4730 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c39e4d8b-b591-43ff-b748-b42cde14f3b4-config-volume\") on node \"crc\" DevicePath \"\""
Mar 20 17:15:04 crc kubenswrapper[4730]: I0320 17:15:04.236648    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s555p\" (UniqueName: \"kubernetes.io/projected/c39e4d8b-b591-43ff-b748-b42cde14f3b4-kube-api-access-s555p\") on node \"crc\" DevicePath \"\""
Mar 20 17:15:04 crc kubenswrapper[4730]: I0320 17:15:04.236660    4730 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c39e4d8b-b591-43ff-b748-b42cde14f3b4-secret-volume\") on node \"crc\" DevicePath \"\""
Mar 20 17:15:04 crc kubenswrapper[4730]: I0320 17:15:04.686239    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567070-5g4gp"]
Mar 20 17:15:04 crc kubenswrapper[4730]: I0320 17:15:04.696646    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-h44j7" event={"ID":"c39e4d8b-b591-43ff-b748-b42cde14f3b4","Type":"ContainerDied","Data":"2c4a8eb757e88191b0c1c62006784189fc622053dff0c7d3f1a8b5fd8f3c2e29"}
Mar 20 17:15:04 crc kubenswrapper[4730]: I0320 17:15:04.696690    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c4a8eb757e88191b0c1c62006784189fc622053dff0c7d3f1a8b5fd8f3c2e29"
Mar 20 17:15:04 crc kubenswrapper[4730]: I0320 17:15:04.696777    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-h44j7"
Mar 20 17:15:04 crc kubenswrapper[4730]: I0320 17:15:04.701265    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567070-5g4gp"]
Mar 20 17:15:05 crc kubenswrapper[4730]: I0320 17:15:05.550278    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c86d92cc-d42e-496f-b31c-d6c56fb441c7" path="/var/lib/kubelet/pods/c86d92cc-d42e-496f-b31c-d6c56fb441c7/volumes"
Mar 20 17:15:35 crc kubenswrapper[4730]: I0320 17:15:35.153805    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bxzjx"]
Mar 20 17:15:35 crc kubenswrapper[4730]: E0320 17:15:35.155725    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c39e4d8b-b591-43ff-b748-b42cde14f3b4" containerName="collect-profiles"
Mar 20 17:15:35 crc kubenswrapper[4730]: I0320 17:15:35.155751    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="c39e4d8b-b591-43ff-b748-b42cde14f3b4" containerName="collect-profiles"
Mar 20 17:15:35 crc kubenswrapper[4730]: I0320 17:15:35.156357    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="c39e4d8b-b591-43ff-b748-b42cde14f3b4" containerName="collect-profiles"
Mar 20 17:15:35 crc kubenswrapper[4730]: I0320 17:15:35.159148    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bxzjx"
Mar 20 17:15:35 crc kubenswrapper[4730]: I0320 17:15:35.167811    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bxzjx"]
Mar 20 17:15:35 crc kubenswrapper[4730]: I0320 17:15:35.255060    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/791ba4a6-f2b3-4689-9e71-0de25f4604d0-catalog-content\") pod \"community-operators-bxzjx\" (UID: \"791ba4a6-f2b3-4689-9e71-0de25f4604d0\") " pod="openshift-marketplace/community-operators-bxzjx"
Mar 20 17:15:35 crc kubenswrapper[4730]: I0320 17:15:35.255198    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62hrp\" (UniqueName: \"kubernetes.io/projected/791ba4a6-f2b3-4689-9e71-0de25f4604d0-kube-api-access-62hrp\") pod \"community-operators-bxzjx\" (UID: \"791ba4a6-f2b3-4689-9e71-0de25f4604d0\") " pod="openshift-marketplace/community-operators-bxzjx"
Mar 20 17:15:35 crc kubenswrapper[4730]: I0320 17:15:35.255322    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/791ba4a6-f2b3-4689-9e71-0de25f4604d0-utilities\") pod \"community-operators-bxzjx\" (UID: \"791ba4a6-f2b3-4689-9e71-0de25f4604d0\") " pod="openshift-marketplace/community-operators-bxzjx"
Mar 20 17:15:35 crc kubenswrapper[4730]: I0320 17:15:35.357709    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/791ba4a6-f2b3-4689-9e71-0de25f4604d0-catalog-content\") pod \"community-operators-bxzjx\" (UID: \"791ba4a6-f2b3-4689-9e71-0de25f4604d0\") " pod="openshift-marketplace/community-operators-bxzjx"
Mar 20 17:15:35 crc kubenswrapper[4730]: I0320 17:15:35.357805    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62hrp\" (UniqueName: \"kubernetes.io/projected/791ba4a6-f2b3-4689-9e71-0de25f4604d0-kube-api-access-62hrp\") pod \"community-operators-bxzjx\" (UID: \"791ba4a6-f2b3-4689-9e71-0de25f4604d0\") " pod="openshift-marketplace/community-operators-bxzjx"
Mar 20 17:15:35 crc kubenswrapper[4730]: I0320 17:15:35.357884    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/791ba4a6-f2b3-4689-9e71-0de25f4604d0-utilities\") pod \"community-operators-bxzjx\" (UID: \"791ba4a6-f2b3-4689-9e71-0de25f4604d0\") " pod="openshift-marketplace/community-operators-bxzjx"
Mar 20 17:15:35 crc kubenswrapper[4730]: I0320 17:15:35.358496    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/791ba4a6-f2b3-4689-9e71-0de25f4604d0-utilities\") pod \"community-operators-bxzjx\" (UID: \"791ba4a6-f2b3-4689-9e71-0de25f4604d0\") " pod="openshift-marketplace/community-operators-bxzjx"
Mar 20 17:15:35 crc kubenswrapper[4730]: I0320 17:15:35.358791    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/791ba4a6-f2b3-4689-9e71-0de25f4604d0-catalog-content\") pod \"community-operators-bxzjx\" (UID: \"791ba4a6-f2b3-4689-9e71-0de25f4604d0\") " pod="openshift-marketplace/community-operators-bxzjx"
Mar 20 17:15:35 crc kubenswrapper[4730]: I0320 17:15:35.387895    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62hrp\" (UniqueName: \"kubernetes.io/projected/791ba4a6-f2b3-4689-9e71-0de25f4604d0-kube-api-access-62hrp\") pod \"community-operators-bxzjx\" (UID: \"791ba4a6-f2b3-4689-9e71-0de25f4604d0\") " pod="openshift-marketplace/community-operators-bxzjx"
Mar 20 17:15:35 crc kubenswrapper[4730]: I0320 17:15:35.491891    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bxzjx"
Mar 20 17:15:36 crc kubenswrapper[4730]: I0320 17:15:36.051688    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bxzjx"]
Mar 20 17:15:36 crc kubenswrapper[4730]: I0320 17:15:36.060808    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bxzjx" event={"ID":"791ba4a6-f2b3-4689-9e71-0de25f4604d0","Type":"ContainerStarted","Data":"b13f5df7dabac5d1b59a2b83dfc3ee8119163948122309b07d794df7c6658400"}
Mar 20 17:15:37 crc kubenswrapper[4730]: I0320 17:15:37.077314    4730 generic.go:334] "Generic (PLEG): container finished" podID="791ba4a6-f2b3-4689-9e71-0de25f4604d0" containerID="04e2ee9ec620bf90cddc552326a5dd3c11ecccc6463da10f015d8dca03fddc41" exitCode=0
Mar 20 17:15:37 crc kubenswrapper[4730]: I0320 17:15:37.077406    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bxzjx" event={"ID":"791ba4a6-f2b3-4689-9e71-0de25f4604d0","Type":"ContainerDied","Data":"04e2ee9ec620bf90cddc552326a5dd3c11ecccc6463da10f015d8dca03fddc41"}
Mar 20 17:15:37 crc kubenswrapper[4730]: I0320 17:15:37.080227    4730 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider
Mar 20 17:15:39 crc kubenswrapper[4730]: I0320 17:15:39.104770    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bxzjx" event={"ID":"791ba4a6-f2b3-4689-9e71-0de25f4604d0","Type":"ContainerStarted","Data":"20aaa39386bf4bfad8806bfbe95e228235dfe7287dd2492b93a9e3dfc6dd2c17"}
Mar 20 17:15:40 crc kubenswrapper[4730]: I0320 17:15:40.116796    4730 generic.go:334] "Generic (PLEG): container finished" podID="791ba4a6-f2b3-4689-9e71-0de25f4604d0" containerID="20aaa39386bf4bfad8806bfbe95e228235dfe7287dd2492b93a9e3dfc6dd2c17" exitCode=0
Mar 20 17:15:40 crc kubenswrapper[4730]: I0320 17:15:40.116843    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bxzjx" event={"ID":"791ba4a6-f2b3-4689-9e71-0de25f4604d0","Type":"ContainerDied","Data":"20aaa39386bf4bfad8806bfbe95e228235dfe7287dd2492b93a9e3dfc6dd2c17"}
Mar 20 17:15:41 crc kubenswrapper[4730]: I0320 17:15:41.127999    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bxzjx" event={"ID":"791ba4a6-f2b3-4689-9e71-0de25f4604d0","Type":"ContainerStarted","Data":"0b90ef7a58192d8a356acbacd08222c1dfbd6fb9675fd51426cc6e2ebaeb31f1"}
Mar 20 17:15:41 crc kubenswrapper[4730]: I0320 17:15:41.165855    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bxzjx" podStartSLOduration=2.721705126 podStartE2EDuration="6.165836333s" podCreationTimestamp="2026-03-20 17:15:35 +0000 UTC" firstStartedPulling="2026-03-20 17:15:37.079851774 +0000 UTC m=+5796.293223173" lastFinishedPulling="2026-03-20 17:15:40.523982971 +0000 UTC m=+5799.737354380" observedRunningTime="2026-03-20 17:15:41.152573295 +0000 UTC m=+5800.365944674" watchObservedRunningTime="2026-03-20 17:15:41.165836333 +0000 UTC m=+5800.379207712"
Mar 20 17:15:45 crc kubenswrapper[4730]: I0320 17:15:45.492335    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bxzjx"
Mar 20 17:15:45 crc kubenswrapper[4730]: I0320 17:15:45.492974    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bxzjx"
Mar 20 17:15:45 crc kubenswrapper[4730]: I0320 17:15:45.573483    4730 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bxzjx"
Mar 20 17:15:46 crc kubenswrapper[4730]: I0320 17:15:46.258071    4730 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bxzjx"
Mar 20 17:15:46 crc kubenswrapper[4730]: I0320 17:15:46.322368    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bxzjx"]
Mar 20 17:15:48 crc kubenswrapper[4730]: I0320 17:15:48.208088    4730 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bxzjx" podUID="791ba4a6-f2b3-4689-9e71-0de25f4604d0" containerName="registry-server" containerID="cri-o://0b90ef7a58192d8a356acbacd08222c1dfbd6fb9675fd51426cc6e2ebaeb31f1" gracePeriod=2
Mar 20 17:15:48 crc kubenswrapper[4730]: I0320 17:15:48.974877    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bxzjx"
Mar 20 17:15:49 crc kubenswrapper[4730]: I0320 17:15:49.134477    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/791ba4a6-f2b3-4689-9e71-0de25f4604d0-utilities\") pod \"791ba4a6-f2b3-4689-9e71-0de25f4604d0\" (UID: \"791ba4a6-f2b3-4689-9e71-0de25f4604d0\") "
Mar 20 17:15:49 crc kubenswrapper[4730]: I0320 17:15:49.134597    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/791ba4a6-f2b3-4689-9e71-0de25f4604d0-catalog-content\") pod \"791ba4a6-f2b3-4689-9e71-0de25f4604d0\" (UID: \"791ba4a6-f2b3-4689-9e71-0de25f4604d0\") "
Mar 20 17:15:49 crc kubenswrapper[4730]: I0320 17:15:49.134802    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62hrp\" (UniqueName: \"kubernetes.io/projected/791ba4a6-f2b3-4689-9e71-0de25f4604d0-kube-api-access-62hrp\") pod \"791ba4a6-f2b3-4689-9e71-0de25f4604d0\" (UID: \"791ba4a6-f2b3-4689-9e71-0de25f4604d0\") "
Mar 20 17:15:49 crc kubenswrapper[4730]: I0320 17:15:49.135407    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/791ba4a6-f2b3-4689-9e71-0de25f4604d0-utilities" (OuterVolumeSpecName: "utilities") pod "791ba4a6-f2b3-4689-9e71-0de25f4604d0" (UID: "791ba4a6-f2b3-4689-9e71-0de25f4604d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 17:15:49 crc kubenswrapper[4730]: I0320 17:15:49.135613    4730 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/791ba4a6-f2b3-4689-9e71-0de25f4604d0-utilities\") on node \"crc\" DevicePath \"\""
Mar 20 17:15:49 crc kubenswrapper[4730]: I0320 17:15:49.150595    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/791ba4a6-f2b3-4689-9e71-0de25f4604d0-kube-api-access-62hrp" (OuterVolumeSpecName: "kube-api-access-62hrp") pod "791ba4a6-f2b3-4689-9e71-0de25f4604d0" (UID: "791ba4a6-f2b3-4689-9e71-0de25f4604d0"). InnerVolumeSpecName "kube-api-access-62hrp". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 17:15:49 crc kubenswrapper[4730]: I0320 17:15:49.189834    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/791ba4a6-f2b3-4689-9e71-0de25f4604d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "791ba4a6-f2b3-4689-9e71-0de25f4604d0" (UID: "791ba4a6-f2b3-4689-9e71-0de25f4604d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue ""
Mar 20 17:15:49 crc kubenswrapper[4730]: I0320 17:15:49.229938    4730 generic.go:334] "Generic (PLEG): container finished" podID="791ba4a6-f2b3-4689-9e71-0de25f4604d0" containerID="0b90ef7a58192d8a356acbacd08222c1dfbd6fb9675fd51426cc6e2ebaeb31f1" exitCode=0
Mar 20 17:15:49 crc kubenswrapper[4730]: I0320 17:15:49.230005    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bxzjx" event={"ID":"791ba4a6-f2b3-4689-9e71-0de25f4604d0","Type":"ContainerDied","Data":"0b90ef7a58192d8a356acbacd08222c1dfbd6fb9675fd51426cc6e2ebaeb31f1"}
Mar 20 17:15:49 crc kubenswrapper[4730]: I0320 17:15:49.230043    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bxzjx" event={"ID":"791ba4a6-f2b3-4689-9e71-0de25f4604d0","Type":"ContainerDied","Data":"b13f5df7dabac5d1b59a2b83dfc3ee8119163948122309b07d794df7c6658400"}
Mar 20 17:15:49 crc kubenswrapper[4730]: I0320 17:15:49.230070    4730 scope.go:117] "RemoveContainer" containerID="0b90ef7a58192d8a356acbacd08222c1dfbd6fb9675fd51426cc6e2ebaeb31f1"
Mar 20 17:15:49 crc kubenswrapper[4730]: I0320 17:15:49.230418    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bxzjx"
Mar 20 17:15:49 crc kubenswrapper[4730]: I0320 17:15:49.237105    4730 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/791ba4a6-f2b3-4689-9e71-0de25f4604d0-catalog-content\") on node \"crc\" DevicePath \"\""
Mar 20 17:15:49 crc kubenswrapper[4730]: I0320 17:15:49.237134    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62hrp\" (UniqueName: \"kubernetes.io/projected/791ba4a6-f2b3-4689-9e71-0de25f4604d0-kube-api-access-62hrp\") on node \"crc\" DevicePath \"\""
Mar 20 17:15:49 crc kubenswrapper[4730]: I0320 17:15:49.259522    4730 scope.go:117] "RemoveContainer" containerID="20aaa39386bf4bfad8806bfbe95e228235dfe7287dd2492b93a9e3dfc6dd2c17"
Mar 20 17:15:49 crc kubenswrapper[4730]: I0320 17:15:49.278120    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bxzjx"]
Mar 20 17:15:49 crc kubenswrapper[4730]: I0320 17:15:49.287075    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bxzjx"]
Mar 20 17:15:49 crc kubenswrapper[4730]: I0320 17:15:49.308200    4730 scope.go:117] "RemoveContainer" containerID="04e2ee9ec620bf90cddc552326a5dd3c11ecccc6463da10f015d8dca03fddc41"
Mar 20 17:15:49 crc kubenswrapper[4730]: I0320 17:15:49.342216    4730 scope.go:117] "RemoveContainer" containerID="0b90ef7a58192d8a356acbacd08222c1dfbd6fb9675fd51426cc6e2ebaeb31f1"
Mar 20 17:15:49 crc kubenswrapper[4730]: E0320 17:15:49.347738    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b90ef7a58192d8a356acbacd08222c1dfbd6fb9675fd51426cc6e2ebaeb31f1\": container with ID starting with 0b90ef7a58192d8a356acbacd08222c1dfbd6fb9675fd51426cc6e2ebaeb31f1 not found: ID does not exist" containerID="0b90ef7a58192d8a356acbacd08222c1dfbd6fb9675fd51426cc6e2ebaeb31f1"
Mar 20 17:15:49 crc kubenswrapper[4730]: I0320 17:15:49.347781    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b90ef7a58192d8a356acbacd08222c1dfbd6fb9675fd51426cc6e2ebaeb31f1"} err="failed to get container status \"0b90ef7a58192d8a356acbacd08222c1dfbd6fb9675fd51426cc6e2ebaeb31f1\": rpc error: code = NotFound desc = could not find container \"0b90ef7a58192d8a356acbacd08222c1dfbd6fb9675fd51426cc6e2ebaeb31f1\": container with ID starting with 0b90ef7a58192d8a356acbacd08222c1dfbd6fb9675fd51426cc6e2ebaeb31f1 not found: ID does not exist"
Mar 20 17:15:49 crc kubenswrapper[4730]: I0320 17:15:49.347808    4730 scope.go:117] "RemoveContainer" containerID="20aaa39386bf4bfad8806bfbe95e228235dfe7287dd2492b93a9e3dfc6dd2c17"
Mar 20 17:15:49 crc kubenswrapper[4730]: E0320 17:15:49.349851    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20aaa39386bf4bfad8806bfbe95e228235dfe7287dd2492b93a9e3dfc6dd2c17\": container with ID starting with 20aaa39386bf4bfad8806bfbe95e228235dfe7287dd2492b93a9e3dfc6dd2c17 not found: ID does not exist" containerID="20aaa39386bf4bfad8806bfbe95e228235dfe7287dd2492b93a9e3dfc6dd2c17"
Mar 20 17:15:49 crc kubenswrapper[4730]: I0320 17:15:49.349900    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20aaa39386bf4bfad8806bfbe95e228235dfe7287dd2492b93a9e3dfc6dd2c17"} err="failed to get container status \"20aaa39386bf4bfad8806bfbe95e228235dfe7287dd2492b93a9e3dfc6dd2c17\": rpc error: code = NotFound desc = could not find container \"20aaa39386bf4bfad8806bfbe95e228235dfe7287dd2492b93a9e3dfc6dd2c17\": container with ID starting with 20aaa39386bf4bfad8806bfbe95e228235dfe7287dd2492b93a9e3dfc6dd2c17 not found: ID does not exist"
Mar 20 17:15:49 crc kubenswrapper[4730]: I0320 17:15:49.349927    4730 scope.go:117] "RemoveContainer" containerID="04e2ee9ec620bf90cddc552326a5dd3c11ecccc6463da10f015d8dca03fddc41"
Mar 20 17:15:49 crc kubenswrapper[4730]: E0320 17:15:49.350376    4730 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04e2ee9ec620bf90cddc552326a5dd3c11ecccc6463da10f015d8dca03fddc41\": container with ID starting with 04e2ee9ec620bf90cddc552326a5dd3c11ecccc6463da10f015d8dca03fddc41 not found: ID does not exist" containerID="04e2ee9ec620bf90cddc552326a5dd3c11ecccc6463da10f015d8dca03fddc41"
Mar 20 17:15:49 crc kubenswrapper[4730]: I0320 17:15:49.350429    4730 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04e2ee9ec620bf90cddc552326a5dd3c11ecccc6463da10f015d8dca03fddc41"} err="failed to get container status \"04e2ee9ec620bf90cddc552326a5dd3c11ecccc6463da10f015d8dca03fddc41\": rpc error: code = NotFound desc = could not find container \"04e2ee9ec620bf90cddc552326a5dd3c11ecccc6463da10f015d8dca03fddc41\": container with ID starting with 04e2ee9ec620bf90cddc552326a5dd3c11ecccc6463da10f015d8dca03fddc41 not found: ID does not exist"
Mar 20 17:15:49 crc kubenswrapper[4730]: I0320 17:15:49.557290    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="791ba4a6-f2b3-4689-9e71-0de25f4604d0" path="/var/lib/kubelet/pods/791ba4a6-f2b3-4689-9e71-0de25f4604d0/volumes"
Mar 20 17:15:55 crc kubenswrapper[4730]: I0320 17:15:55.875210    4730 scope.go:117] "RemoveContainer" containerID="dc2c07b3766f06e0423270d40c09a7a028e4cbca82d59a060deedb7b5661816a"
Mar 20 17:16:00 crc kubenswrapper[4730]: I0320 17:16:00.176446    4730 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567116-j4glj"]
Mar 20 17:16:00 crc kubenswrapper[4730]: E0320 17:16:00.178624    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="791ba4a6-f2b3-4689-9e71-0de25f4604d0" containerName="registry-server"
Mar 20 17:16:00 crc kubenswrapper[4730]: I0320 17:16:00.178653    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="791ba4a6-f2b3-4689-9e71-0de25f4604d0" containerName="registry-server"
Mar 20 17:16:00 crc kubenswrapper[4730]: E0320 17:16:00.178704    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="791ba4a6-f2b3-4689-9e71-0de25f4604d0" containerName="extract-utilities"
Mar 20 17:16:00 crc kubenswrapper[4730]: I0320 17:16:00.178713    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="791ba4a6-f2b3-4689-9e71-0de25f4604d0" containerName="extract-utilities"
Mar 20 17:16:00 crc kubenswrapper[4730]: E0320 17:16:00.178728    4730 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="791ba4a6-f2b3-4689-9e71-0de25f4604d0" containerName="extract-content"
Mar 20 17:16:00 crc kubenswrapper[4730]: I0320 17:16:00.178744    4730 state_mem.go:107] "Deleted CPUSet assignment" podUID="791ba4a6-f2b3-4689-9e71-0de25f4604d0" containerName="extract-content"
Mar 20 17:16:00 crc kubenswrapper[4730]: I0320 17:16:00.178994    4730 memory_manager.go:354] "RemoveStaleState removing state" podUID="791ba4a6-f2b3-4689-9e71-0de25f4604d0" containerName="registry-server"
Mar 20 17:16:00 crc kubenswrapper[4730]: I0320 17:16:00.179762    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567116-j4glj"
Mar 20 17:16:00 crc kubenswrapper[4730]: I0320 17:16:00.182303    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt"
Mar 20 17:16:00 crc kubenswrapper[4730]: I0320 17:16:00.182644    4730 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt"
Mar 20 17:16:00 crc kubenswrapper[4730]: I0320 17:16:00.182896    4730 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vlqbl"
Mar 20 17:16:00 crc kubenswrapper[4730]: I0320 17:16:00.195724    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567116-j4glj"]
Mar 20 17:16:00 crc kubenswrapper[4730]: I0320 17:16:00.281833    4730 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt4cr\" (UniqueName: \"kubernetes.io/projected/c93b527b-9082-468d-8a7d-9c40023e92f4-kube-api-access-bt4cr\") pod \"auto-csr-approver-29567116-j4glj\" (UID: \"c93b527b-9082-468d-8a7d-9c40023e92f4\") " pod="openshift-infra/auto-csr-approver-29567116-j4glj"
Mar 20 17:16:00 crc kubenswrapper[4730]: I0320 17:16:00.384581    4730 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt4cr\" (UniqueName: \"kubernetes.io/projected/c93b527b-9082-468d-8a7d-9c40023e92f4-kube-api-access-bt4cr\") pod \"auto-csr-approver-29567116-j4glj\" (UID: \"c93b527b-9082-468d-8a7d-9c40023e92f4\") " pod="openshift-infra/auto-csr-approver-29567116-j4glj"
Mar 20 17:16:00 crc kubenswrapper[4730]: I0320 17:16:00.414601    4730 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt4cr\" (UniqueName: \"kubernetes.io/projected/c93b527b-9082-468d-8a7d-9c40023e92f4-kube-api-access-bt4cr\") pod \"auto-csr-approver-29567116-j4glj\" (UID: \"c93b527b-9082-468d-8a7d-9c40023e92f4\") " pod="openshift-infra/auto-csr-approver-29567116-j4glj"
Mar 20 17:16:00 crc kubenswrapper[4730]: I0320 17:16:00.507447    4730 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567116-j4glj"
Mar 20 17:16:00 crc kubenswrapper[4730]: I0320 17:16:00.831236    4730 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567116-j4glj"]
Mar 20 17:16:00 crc kubenswrapper[4730]: W0320 17:16:00.841538    4730 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc93b527b_9082_468d_8a7d_9c40023e92f4.slice/crio-22953afd2d795cdc6d80eed1f4ba1a4fdf6c8530363cff9b77f1fe7cfdc698fc WatchSource:0}: Error finding container 22953afd2d795cdc6d80eed1f4ba1a4fdf6c8530363cff9b77f1fe7cfdc698fc: Status 404 returned error can't find the container with id 22953afd2d795cdc6d80eed1f4ba1a4fdf6c8530363cff9b77f1fe7cfdc698fc
Mar 20 17:16:01 crc kubenswrapper[4730]: I0320 17:16:01.373730    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567116-j4glj" event={"ID":"c93b527b-9082-468d-8a7d-9c40023e92f4","Type":"ContainerStarted","Data":"22953afd2d795cdc6d80eed1f4ba1a4fdf6c8530363cff9b77f1fe7cfdc698fc"}
Mar 20 17:16:02 crc kubenswrapper[4730]: I0320 17:16:02.385685    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567116-j4glj" event={"ID":"c93b527b-9082-468d-8a7d-9c40023e92f4","Type":"ContainerStarted","Data":"4d43c7d4bdb98f9e776981a409e6b423d12e3f46827a144b8d0da0bc996bfc87"}
Mar 20 17:16:02 crc kubenswrapper[4730]: I0320 17:16:02.420774    4730 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567116-j4glj" podStartSLOduration=1.4968243270000001 podStartE2EDuration="2.420752116s" podCreationTimestamp="2026-03-20 17:16:00 +0000 UTC" firstStartedPulling="2026-03-20 17:16:00.845447844 +0000 UTC m=+5820.058819213" lastFinishedPulling="2026-03-20 17:16:01.769375593 +0000 UTC m=+5820.982747002" observedRunningTime="2026-03-20 17:16:02.408158297 +0000 UTC m=+5821.621529696" watchObservedRunningTime="2026-03-20 17:16:02.420752116 +0000 UTC m=+5821.634123485"
Mar 20 17:16:03 crc kubenswrapper[4730]: I0320 17:16:03.397766    4730 generic.go:334] "Generic (PLEG): container finished" podID="c93b527b-9082-468d-8a7d-9c40023e92f4" containerID="4d43c7d4bdb98f9e776981a409e6b423d12e3f46827a144b8d0da0bc996bfc87" exitCode=0
Mar 20 17:16:03 crc kubenswrapper[4730]: I0320 17:16:03.398146    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567116-j4glj" event={"ID":"c93b527b-9082-468d-8a7d-9c40023e92f4","Type":"ContainerDied","Data":"4d43c7d4bdb98f9e776981a409e6b423d12e3f46827a144b8d0da0bc996bfc87"}
Mar 20 17:16:04 crc kubenswrapper[4730]: I0320 17:16:04.891076    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567116-j4glj"
Mar 20 17:16:04 crc kubenswrapper[4730]: I0320 17:16:04.997301    4730 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bt4cr\" (UniqueName: \"kubernetes.io/projected/c93b527b-9082-468d-8a7d-9c40023e92f4-kube-api-access-bt4cr\") pod \"c93b527b-9082-468d-8a7d-9c40023e92f4\" (UID: \"c93b527b-9082-468d-8a7d-9c40023e92f4\") "
Mar 20 17:16:05 crc kubenswrapper[4730]: I0320 17:16:05.022785    4730 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c93b527b-9082-468d-8a7d-9c40023e92f4-kube-api-access-bt4cr" (OuterVolumeSpecName: "kube-api-access-bt4cr") pod "c93b527b-9082-468d-8a7d-9c40023e92f4" (UID: "c93b527b-9082-468d-8a7d-9c40023e92f4"). InnerVolumeSpecName "kube-api-access-bt4cr". PluginName "kubernetes.io/projected", VolumeGidValue ""
Mar 20 17:16:05 crc kubenswrapper[4730]: I0320 17:16:05.101708    4730 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bt4cr\" (UniqueName: \"kubernetes.io/projected/c93b527b-9082-468d-8a7d-9c40023e92f4-kube-api-access-bt4cr\") on node \"crc\" DevicePath \"\""
Mar 20 17:16:05 crc kubenswrapper[4730]: I0320 17:16:05.426042    4730 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567116-j4glj" event={"ID":"c93b527b-9082-468d-8a7d-9c40023e92f4","Type":"ContainerDied","Data":"22953afd2d795cdc6d80eed1f4ba1a4fdf6c8530363cff9b77f1fe7cfdc698fc"}
Mar 20 17:16:05 crc kubenswrapper[4730]: I0320 17:16:05.426094    4730 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22953afd2d795cdc6d80eed1f4ba1a4fdf6c8530363cff9b77f1fe7cfdc698fc"
Mar 20 17:16:05 crc kubenswrapper[4730]: I0320 17:16:05.426116    4730 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567116-j4glj"
Mar 20 17:16:05 crc kubenswrapper[4730]: I0320 17:16:05.977687    4730 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567110-n8lgt"]
Mar 20 17:16:05 crc kubenswrapper[4730]: I0320 17:16:05.994559    4730 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567110-n8lgt"]
Mar 20 17:16:07 crc kubenswrapper[4730]: I0320 17:16:07.553074    4730 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe10df03-e254-4075-9487-78370bdbdf87" path="/var/lib/kubelet/pods/fe10df03-e254-4075-9487-78370bdbdf87/volumes"
Mar 20 17:16:12 crc kubenswrapper[4730]: I0320 17:16:12.880788    4730 patch_prober.go:28] interesting pod/machine-config-daemon-p5qvf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body=
Mar 20 17:16:12 crc kubenswrapper[4730]: I0320 17:16:12.881614    4730 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p5qvf" podUID="7fcd3db3-55f1-4c23-8fa9-78844495cea3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"
var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515157300355024451 0ustar  coreroot
 Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015157300356017367 5ustar  corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015157264433016517 5ustar  corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015157264433015467 5ustar  corecore